[JIRA] (OVIRT-2552) Remove 4.1 build artifacts jobs before removing
the nightly publisher of 4.1
by Ehud Yonasi (oVirt JIRA)
[ https://ovirt-jira.atlassian.net/browse/OVIRT-2552?page=com.atlassian.jir... ]
Ehud Yonasi reassigned OVIRT-2552:
----------------------------------
Assignee: Ehud Yonasi (was: infra)
> Remove 4.1 build artifacts jobs before removing the nightly publisher of 4.1
> ----------------------------------------------------------------------------
>
> Key: OVIRT-2552
> URL: https://ovirt-jira.atlassian.net/browse/OVIRT-2552
> Project: oVirt - virtualization made easy
> Issue Type: Task
> Reporter: Ehud Yonasi
> Assignee: Ehud Yonasi
> Priority: Low
>
> This is the list to remove from the yaml file the build-artifacts jobs:
> job otopi_4.1_build-artifacts-el7-x86_64 still exists
> job ovirt-vmconsole_4.1_build-artifacts-el7-x86_64 still exists
> job ovirt-log-collector_4.1_build-artifacts-el7-x86_64 still exists
> job ovirt-engine-cli_3.6_build-artifacts-el7-x86_64 still exists
> job ovirt-engine-extension-aaa-jdbc_4.1_build-artifacts-el7-x86_64 still exists
> job ovirt-engine-extension-aaa-ldap_4.1_build-artifacts-el7-x86_64 still exists
> job ovirt-engine-extension-aaa-misc_4.1_build-artifacts-el7-x86_64 still exists
> job ovirt-engine-extension-logger-log4j_4.1_build-artifacts-el7-x86_64 still exists
> job vdsm-jsonrpc-java_4.1_build-artifacts-el7-x86_64 still exists
> job python-ovirt-engine-sdk4_4.1_build-artifacts-el7-x86_64 still exists
> job python-ovirt-engine-sdk4_4.1_build-artifacts-el7-ppc64le still exists
> job ovirt-engine-sdk-java_4.1_build-artifacts-el7-x86_64 still exists
> job java-ovirt-engine-sdk4_4.1_build-artifacts-el7-x86_64 still exists
> job ovirt-scheduler-proxy_4.1_build-artifacts-el7-x86_64 still exists
> job ovirt-optimizer_4.1_build-artifacts-el7-x86_64 still exists
> job ovirt-engine-dashboard_4.1_build-artifacts-el7-x86_64 still exists
> job ioprocess_4.1_build-artifacts-el7-x86_64 still exists
> job ioprocess_4.1_build-artifacts-el7-ppc64le still exists
> job vdsm_4.1_build-artifacts-el7-x86_64 still exists
> job vdsm_4.1_build-artifacts-el7-ppc64le still exists
> job ovirt-hosted-engine-ha_4.1_build-artifacts-el7-x86_64 still exists
> job ovirt-hosted-engine-setup_4.1_build-artifacts-el7-x86_64 still exists
> job ovirt-engine-sdk_4.1_build-artifacts-el7-x86_64 still exists
> job ovirt-engine_4.1_build-artifacts-el7-x86_64 still exists
> job mom_4.1_build-artifacts-el7-x86_64 still exists
> job ovirt-wgt_4.1_build-artifacts-el7-x86_64 still exists
> job cockpit-ovirt_4.1_build-artifacts-el7-x86_64 still exists
> job ovirt-engine-sdk-ruby_4.1_build-artifacts-el7-x86_64 still exists
> job ovirt-engine-sdk-ruby_4.1_build-artifacts-el7-ppc64le still exists
> job ovirt-guest-agent_4.1_build-artifacts-el6-x86_64 still exists
> job ovirt-guest-agent_4.1_build-artifacts-el7-x86_64 still exists
--
This message was sent by Atlassian Jira
(v1001.0.0-SNAPSHOT#100094)
6 years, 2 months
[JIRA] (OVIRT-2552) Remove 4.1 build artifacts jobs before removing
the nightly publisher of 4.1
by Ehud Yonasi (oVirt JIRA)
Ehud Yonasi created OVIRT-2552:
----------------------------------
Summary: Remove 4.1 build artifacts jobs before removing the nightly publisher of 4.1
Key: OVIRT-2552
URL: https://ovirt-jira.atlassian.net/browse/OVIRT-2552
Project: oVirt - virtualization made easy
Issue Type: Task
Reporter: Ehud Yonasi
Assignee: infra
Priority: Low
This is the list to remove from the yaml file the build-artifacts jobs:
job otopi_4.1_build-artifacts-el7-x86_64 still exists
job ovirt-vmconsole_4.1_build-artifacts-el7-x86_64 still exists
job ovirt-log-collector_4.1_build-artifacts-el7-x86_64 still exists
job ovirt-engine-cli_3.6_build-artifacts-el7-x86_64 still exists
job ovirt-engine-extension-aaa-jdbc_4.1_build-artifacts-el7-x86_64 still exists
job ovirt-engine-extension-aaa-ldap_4.1_build-artifacts-el7-x86_64 still exists
job ovirt-engine-extension-aaa-misc_4.1_build-artifacts-el7-x86_64 still exists
job ovirt-engine-extension-logger-log4j_4.1_build-artifacts-el7-x86_64 still exists
job vdsm-jsonrpc-java_4.1_build-artifacts-el7-x86_64 still exists
job python-ovirt-engine-sdk4_4.1_build-artifacts-el7-x86_64 still exists
job python-ovirt-engine-sdk4_4.1_build-artifacts-el7-ppc64le still exists
job ovirt-engine-sdk-java_4.1_build-artifacts-el7-x86_64 still exists
job java-ovirt-engine-sdk4_4.1_build-artifacts-el7-x86_64 still exists
job ovirt-scheduler-proxy_4.1_build-artifacts-el7-x86_64 still exists
job ovirt-optimizer_4.1_build-artifacts-el7-x86_64 still exists
job ovirt-engine-dashboard_4.1_build-artifacts-el7-x86_64 still exists
job ioprocess_4.1_build-artifacts-el7-x86_64 still exists
job ioprocess_4.1_build-artifacts-el7-ppc64le still exists
job vdsm_4.1_build-artifacts-el7-x86_64 still exists
job vdsm_4.1_build-artifacts-el7-ppc64le still exists
job ovirt-hosted-engine-ha_4.1_build-artifacts-el7-x86_64 still exists
job ovirt-hosted-engine-setup_4.1_build-artifacts-el7-x86_64 still exists
job ovirt-engine-sdk_4.1_build-artifacts-el7-x86_64 still exists
job ovirt-engine_4.1_build-artifacts-el7-x86_64 still exists
job mom_4.1_build-artifacts-el7-x86_64 still exists
job ovirt-wgt_4.1_build-artifacts-el7-x86_64 still exists
job cockpit-ovirt_4.1_build-artifacts-el7-x86_64 still exists
job ovirt-engine-sdk-ruby_4.1_build-artifacts-el7-x86_64 still exists
job ovirt-engine-sdk-ruby_4.1_build-artifacts-el7-ppc64le still exists
job ovirt-guest-agent_4.1_build-artifacts-el6-x86_64 still exists
job ovirt-guest-agent_4.1_build-artifacts-el7-x86_64 still exists
--
This message was sent by Atlassian Jira
(v1001.0.0-SNAPSHOT#100094)
6 years, 2 months
[JIRA] (OVIRT-2552) Remove 4.1 build artifacts jobs before removing
the nightly publisher of 4.1
by Ehud Yonasi (oVirt JIRA)
[ https://ovirt-jira.atlassian.net/browse/OVIRT-2552?page=com.atlassian.jir... ]
Ehud Yonasi updated OVIRT-2552:
-------------------------------
Epic Link: OVIRT-2044
> Remove 4.1 build artifacts jobs before removing the nightly publisher of 4.1
> ----------------------------------------------------------------------------
>
> Key: OVIRT-2552
> URL: https://ovirt-jira.atlassian.net/browse/OVIRT-2552
> Project: oVirt - virtualization made easy
> Issue Type: Task
> Reporter: Ehud Yonasi
> Assignee: infra
> Priority: Low
>
> This is the list to remove from the yaml file the build-artifacts jobs:
> job otopi_4.1_build-artifacts-el7-x86_64 still exists
> job ovirt-vmconsole_4.1_build-artifacts-el7-x86_64 still exists
> job ovirt-log-collector_4.1_build-artifacts-el7-x86_64 still exists
> job ovirt-engine-cli_3.6_build-artifacts-el7-x86_64 still exists
> job ovirt-engine-extension-aaa-jdbc_4.1_build-artifacts-el7-x86_64 still exists
> job ovirt-engine-extension-aaa-ldap_4.1_build-artifacts-el7-x86_64 still exists
> job ovirt-engine-extension-aaa-misc_4.1_build-artifacts-el7-x86_64 still exists
> job ovirt-engine-extension-logger-log4j_4.1_build-artifacts-el7-x86_64 still exists
> job vdsm-jsonrpc-java_4.1_build-artifacts-el7-x86_64 still exists
> job python-ovirt-engine-sdk4_4.1_build-artifacts-el7-x86_64 still exists
> job python-ovirt-engine-sdk4_4.1_build-artifacts-el7-ppc64le still exists
> job ovirt-engine-sdk-java_4.1_build-artifacts-el7-x86_64 still exists
> job java-ovirt-engine-sdk4_4.1_build-artifacts-el7-x86_64 still exists
> job ovirt-scheduler-proxy_4.1_build-artifacts-el7-x86_64 still exists
> job ovirt-optimizer_4.1_build-artifacts-el7-x86_64 still exists
> job ovirt-engine-dashboard_4.1_build-artifacts-el7-x86_64 still exists
> job ioprocess_4.1_build-artifacts-el7-x86_64 still exists
> job ioprocess_4.1_build-artifacts-el7-ppc64le still exists
> job vdsm_4.1_build-artifacts-el7-x86_64 still exists
> job vdsm_4.1_build-artifacts-el7-ppc64le still exists
> job ovirt-hosted-engine-ha_4.1_build-artifacts-el7-x86_64 still exists
> job ovirt-hosted-engine-setup_4.1_build-artifacts-el7-x86_64 still exists
> job ovirt-engine-sdk_4.1_build-artifacts-el7-x86_64 still exists
> job ovirt-engine_4.1_build-artifacts-el7-x86_64 still exists
> job mom_4.1_build-artifacts-el7-x86_64 still exists
> job ovirt-wgt_4.1_build-artifacts-el7-x86_64 still exists
> job cockpit-ovirt_4.1_build-artifacts-el7-x86_64 still exists
> job ovirt-engine-sdk-ruby_4.1_build-artifacts-el7-x86_64 still exists
> job ovirt-engine-sdk-ruby_4.1_build-artifacts-el7-ppc64le still exists
> job ovirt-guest-agent_4.1_build-artifacts-el6-x86_64 still exists
> job ovirt-guest-agent_4.1_build-artifacts-el7-x86_64 still exists
--
This message was sent by Atlassian Jira
(v1001.0.0-SNAPSHOT#100094)
6 years, 2 months
[oVirt Jenkins] ovirt-system-tests_hc-basic-suite-master - Build #
795 - Failure!
by jenkins@jenkins.phx.ovirt.org
Project: http://jenkins.ovirt.org/job/ovirt-system-tests_hc-basic-suite-master/
Build: http://jenkins.ovirt.org/job/ovirt-system-tests_hc-basic-suite-master/795/
Build Number: 795
Build Status: Failure
Triggered By: Started by timer
-------------------------------------
Changes Since Last Success:
-------------------------------------
Changes for Build #795
[Galit] Update the config for poll u/s sources
-----------------
Failed Tests:
-----------------
1 tests failed.
FAILED: 002_bootstrap.add_hosts
Error Message:
Host lago-hc-basic-suite-master-host-1 is in non operational state
-------------------- >> begin captured logging << --------------------
ovirtlago.testlib: ERROR: * Unhandled exception in <function _host_is_up_4 at 0x7fd37e46a230>
Traceback (most recent call last):
File "/usr/lib/python2.7/site-packages/ovirtlago/testlib.py", line 234, in assert_equals_within
res = func()
File "/home/jenkins/workspace/ovirt-system-tests_hc-basic-suite-master/ovirt-system-tests/hc-basic-suite-master/test-scenarios/002_bootstrap.py", line 258, in _host_is_up_4
raise RuntimeError('Host %s is in non operational state' % api_host.name)
RuntimeError: Host lago-hc-basic-suite-master-host-1 is in non operational state
--------------------- >> end captured logging << ---------------------
Stack Trace:
File "/usr/lib64/python2.7/unittest/case.py", line 369, in run
testMethod()
File "/usr/lib/python2.7/site-packages/nose/case.py", line 197, in runTest
self.test(*self.arg)
File "/usr/lib/python2.7/site-packages/ovirtlago/testlib.py", line 142, in wrapped_test
test()
File "/usr/lib/python2.7/site-packages/ovirtlago/testlib.py", line 60, in wrapper
return func(get_test_prefix(), *args, **kwargs)
File "/home/jenkins/workspace/ovirt-system-tests_hc-basic-suite-master/ovirt-system-tests/hc-basic-suite-master/test-scenarios/002_bootstrap.py", line 228, in add_hosts
add_hosts_4(prefix)
File "/home/jenkins/workspace/ovirt-system-tests_hc-basic-suite-master/ovirt-system-tests/hc-basic-suite-master/test-scenarios/002_bootstrap.py", line 275, in add_hosts_4
testlib.assert_true_within(_host_is_up_4, timeout=15*60)
File "/usr/lib/python2.7/site-packages/ovirtlago/testlib.py", line 278, in assert_true_within
assert_equals_within(func, True, timeout, allowed_exceptions)
File "/usr/lib/python2.7/site-packages/ovirtlago/testlib.py", line 234, in assert_equals_within
res = func()
File "/home/jenkins/workspace/ovirt-system-tests_hc-basic-suite-master/ovirt-system-tests/hc-basic-suite-master/test-scenarios/002_bootstrap.py", line 258, in _host_is_up_4
raise RuntimeError('Host %s is in non operational state' % api_host.name)
'Host lago-hc-basic-suite-master-host-1 is in non operational state\n-------------------- >> begin captured logging << --------------------\novirtlago.testlib: ERROR: * Unhandled exception in <function _host_is_up_4 at 0x7fd37e46a230>\nTraceback (most recent call last):\n File "/usr/lib/python2.7/site-packages/ovirtlago/testlib.py", line 234, in assert_equals_within\n res = func()\n File "/home/jenkins/workspace/ovirt-system-tests_hc-basic-suite-master/ovirt-system-tests/hc-basic-suite-master/test-scenarios/002_bootstrap.py", line 258, in _host_is_up_4\n raise RuntimeError(\'Host %s is in non operational state\' % api_host.name)\nRuntimeError: Host lago-hc-basic-suite-master-host-1 is in non operational state\n--------------------- >> end captured logging << ---------------------'
6 years, 2 months
[JIRA] (OVIRT-2551) failure to remove journal logs
by Daniel Belenky (oVirt JIRA)
[ https://ovirt-jira.atlassian.net/browse/OVIRT-2551?page=com.atlassian.jir... ]
Daniel Belenky reassigned OVIRT-2551:
-------------------------------------
Assignee: Evgheni Dereveanchin (was: infra)
> failure to remove journal logs
> ------------------------------
>
> Key: OVIRT-2551
> URL: https://ovirt-jira.atlassian.net/browse/OVIRT-2551
> Project: oVirt - virtualization made easy
> Issue Type: Bug
> Reporter: Daniel Belenky
> Assignee: Evgheni Dereveanchin
>
> Lately, we've seen an issue where systemd-journal logs dir belongs to the root group and automation (slave_cleanup.sh) fails to remove those logs and eventually fails the entire job. I can think of 2 reasons why we should remove journals: saving disk space and possibly having a clean journal for every job (though it makes debugging hard because we don't have old logs). Anyway, I think that the way we remove journal logs is improper - maybe back when David wrote this (back in 2015) it was the only way, but today we can ask journalctl to handle the logs rotation for us:
> --vacuum-size=, --vacuum-time=, --vacuum-files=
> Removes archived journal files until the disk space they use falls below the specified size (specified with the usual "K", "M", "G" and "T" suffixes), or all archived journal
> files contain no data older than the specified timespan (specified with the usual "s", "m", "h", "days", "months", "weeks" and "years" suffixes), or no more than the
> specified number of separate journal files remain. Note that running --vacuum-size= has only an indirect effect on the output shown by --disk-usage, as the latter includes
> active journal files, while the vacuuming operation only operates on archived journal files. Similarly, --vacuum-files= might not actually reduce the number of journal files
> to below the specified number, as it will not remove active journal files. --vacuum-size=, --vacuum-time= and --vacuum-files= may be combined in a single invocation to
> enforce any combination of a size, a time and a number of files limit on the archived journal files. Specifying any of these three parameters as zero is equivalent to not
> enforcing the specific limit, and is thus redundant.
> I think that we should use --vacuum-time and keep journal logs for 10-15 days.
--
This message was sent by Atlassian Jira
(v1001.0.0-SNAPSHOT#100094)
6 years, 2 months
[JIRA] (OVIRT-2551) failure to remove journal logs
by Daniel Belenky (oVirt JIRA)
Daniel Belenky created OVIRT-2551:
-------------------------------------
Summary: failure to remove journal logs
Key: OVIRT-2551
URL: https://ovirt-jira.atlassian.net/browse/OVIRT-2551
Project: oVirt - virtualization made easy
Issue Type: Bug
Reporter: Daniel Belenky
Assignee: infra
Lately, we've seen an issue where systemd-journal logs dir belongs to the root group and automation (slave_cleanup.sh) fails to remove those logs and eventually fails the entire job. I can think of 2 reasons why we should remove journals: saving disk space and possibly having a clean journal for every job (though it makes debugging hard because we don't have old logs). Anyway, I think that the way we remove journal logs is improper - maybe back when David wrote this (back in 2015) it was the only way, but today we can ask journalctl to handle the logs rotation for us:
--vacuum-size=, --vacuum-time=, --vacuum-files=
Removes archived journal files until the disk space they use falls below the specified size (specified with the usual "K", "M", "G" and "T" suffixes), or all archived journal
files contain no data older than the specified timespan (specified with the usual "s", "m", "h", "days", "months", "weeks" and "years" suffixes), or no more than the
specified number of separate journal files remain. Note that running --vacuum-size= has only an indirect effect on the output shown by --disk-usage, as the latter includes
active journal files, while the vacuuming operation only operates on archived journal files. Similarly, --vacuum-files= might not actually reduce the number of journal files
to below the specified number, as it will not remove active journal files. --vacuum-size=, --vacuum-time= and --vacuum-files= may be combined in a single invocation to
enforce any combination of a size, a time and a number of files limit on the archived journal files. Specifying any of these three parameters as zero is equivalent to not
enforcing the specific limit, and is thus redundant.
I think that we should use --vacuum-time and keep journal logs for 10-15 days.
--
This message was sent by Atlassian Jira
(v1001.0.0-SNAPSHOT#100094)
6 years, 2 months