[oVirt Jenkins] ovirt-system-tests_he-node-ng-suite-4.2 - Build #
459 - Failure!
by jenkins@jenkins.phx.ovirt.org
Project: http://jenkins.ovirt.org/job/ovirt-system-tests_he-node-ng-suite-4.2/
Build: http://jenkins.ovirt.org/job/ovirt-system-tests_he-node-ng-suite-4.2/459/
Build Number: 459
Build Status: Failure
Triggered By: Started by timer
-------------------------------------
Changes Since Last Success:
-------------------------------------
Changes for Build #459
[Martin Perina] Fix notifier check in master and 4.2
-----------------
Failed Tests:
-----------------
1 tests failed.
FAILED: …
[View More]012_local_maintenance_sdk.local_maintenance
Error Message:
143
-------------------- >> begin captured logging << --------------------
root: INFO: * Waiting For System Stability...
cli: DEBUG: signal 15 was caught
--------------------- >> end captured logging << ---------------------
Stack Trace:
File "/usr/lib64/python2.7/unittest/case.py", line 369, in run
testMethod()
File "/usr/lib/python2.7/site-packages/nose/case.py", line 197, in runTest
self.test(*self.arg)
File "/usr/lib/python2.7/site-packages/ovirtlago/testlib.py", line 142, in wrapped_test
test()
File "/usr/lib/python2.7/site-packages/ovirtlago/testlib.py", line 60, in wrapper
return func(get_test_prefix(), *args, **kwargs)
File "/usr/lib/python2.7/site-packages/ovirtlago/testlib.py", line 79, in wrapper
prefix.virt_env.engine_vm().get_api(api_ver=4), *args, **kwargs
File "/usr/lib/python2.7/site-packages/ovirtlago/testlib.py", line 60, in wrapper
return func(get_test_prefix(), *args, **kwargs)
File "/home/jenkins/workspace/ovirt-system-tests_he-node-ng-suite-4.2/ovirt-system-tests/he-node-ng-suite-4.2/test-scenarios/012_local_maintenance_sdk.py", line 42, in local_maintenance
time.sleep(wait_value)
File "/usr/lib/python2.7/site-packages/lago/cmd.py", line 922, in exit_handler
sys.exit(128 + signum)
'143\n-------------------- >> begin captured logging << --------------------\nroot: INFO: * Waiting For System Stability...\ncli: DEBUG: signal 15 was caught\n--------------------- >> end captured logging << ---------------------'
[View Less]
6 years, 5 months
[oVirt Jenkins] ovirt-system-tests_hc-basic-suite-master - Build #
843 - Failure!
by jenkins@jenkins.phx.ovirt.org
Project: http://jenkins.ovirt.org/job/ovirt-system-tests_hc-basic-suite-master/
Build: http://jenkins.ovirt.org/job/ovirt-system-tests_hc-basic-suite-master/843/
Build Number: 843
Build Status: Failure
Triggered By: Started by timer
-------------------------------------
Changes Since Last Success:
-------------------------------------
Changes for Build #843
[Martin Perina] Fix notifier check in master and 4.2
-----------------
Failed Tests:
-----------------
1 tests failed.
FAILED: …
[View More]002_bootstrap.add_hosts
Error Message:
143
-------------------- >> begin captured logging << --------------------
cli: DEBUG: signal 15 was caught
--------------------- >> end captured logging << ---------------------
Stack Trace:
File "/usr/lib64/python2.7/unittest/case.py", line 369, in run
testMethod()
File "/usr/lib/python2.7/site-packages/nose/case.py", line 197, in runTest
self.test(*self.arg)
File "/usr/lib/python2.7/site-packages/ovirtlago/testlib.py", line 142, in wrapped_test
test()
File "/usr/lib/python2.7/site-packages/ovirtlago/testlib.py", line 60, in wrapper
return func(get_test_prefix(), *args, **kwargs)
File "/home/jenkins/workspace/ovirt-system-tests_hc-basic-suite-master/ovirt-system-tests/hc-basic-suite-master/test-scenarios/002_bootstrap.py", line 228, in add_hosts
add_hosts_4(prefix)
File "/home/jenkins/workspace/ovirt-system-tests_hc-basic-suite-master/ovirt-system-tests/hc-basic-suite-master/test-scenarios/002_bootstrap.py", line 275, in add_hosts_4
testlib.assert_true_within(_host_is_up_4, timeout=15*60)
File "/usr/lib/python2.7/site-packages/ovirtlago/testlib.py", line 278, in assert_true_within
assert_equals_within(func, True, timeout, allowed_exceptions)
File "/usr/lib/python2.7/site-packages/ovirtlago/testlib.py", line 234, in assert_equals_within
res = func()
File "/home/jenkins/workspace/ovirt-system-tests_hc-basic-suite-master/ovirt-system-tests/hc-basic-suite-master/test-scenarios/002_bootstrap.py", line 253, in _host_is_up_4
host_obj = host_service.get()
File "/usr/lib64/python2.7/site-packages/ovirtsdk4/services.py", line 38639, in get
return self._internal_get(headers, query, wait)
File "/usr/lib64/python2.7/site-packages/ovirtsdk4/service.py", line 211, in _internal_get
return future.wait() if wait else future
File "/usr/lib64/python2.7/site-packages/ovirtsdk4/service.py", line 54, in wait
response = self._connection.wait(self._context)
File "/usr/lib64/python2.7/site-packages/ovirtsdk4/__init__.py", line 497, in wait
return self.__wait(context, failed_auth)
File "/usr/lib64/python2.7/site-packages/ovirtsdk4/__init__.py", line 533, in __wait
self._multi.select(1.0)
File "/usr/lib/python2.7/site-packages/lago/cmd.py", line 922, in exit_handler
sys.exit(128 + signum)
'143\n-------------------- >> begin captured logging << --------------------\ncli: DEBUG: signal 15 was caught\n--------------------- >> end captured logging << ---------------------'
[View Less]
6 years, 5 months
[oVirt Jenkins] ovirt-system-tests_he-node-ng-suite-master - Build
# 524 - Failure!
by jenkins@jenkins.phx.ovirt.org
Project: http://jenkins.ovirt.org/job/ovirt-system-tests_he-node-ng-suite-master/
Build: http://jenkins.ovirt.org/job/ovirt-system-tests_he-node-ng-suite-master/524/
Build Number: 524
Build Status: Failure
Triggered By: Started by timer
-------------------------------------
Changes Since Last Success:
-------------------------------------
Changes for Build #524
[Martin Perina] Fix notifier check in master and 4.2
-----------------
Failed Tests:
-----------------
1 tests failed.
FAILED: …
[View More]012_local_maintenance_sdk.local_maintenance
Error Message:
143
-------------------- >> begin captured logging << --------------------
root: INFO: * Waiting For System Stability...
cli: DEBUG: signal 15 was caught
--------------------- >> end captured logging << ---------------------
Stack Trace:
File "/usr/lib64/python2.7/unittest/case.py", line 369, in run
testMethod()
File "/usr/lib/python2.7/site-packages/nose/case.py", line 197, in runTest
self.test(*self.arg)
File "/usr/lib/python2.7/site-packages/ovirtlago/testlib.py", line 142, in wrapped_test
test()
File "/usr/lib/python2.7/site-packages/ovirtlago/testlib.py", line 60, in wrapper
return func(get_test_prefix(), *args, **kwargs)
File "/usr/lib/python2.7/site-packages/ovirtlago/testlib.py", line 79, in wrapper
prefix.virt_env.engine_vm().get_api(api_ver=4), *args, **kwargs
File "/usr/lib/python2.7/site-packages/ovirtlago/testlib.py", line 60, in wrapper
return func(get_test_prefix(), *args, **kwargs)
File "/home/jenkins/workspace/ovirt-system-tests_he-node-ng-suite-master/ovirt-system-tests/he-node-ng-suite-master/test-scenarios/012_local_maintenance_sdk.py", line 42, in local_maintenance
time.sleep(wait_value)
File "/usr/lib/python2.7/site-packages/lago/cmd.py", line 922, in exit_handler
sys.exit(128 + signum)
'143\n-------------------- >> begin captured logging << --------------------\nroot: INFO: * Waiting For System Stability...\ncli: DEBUG: signal 15 was caught\n--------------------- >> end captured logging << ---------------------'
[View Less]
6 years, 5 months
[oVirt Jenkins] ovirt-system-tests_he-basic-ansible-suite-master -
Build # 778 - Failure!
by jenkins@jenkins.phx.ovirt.org
Project: http://jenkins.ovirt.org/job/ovirt-system-tests_he-basic-ansible-suite-ma...
Build: http://jenkins.ovirt.org/job/ovirt-system-tests_he-basic-ansible-suite-ma...
Build Number: 778
Build Status: Failure
Triggered By: Started by timer
-------------------------------------
Changes Since Last Success:
-------------------------------------
Changes for Build #778
[Martin Perina] Fix notifier check in master and 4.2
-----------------
Failed Tests:
-----------------
1 tests failed.
…
[View More]FAILED: 008_restart_he_vm.clear_global_maintenance
Error Message:
could not parse JSON: {"1": {"conf_on_shared_storage": true, "live-data": true, "extra": "metadata_parse_version=1\nmetadata_feature_version=1\ntimestamp=3275 (Fri Nov 2 19:04:22 2018)\nhost-id=1\nscore=3400\nvm_conf_refresh_time=3275 (Fri Nov 2 19:04:23 2018)\nconf_on_shared_storage=True\nmaintenance=False\nstate=EngineUp\nstopped=False\n", "hostname": "lago-he-basic-ansible-suite-master-host-0", "host-id": 1, "engine-status": {"health": "good", "vm": "up", "detail": "Up"}, "score": 3400, "stopped": false, "maintenance": false, "crc32": "0e550e4f", "local_conf_timestamp": 3275, "host-ts": 3275}, "2": {"conf_on_shared_storage": true, "live-data": true, "extra": "metadata_parse_version=1\nmetadata_feature_version=1\ntimestamp=3273 (Fri Nov 2 19:04:21 2018)\nhost-id=2\nscore=0\nvm_conf_refresh_time=3274 (Fri Nov 2 19:04:21 2018)\nconf_on_shared_storage=True\nmaintenance=False\nstate=ReinitializeFSM\nstopped=False\n", "hostname": "lago-he-basic-ansible-suite-master-host-1", "host-id": 2, "engine-status": {"reason": "vm not runnin
-------------------- >> begin captured logging << --------------------
root: INFO: * Waiting For System Stability...
lago.ssh: DEBUG: start task:0c0547e8-8d97-46b2-8baf-324de01e613a:Get ssh client for lago-he-basic-ansible-suite-master-host-0:
lago.ssh: DEBUG: end task:0c0547e8-8d97-46b2-8baf-324de01e613a:Get ssh client for lago-he-basic-ansible-suite-master-host-0:
lago.ssh: DEBUG: Running 99dca43c on lago-he-basic-ansible-suite-master-host-0: hosted-engine --set-maintenance --mode=none
lago.ssh: DEBUG: Command 99dca43c on lago-he-basic-ansible-suite-master-host-0 returned with 0
lago.ssh: DEBUG: start task:ed30e092-2a34-45b3-a264-33553f979604:Get ssh client for lago-he-basic-ansible-suite-master-host-0:
lago.ssh: DEBUG: end task:ed30e092-2a34-45b3-a264-33553f979604:Get ssh client for lago-he-basic-ansible-suite-master-host-0:
lago.ssh: DEBUG: Running 9a68a338 on lago-he-basic-ansible-suite-master-host-0: hosted-engine --vm-status --json
lago.ssh: DEBUG: Command 9a68a338 on lago-he-basic-ansible-suite-master-host-0 returned with 0
lago.ssh: DEBUG: Command 9a68a338 on lago-he-basic-ansible-suite-master-host-0 output:
{"1": {"conf_on_shared_storage": true, "live-data": true, "extra": "metadata_parse_version=1\nmetadata_feature_version=1\ntimestamp=3255 (Fri Nov 2 19:04:03 2018)\nhost-id=1\nscore=3000\nvm_conf_refresh_time=3255 (Fri Nov 2 19:04:03 2018)\nconf_on_shared_storage=True\nmaintenance=False\nstate=GlobalMaintenance\nstopped=False\n", "hostname": "lago-he-basic-ansible-suite-master-host-0", "host-id": 1, "engine-status": {"health": "good", "vm": "up", "detail": "Up"}, "score": 3000, "stopped": false, "maintenance": false, "crc32": "9f6ed413", "local_conf_timestamp": 3255, "host-ts": 3255}, "2": {"conf_on_shared_storage": true, "live-data": true, "extra": "metadata_parse_version=1\nmetadata_feature_version=1\ntimestamp=3253 (Fri Nov 2 19:04:01 2018)\nhost-id=2\nscore=3400\nvm_conf_refresh_time=3253 (Fri Nov 2 19:04:01 2018)\nconf_on_shared_storage=True\nmaintenance=False\nstate=GlobalMaintenance\nstopped=False\n", "hostname": "lago-he-basic-ansible-suite-master-host-1", "host-id": 2, "engine-status": {"reason": "vm not running on this host", "health": "bad", "vm": "down", "detail": "unknown"}, "score": 3400, "stopped": false, "maintenance": false, "crc32": "ae89ee36", "local_conf_timestamp": 3253, "host-ts": 3253}, "global_maintenance": false}
lago.ssh: DEBUG: start task:1804e051-e73b-494a-8813-35750dc4ad59:Get ssh client for lago-he-basic-ansible-suite-master-host-0:
lago.ssh: DEBUG: end task:1804e051-e73b-494a-8813-35750dc4ad59:Get ssh client for lago-he-basic-ansible-suite-master-host-0:
lago.ssh: DEBUG: Running 9b2796b2 on lago-he-basic-ansible-suite-master-host-0: hosted-engine --vm-status --json
lago.ssh: DEBUG: Command 9b2796b2 on lago-he-basic-ansible-suite-master-host-0 returned with 0
lago.ssh: DEBUG: Command 9b2796b2 on lago-he-basic-ansible-suite-master-host-0 output:
{"1": {"conf_on_shared_storage": true, "live-data": true, "extra": "metadata_parse_version=1\nmetadata_feature_version=1\ntimestamp=3255 (Fri Nov 2 19:04:03 2018)\nhost-id=1\nscore=3000\nvm_conf_refresh_time=3255 (Fri Nov 2 19:04:03 2018)\nconf_on_shared_storage=True\nmaintenance=False\nstate=GlobalMaintenance\nstopped=False\n", "hostname": "lago-he-basic-ansible-suite-master-host-0", "host-id": 1, "engine-status": {"health": "good", "vm": "up", "detail": "Up"}, "score": 3000, "stopped": false, "maintenance": false, "crc32": "9f6ed413", "local_conf_timestamp": 3255, "host-ts": 3255}, "2": {"conf_on_shared_storage": true, "live-data": true, "extra": "metadata_parse_version=1\nmetadata_feature_version=1\ntimestamp=3253 (Fri Nov 2 19:04:01 2018)\nhost-id=2\nscore=3400\nvm_conf_refresh_time=3253 (Fri Nov 2 19:04:01 2018)\nconf_on_shared_storage=True\nmaintenance=False\nstate=GlobalMaintenance\nstopped=False\n", "hostname": "lago-he-basic-ansible-suite-master-host-1", "host-id": 2, "engine-status": {"reason": "vm not running on this host", "health": "bad", "vm": "down", "detail": "unknown"}, "score": 3400, "stopped": false, "maintenance": false, "crc32": "ae89ee36", "local_conf_timestamp": 3253, "host-ts": 3253}, "global_maintenance": false}
lago.ssh: DEBUG: start task:9f4e78c0-cc0d-49a0-ba9b-55240f32537b:Get ssh client for lago-he-basic-ansible-suite-master-host-0:
lago.ssh: DEBUG: end task:9f4e78c0-cc0d-49a0-ba9b-55240f32537b:Get ssh client for lago-he-basic-ansible-suite-master-host-0:
lago.ssh: DEBUG: Running a1a70400 on lago-he-basic-ansible-suite-master-host-0: hosted-engine --vm-status --json
lago.ssh: DEBUG: Command a1a70400 on lago-he-basic-ansible-suite-master-host-0 returned with 0
lago.ssh: DEBUG: Command a1a70400 on lago-he-basic-ansible-suite-master-host-0 output:
{"1": {"conf_on_shared_storage": true, "live-data": true, "extra": "metadata_parse_version=1\nmetadata_feature_version=1\ntimestamp=3265 (Fri Nov 2 19:04:13 2018)\nhost-id=1\nscore=0\nvm_conf_refresh_time=3266 (Fri Nov 2 19:04:13 2018)\nconf_on_shared_storage=True\nmaintenance=False\nstate=ReinitializeFSM\nstopped=False\n", "hostname": "lago-he-basic-ansible-suite-master-host-0", "host-id": 1, "engine-status": {"health": "good", "vm": "up", "detail": "Up"}, "score": 0, "stopped": false, "maintenance": false, "crc32": "0a390014", "local_conf_timestamp": 3266, "host-ts": 3265}, "2": {"conf_on_shared_storage": true, "live-data": true, "extra": "metadata_parse_version=1\nmetadata_feature_version=1\ntimestamp=3263 (Fri Nov 2 19:04:11 2018)\nhost-id=2\nscore=3400\nvm_conf_refresh_time=3263 (Fri Nov 2 19:04:11 2018)\nconf_on_shared_storage=True\nmaintenance=False\nstate=GlobalMaintenance\nstopped=False\n", "hostname": "lago-he-basic-ansible-suite-master-host-1", "host-id": 2, "engine-status": {"reason": "vm not running on this host", "health": "bad", "vm": "down", "detail": "unknown"}, "score": 3400, "stopped": false, "maintenance": false, "crc32": "b68d1fb1", "local_conf_timestamp": 3263, "host-ts": 3263}, "global_maintenance": false}
lago.ssh: DEBUG: start task:ea0f0ccd-77fe-4ff8-aba9-b8a99449aa74:Get ssh client for lago-he-basic-ansible-suite-master-host-0:
lago.ssh: DEBUG: end task:ea0f0ccd-77fe-4ff8-aba9-b8a99449aa74:Get ssh client for lago-he-basic-ansible-suite-master-host-0:
lago.ssh: DEBUG: Running a3f05c48 on lago-he-basic-ansible-suite-master-host-0: hosted-engine --vm-status --json
lago.ssh: DEBUG: Command a3f05c48 on lago-he-basic-ansible-suite-master-host-0 returned with 0
lago.ssh: DEBUG: Command a3f05c48 on lago-he-basic-ansible-suite-master-host-0 output:
{"1": {"conf_on_shared_storage": true, "live-data": true, "extra": "metadata_parse_version=1\nmetadata_feature_version=1\ntimestamp=3275 (Fri Nov 2 19:04:22 2018)\nhost-id=1\nscore=3400\nvm_conf_refresh_time=3275 (Fri Nov 2 19:04:23 2018)\nconf_on_shared_storage=True\nmaintenance=False\nstate=EngineUp\nstopped=False\n", "hostname": "lago-he-basic-ansible-suite-master-host-0", "host-id": 1, "engine-status": {"health": "good", "vm": "up", "detail": "Up"}, "score": 3400, "stopped": false, "maintenance": false, "crc32": "0e550e4f", "local_conf_timestamp": 3275, "host-ts": 3275}, "2": {"conf_on_shared_storage": true, "live-data": true, "extra": "metadata_parse_version=1\nmetadata_feature_version=1\ntimestamp=3273 (Fri Nov 2 19:04:21 2018)\nhost-id=2\nscore=0\nvm_conf_refresh_time=3274 (Fri Nov 2 19:04:21 2018)\nconf_on_shared_storage=True\nmaintenance=False\nstate=ReinitializeFSM\nstopped=False\n", "hostname": "lago-he-basic-ansible-suite-master-host-1", "host-id": 2, "engine-status": {"reason": "vm not running on this host", "health": "bad", "vm": "down", "detail": "unknown"}, "score": 0, "stopped": false, "maintenance": false, "crc32": "271d3e26", "local_conf_timestamp": 3274, "host-ts": 3273}, "global_maintenance": false}
lago.ssh: DEBUG: start task:97401d7d-20a1-460b-b997-60e93c926d9b:Get ssh client for lago-he-basic-ansible-suite-master-host-0:
lago.ssh: DEBUG: end task:97401d7d-20a1-460b-b997-60e93c926d9b:Get ssh client for lago-he-basic-ansible-suite-master-host-0:
lago.ssh: DEBUG: Running a63bf228 on lago-he-basic-ansible-suite-master-host-0: hosted-engine --vm-status --json
lago.ssh: DEBUG: Command a63bf228 on lago-he-basic-ansible-suite-master-host-0 returned with 0
lago.ssh: DEBUG: Command a63bf228 on lago-he-basic-ansible-suite-master-host-0 output:
{"1": {"conf_on_shared_storage": true, "live-data": true, "extra": "metadata_parse_version=1\nmetadata_feature_version=1\ntimestamp=3275 (Fri Nov 2 19:04:22 2018)\nhost-id=1\nscore=3400\nvm_conf_refresh_time=3275 (Fri Nov 2 19:04:23 2018)\nconf_on_shared_storage=True\nmaintenance=False\nstate=EngineUp\nstopped=False\n", "hostname": "lago-he-basic-ansible-suite-master-host-0", "host-id": 1, "engine-status": {"health": "good", "vm": "up", "detail": "Up"}, "score": 3400, "stopped": false, "maintenance": false, "crc32": "0e550e4f", "local_conf_timestamp": 3275, "host-ts": 3275}, "2": {"conf_on_shared_storage": true, "live-data": true, "extra": "metadata_parse_version=1\nmetadata_feature_version=1\ntimestamp=3273 (Fri Nov 2 19:04:21 2018)\nhost-id=2\nscore=0\nvm_conf_refresh_time=3274 (Fri Nov 2 19:04:21 2018)\nconf_on_shared_storage=True\nmaintenance=False\nstate=ReinitializeFSM\nstopped=False\n", "hostname": "lago-he-basic-ansible-suite-master-host-1", "host-id": 2, "engine-status": {"reason": "vm not running on this host", "health": "bad", "vm": "down", "detail": "unknown"}, "score": 0, "stopped": false, "maintenance": false, "crc32": "271d3e26", "local_conf_timestamp": 3274, "host-ts": 3273}, "global_maintenance": false}
lago.ssh: DEBUG: start task:643e3d97-fb3d-49ad-8f92-3caf6158aa25:Get ssh client for lago-he-basic-ansible-suite-master-host-0:
lago.ssh: DEBUG: end task:643e3d97-fb3d-49ad-8f92-3caf6158aa25:Get ssh client for lago-he-basic-ansible-suite-master-host-0:
lago.ssh: DEBUG: Running a88867b4 on lago-he-basic-ansible-suite-master-host-0: hosted-engine --vm-status --json
lago.ssh: DEBUG: Command a88867b4 on lago-he-basic-ansible-suite-master-host-0 returned with 0
lago.ssh: DEBUG: Command a88867b4 on lago-he-basic-ansible-suite-master-host-0 output:
{"1": {"conf_on_shared_storage": true, "live-data": true, "extra": "metadata_parse_version=1\nmetadata_feature_version=1\ntimestamp=3275 (Fri Nov 2 19:04:22 2018)\nhost-id=1\nscore=3400\nvm_conf_refresh_time=3275 (Fri Nov 2 19:04:23 2018)\nconf_on_shared_storage=True\nmaintenance=False\nstate=EngineUp\nstopped=False\n", "hostname": "lago-he-basic-ansible-suite-master-host-0", "host-id": 1, "engine-status": {"health": "good", "vm": "up", "detail": "Up"}, "score": 3400, "stopped": false, "maintenance": false, "crc32": "0e550e4f", "local_conf_timestamp": 3275, "host-ts": 3275}, "2": {"conf_on_shared_storage": true, "live-data": true, "extra": "metadata_parse_version=1\nmetadata_feature_version=1\ntimestamp=3273 (Fri Nov 2 19:04:21 2018)\nhost-id=2\nscore=0\nvm_conf_refresh_time=3274 (Fri Nov 2 19:04:21 2018)\nconf_on_shared_storage=True\nmaintenance=False\nstate=ReinitializeFSM\nstopped=False\n", "hostname": "lago-he-basic-ansible-suite-master-host-1", "host-id": 2, "engine-status": {"reason": "vm not runnin
ovirtlago.testlib: ERROR: * Unhandled exception in <function <lambda> at 0x7f7cd1ef4050>
Traceback (most recent call last):
File "/usr/lib/python2.7/site-packages/ovirtlago/testlib.py", line 234, in assert_equals_within
res = func()
File "/home/jenkins/workspace/ovirt-system-tests_he-basic-ansible-suite-master/ovirt-system-tests/he-basic-ansible-suite-master/test-scenarios/008_restart_he_vm.py", line 85, in <lambda>
lambda: _is_state_maintenance(host, "GlobalMaintenance") is False
File "/home/jenkins/workspace/ovirt-system-tests_he-basic-ansible-suite-master/ovirt-system-tests/he-basic-ansible-suite-master/test-scenarios/008_restart_he_vm.py", line 58, in _is_state_maintenance
status = _get_he_status(host)
File "/home/jenkins/workspace/ovirt-system-tests_he-basic-ansible-suite-master/ovirt-system-tests/he-basic-ansible-suite-master/test-scenarios/008_restart_he_vm.py", line 128, in _get_he_status
raise RuntimeError('could not parse JSON: %s' % ret.out)
RuntimeError: could not parse JSON: {"1": {"conf_on_shared_storage": true, "live-data": true, "extra": "metadata_parse_version=1\nmetadata_feature_version=1\ntimestamp=3275 (Fri Nov 2 19:04:22 2018)\nhost-id=1\nscore=3400\nvm_conf_refresh_time=3275 (Fri Nov 2 19:04:23 2018)\nconf_on_shared_storage=True\nmaintenance=False\nstate=EngineUp\nstopped=False\n", "hostname": "lago-he-basic-ansible-suite-master-host-0", "host-id": 1, "engine-status": {"health": "good", "vm": "up", "detail": "Up"}, "score": 3400, "stopped": false, "maintenance": false, "crc32": "0e550e4f", "local_conf_timestamp": 3275, "host-ts": 3275}, "2": {"conf_on_shared_storage": true, "live-data": true, "extra": "metadata_parse_version=1\nmetadata_feature_version=1\ntimestamp=3273 (Fri Nov 2 19:04:21 2018)\nhost-id=2\nscore=0\nvm_conf_refresh_time=3274 (Fri Nov 2 19:04:21 2018)\nconf_on_shared_storage=True\nmaintenance=False\nstate=ReinitializeFSM\nstopped=False\n", "hostname": "lago-he-basic-ansible-suite-master-host-1", "host-id": 2, "engine-status": {"reason": "vm not runnin
--------------------- >> end captured logging << ---------------------
Stack Trace:
File "/usr/lib64/python2.7/unittest/case.py", line 369, in run
testMethod()
File "/usr/lib/python2.7/site-packages/nose/case.py", line 197, in runTest
self.test(*self.arg)
File "/usr/lib/python2.7/site-packages/ovirtlago/testlib.py", line 142, in wrapped_test
test()
File "/usr/lib/python2.7/site-packages/ovirtlago/testlib.py", line 60, in wrapper
return func(get_test_prefix(), *args, **kwargs)
File "/home/jenkins/workspace/ovirt-system-tests_he-basic-ansible-suite-master/ovirt-system-tests/he-basic-ansible-suite-master/test-scenarios/008_restart_he_vm.py", line 85, in clear_global_maintenance
lambda: _is_state_maintenance(host, "GlobalMaintenance") is False
File "/usr/lib/python2.7/site-packages/ovirtlago/testlib.py", line 286, in assert_true_within_long
assert_equals_within_long(func, True, allowed_exceptions)
File "/usr/lib/python2.7/site-packages/ovirtlago/testlib.py", line 273, in assert_equals_within_long
func, value, LONG_TIMEOUT, allowed_exceptions=allowed_exceptions
File "/usr/lib/python2.7/site-packages/ovirtlago/testlib.py", line 234, in assert_equals_within
res = func()
File "/home/jenkins/workspace/ovirt-system-tests_he-basic-ansible-suite-master/ovirt-system-tests/he-basic-ansible-suite-master/test-scenarios/008_restart_he_vm.py", line 85, in <lambda>
lambda: _is_state_maintenance(host, "GlobalMaintenance") is False
File "/home/jenkins/workspace/ovirt-system-tests_he-basic-ansible-suite-master/ovirt-system-tests/he-basic-ansible-suite-master/test-scenarios/008_restart_he_vm.py", line 58, in _is_state_maintenance
status = _get_he_status(host)
File "/home/jenkins/workspace/ovirt-system-tests_he-basic-ansible-suite-master/ovirt-system-tests/he-basic-ansible-suite-master/test-scenarios/008_restart_he_vm.py", line 128, in _get_he_status
raise RuntimeError('could not parse JSON: %s' % ret.out)
'could not parse JSON: {"1": {"conf_on_shared_storage": true, "live-data": true, "extra": "metadata_parse_version=1\\nmetadata_feature_version=1\\ntimestamp=3275 (Fri Nov 2 19:04:22 2018)\\nhost-id=1\\nscore=3400\\nvm_conf_refresh_time=3275 (Fri Nov 2 19:04:23 2018)\\nconf_on_shared_storage=True\\nmaintenance=False\\nstate=EngineUp\\nstopped=False\\n", "hostname": "lago-he-basic-ansible-suite-master-host-0", "host-id": 1, "engine-status": {"health": "good", "vm": "up", "detail": "Up"}, "score": 3400, "stopped": false, "maintenance": false, "crc32": "0e550e4f", "local_conf_timestamp": 3275, "host-ts": 3275}, "2": {"conf_on_shared_storage": true, "live-data": true, "extra": "metadata_parse_version=1\\nmetadata_feature_version=1\\ntimestamp=3273 (Fri Nov 2 19:04:21 2018)\\nhost-id=2\\nscore=0\\nvm_conf_refresh_time=3274 (Fri Nov 2 19:04:21 2018)\\nconf_on_shared_storage=True\\nmaintenance=False\\nstate=ReinitializeFSM\\nstopped=False\\n", "hostname": "lago-he-basic-ansible-suite-master-host-1", "host-id": 2, "engine-status": {"reason": "vm not runnin\n-------------------- >> begin captured logging << --------------------\nroot: INFO: * Waiting For System Stability...\nlago.ssh: DEBUG: start task:0c0547e8-8d97-46b2-8baf-324de01e613a:Get ssh client for lago-he-basic-ansible-suite-master-host-0:\nlago.ssh: DEBUG: end task:0c0547e8-8d97-46b2-8baf-324de01e613a:Get ssh client for lago-he-basic-ansible-suite-master-host-0:\nlago.ssh: DEBUG: Running 99dca43c on lago-he-basic-ansible-suite-master-host-0: hosted-engine --set-maintenance --mode=none\nlago.ssh: DEBUG: Command 99dca43c on lago-he-basic-ansible-suite-master-host-0 returned with 0\nlago.ssh: DEBUG: start task:ed30e092-2a34-45b3-a264-33553f979604:Get ssh client for lago-he-basic-ansible-suite-master-host-0:\nlago.ssh: DEBUG: end task:ed30e092-2a34-45b3-a264-33553f979604:Get ssh client for lago-he-basic-ansible-suite-master-host-0:\nlago.ssh: DEBUG: Running 9a68a338 on lago-he-basic-ansible-suite-master-host-0: hosted-engine --vm-status --json\nlago.ssh: DEBUG: Command 9a68a338 on lago-he-basic-ansible-suite-master-host-0 returned with 0\nlago.ssh: DEBUG: Command 9a68a338 on lago-he-basic-ansible-suite-master-host-0 output:\n {"1": {"conf_on_shared_storage": true, "live-data": true, "extra": "metadata_parse_version=1\\nmetadata_feature_version=1\\ntimestamp=3255 (Fri Nov 2 19:04:03 2018)\\nhost-id=1\\nscore=3000\\nvm_conf_refresh_time=3255 (Fri Nov 2 19:04:03 2018)\\nconf_on_shared_storage=True\\nmaintenance=False\\nstate=GlobalMaintenance\\nstopped=False\\n", "hostname": "lago-he-basic-ansible-suite-master-host-0", "host-id": 1, "engine-status": {"health": "good", "vm": "up", "detail": "Up"}, "score": 3000, "stopped": false, "maintenance": false, "crc32": "9f6ed413", "local_conf_timestamp": 3255, "host-ts": 3255}, "2": {"conf_on_shared_storage": true, "live-data": true, "extra": "metadata_parse_version=1\\nmetadata_feature_version=1\\ntimestamp=3253 (Fri Nov 2 19:04:01 2018)\\nhost-id=2\\nscore=3400\\nvm_conf_refresh_time=3253 (Fri Nov 2 19:04:01 2018)\\nconf_on_shared_storage=True\\nmaintenance=False\\nstate=GlobalMaintenance\\nstopped=False\\n", "hostname": "lago-he-basic-ansible-suite-master-host-1", "host-id": 2, "engine-status": {"reason": "vm not running on this host", "health": "bad", "vm": "down", "detail": "unknown"}, "score": 3400, "stopped": false, "maintenance": false, "crc32": "ae89ee36", "local_conf_timestamp": 3253, "host-ts": 3253}, "global_maintenance": false}\n\nlago.ssh: DEBUG: start task:1804e051-e73b-494a-8813-35750dc4ad59:Get ssh client for lago-he-basic-ansible-suite-master-host-0:\nlago.ssh: DEBUG: end task:1804e051-e73b-494a-8813-35750dc4ad59:Get ssh client for lago-he-basic-ansible-suite-master-host-0:\nlago.ssh: DEBUG: Running 9b2796b2 on lago-he-basic-ansible-suite-master-host-0: hosted-engine --vm-status --json\nlago.ssh: DEBUG: Command 9b2796b2 on lago-he-basic-ansible-suite-master-host-0 returned with 0\nlago.ssh: DEBUG: Command 9b2796b2 on lago-he-basic-ansible-suite-master-host-0 output:\n {"1": {"conf_on_shared_storage": true, "live-data": true, "extra": "metadata_parse_version=1\\nmetadata_feature_version=1\\ntimestamp=3255 (Fri Nov 2 19:04:03 2018)\\nhost-id=1\\nscore=3000\\nvm_conf_refresh_time=3255 (Fri Nov 2 19:04:03 2018)\\nconf_on_shared_storage=True\\nmaintenance=False\\nstate=GlobalMaintenance\\nstopped=False\\n", "hostname": "lago-he-basic-ansible-suite-master-host-0", "host-id": 1, "engine-status": {"health": "good", "vm": "up", "detail": "Up"}, "score": 3000, "stopped": false, "maintenance": false, "crc32": "9f6ed413", "local_conf_timestamp": 3255, "host-ts": 3255}, "2": {"conf_on_shared_storage": true, "live-data": true, "extra": "metadata_parse_version=1\\nmetadata_feature_version=1\\ntimestamp=3253 (Fri Nov 2 19:04:01 2018)\\nhost-id=2\\nscore=3400\\nvm_conf_refresh_time=3253 (Fri Nov 2 19:04:01 2018)\\nconf_on_shared_storage=True\\nmaintenance=False\\nstate=GlobalMaintenance\\nstopped=False\\n", "hostname": "lago-he-basic-ansible-suite-master-host-1", "host-id": 2, "engine-status": {"reason": "vm not running on this host", "health": "bad", "vm": "down", "detail": "unknown"}, "score": 3400, "stopped": false, "maintenance": false, "crc32": "ae89ee36", "local_conf_timestamp": 3253, "host-ts": 3253}, "global_maintenance": false}\n\nlago.ssh: DEBUG: start task:9f4e78c0-cc0d-49a0-ba9b-55240f32537b:Get ssh client for lago-he-basic-ansible-suite-master-host-0:\nlago.ssh: DEBUG: end task:9f4e78c0-cc0d-49a0-ba9b-55240f32537b:Get ssh client for lago-he-basic-ansible-suite-master-host-0:\nlago.ssh: DEBUG: Running a1a70400 on lago-he-basic-ansible-suite-master-host-0: hosted-engine --vm-status --json\nlago.ssh: DEBUG: Command a1a70400 on lago-he-basic-ansible-suite-master-host-0 returned with 0\nlago.ssh: DEBUG: Command a1a70400 on lago-he-basic-ansible-suite-master-host-0 output:\n {"1": {"conf_on_shared_storage": true, "live-data": true, "extra": "metadata_parse_version=1\\nmetadata_feature_version=1\\ntimestamp=3265 (Fri Nov 2 19:04:13 2018)\\nhost-id=1\\nscore=0\\nvm_conf_refresh_time=3266 (Fri Nov 2 19:04:13 2018)\\nconf_on_shared_storage=True\\nmaintenance=False\\nstate=ReinitializeFSM\\nstopped=False\\n", "hostname": "lago-he-basic-ansible-suite-master-host-0", "host-id": 1, "engine-status": {"health": "good", "vm": "up", "detail": "Up"}, "score": 0, "stopped": false, "maintenance": false, "crc32": "0a390014", "local_conf_timestamp": 3266, "host-ts": 3265}, "2": {"conf_on_shared_storage": true, "live-data": true, "extra": "metadata_parse_version=1\\nmetadata_feature_version=1\\ntimestamp=3263 (Fri Nov 2 19:04:11 2018)\\nhost-id=2\\nscore=3400\\nvm_conf_refresh_time=3263 (Fri Nov 2 19:04:11 2018)\\nconf_on_shared_storage=True\\nmaintenance=False\\nstate=GlobalMaintenance\\nstopped=False\\n", "hostname": "lago-he-basic-ansible-suite-master-host-1", "host-id": 2, "engine-status": {"reason": "vm not running on this host", "health": "bad", "vm": "down", "detail": "unknown"}, "score": 3400, "stopped": false, "maintenance": false, "crc32": "b68d1fb1", "local_conf_timestamp": 3263, "host-ts": 3263}, "global_maintenance": false}\n\nlago.ssh: DEBUG: start task:ea0f0ccd-77fe-4ff8-aba9-b8a99449aa74:Get ssh client for lago-he-basic-ansible-suite-master-host-0:\nlago.ssh: DEBUG: end task:ea0f0ccd-77fe-4ff8-aba9-b8a99449aa74:Get ssh client for lago-he-basic-ansible-suite-master-host-0:\nlago.ssh: DEBUG: Running a3f05c48 on lago-he-basic-ansible-suite-master-host-0: hosted-engine --vm-status --json\nlago.ssh: DEBUG: Command a3f05c48 on lago-he-basic-ansible-suite-master-host-0 returned with 0\nlago.ssh: DEBUG: Command a3f05c48 on lago-he-basic-ansible-suite-master-host-0 output:\n {"1": {"conf_on_shared_storage": true, "live-data": true, "extra": "metadata_parse_version=1\\nmetadata_feature_version=1\\ntimestamp=3275 (Fri Nov 2 19:04:22 2018)\\nhost-id=1\\nscore=3400\\nvm_conf_refresh_time=3275 (Fri Nov 2 19:04:23 2018)\\nconf_on_shared_storage=True\\nmaintenance=False\\nstate=EngineUp\\nstopped=False\\n", "hostname": "lago-he-basic-ansible-suite-master-host-0", "host-id": 1, "engine-status": {"health": "good", "vm": "up", "detail": "Up"}, "score": 3400, "stopped": false, "maintenance": false, "crc32": "0e550e4f", "local_conf_timestamp": 3275, "host-ts": 3275}, "2": {"conf_on_shared_storage": true, "live-data": true, "extra": "metadata_parse_version=1\\nmetadata_feature_version=1\\ntimestamp=3273 (Fri Nov 2 19:04:21 2018)\\nhost-id=2\\nscore=0\\nvm_conf_refresh_time=3274 (Fri Nov 2 19:04:21 2018)\\nconf_on_shared_storage=True\\nmaintenance=False\\nstate=ReinitializeFSM\\nstopped=False\\n", "hostname": "lago-he-basic-ansible-suite-master-host-1", "host-id": 2, "engine-status": {"reason": "vm not running on this host", "health": "bad", "vm": "down", "detail": "unknown"}, "score": 0, "stopped": false, "maintenance": false, "crc32": "271d3e26", "local_conf_timestamp": 3274, "host-ts": 3273}, "global_maintenance": false}\n\nlago.ssh: DEBUG: start task:97401d7d-20a1-460b-b997-60e93c926d9b:Get ssh client for lago-he-basic-ansible-suite-master-host-0:\nlago.ssh: DEBUG: end task:97401d7d-20a1-460b-b997-60e93c926d9b:Get ssh client for lago-he-basic-ansible-suite-master-host-0:\nlago.ssh: DEBUG: Running a63bf228 on lago-he-basic-ansible-suite-master-host-0: hosted-engine --vm-status --json\nlago.ssh: DEBUG: Command a63bf228 on lago-he-basic-ansible-suite-master-host-0 returned with 0\nlago.ssh: DEBUG: Command a63bf228 on lago-he-basic-ansible-suite-master-host-0 output:\n {"1": {"conf_on_shared_storage": true, "live-data": true, "extra": "metadata_parse_version=1\\nmetadata_feature_version=1\\ntimestamp=3275 (Fri Nov 2 19:04:22 2018)\\nhost-id=1\\nscore=3400\\nvm_conf_refresh_time=3275 (Fri Nov 2 19:04:23 2018)\\nconf_on_shared_storage=True\\nmaintenance=False\\nstate=EngineUp\\nstopped=False\\n", "hostname": "lago-he-basic-ansible-suite-master-host-0", "host-id": 1, "engine-status": {"health": "good", "vm": "up", "detail": "Up"}, "score": 3400, "stopped": false, "maintenance": false, "crc32": "0e550e4f", "local_conf_timestamp": 3275, "host-ts": 3275}, "2": {"conf_on_shared_storage": true, "live-data": true, "extra": "metadata_parse_version=1\\nmetadata_feature_version=1\\ntimestamp=3273 (Fri Nov 2 19:04:21 2018)\\nhost-id=2\\nscore=0\\nvm_conf_refresh_time=3274 (Fri Nov 2 19:04:21 2018)\\nconf_on_shared_storage=True\\nmaintenance=False\\nstate=ReinitializeFSM\\nstopped=False\\n", "hostname": "lago-he-basic-ansible-suite-master-host-1", "host-id": 2, "engine-status": {"reason": "vm not running on this host", "health": "bad", "vm": "down", "detail": "unknown"}, "score": 0, "stopped": false, "maintenance": false, "crc32": "271d3e26", "local_conf_timestamp": 3274, "host-ts": 3273}, "global_maintenance": false}\n\nlago.ssh: DEBUG: start task:643e3d97-fb3d-49ad-8f92-3caf6158aa25:Get ssh client for lago-he-basic-ansible-suite-master-host-0:\nlago.ssh: DEBUG: end task:643e3d97-fb3d-49ad-8f92-3caf6158aa25:Get ssh client for lago-he-basic-ansible-suite-master-host-0:\nlago.ssh: DEBUG: Running a88867b4 on lago-he-basic-ansible-suite-master-host-0: hosted-engine --vm-status --json\nlago.ssh: DEBUG: Command a88867b4 on lago-he-basic-ansible-suite-master-host-0 returned with 0\nlago.ssh: DEBUG: Command a88867b4 on lago-he-basic-ansible-suite-master-host-0 output:\n {"1": {"conf_on_shared_storage": true, "live-data": true, "extra": "metadata_parse_version=1\\nmetadata_feature_version=1\\ntimestamp=3275 (Fri Nov 2 19:04:22 2018)\\nhost-id=1\\nscore=3400\\nvm_conf_refresh_time=3275 (Fri Nov 2 19:04:23 2018)\\nconf_on_shared_storage=True\\nmaintenance=False\\nstate=EngineUp\\nstopped=False\\n", "hostname": "lago-he-basic-ansible-suite-master-host-0", "host-id": 1, "engine-status": {"health": "good", "vm": "up", "detail": "Up"}, "score": 3400, "stopped": false, "maintenance": false, "crc32": "0e550e4f", "local_conf_timestamp": 3275, "host-ts": 3275}, "2": {"conf_on_shared_storage": true, "live-data": true, "extra": "metadata_parse_version=1\\nmetadata_feature_version=1\\ntimestamp=3273 (Fri Nov 2 19:04:21 2018)\\nhost-id=2\\nscore=0\\nvm_conf_refresh_time=3274 (Fri Nov 2 19:04:21 2018)\\nconf_on_shared_storage=True\\nmaintenance=False\\nstate=ReinitializeFSM\\nstopped=False\\n", "hostname": "lago-he-basic-ansible-suite-master-host-1", "host-id": 2, "engine-status": {"reason": "vm not runnin\novirtlago.testlib: ERROR: * Unhandled exception in <function <lambda> at 0x7f7cd1ef4050>\nTraceback (most recent call last):\n File "/usr/lib/python2.7/site-packages/ovirtlago/testlib.py", line 234, in assert_equals_within\n res = func()\n File "/home/jenkins/workspace/ovirt-system-tests_he-basic-ansible-suite-master/ovirt-system-tests/he-basic-ansible-suite-master/test-scenarios/008_restart_he_vm.py", line 85, in <lambda>\n lambda: _is_state_maintenance(host, "GlobalMaintenance") is False\n File "/home/jenkins/workspace/ovirt-system-tests_he-basic-ansible-suite-master/ovirt-system-tests/he-basic-ansible-suite-master/test-scenarios/008_restart_he_vm.py", line 58, in _is_state_maintenance\n status = _get_he_status(host)\n File "/home/jenkins/workspace/ovirt-system-tests_he-basic-ansible-suite-master/ovirt-system-tests/he-basic-ansible-suite-master/test-scenarios/008_restart_he_vm.py", line 128, in _get_he_status\n raise RuntimeError(\'could not parse JSON: %s\' % ret.out)\nRuntimeError: could not parse JSON: {"1": {"conf_on_shared_storage": true, "live-data": true, "extra": "metadata_parse_version=1\\nmetadata_feature_version=1\\ntimestamp=3275 (Fri Nov 2 19:04:22 2018)\\nhost-id=1\\nscore=3400\\nvm_conf_refresh_time=3275 (Fri Nov 2 19:04:23 2018)\\nconf_on_shared_storage=True\\nmaintenance=False\\nstate=EngineUp\\nstopped=False\\n", "hostname": "lago-he-basic-ansible-suite-master-host-0", "host-id": 1, "engine-status": {"health": "good", "vm": "up", "detail": "Up"}, "score": 3400, "stopped": false, "maintenance": false, "crc32": "0e550e4f", "local_conf_timestamp": 3275, "host-ts": 3275}, "2": {"conf_on_shared_storage": true, "live-data": true, "extra": "metadata_parse_version=1\\nmetadata_feature_version=1\\ntimestamp=3273 (Fri Nov 2 19:04:21 2018)\\nhost-id=2\\nscore=0\\nvm_conf_refresh_time=3274 (Fri Nov 2 19:04:21 2018)\\nconf_on_shared_storage=True\\nmaintenance=False\\nstate=ReinitializeFSM\\nstopped=False\\n", "hostname": "lago-he-basic-ansible-suite-master-host-1", "host-id": 2, "engine-status": {"reason": "vm not runnin\n--------------------- >> end captured logging << ---------------------'
[View Less]
6 years, 5 months
[oVirt Jenkins] ovirt-system-tests_hc-basic-suite-4.2 - Build # 619
- Failure!
by jenkins@jenkins.phx.ovirt.org
Project: http://jenkins.ovirt.org/job/ovirt-system-tests_hc-basic-suite-4.2/
Build: http://jenkins.ovirt.org/job/ovirt-system-tests_hc-basic-suite-4.2/619/
Build Number: 619
Build Status: Failure
Triggered By: Started by timer
-------------------------------------
Changes Since Last Success:
-------------------------------------
Changes for Build #619
[Galit Rosenthal] added hooks to poll-upstream-sources
[Barak Korren] Make STDCI V2 agnostic to wither you say 'please'
-----------------
…
[View More]Failed Tests:
-----------------
1 tests failed.
FAILED: 002_bootstrap.add_hosts
Error Message:
Failed to read response: [(<pycurl.Curl object at 0x7fcf4b8b9050>, 7, 'TCP connection reset by peer')]
-------------------- >> begin captured logging << --------------------
ovirtlago.testlib: ERROR: * Unhandled exception in <function _host_is_up_4 at 0x7fcf4b85c578>
Traceback (most recent call last):
File "/usr/lib/python2.7/site-packages/ovirtlago/testlib.py", line 234, in assert_equals_within
res = func()
File "/home/jenkins/workspace/ovirt-system-tests_hc-basic-suite-4.2/ovirt-system-tests/hc-basic-suite-4.2/test-scenarios/002_bootstrap.py", line 254, in _host_is_up_4
host_obj = host_service.get()
File "/usr/lib64/python2.7/site-packages/ovirtsdk4/services.py", line 37093, in get
return self._internal_get(headers, query, wait)
File "/usr/lib64/python2.7/site-packages/ovirtsdk4/service.py", line 211, in _internal_get
return future.wait() if wait else future
File "/usr/lib64/python2.7/site-packages/ovirtsdk4/service.py", line 54, in wait
response = self._connection.wait(self._context)
File "/usr/lib64/python2.7/site-packages/ovirtsdk4/__init__.py", line 496, in wait
return self.__wait(context, failed_auth)
File "/usr/lib64/python2.7/site-packages/ovirtsdk4/__init__.py", line 510, in __wait
raise Error("Failed to read response: {}".format(err_list))
Error: Failed to read response: [(<pycurl.Curl object at 0x7fcf4b8b9050>, 7, 'TCP connection reset by peer')]
--------------------- >> end captured logging << ---------------------
Stack Trace:
File "/usr/lib64/python2.7/unittest/case.py", line 369, in run
testMethod()
File "/usr/lib/python2.7/site-packages/nose/case.py", line 197, in runTest
self.test(*self.arg)
File "/usr/lib/python2.7/site-packages/ovirtlago/testlib.py", line 142, in wrapped_test
test()
File "/usr/lib/python2.7/site-packages/ovirtlago/testlib.py", line 60, in wrapper
return func(get_test_prefix(), *args, **kwargs)
File "/home/jenkins/workspace/ovirt-system-tests_hc-basic-suite-4.2/ovirt-system-tests/hc-basic-suite-4.2/test-scenarios/002_bootstrap.py", line 229, in add_hosts
add_hosts_4(prefix)
File "/home/jenkins/workspace/ovirt-system-tests_hc-basic-suite-4.2/ovirt-system-tests/hc-basic-suite-4.2/test-scenarios/002_bootstrap.py", line 273, in add_hosts_4
testlib.assert_true_within(_host_is_up_4, timeout=15*60)
File "/usr/lib/python2.7/site-packages/ovirtlago/testlib.py", line 278, in assert_true_within
assert_equals_within(func, True, timeout, allowed_exceptions)
File "/usr/lib/python2.7/site-packages/ovirtlago/testlib.py", line 234, in assert_equals_within
res = func()
File "/home/jenkins/workspace/ovirt-system-tests_hc-basic-suite-4.2/ovirt-system-tests/hc-basic-suite-4.2/test-scenarios/002_bootstrap.py", line 254, in _host_is_up_4
host_obj = host_service.get()
File "/usr/lib64/python2.7/site-packages/ovirtsdk4/services.py", line 37093, in get
return self._internal_get(headers, query, wait)
File "/usr/lib64/python2.7/site-packages/ovirtsdk4/service.py", line 211, in _internal_get
return future.wait() if wait else future
File "/usr/lib64/python2.7/site-packages/ovirtsdk4/service.py", line 54, in wait
response = self._connection.wait(self._context)
File "/usr/lib64/python2.7/site-packages/ovirtsdk4/__init__.py", line 496, in wait
return self.__wait(context, failed_auth)
File "/usr/lib64/python2.7/site-packages/ovirtsdk4/__init__.py", line 510, in __wait
raise Error("Failed to read response: {}".format(err_list))
'Failed to read response: [(<pycurl.Curl object at 0x7fcf4b8b9050>, 7, \'TCP connection reset by peer\')]\n-------------------- >> begin captured logging << --------------------\novirtlago.testlib: ERROR: * Unhandled exception in <function _host_is_up_4 at 0x7fcf4b85c578>\nTraceback (most recent call last):\n File "/usr/lib/python2.7/site-packages/ovirtlago/testlib.py", line 234, in assert_equals_within\n res = func()\n File "/home/jenkins/workspace/ovirt-system-tests_hc-basic-suite-4.2/ovirt-system-tests/hc-basic-suite-4.2/test-scenarios/002_bootstrap.py", line 254, in _host_is_up_4\n host_obj = host_service.get()\n File "/usr/lib64/python2.7/site-packages/ovirtsdk4/services.py", line 37093, in get\n return self._internal_get(headers, query, wait)\n File "/usr/lib64/python2.7/site-packages/ovirtsdk4/service.py", line 211, in _internal_get\n return future.wait() if wait else future\n File "/usr/lib64/python2.7/site-packages/ovirtsdk4/service.py", line 54, in wait\n response = self._connection.wait(self._context)\n File "/usr/lib64/python2.7/site-packages/ovirtsdk4/__init__.py", line 496, in wait\n return self.__wait(context, failed_auth)\n File "/usr/lib64/python2.7/site-packages/ovirtsdk4/__init__.py", line 510, in __wait\n raise Error("Failed to read response: {}".format(err_list))\nError: Failed to read response: [(<pycurl.Curl object at 0x7fcf4b8b9050>, 7, \'TCP connection reset by peer\')]\n--------------------- >> end captured logging << ---------------------'
[View Less]
6 years, 5 months
[oVirt Jenkins] ovirt-system-tests_performance-suite-master - Build
# 823 - Failure!
by jenkins@jenkins.phx.ovirt.org
Project: http://jenkins.ovirt.org/job/ovirt-system-tests_performance-suite-master/
Build: http://jenkins.ovirt.org/job/ovirt-system-tests_performance-suite-master/...
Build Number: 823
Build Status: Failure
Triggered By: Started by timer
-------------------------------------
Changes Since Last Success:
-------------------------------------
Changes for Build #823
[Martin Perina] Fix notifier check in master and 4.2
-----------------
Failed Tests:
-----------------
1 tests failed.
FAILED: …
[View More]040_add_hosts_vms.add_master_storage_domain
Error Message:
Fault reason is "Operation Failed". Fault detail is "[Cannot add storage server connection when Host status is not up]". HTTP response code is 409.
Stack Trace:
Traceback (most recent call last):
File "/usr/lib64/python2.7/unittest/case.py", line 369, in run
testMethod()
File "/usr/lib/python2.7/site-packages/nose/case.py", line 197, in runTest
self.test(*self.arg)
File "/usr/lib/python2.7/site-packages/ovirtlago/testlib.py", line 142, in wrapped_test
test()
File "/usr/lib/python2.7/site-packages/ovirtlago/testlib.py", line 60, in wrapper
return func(get_test_prefix(), *args, **kwargs)
File "/home/jenkins/workspace/ovirt-system-tests_performance-suite-master/ovirt-system-tests/performance-suite-master/test-scenarios/040_add_hosts_vms.py", line 340, in add_master_storage_domain
add_nfs_storage_domain(prefix)
File "/home/jenkins/workspace/ovirt-system-tests_performance-suite-master/ovirt-system-tests/performance-suite-master/test-scenarios/040_add_hosts_vms.py", line 344, in add_nfs_storage_domain
add_generic_nfs_storage_domain_4(prefix, SD_NFS_NAME, SD_NFS_HOST_NAME, SD_NFS_PATH, nfs_version='v4_2')
File "/home/jenkins/workspace/ovirt-system-tests_performance-suite-master/ovirt-system-tests/performance-suite-master/test-scenarios/040_add_hosts_vms.py", line 386, in add_generic_nfs_storage_domain_4
_add_storage_domain_4(api, p)
File "/home/jenkins/workspace/ovirt-system-tests_performance-suite-master/ovirt-system-tests/performance-suite-master/test-scenarios/040_add_hosts_vms.py", line 314, in _add_storage_domain_4
sd = sds_service.add(p)
File "/usr/lib64/python2.7/site-packages/ovirtsdk4/services.py", line 25064, in add
return self._internal_add(storage_domain, headers, query, wait)
File "/usr/lib64/python2.7/site-packages/ovirtsdk4/service.py", line 232, in _internal_add
return future.wait() if wait else future
File "/usr/lib64/python2.7/site-packages/ovirtsdk4/service.py", line 55, in wait
return self._code(response)
File "/usr/lib64/python2.7/site-packages/ovirtsdk4/service.py", line 229, in callback
self._check_fault(response)
File "/usr/lib64/python2.7/site-packages/ovirtsdk4/service.py", line 132, in _check_fault
self._raise_error(response, body)
File "/usr/lib64/python2.7/site-packages/ovirtsdk4/service.py", line 118, in _raise_error
raise error
Error: Fault reason is "Operation Failed". Fault detail is "[Cannot add storage server connection when Host status is not up]". HTTP response code is 409.
[View Less]
6 years, 5 months
[oVirt Jenkins] ovirt-system-tests_he-node-ng-suite-master - Build
# 519 - Failure!
by jenkins@jenkins.phx.ovirt.org
Project: http://jenkins.ovirt.org/job/ovirt-system-tests_he-node-ng-suite-master/
Build: http://jenkins.ovirt.org/job/ovirt-system-tests_he-node-ng-suite-master/519/
Build Number: 519
Build Status: Failure
Triggered By: Started by timer
-------------------------------------
Changes Since Last Success:
-------------------------------------
Changes for Build #519
[Ales Musil] master: Update master suite compatibility version
-----------------
Failed Tests:
-----------------
1 tests failed.
…
[View More]FAILED: 012_local_maintenance_sdk.local_maintenance
Error Message:
143
-------------------- >> begin captured logging << --------------------
root: INFO: * Waiting For System Stability...
root: INFO: * Performing Deactivation...
root: INFO: * Performing Activation...
root: INFO: * Waiting For System Stability...
cli: DEBUG: signal 15 was caught
--------------------- >> end captured logging << ---------------------
Stack Trace:
File "/usr/lib64/python2.7/unittest/case.py", line 369, in run
testMethod()
File "/usr/lib/python2.7/site-packages/nose/case.py", line 197, in runTest
self.test(*self.arg)
File "/usr/lib/python2.7/site-packages/ovirtlago/testlib.py", line 142, in wrapped_test
test()
File "/usr/lib/python2.7/site-packages/ovirtlago/testlib.py", line 60, in wrapper
return func(get_test_prefix(), *args, **kwargs)
File "/usr/lib/python2.7/site-packages/ovirtlago/testlib.py", line 79, in wrapper
prefix.virt_env.engine_vm().get_api(api_ver=4), *args, **kwargs
File "/usr/lib/python2.7/site-packages/ovirtlago/testlib.py", line 60, in wrapper
return func(get_test_prefix(), *args, **kwargs)
File "/home/jenkins/workspace/ovirt-system-tests_he-node-ng-suite-master/ovirt-system-tests/he-node-ng-suite-master/test-scenarios/012_local_maintenance_sdk.py", line 79, in local_maintenance
time.sleep(wait_value)
File "/usr/lib/python2.7/site-packages/lago/cmd.py", line 922, in exit_handler
sys.exit(128 + signum)
'143\n-------------------- >> begin captured logging << --------------------\nroot: INFO: * Waiting For System Stability...\nroot: INFO: * Performing Deactivation...\nroot: INFO: * Performing Activation...\nroot: INFO: * Waiting For System Stability...\ncli: DEBUG: signal 15 was caught\n--------------------- >> end captured logging << ---------------------'
[View Less]
6 years, 5 months