Build failed in Jenkins:
system-sync_mirrors-centos-updates-el7-x86_64 #1752
by jenkins@jenkins.phx.ovirt.org
See <http://jenkins.ovirt.org/job/system-sync_mirrors-centos-updates-el7-x86_6...>
------------------------------------------
Started by timer
[EnvInject] - Loading node environment variables.
Building remotely on mirrors.phx.ovirt.org (mirrors) in workspace <http://jenkins.ovirt.org/job/system-sync_mirrors-centos-updates-el7-x86_6...>
> git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
> git config remote.origin.url http://gerrit.ovirt.org/jenkins.git # timeout=10
Cleaning workspace
> git rev-parse --verify HEAD # timeout=10
Resetting working tree
> git reset --hard # timeout=10
> git clean -fdx # timeout=10
Pruning obsolete local branches
Fetching upstream changes from http://gerrit.ovirt.org/jenkins.git
> git --version # timeout=10
> git fetch --tags --progress http://gerrit.ovirt.org/jenkins.git +refs/heads/*:refs/remotes/origin/* --prune
> git rev-parse origin/master^{commit} # timeout=10
Checking out Revision f487c13bea81cf075c9878a65152927ace35b3e3 (origin/master)
> git config core.sparsecheckout # timeout=10
> git checkout -f f487c13bea81cf075c9878a65152927ace35b3e3
Commit message: "ioprocess: Enable standard CI v2"
> git rev-list --no-walk f487c13bea81cf075c9878a65152927ace35b3e3 # timeout=10
[system-sync_mirrors-centos-updates-el7-x86_64] $ /bin/bash -xe /tmp/jenkins1363124906769883359.sh
+ jenkins/scripts/mirror_mgr.sh resync_yum_mirror centos-updates-el7 x86_64 jenkins/data/mirrors-reposync.conf
Checking if mirror needs a resync
Traceback (most recent call last):
File "/usr/bin/reposync", line 343, in <module>
main()
File "/usr/bin/reposync", line 175, in main
my.doRepoSetup()
File "/usr/lib/python2.7/site-packages/yum/__init__.py", line 681, in doRepoSetup
return self._getRepos(thisrepo, True)
File "/usr/lib/python2.7/site-packages/yum/__init__.py", line 721, in _getRepos
self._repos.doSetup(thisrepo)
File "/usr/lib/python2.7/site-packages/yum/repos.py", line 157, in doSetup
self.retrieveAllMD()
File "/usr/lib/python2.7/site-packages/yum/repos.py", line 88, in retrieveAllMD
dl = repo._async and repo._commonLoadRepoXML(repo)
File "/usr/lib/python2.7/site-packages/yum/yumRepo.py", line 1474, in _commonLoadRepoXML
if self._latestRepoXML(local):
File "/usr/lib/python2.7/site-packages/yum/yumRepo.py", line 1443, in _latestRepoXML
oxml = self._saveOldRepoXML(local)
File "/usr/lib/python2.7/site-packages/yum/yumRepo.py", line 1300, in _saveOldRepoXML
shutil.copy2(local, old_local)
File "/usr/lib64/python2.7/shutil.py", line 131, in copy2
copystat(src, dst)
File "/usr/lib64/python2.7/shutil.py", line 98, in copystat
os.utime(dst, (st.st_atime, st.st_mtime))
OSError: [Errno 2] No such file or directory: '/home/jenkins/mirrors_cache/centos-extras-el7/repomd.xml.old.tmp'
Build step 'Execute shell' marked build as failure
6 years, 4 months
[oVirt Jenkins] ovirt-system-tests_he-basic-ansible-suite-4.2 -
Build # 374 - Failure!
by jenkins@jenkins.phx.ovirt.org
Project: http://jenkins.ovirt.org/job/ovirt-system-tests_he-basic-ansible-suite-4.2/
Build: http://jenkins.ovirt.org/job/ovirt-system-tests_he-basic-ansible-suite-4....
Build Number: 374
Build Status: Failure
Triggered By: Started by timer
-------------------------------------
Changes Since Last Success:
-------------------------------------
Changes for Build #374
[Gal Ben Haim] Adding 1GB of ram to hosts in HE suites
-----------------
Failed Tests:
-----------------
1 tests failed.
FAILED: 010_local_mainentance.local_maintenance
Error Message:
could not parse JSON: {"1": {"conf_on_shared_storage": true, "live-data": true, "extra": "metadata_parse_version=1\nmetadata_feature_version=1\ntimestamp=2413 (Sat Jul 14 04:05:24 2018)\nhost-id=1\nscore=3000\nvm_conf_refresh_time=2413 (Sat Jul 14 04:05:24 2018)\nconf_on_shared_storage=True\nmaintenance=False\nstate=EngineMigratingAway\nstopped=False\n", "hostname": "lago-he-basic-ansible-suite-4-2-host-0", "host-id": 1, "engine-status": {"health": "good", "vm": "up", "detail": "Migration Source"}, "score": 3000, "stopped": false, "maintenance": false, "crc32": "fbe19d82", "local_conf_timestamp": 2413, "host-ts": 2413}, "2": {"conf_on_shared_storage": true, "live-data": true, "extra": "metadata_parse_version=1\nmetadata_feature_version=1\ntimestamp=2406 (Sat Jul 14 04:05:17 2018)\nhost-id=2\nscore=3400\nvm_conf_refresh_time=2406 (Sat Jul 14 04:05:17 2018)\nconf_on_shared_storage=True\nmaintenance=False\nstate=EngineDown\nstopped=False\n", "hostname": "lago-he-basic-ansible-suite-4-2-host-1", "host-id": 2, "engine-status": {"reason
-------------------- >> begin captured logging << --------------------
lago.ssh: DEBUG: start task:20e58b83-75f8-4b3d-8f26-1c5b014b4eb7:Get ssh client for lago-he-basic-ansible-suite-4-2-host-0:
lago.ssh: DEBUG: end task:20e58b83-75f8-4b3d-8f26-1c5b014b4eb7:Get ssh client for lago-he-basic-ansible-suite-4-2-host-0:
lago.ssh: DEBUG: Running 9d5832ca on lago-he-basic-ansible-suite-4-2-host-0: hosted-engine --vm-status --json
lago.ssh: DEBUG: Command 9d5832ca on lago-he-basic-ansible-suite-4-2-host-0 returned with 0
lago.ssh: DEBUG: Command 9d5832ca on lago-he-basic-ansible-suite-4-2-host-0 output:
{"1": {"conf_on_shared_storage": true, "live-data": true, "extra": "metadata_parse_version=1\nmetadata_feature_version=1\ntimestamp=2383 (Sat Jul 14 04:04:53 2018)\nhost-id=1\nscore=3400\nvm_conf_refresh_time=2383 (Sat Jul 14 04:04:54 2018)\nconf_on_shared_storage=True\nmaintenance=False\nstate=EngineUp\nstopped=False\n", "hostname": "lago-he-basic-ansible-suite-4-2-host-0", "host-id": 1, "engine-status": {"health": "good", "vm": "up", "detail": "Up"}, "score": 3400, "stopped": false, "maintenance": false, "crc32": "f2c58547", "local_conf_timestamp": 2383, "host-ts": 2383}, "2": {"conf_on_shared_storage": true, "live-data": true, "extra": "metadata_parse_version=1\nmetadata_feature_version=1\ntimestamp=2385 (Sat Jul 14 04:04:56 2018)\nhost-id=2\nscore=3400\nvm_conf_refresh_time=2385 (Sat Jul 14 04:04:56 2018)\nconf_on_shared_storage=True\nmaintenance=False\nstate=EngineDown\nstopped=False\n", "hostname": "lago-he-basic-ansible-suite-4-2-host-1", "host-id": 2, "engine-status": {"reason": "vm not running on this host", "health": "bad", "vm": "down_unexpected", "detail": "unknown"}, "score": 3400, "stopped": false, "maintenance": false, "crc32": "920e0845", "local_conf_timestamp": 2385, "host-ts": 2385}, "global_maintenance": false}
lago.ssh: DEBUG: start task:82905888-0e48-4556-aa63-8bf44d0ab81a:Get ssh client for lago-he-basic-ansible-suite-4-2-host-0:
lago.ssh: DEBUG: end task:82905888-0e48-4556-aa63-8bf44d0ab81a:Get ssh client for lago-he-basic-ansible-suite-4-2-host-0:
lago.ssh: DEBUG: Running 9dd5f336 on lago-he-basic-ansible-suite-4-2-host-0: hosted-engine --set-maintenance --mode=local
lago.ssh: DEBUG: Command 9dd5f336 on lago-he-basic-ansible-suite-4-2-host-0 returned with 0
root: INFO: * Waiting for engine to migrate...
lago.ssh: DEBUG: start task:8d6f88dd-96c6-407e-98a2-f970dd23dbde:Get ssh client for lago-he-basic-ansible-suite-4-2-host-0:
lago.ssh: DEBUG: end task:8d6f88dd-96c6-407e-98a2-f970dd23dbde:Get ssh client for lago-he-basic-ansible-suite-4-2-host-0:
lago.ssh: DEBUG: Running 9e567c18 on lago-he-basic-ansible-suite-4-2-host-0: hosted-engine --vm-status --json
lago.ssh: DEBUG: Command 9e567c18 on lago-he-basic-ansible-suite-4-2-host-0 returned with 0
lago.ssh: DEBUG: Command 9e567c18 on lago-he-basic-ansible-suite-4-2-host-0 output:
{"1": {"conf_on_shared_storage": true, "live-data": true, "extra": "metadata_parse_version=1\nmetadata_feature_version=1\ntimestamp=2393 (Sat Jul 14 04:05:04 2018)\nhost-id=1\nscore=3400\nvm_conf_refresh_time=2393 (Sat Jul 14 04:05:04 2018)\nconf_on_shared_storage=True\nmaintenance=False\nstate=EngineUp\nstopped=False\n", "hostname": "lago-he-basic-ansible-suite-4-2-host-0", "host-id": 1, "engine-status": {"health": "good", "vm": "up", "detail": "Up"}, "score": 3400, "stopped": false, "maintenance": false, "crc32": "a8c95a47", "local_conf_timestamp": 2393, "host-ts": 2393}, "2": {"conf_on_shared_storage": true, "live-data": true, "extra": "metadata_parse_version=1\nmetadata_feature_version=1\ntimestamp=2385 (Sat Jul 14 04:04:56 2018)\nhost-id=2\nscore=3400\nvm_conf_refresh_time=2385 (Sat Jul 14 04:04:56 2018)\nconf_on_shared_storage=True\nmaintenance=False\nstate=EngineDown\nstopped=False\n", "hostname": "lago-he-basic-ansible-suite-4-2-host-1", "host-id": 2, "engine-status": {"reason": "vm not running on this host", "health": "bad", "vm": "down_unexpected", "detail": "unknown"}, "score": 3400, "stopped": false, "maintenance": false, "crc32": "920e0845", "local_conf_timestamp": 2385, "host-ts": 2385}, "global_maintenance": false}
lago.ssh: DEBUG: start task:6c85f2c8-fdb8-4ba0-844a-f0f3cea8338d:Get ssh client for lago-he-basic-ansible-suite-4-2-host-0:
lago.ssh: DEBUG: end task:6c85f2c8-fdb8-4ba0-844a-f0f3cea8338d:Get ssh client for lago-he-basic-ansible-suite-4-2-host-0:
lago.ssh: DEBUG: Running a5189694 on lago-he-basic-ansible-suite-4-2-host-0: hosted-engine --vm-status --json
lago.ssh: DEBUG: Command a5189694 on lago-he-basic-ansible-suite-4-2-host-0 returned with 0
lago.ssh: DEBUG: Command a5189694 on lago-he-basic-ansible-suite-4-2-host-0 output:
{"1": {"conf_on_shared_storage": true, "live-data": true, "extra": "metadata_parse_version=1\nmetadata_feature_version=1\ntimestamp=2403 (Sat Jul 14 04:05:13 2018)\nhost-id=1\nscore=3000\nvm_conf_refresh_time=2403 (Sat Jul 14 04:05:14 2018)\nconf_on_shared_storage=True\nmaintenance=False\nstate=EngineMigratingAway\nstopped=False\n", "hostname": "lago-he-basic-ansible-suite-4-2-host-0", "host-id": 1, "engine-status": {"health": "good", "vm": "up", "detail": "Up"}, "score": 3000, "stopped": false, "maintenance": false, "crc32": "eda598e3", "local_conf_timestamp": 2403, "host-ts": 2403}, "2": {"conf_on_shared_storage": true, "live-data": true, "extra": "metadata_parse_version=1\nmetadata_feature_version=1\ntimestamp=2395 (Sat Jul 14 04:05:06 2018)\nhost-id=2\nscore=3400\nvm_conf_refresh_time=2396 (Sat Jul 14 04:05:07 2018)\nconf_on_shared_storage=True\nmaintenance=False\nstate=EngineDown\nstopped=False\n", "hostname": "lago-he-basic-ansible-suite-4-2-host-1", "host-id": 2, "engine-status": {"reason": "vm not running on this host", "health": "bad", "vm": "down_unexpected", "detail": "unknown"}, "score": 3400, "stopped": false, "maintenance": false, "crc32": "cd8c1d29", "local_conf_timestamp": 2396, "host-ts": 2395}, "global_maintenance": false}
lago.ssh: DEBUG: start task:6fef0ac9-ea00-409b-a8e3-53e4a88f8fb9:Get ssh client for lago-he-basic-ansible-suite-4-2-host-0:
lago.ssh: DEBUG: end task:6fef0ac9-ea00-409b-a8e3-53e4a88f8fb9:Get ssh client for lago-he-basic-ansible-suite-4-2-host-0:
lago.ssh: DEBUG: Running a77272c0 on lago-he-basic-ansible-suite-4-2-host-0: hosted-engine --vm-status --json
lago.ssh: DEBUG: Command a77272c0 on lago-he-basic-ansible-suite-4-2-host-0 returned with 0
lago.ssh: DEBUG: Command a77272c0 on lago-he-basic-ansible-suite-4-2-host-0 output:
{"1": {"conf_on_shared_storage": true, "live-data": true, "extra": "metadata_parse_version=1\nmetadata_feature_version=1\ntimestamp=2403 (Sat Jul 14 04:05:13 2018)\nhost-id=1\nscore=3000\nvm_conf_refresh_time=2403 (Sat Jul 14 04:05:14 2018)\nconf_on_shared_storage=True\nmaintenance=False\nstate=EngineMigratingAway\nstopped=False\n", "hostname": "lago-he-basic-ansible-suite-4-2-host-0", "host-id": 1, "engine-status": {"health": "good", "vm": "up", "detail": "Up"}, "score": 3000, "stopped": false, "maintenance": false, "crc32": "eda598e3", "local_conf_timestamp": 2403, "host-ts": 2403}, "2": {"conf_on_shared_storage": true, "live-data": true, "extra": "metadata_parse_version=1\nmetadata_feature_version=1\ntimestamp=2395 (Sat Jul 14 04:05:06 2018)\nhost-id=2\nscore=3400\nvm_conf_refresh_time=2396 (Sat Jul 14 04:05:07 2018)\nconf_on_shared_storage=True\nmaintenance=False\nstate=EngineDown\nstopped=False\n", "hostname": "lago-he-basic-ansible-suite-4-2-host-1", "host-id": 2, "engine-status": {"reason": "vm not running on this host", "health": "bad", "vm": "down_unexpected", "detail": "unknown"}, "score": 3400, "stopped": false, "maintenance": false, "crc32": "cd8c1d29", "local_conf_timestamp": 2396, "host-ts": 2395}, "global_maintenance": false}
lago.ssh: DEBUG: start task:49db91cd-c792-4946-a91c-2fc015efcdbd:Get ssh client for lago-he-basic-ansible-suite-4-2-host-0:
lago.ssh: DEBUG: end task:49db91cd-c792-4946-a91c-2fc015efcdbd:Get ssh client for lago-he-basic-ansible-suite-4-2-host-0:
lago.ssh: DEBUG: Running a9bd1d8c on lago-he-basic-ansible-suite-4-2-host-0: hosted-engine --vm-status --json
lago.ssh: DEBUG: Command a9bd1d8c on lago-he-basic-ansible-suite-4-2-host-0 returned with 0
lago.ssh: DEBUG: Command a9bd1d8c on lago-he-basic-ansible-suite-4-2-host-0 output:
{"1": {"conf_on_shared_storage": true, "live-data": true, "extra": "metadata_parse_version=1\nmetadata_feature_version=1\ntimestamp=2413 (Sat Jul 14 04:05:24 2018)\nhost-id=1\nscore=3000\nvm_conf_refresh_time=2413 (Sat Jul 14 04:05:24 2018)\nconf_on_shared_storage=True\nmaintenance=False\nstate=EngineMigratingAway\nstopped=False\n", "hostname": "lago-he-basic-ansible-suite-4-2-host-0", "host-id": 1, "engine-status": {"health": "good", "vm": "up", "detail": "Migration Source"}, "score": 3000, "stopped": false, "maintenance": false, "crc32": "fbe19d82", "local_conf_timestamp": 2413, "host-ts": 2413}, "2": {"conf_on_shared_storage": true, "live-data": true, "extra": "metadata_parse_version=1\nmetadata_feature_version=1\ntimestamp=2406 (Sat Jul 14 04:05:17 2018)\nhost-id=2\nscore=3400\nvm_conf_refresh_time=2406 (Sat Jul 14 04:05:17 2018)\nconf_on_shared_storage=True\nmaintenance=False\nstate=EngineDown\nstopped=False\n", "hostname": "lago-he-basic-ansible-suite-4-2-host-1", "host-id": 2, "engine-status": {"reason": "vm not running on this host", "health": "bad", "vm": "down_unexpected", "detail": "unknown"}, "score": 3400, "stopped": false, "maintenance": false, "crc32": "94066a85", "local_conf_timestamp": 2406, "host-ts": 2406}, "global_maintenance": false}
lago.ssh: DEBUG: start task:ac48cebc-2a4e-4580-9aee-13bf372d88f5:Get ssh client for lago-he-basic-ansible-suite-4-2-host-0:
lago.ssh: DEBUG: end task:ac48cebc-2a4e-4580-9aee-13bf372d88f5:Get ssh client for lago-he-basic-ansible-suite-4-2-host-0:
lago.ssh: DEBUG: Running ac15ac20 on lago-he-basic-ansible-suite-4-2-host-0: hosted-engine --vm-status --json
lago.ssh: DEBUG: Command ac15ac20 on lago-he-basic-ansible-suite-4-2-host-0 returned with 0
lago.ssh: DEBUG: Command ac15ac20 on lago-he-basic-ansible-suite-4-2-host-0 output:
{"1": {"conf_on_shared_storage": true, "live-data": true, "extra": "metadata_parse_version=1\nmetadata_feature_version=1\ntimestamp=2413 (Sat Jul 14 04:05:24 2018)\nhost-id=1\nscore=3000\nvm_conf_refresh_time=2413 (Sat Jul 14 04:05:24 2018)\nconf_on_shared_storage=True\nmaintenance=False\nstate=EngineMigratingAway\nstopped=False\n", "hostname": "lago-he-basic-ansible-suite-4-2-host-0", "host-id": 1, "engine-status": {"health": "good", "vm": "up", "detail": "Migration Source"}, "score": 3000, "stopped": false, "maintenance": false, "crc32": "fbe19d82", "local_conf_timestamp": 2413, "host-ts": 2413}, "2": {"conf_on_shared_storage": true, "live-data": true, "extra": "metadata_parse_version=1\nmetadata_feature_version=1\ntimestamp=2406 (Sat Jul 14 04:05:17 2018)\nhost-id=2\nscore=3400\nvm_conf_refresh_time=2406 (Sat Jul 14 04:05:17 2018)\nconf_on_shared_storage=True\nmaintenance=False\nstate=EngineDown\nstopped=False\n", "hostname": "lago-he-basic-ansible-suite-4-2-host-1", "host-id": 2, "engine-status": {"reason
ovirtlago.testlib: ERROR: * Unhandled exception in <function <lambda> at 0x7f1ba9da3f50>
Traceback (most recent call last):
File "/usr/lib/python2.7/site-packages/ovirtlago/testlib.py", line 219, in assert_equals_within
res = func()
File "/home/jenkins/workspace/ovirt-system-tests_he-basic-ansible-suite-4.2/ovirt-system-tests/he-basic-ansible-suite-4.2/test-scenarios/010_local_mainentance.py", line 35, in <lambda>
testlib.assert_true_within_long(lambda: _get_he_status(host)
File "/home/jenkins/workspace/ovirt-system-tests_he-basic-ansible-suite-4.2/ovirt-system-tests/he-basic-ansible-suite-4.2/test-scenarios/010_local_mainentance.py", line 94, in _get_he_status
raise RuntimeError('could not parse JSON: %s' % ret.out)
RuntimeError: could not parse JSON: {"1": {"conf_on_shared_storage": true, "live-data": true, "extra": "metadata_parse_version=1\nmetadata_feature_version=1\ntimestamp=2413 (Sat Jul 14 04:05:24 2018)\nhost-id=1\nscore=3000\nvm_conf_refresh_time=2413 (Sat Jul 14 04:05:24 2018)\nconf_on_shared_storage=True\nmaintenance=False\nstate=EngineMigratingAway\nstopped=False\n", "hostname": "lago-he-basic-ansible-suite-4-2-host-0", "host-id": 1, "engine-status": {"health": "good", "vm": "up", "detail": "Migration Source"}, "score": 3000, "stopped": false, "maintenance": false, "crc32": "fbe19d82", "local_conf_timestamp": 2413, "host-ts": 2413}, "2": {"conf_on_shared_storage": true, "live-data": true, "extra": "metadata_parse_version=1\nmetadata_feature_version=1\ntimestamp=2406 (Sat Jul 14 04:05:17 2018)\nhost-id=2\nscore=3400\nvm_conf_refresh_time=2406 (Sat Jul 14 04:05:17 2018)\nconf_on_shared_storage=True\nmaintenance=False\nstate=EngineDown\nstopped=False\n", "hostname": "lago-he-basic-ansible-suite-4-2-host-1", "host-id": 2, "engine-status": {"reason
--------------------- >> end captured logging << ---------------------
Stack Trace:
File "/usr/lib64/python2.7/unittest/case.py", line 369, in run
testMethod()
File "/usr/lib/python2.7/site-packages/nose/case.py", line 197, in runTest
self.test(*self.arg)
File "/usr/lib/python2.7/site-packages/ovirtlago/testlib.py", line 129, in wrapped_test
test()
File "/usr/lib/python2.7/site-packages/ovirtlago/testlib.py", line 59, in wrapper
return func(get_test_prefix(), *args, **kwargs)
File "/home/jenkins/workspace/ovirt-system-tests_he-basic-ansible-suite-4.2/ovirt-system-tests/he-basic-ansible-suite-4.2/test-scenarios/010_local_mainentance.py", line 118, in local_maintenance
_wait_for_engine_migration(host, he_index, "bad", "Migration Destination")
File "/home/jenkins/workspace/ovirt-system-tests_he-basic-ansible-suite-4.2/ovirt-system-tests/he-basic-ansible-suite-4.2/test-scenarios/010_local_mainentance.py", line 35, in _wait_for_engine_migration
testlib.assert_true_within_long(lambda: _get_he_status(host)
File "/usr/lib/python2.7/site-packages/ovirtlago/testlib.py", line 271, in assert_true_within_long
assert_equals_within_long(func, True, allowed_exceptions)
File "/usr/lib/python2.7/site-packages/ovirtlago/testlib.py", line 258, in assert_equals_within_long
func, value, LONG_TIMEOUT, allowed_exceptions=allowed_exceptions
File "/usr/lib/python2.7/site-packages/ovirtlago/testlib.py", line 219, in assert_equals_within
res = func()
File "/home/jenkins/workspace/ovirt-system-tests_he-basic-ansible-suite-4.2/ovirt-system-tests/he-basic-ansible-suite-4.2/test-scenarios/010_local_mainentance.py", line 35, in <lambda>
testlib.assert_true_within_long(lambda: _get_he_status(host)
File "/home/jenkins/workspace/ovirt-system-tests_he-basic-ansible-suite-4.2/ovirt-system-tests/he-basic-ansible-suite-4.2/test-scenarios/010_local_mainentance.py", line 94, in _get_he_status
raise RuntimeError('could not parse JSON: %s' % ret.out)
'could not parse JSON: {"1": {"conf_on_shared_storage": true, "live-data": true, "extra": "metadata_parse_version=1\\nmetadata_feature_version=1\\ntimestamp=2413 (Sat Jul 14 04:05:24 2018)\\nhost-id=1\\nscore=3000\\nvm_conf_refresh_time=2413 (Sat Jul 14 04:05:24 2018)\\nconf_on_shared_storage=True\\nmaintenance=False\\nstate=EngineMigratingAway\\nstopped=False\\n", "hostname": "lago-he-basic-ansible-suite-4-2-host-0", "host-id": 1, "engine-status": {"health": "good", "vm": "up", "detail": "Migration Source"}, "score": 3000, "stopped": false, "maintenance": false, "crc32": "fbe19d82", "local_conf_timestamp": 2413, "host-ts": 2413}, "2": {"conf_on_shared_storage": true, "live-data": true, "extra": "metadata_parse_version=1\\nmetadata_feature_version=1\\ntimestamp=2406 (Sat Jul 14 04:05:17 2018)\\nhost-id=2\\nscore=3400\\nvm_conf_refresh_time=2406 (Sat Jul 14 04:05:17 2018)\\nconf_on_shared_storage=True\\nmaintenance=False\\nstate=EngineDown\\nstopped=False\\n", "hostname": "lago-he-basic-ansible-suite-4-2-host-1", "host-id": 2, "engine-status": {"reason\n-------------------- >> begin captured logging << --------------------\nlago.ssh: DEBUG: start task:20e58b83-75f8-4b3d-8f26-1c5b014b4eb7:Get ssh client for lago-he-basic-ansible-suite-4-2-host-0:\nlago.ssh: DEBUG: end task:20e58b83-75f8-4b3d-8f26-1c5b014b4eb7:Get ssh client for lago-he-basic-ansible-suite-4-2-host-0:\nlago.ssh: DEBUG: Running 9d5832ca on lago-he-basic-ansible-suite-4-2-host-0: hosted-engine --vm-status --json\nlago.ssh: DEBUG: Command 9d5832ca on lago-he-basic-ansible-suite-4-2-host-0 returned with 0\nlago.ssh: DEBUG: Command 9d5832ca on lago-he-basic-ansible-suite-4-2-host-0 output:\n {"1": {"conf_on_shared_storage": true, "live-data": true, "extra": "metadata_parse_version=1\\nmetadata_feature_version=1\\ntimestamp=2383 (Sat Jul 14 04:04:53 2018)\\nhost-id=1\\nscore=3400\\nvm_conf_refresh_time=2383 (Sat Jul 14 04:04:54 2018)\\nconf_on_shared_storage=True\\nmaintenance=False\\nstate=EngineUp\\nstopped=False\\n", "hostname": "lago-he-basic-ansible-suite-4-2-host-0", "host-id": 1, "engine-status": {"health": "good", "vm": "up", "detail": "Up"}, "score": 3400, "stopped": false, "maintenance": false, "crc32": "f2c58547", "local_conf_timestamp": 2383, "host-ts": 2383}, "2": {"conf_on_shared_storage": true, "live-data": true, "extra": "metadata_parse_version=1\\nmetadata_feature_version=1\\ntimestamp=2385 (Sat Jul 14 04:04:56 2018)\\nhost-id=2\\nscore=3400\\nvm_conf_refresh_time=2385 (Sat Jul 14 04:04:56 2018)\\nconf_on_shared_storage=True\\nmaintenance=False\\nstate=EngineDown\\nstopped=False\\n", "hostname": "lago-he-basic-ansible-suite-4-2-host-1", "host-id": 2, "engine-status": {"reason": "vm not running on this host", "health": "bad", "vm": "down_unexpected", "detail": "unknown"}, "score": 3400, "stopped": false, "maintenance": false, "crc32": "920e0845", "local_conf_timestamp": 2385, "host-ts": 2385}, "global_maintenance": false}\n\nlago.ssh: DEBUG: start task:82905888-0e48-4556-aa63-8bf44d0ab81a:Get ssh client for lago-he-basic-ansible-suite-4-2-host-0:\nlago.ssh: DEBUG: end task:82905888-0e48-4556-aa63-8bf44d0ab81a:Get ssh client for lago-he-basic-ansible-suite-4-2-host-0:\nlago.ssh: DEBUG: Running 9dd5f336 on lago-he-basic-ansible-suite-4-2-host-0: hosted-engine --set-maintenance --mode=local\nlago.ssh: DEBUG: Command 9dd5f336 on lago-he-basic-ansible-suite-4-2-host-0 returned with 0\nroot: INFO: * Waiting for engine to migrate...\nlago.ssh: DEBUG: start task:8d6f88dd-96c6-407e-98a2-f970dd23dbde:Get ssh client for lago-he-basic-ansible-suite-4-2-host-0:\nlago.ssh: DEBUG: end task:8d6f88dd-96c6-407e-98a2-f970dd23dbde:Get ssh client for lago-he-basic-ansible-suite-4-2-host-0:\nlago.ssh: DEBUG: Running 9e567c18 on lago-he-basic-ansible-suite-4-2-host-0: hosted-engine --vm-status --json\nlago.ssh: DEBUG: Command 9e567c18 on lago-he-basic-ansible-suite-4-2-host-0 returned with 0\nlago.ssh: DEBUG: Command 9e567c18 on lago-he-basic-ansible-suite-4-2-host-0 output:\n {"1": {"conf_on_shared_storage": true, "live-data": true, "extra": "metadata_parse_version=1\\nmetadata_feature_version=1\\ntimestamp=2393 (Sat Jul 14 04:05:04 2018)\\nhost-id=1\\nscore=3400\\nvm_conf_refresh_time=2393 (Sat Jul 14 04:05:04 2018)\\nconf_on_shared_storage=True\\nmaintenance=False\\nstate=EngineUp\\nstopped=False\\n", "hostname": "lago-he-basic-ansible-suite-4-2-host-0", "host-id": 1, "engine-status": {"health": "good", "vm": "up", "detail": "Up"}, "score": 3400, "stopped": false, "maintenance": false, "crc32": "a8c95a47", "local_conf_timestamp": 2393, "host-ts": 2393}, "2": {"conf_on_shared_storage": true, "live-data": true, "extra": "metadata_parse_version=1\\nmetadata_feature_version=1\\ntimestamp=2385 (Sat Jul 14 04:04:56 2018)\\nhost-id=2\\nscore=3400\\nvm_conf_refresh_time=2385 (Sat Jul 14 04:04:56 2018)\\nconf_on_shared_storage=True\\nmaintenance=False\\nstate=EngineDown\\nstopped=False\\n", "hostname": "lago-he-basic-ansible-suite-4-2-host-1", "host-id": 2, "engine-status": {"reason": "vm not running on this host", "health": "bad", "vm": "down_unexpected", "detail": "unknown"}, "score": 3400, "stopped": false, "maintenance": false, "crc32": "920e0845", "local_conf_timestamp": 2385, "host-ts": 2385}, "global_maintenance": false}\n\nlago.ssh: DEBUG: start task:6c85f2c8-fdb8-4ba0-844a-f0f3cea8338d:Get ssh client for lago-he-basic-ansible-suite-4-2-host-0:\nlago.ssh: DEBUG: end task:6c85f2c8-fdb8-4ba0-844a-f0f3cea8338d:Get ssh client for lago-he-basic-ansible-suite-4-2-host-0:\nlago.ssh: DEBUG: Running a5189694 on lago-he-basic-ansible-suite-4-2-host-0: hosted-engine --vm-status --json\nlago.ssh: DEBUG: Command a5189694 on lago-he-basic-ansible-suite-4-2-host-0 returned with 0\nlago.ssh: DEBUG: Command a5189694 on lago-he-basic-ansible-suite-4-2-host-0 output:\n {"1": {"conf_on_shared_storage": true, "live-data": true, "extra": "metadata_parse_version=1\\nmetadata_feature_version=1\\ntimestamp=2403 (Sat Jul 14 04:05:13 2018)\\nhost-id=1\\nscore=3000\\nvm_conf_refresh_time=2403 (Sat Jul 14 04:05:14 2018)\\nconf_on_shared_storage=True\\nmaintenance=False\\nstate=EngineMigratingAway\\nstopped=False\\n", "hostname": "lago-he-basic-ansible-suite-4-2-host-0", "host-id": 1, "engine-status": {"health": "good", "vm": "up", "detail": "Up"}, "score": 3000, "stopped": false, "maintenance": false, "crc32": "eda598e3", "local_conf_timestamp": 2403, "host-ts": 2403}, "2": {"conf_on_shared_storage": true, "live-data": true, "extra": "metadata_parse_version=1\\nmetadata_feature_version=1\\ntimestamp=2395 (Sat Jul 14 04:05:06 2018)\\nhost-id=2\\nscore=3400\\nvm_conf_refresh_time=2396 (Sat Jul 14 04:05:07 2018)\\nconf_on_shared_storage=True\\nmaintenance=False\\nstate=EngineDown\\nstopped=False\\n", "hostname": "lago-he-basic-ansible-suite-4-2-host-1", "host-id": 2, "engine-status": {"reason": "vm not running on this host", "health": "bad", "vm": "down_unexpected", "detail": "unknown"}, "score": 3400, "stopped": false, "maintenance": false, "crc32": "cd8c1d29", "local_conf_timestamp": 2396, "host-ts": 2395}, "global_maintenance": false}\n\nlago.ssh: DEBUG: start task:6fef0ac9-ea00-409b-a8e3-53e4a88f8fb9:Get ssh client for lago-he-basic-ansible-suite-4-2-host-0:\nlago.ssh: DEBUG: end task:6fef0ac9-ea00-409b-a8e3-53e4a88f8fb9:Get ssh client for lago-he-basic-ansible-suite-4-2-host-0:\nlago.ssh: DEBUG: Running a77272c0 on lago-he-basic-ansible-suite-4-2-host-0: hosted-engine --vm-status --json\nlago.ssh: DEBUG: Command a77272c0 on lago-he-basic-ansible-suite-4-2-host-0 returned with 0\nlago.ssh: DEBUG: Command a77272c0 on lago-he-basic-ansible-suite-4-2-host-0 output:\n {"1": {"conf_on_shared_storage": true, "live-data": true, "extra": "metadata_parse_version=1\\nmetadata_feature_version=1\\ntimestamp=2403 (Sat Jul 14 04:05:13 2018)\\nhost-id=1\\nscore=3000\\nvm_conf_refresh_time=2403 (Sat Jul 14 04:05:14 2018)\\nconf_on_shared_storage=True\\nmaintenance=False\\nstate=EngineMigratingAway\\nstopped=False\\n", "hostname": "lago-he-basic-ansible-suite-4-2-host-0", "host-id": 1, "engine-status": {"health": "good", "vm": "up", "detail": "Up"}, "score": 3000, "stopped": false, "maintenance": false, "crc32": "eda598e3", "local_conf_timestamp": 2403, "host-ts": 2403}, "2": {"conf_on_shared_storage": true, "live-data": true, "extra": "metadata_parse_version=1\\nmetadata_feature_version=1\\ntimestamp=2395 (Sat Jul 14 04:05:06 2018)\\nhost-id=2\\nscore=3400\\nvm_conf_refresh_time=2396 (Sat Jul 14 04:05:07 2018)\\nconf_on_shared_storage=True\\nmaintenance=False\\nstate=EngineDown\\nstopped=False\\n", "hostname": "lago-he-basic-ansible-suite-4-2-host-1", "host-id": 2, "engine-status": {"reason": "vm not running on this host", "health": "bad", "vm": "down_unexpected", "detail": "unknown"}, "score": 3400, "stopped": false, "maintenance": false, "crc32": "cd8c1d29", "local_conf_timestamp": 2396, "host-ts": 2395}, "global_maintenance": false}\n\nlago.ssh: DEBUG: start task:49db91cd-c792-4946-a91c-2fc015efcdbd:Get ssh client for lago-he-basic-ansible-suite-4-2-host-0:\nlago.ssh: DEBUG: end task:49db91cd-c792-4946-a91c-2fc015efcdbd:Get ssh client for lago-he-basic-ansible-suite-4-2-host-0:\nlago.ssh: DEBUG: Running a9bd1d8c on lago-he-basic-ansible-suite-4-2-host-0: hosted-engine --vm-status --json\nlago.ssh: DEBUG: Command a9bd1d8c on lago-he-basic-ansible-suite-4-2-host-0 returned with 0\nlago.ssh: DEBUG: Command a9bd1d8c on lago-he-basic-ansible-suite-4-2-host-0 output:\n {"1": {"conf_on_shared_storage": true, "live-data": true, "extra": "metadata_parse_version=1\\nmetadata_feature_version=1\\ntimestamp=2413 (Sat Jul 14 04:05:24 2018)\\nhost-id=1\\nscore=3000\\nvm_conf_refresh_time=2413 (Sat Jul 14 04:05:24 2018)\\nconf_on_shared_storage=True\\nmaintenance=False\\nstate=EngineMigratingAway\\nstopped=False\\n", "hostname": "lago-he-basic-ansible-suite-4-2-host-0", "host-id": 1, "engine-status": {"health": "good", "vm": "up", "detail": "Migration Source"}, "score": 3000, "stopped": false, "maintenance": false, "crc32": "fbe19d82", "local_conf_timestamp": 2413, "host-ts": 2413}, "2": {"conf_on_shared_storage": true, "live-data": true, "extra": "metadata_parse_version=1\\nmetadata_feature_version=1\\ntimestamp=2406 (Sat Jul 14 04:05:17 2018)\\nhost-id=2\\nscore=3400\\nvm_conf_refresh_time=2406 (Sat Jul 14 04:05:17 2018)\\nconf_on_shared_storage=True\\nmaintenance=False\\nstate=EngineDown\\nstopped=False\\n", "hostname": "lago-he-basic-ansible-suite-4-2-host-1", "host-id": 2, "engine-status": {"reason": "vm not running on this host", "health": "bad", "vm": "down_unexpected", "detail": "unknown"}, "score": 3400, "stopped": false, "maintenance": false, "crc32": "94066a85", "local_conf_timestamp": 2406, "host-ts": 2406}, "global_maintenance": false}\n\nlago.ssh: DEBUG: start task:ac48cebc-2a4e-4580-9aee-13bf372d88f5:Get ssh client for lago-he-basic-ansible-suite-4-2-host-0:\nlago.ssh: DEBUG: end task:ac48cebc-2a4e-4580-9aee-13bf372d88f5:Get ssh client for lago-he-basic-ansible-suite-4-2-host-0:\nlago.ssh: DEBUG: Running ac15ac20 on lago-he-basic-ansible-suite-4-2-host-0: hosted-engine --vm-status --json\nlago.ssh: DEBUG: Command ac15ac20 on lago-he-basic-ansible-suite-4-2-host-0 returned with 0\nlago.ssh: DEBUG: Command ac15ac20 on lago-he-basic-ansible-suite-4-2-host-0 output:\n {"1": {"conf_on_shared_storage": true, "live-data": true, "extra": "metadata_parse_version=1\\nmetadata_feature_version=1\\ntimestamp=2413 (Sat Jul 14 04:05:24 2018)\\nhost-id=1\\nscore=3000\\nvm_conf_refresh_time=2413 (Sat Jul 14 04:05:24 2018)\\nconf_on_shared_storage=True\\nmaintenance=False\\nstate=EngineMigratingAway\\nstopped=False\\n", "hostname": "lago-he-basic-ansible-suite-4-2-host-0", "host-id": 1, "engine-status": {"health": "good", "vm": "up", "detail": "Migration Source"}, "score": 3000, "stopped": false, "maintenance": false, "crc32": "fbe19d82", "local_conf_timestamp": 2413, "host-ts": 2413}, "2": {"conf_on_shared_storage": true, "live-data": true, "extra": "metadata_parse_version=1\\nmetadata_feature_version=1\\ntimestamp=2406 (Sat Jul 14 04:05:17 2018)\\nhost-id=2\\nscore=3400\\nvm_conf_refresh_time=2406 (Sat Jul 14 04:05:17 2018)\\nconf_on_shared_storage=True\\nmaintenance=False\\nstate=EngineDown\\nstopped=False\\n", "hostname": "lago-he-basic-ansible-suite-4-2-host-1", "host-id": 2, "engine-status": {"reason\novirtlago.testlib: ERROR: * Unhandled exception in <function <lambda> at 0x7f1ba9da3f50>\nTraceback (most recent call last):\n File "/usr/lib/python2.7/site-packages/ovirtlago/testlib.py", line 219, in assert_equals_within\n res = func()\n File "/home/jenkins/workspace/ovirt-system-tests_he-basic-ansible-suite-4.2/ovirt-system-tests/he-basic-ansible-suite-4.2/test-scenarios/010_local_mainentance.py", line 35, in <lambda>\n testlib.assert_true_within_long(lambda: _get_he_status(host)\n File "/home/jenkins/workspace/ovirt-system-tests_he-basic-ansible-suite-4.2/ovirt-system-tests/he-basic-ansible-suite-4.2/test-scenarios/010_local_mainentance.py", line 94, in _get_he_status\n raise RuntimeError(\'could not parse JSON: %s\' % ret.out)\nRuntimeError: could not parse JSON: {"1": {"conf_on_shared_storage": true, "live-data": true, "extra": "metadata_parse_version=1\\nmetadata_feature_version=1\\ntimestamp=2413 (Sat Jul 14 04:05:24 2018)\\nhost-id=1\\nscore=3000\\nvm_conf_refresh_time=2413 (Sat Jul 14 04:05:24 2018)\\nconf_on_shared_storage=True\\nmaintenance=False\\nstate=EngineMigratingAway\\nstopped=False\\n", "hostname": "lago-he-basic-ansible-suite-4-2-host-0", "host-id": 1, "engine-status": {"health": "good", "vm": "up", "detail": "Migration Source"}, "score": 3000, "stopped": false, "maintenance": false, "crc32": "fbe19d82", "local_conf_timestamp": 2413, "host-ts": 2413}, "2": {"conf_on_shared_storage": true, "live-data": true, "extra": "metadata_parse_version=1\\nmetadata_feature_version=1\\ntimestamp=2406 (Sat Jul 14 04:05:17 2018)\\nhost-id=2\\nscore=3400\\nvm_conf_refresh_time=2406 (Sat Jul 14 04:05:17 2018)\\nconf_on_shared_storage=True\\nmaintenance=False\\nstate=EngineDown\\nstopped=False\\n", "hostname": "lago-he-basic-ansible-suite-4-2-host-1", "host-id": 2, "engine-status": {"reason\n--------------------- >> end captured logging << ---------------------'
6 years, 4 months
[oVirt Jenkins] ovirt-system-tests_hc-basic-suite-4.2 - Build # 350
- Failure!
by jenkins@jenkins.phx.ovirt.org
Project: http://jenkins.ovirt.org/job/ovirt-system-tests_hc-basic-suite-4.2/
Build: http://jenkins.ovirt.org/job/ovirt-system-tests_hc-basic-suite-4.2/350/
Build Number: 350
Build Status: Failure
Triggered By: Started by timer
-------------------------------------
Changes Since Last Success:
-------------------------------------
Changes for Build #350
[Gal Ben Haim] Adding 1GB of ram to hosts in HE suites
-----------------
Failed Tests:
-----------------
1 tests failed.
FAILED: 002_bootstrap.add_hosts
Error Message:
The response content type 'text/html; charset=iso-8859-1' isn't the expected XML
-------------------- >> begin captured logging << --------------------
ovirtlago.testlib: ERROR: * Unhandled exception in <function _host_is_up_4 at 0x7fec792725f0>
Traceback (most recent call last):
File "/usr/lib/python2.7/site-packages/ovirtlago/testlib.py", line 219, in assert_equals_within
res = func()
File "/home/jenkins/workspace/ovirt-system-tests_hc-basic-suite-4.2/ovirt-system-tests/hc-basic-suite-4.2/test-scenarios/002_bootstrap.py", line 254, in _host_is_up_4
host_obj = host_service.get()
File "/usr/lib64/python2.7/site-packages/ovirtsdk4/services.py", line 36846, in get
return self._internal_get(headers, query, wait)
File "/usr/lib64/python2.7/site-packages/ovirtsdk4/service.py", line 211, in _internal_get
return future.wait() if wait else future
File "/usr/lib64/python2.7/site-packages/ovirtsdk4/service.py", line 55, in wait
return self._code(response)
File "/usr/lib64/python2.7/site-packages/ovirtsdk4/service.py", line 208, in callback
self._check_fault(response)
File "/usr/lib64/python2.7/site-packages/ovirtsdk4/service.py", line 130, in _check_fault
body = self._internal_read_body(response)
File "/usr/lib64/python2.7/site-packages/ovirtsdk4/service.py", line 316, in _internal_read_body
self._connection.check_xml_content_type(response)
File "/usr/lib64/python2.7/site-packages/ovirtsdk4/__init__.py", line 849, in check_xml_content_type
response.headers
File "/usr/lib64/python2.7/site-packages/ovirtsdk4/__init__.py", line 889, in _check_content_type
raise Error(msg)
Error: The response content type 'text/html; charset=iso-8859-1' isn't the expected XML
--------------------- >> end captured logging << ---------------------
Stack Trace:
File "/usr/lib64/python2.7/unittest/case.py", line 369, in run
testMethod()
File "/usr/lib/python2.7/site-packages/nose/case.py", line 197, in runTest
self.test(*self.arg)
File "/usr/lib/python2.7/site-packages/ovirtlago/testlib.py", line 129, in wrapped_test
test()
File "/usr/lib/python2.7/site-packages/ovirtlago/testlib.py", line 59, in wrapper
return func(get_test_prefix(), *args, **kwargs)
File "/home/jenkins/workspace/ovirt-system-tests_hc-basic-suite-4.2/ovirt-system-tests/hc-basic-suite-4.2/test-scenarios/002_bootstrap.py", line 229, in add_hosts
add_hosts_4(prefix)
File "/home/jenkins/workspace/ovirt-system-tests_hc-basic-suite-4.2/ovirt-system-tests/hc-basic-suite-4.2/test-scenarios/002_bootstrap.py", line 273, in add_hosts_4
testlib.assert_true_within(_host_is_up_4, timeout=15*60)
File "/usr/lib/python2.7/site-packages/ovirtlago/testlib.py", line 263, in assert_true_within
assert_equals_within(func, True, timeout, allowed_exceptions)
File "/usr/lib/python2.7/site-packages/ovirtlago/testlib.py", line 219, in assert_equals_within
res = func()
File "/home/jenkins/workspace/ovirt-system-tests_hc-basic-suite-4.2/ovirt-system-tests/hc-basic-suite-4.2/test-scenarios/002_bootstrap.py", line 254, in _host_is_up_4
host_obj = host_service.get()
File "/usr/lib64/python2.7/site-packages/ovirtsdk4/services.py", line 36846, in get
return self._internal_get(headers, query, wait)
File "/usr/lib64/python2.7/site-packages/ovirtsdk4/service.py", line 211, in _internal_get
return future.wait() if wait else future
File "/usr/lib64/python2.7/site-packages/ovirtsdk4/service.py", line 55, in wait
return self._code(response)
File "/usr/lib64/python2.7/site-packages/ovirtsdk4/service.py", line 208, in callback
self._check_fault(response)
File "/usr/lib64/python2.7/site-packages/ovirtsdk4/service.py", line 130, in _check_fault
body = self._internal_read_body(response)
File "/usr/lib64/python2.7/site-packages/ovirtsdk4/service.py", line 316, in _internal_read_body
self._connection.check_xml_content_type(response)
File "/usr/lib64/python2.7/site-packages/ovirtsdk4/__init__.py", line 849, in check_xml_content_type
response.headers
File "/usr/lib64/python2.7/site-packages/ovirtsdk4/__init__.py", line 889, in _check_content_type
raise Error(msg)
'The response content type \'text/html; charset=iso-8859-1\' isn\'t the expected XML\n-------------------- >> begin captured logging << --------------------\novirtlago.testlib: ERROR: * Unhandled exception in <function _host_is_up_4 at 0x7fec792725f0>\nTraceback (most recent call last):\n File "/usr/lib/python2.7/site-packages/ovirtlago/testlib.py", line 219, in assert_equals_within\n res = func()\n File "/home/jenkins/workspace/ovirt-system-tests_hc-basic-suite-4.2/ovirt-system-tests/hc-basic-suite-4.2/test-scenarios/002_bootstrap.py", line 254, in _host_is_up_4\n host_obj = host_service.get()\n File "/usr/lib64/python2.7/site-packages/ovirtsdk4/services.py", line 36846, in get\n return self._internal_get(headers, query, wait)\n File "/usr/lib64/python2.7/site-packages/ovirtsdk4/service.py", line 211, in _internal_get\n return future.wait() if wait else future\n File "/usr/lib64/python2.7/site-packages/ovirtsdk4/service.py", line 55, in wait\n return self._code(response)\n File "/usr/lib64/python2.7/site-packages/ovirtsdk4/service.py", line 208, in callback\n self._check_fault(response)\n File "/usr/lib64/python2.7/site-packages/ovirtsdk4/service.py", line 130, in _check_fault\n body = self._internal_read_body(response)\n File "/usr/lib64/python2.7/site-packages/ovirtsdk4/service.py", line 316, in _internal_read_body\n self._connection.check_xml_content_type(response)\n File "/usr/lib64/python2.7/site-packages/ovirtsdk4/__init__.py", line 849, in check_xml_content_type\n response.headers\n File "/usr/lib64/python2.7/site-packages/ovirtsdk4/__init__.py", line 889, in _check_content_type\n raise Error(msg)\nError: The response content type \'text/html; charset=iso-8859-1\' isn\'t the expected XML\n--------------------- >> end captured logging << ---------------------'
6 years, 4 months
[oVirt Jenkins] ovirt-system-tests_he-node-ng-suite-master - Build
# 164 - Failure!
by jenkins@jenkins.phx.ovirt.org
Project: http://jenkins.ovirt.org/job/ovirt-system-tests_he-node-ng-suite-master/
Build: http://jenkins.ovirt.org/job/ovirt-system-tests_he-node-ng-suite-master/164/
Build Number: 164
Build Status: Failure
Triggered By: Started by timer
-------------------------------------
Changes Since Last Success:
-------------------------------------
Changes for Build #164
[Marcin Mirecki] network: Add ping check for networks connected via multiple routers
-----------------
Failed Tests:
-----------------
1 tests failed.
FAILED: 004_basic_sanity.vm_run
Error Message:
status: 409
reason: Conflict
detail: Cannot run VM. There is no host that satisfies current scheduling constraints. See below for details:, The host lago-he-node-ng-suite-master-host-0 did not satisfy internal filter Memory because its available memory is too low (656 MB) to run the VM.
Stack Trace:
Traceback (most recent call last):
File "/usr/lib64/python2.7/unittest/case.py", line 369, in run
testMethod()
File "/usr/lib/python2.7/site-packages/nose/case.py", line 197, in runTest
self.test(*self.arg)
File "/usr/lib/python2.7/site-packages/ovirtlago/testlib.py", line 129, in wrapped_test
test()
File "/usr/lib/python2.7/site-packages/ovirtlago/testlib.py", line 59, in wrapper
return func(get_test_prefix(), *args, **kwargs)
File "/home/jenkins/workspace/ovirt-system-tests_he-node-ng-suite-master/ovirt-system-tests/he-node-ng-suite-master/test-scenarios/004_basic_sanity.py", line 265, in vm_run
api.vms.get(VM0_NAME).start(start_params)
File "/usr/lib/python2.7/site-packages/ovirtsdk/infrastructure/brokers.py", line 31193, in start
headers={"Correlation-Id":correlation_id}
File "/usr/lib/python2.7/site-packages/ovirtsdk/infrastructure/proxy.py", line 122, in request
persistent_auth=self.__persistent_auth
File "/usr/lib/python2.7/site-packages/ovirtsdk/infrastructure/connectionspool.py", line 79, in do_request
persistent_auth)
File "/usr/lib/python2.7/site-packages/ovirtsdk/infrastructure/connectionspool.py", line 162, in __do_request
raise errors.RequestError(response_code, response_reason, response_body)
RequestError:
status: 409
reason: Conflict
detail: Cannot run VM. There is no host that satisfies current scheduling constraints. See below for details:, The host lago-he-node-ng-suite-master-host-0 did not satisfy internal filter Memory because its available memory is too low (656 MB) to run the VM.
6 years, 4 months
Build failed in Jenkins: deploy-to_ovirt-4.1_tested #1487
by jenkins@jenkins.phx.ovirt.org
See <http://jenkins.ovirt.org/job/deploy-to_ovirt-4.1_tested/1487/display/redi...>
------------------------------------------
Started by upstream project "ovirt-release_4.1_build-artifacts-el7-x86_64" build number 644
originally caused by:
Started by timer
[EnvInject] - Loading node environment variables.
Building remotely on vm0156.workers-phx.ovirt.org (phx fc28 nested) in workspace <http://jenkins.ovirt.org/job/deploy-to_ovirt-4.1_tested/ws/>
[ssh-agent] Looking for ssh-agent implementation...
[ssh-agent] Exec ssh-agent (binary ssh-agent on a remote machine)
$ ssh-agent
SSH_AUTH_SOCK=/tmp/ssh-UNdjSQSz4DPu/agent.29258
SSH_AGENT_PID=29262
[ssh-agent] Started.
$ ssh-add <http://jenkins.ovirt.org/job/deploy-to_ovirt-4.1_tested/ws/@tmp/private_k...>
Identity added: <http://jenkins.ovirt.org/job/deploy-to_ovirt-4.1_tested/ws/@tmp/private_k...> (<http://jenkins.ovirt.org/job/deploy-to_ovirt-4.1_tested/ws/@tmp/private_k...)>
[ssh-agent] Using credentials deploy-ovirt-experimental (SSH key for deploying to the tested repo)
[deploy-to_ovirt-4.1_tested] $ /bin/bash -xe /tmp/jenkins889326999217138486.sh
+ [[ http://jenkins.ovirt.org/job/ovirt-release_4.1_build-artifacts-el7-x86_64... == '' ]]
+ queue_name=ovirt-4.1
+ echo repo-extra-dir:4.1
+ echo http://jenkins.ovirt.org/job/ovirt-release_4.1_build-artifacts-el7-x86_64...
+ ssh -o StrictHostKeyChecking=no deploy-ovirt-experimental(a)resources.ovirt.org
Pseudo-terminal will not be allocated because stdin is not a terminal.
+ BASE_DIR=/srv/resources/repos/ovirt/tested
+ PUBLISH_MD_COPIES=50
+ main
+ local tmp_dir
+ mkdir -p /srv/resources/repos/ovirt/tested
++ mktemp -d /srv/resources/repos/ovirt/tested/.deploy.XXXXXXXXXX
Collecting packages
+ tmp_dir=/srv/resources/repos/ovirt/tested/.deploy.Ap4BPkt4b8
+ trap 'rm -rf '\''/srv/resources/repos/ovirt/tested/.deploy.Ap4BPkt4b8'\''' EXIT HUP
+ echo 'Collecting packages'
+ collect_packages /srv/resources/repos/ovirt/tested/.deploy.Ap4BPkt4b8
+ local repoman_dst=/srv/resources/repos/ovirt/tested/.deploy.Ap4BPkt4b8
+ repoman --temp-dir generate-in-repo --option main.allowed_repo_paths=/srv/resources/repos/ovirt/tested/.deploy.Ap4BPkt4b8 --option main.on_empty_source=warn /srv/resources/repos/ovirt/tested/.deploy.Ap4BPkt4b8 add conf:stdin
2018-07-12 03:00:07,369::INFO::repoman.cmd::
2018-07-12 03:00:07,400::INFO::repoman.cmd::Adding artifacts to the repo /srv/resources/repos/ovirt/tested/.deploy.Ap4BPkt4b8
2018-07-12 03:00:07,400::INFO::repoman.common.repo::Adding repo extra dir 4.1
2018-07-12 03:00:07,405::INFO::repoman.common.stores.RPM::Loading repo /srv/resources/repos/ovirt/tested/.deploy.Ap4BPkt4b8/4.1
2018-07-12 03:00:07,405::INFO::repoman.common.stores.RPM::Repo /srv/resources/repos/ovirt/tested/.deploy.Ap4BPkt4b8/4.1 loaded
2018-07-12 03:00:07,407::INFO::repoman.common.stores.iso::Loading repo /srv/resources/repos/ovirt/tested/.deploy.Ap4BPkt4b8/4.1
2018-07-12 03:00:07,408::INFO::repoman.common.stores.iso::Repo /srv/resources/repos/ovirt/tested/.deploy.Ap4BPkt4b8/4.1 loaded
2018-07-12 03:00:07,424::INFO::repoman.common.repo::Resolving artifact source http://jenkins.ovirt.org/job/ovirt-release_4.1_build-artifacts-el7-x86_64...
2018-07-12 03:00:08,016::INFO::repoman.common.sources.jenkins::Parsing jenkins URL: http://jenkins.ovirt.org/job/ovirt-release_4.1_build-artifacts-el7-x86_64...
2018-07-12 03:00:08,019::INFO::repoman.common.sources.jenkins:: Got URL: http://jenkins.ovirt.org/job/ovirt-release_4.1_build-artifacts-el7-x86_64...
2018-07-12 03:00:08,019::INFO::repoman.common.sources.jenkins:: Got URL: http://jenkins.ovirt.org/job/ovirt-release_4.1_build-artifacts-el7-x86_64...
2018-07-12 03:00:08,020::INFO::repoman.common.sources.jenkins:: Got URL: http://jenkins.ovirt.org/job/ovirt-release_4.1_build-artifacts-el7-x86_64...
2018-07-12 03:00:08,020::INFO::repoman.common.sources.jenkins:: Got URL: http://jenkins.ovirt.org/job/ovirt-release_4.1_build-artifacts-el7-x86_64...
2018-07-12 03:00:08,021::INFO::repoman.common.sources.jenkins:: Got URL: http://jenkins.ovirt.org/job/ovirt-release_4.1_build-artifacts-el7-x86_64...
2018-07-12 03:00:08,021::INFO::repoman.common.sources.jenkins:: Got URL: http://jenkins.ovirt.org/job/ovirt-release_4.1_build-artifacts-el7-x86_64...
2018-07-12 03:00:08,023::INFO::root:: Done
2018-07-12 03:00:08,075::INFO::root::Downloading http://jenkins.ovirt.org/job/ovirt-release_4.1_build-artifacts-el7-x86_64..., length 9K ...
%[-----------------------25------------------------50-----------------------75------------------------] %[=====================================================================================================]
2018-07-12 03:00:08,771::INFO::repoman.common.stores.RPM::Adding package /srv/resources/repos/ovirt/tested/.deploy.Ap4BPkt4b8/.lago_tmp/tmpLN8WcF/tmpaTzE6J/ovirt-node-ng-image-update-placeholder-4.1.9-1.el7.noarch.rpm to repo /srv/resources/repos/ovirt/tested/.deploy.Ap4BPkt4b8/4.1
2018-07-12 03:00:08,803::INFO::root::Downloading http://jenkins.ovirt.org/job/ovirt-release_4.1_build-artifacts-el7-x86_64..., length 329K ...
%[-----------------------25------------------------50-----------------------75------------------------] %[====================================================================================================]
2018-07-12 03:00:09,051::INFO::repoman.common.stores.RPM::Adding package /srv/resources/repos/ovirt/tested/.deploy.Ap4BPkt4b8/.lago_tmp/tmpLN8WcF/tmpaTzE6J/ovirt-release-host-node-4.1.9-1.el7.noarch.rpm to repo /srv/resources/repos/ovirt/tested/.deploy.Ap4BPkt4b8/4.1
2018-07-12 03:00:09,080::INFO::root::Downloading http://jenkins.ovirt.org/job/ovirt-release_4.1_build-artifacts-el7-x86_64..., length 15K ...
%[-----------------------25------------------------50-----------------------75------------------------] %[====================================================================================================]
2018-07-12 03:00:09,206::INFO::repoman.common.stores.RPM::Adding package /srv/resources/repos/ovirt/tested/.deploy.Ap4BPkt4b8/.lago_tmp/tmpLN8WcF/tmpaTzE6J/ovirt-release41-4.1.9-1.el7.noarch.rpm to repo /srv/resources/repos/ovirt/tested/.deploy.Ap4BPkt4b8/4.1
2018-07-12 03:00:09,257::INFO::root::Downloading http://jenkins.ovirt.org/job/ovirt-release_4.1_build-artifacts-el7-x86_64..., length 405K ...
%[-----------------------25------------------------50-----------------------75------------------------] %[====================================================================================================]
2018-07-12 03:00:09,482::INFO::repoman.common.stores.RPM::Adding package /srv/resources/repos/ovirt/tested/.deploy.Ap4BPkt4b8/.lago_tmp/tmpLN8WcF/tmpaTzE6J/ovirt-release41-4.1.9-1.el7.src.rpm to repo /srv/resources/repos/ovirt/tested/.deploy.Ap4BPkt4b8/4.1
2018-07-12 03:00:09,514::INFO::root::Downloading http://jenkins.ovirt.org/job/ovirt-release_4.1_build-artifacts-el7-x86_64..., length 15K ...
%[-----------------------25------------------------50-----------------------75------------------------] %[====================================================================================================]
2018-07-12 03:00:09,699::INFO::repoman.common.stores.RPM::Adding package /srv/resources/repos/ovirt/tested/.deploy.Ap4BPkt4b8/.lago_tmp/tmpLN8WcF/tmpaTzE6J/ovirt-release41-pre-4.1.9-1.el7.noarch.rpm to repo /srv/resources/repos/ovirt/tested/.deploy.Ap4BPkt4b8/4.1
2018-07-12 03:00:09,728::INFO::root::Downloading http://jenkins.ovirt.org/job/ovirt-release_4.1_build-artifacts-el7-x86_64..., length 13K ...
%[-----------------------25------------------------50-----------------------75------------------------] %[====================================================================================================]
2018-07-12 03:00:09,855::INFO::repoman.common.stores.RPM::Adding package /srv/resources/repos/ovirt/tested/.deploy.Ap4BPkt4b8/.lago_tmp/tmpLN8WcF/tmpaTzE6J/ovirt-release41-snapshot-4.1.9-1.el7.noarch.rpm to repo /srv/resources/repos/ovirt/tested/.deploy.Ap4BPkt4b8/4.1
2018-07-12 03:00:09,856::INFO::repoman.cmd::
2018-07-12 03:00:09,856::INFO::repoman.common.stores.RPM::Saving new added rpms into /srv/resources/repos/ovirt/tested/.deploy.Ap4BPkt4b8/4.1
2018-07-12 03:00:09,857::INFO::root::Saving /srv/resources/repos/ovirt/tested/.deploy.Ap4BPkt4b8/4.1/rpm/el7/noarch/ovirt-node-ng-image-update-placeholder-4.1.9-1.el7.noarch.rpm
2018-07-12 03:00:09,863::INFO::root::Saving /srv/resources/repos/ovirt/tested/.deploy.Ap4BPkt4b8/4.1/rpm/el7/noarch/ovirt-release-host-node-4.1.9-1.el7.noarch.rpm
2018-07-12 03:00:09,863::INFO::root::Saving /srv/resources/repos/ovirt/tested/.deploy.Ap4BPkt4b8/4.1/rpm/el7/noarch/ovirt-release41-4.1.9-1.el7.noarch.rpm
2018-07-12 03:00:09,864::INFO::root::Saving /srv/resources/repos/ovirt/tested/.deploy.Ap4BPkt4b8/4.1/rpm/el7/SRPMS/ovirt-release41-4.1.9-1.el7.src.rpm
2018-07-12 03:00:09,864::INFO::root::Saving /srv/resources/repos/ovirt/tested/.deploy.Ap4BPkt4b8/4.1/rpm/el7/noarch/ovirt-release41-pre-4.1.9-1.el7.noarch.rpm
2018-07-12 03:00:09,865::INFO::root::Saving /srv/resources/repos/ovirt/tested/.deploy.Ap4BPkt4b8/4.1/rpm/el7/noarch/ovirt-release41-snapshot-4.1.9-1.el7.noarch.rpm
2018-07-12 03:00:09,866::INFO::repoman.common.stores.RPM::
2018-07-12 03:00:09,866::INFO::repoman.common.stores.RPM::Updating metadata
2018-07-12 03:00:09,866::INFO::repoman.common.stores.RPM:: Creating metadata for el7
2018-07-12 03:00:11,986::INFO::repoman.common.stores.RPM::
2018-07-12 03:00:11,987::INFO::repoman.common.stores.RPM::Creating symlinks
2018-07-12 03:00:11,988::INFO::repoman.common.stores.RPM::
2018-07-12 03:00:11,988::INFO::repoman.common.stores.RPM::Saved /srv/resources/repos/ovirt/tested/.deploy.Ap4BPkt4b8/4.1
2018-07-12 03:00:11,989::INFO::repoman.common.stores.iso::Saving new added isos into /srv/resources/repos/ovirt/tested/.deploy.Ap4BPkt4b8/4.1
2018-07-12 03:00:11,989::INFO::repoman.common.stores.iso::
2018-07-12 03:00:11,989::INFO::repoman.common.stores.iso::Saved /srv/resources/repos/ovirt/tested/.deploy.Ap4BPkt4b8/4.1
2018-07-12 03:00:11,993::INFO::repoman.common.repo::Cleaning up temporary dir /srv/resources/repos/ovirt/tested/.deploy.Ap4BPkt4b8/.lago_tmp/tmpLN8WcF/tmpaTzE6J
2018-07-12 03:00:11,996::INFO::repoman.common.repo::Cleaning up temporary dir /srv/resources/repos/ovirt/tested/.deploy.Ap4BPkt4b8/.lago_tmp/tmpLN8WcF
Publishing to repo
+ echo 'Publishing to repo'
+ push_to_tested /srv/resources/repos/ovirt/tested/.deploy.Ap4BPkt4b8 /srv/resources/repos/ovirt/tested
+ local pkg_src=/srv/resources/repos/ovirt/tested/.deploy.Ap4BPkt4b8
+ local pkg_dst=/srv/resources/repos/ovirt/tested
+ cd /srv/resources/repos/ovirt/tested/.deploy.Ap4BPkt4b8
+ find . -type d '!' -name repodata
+ tac
+ read dir
+ install -o deploy-ovirt-experimental -m 755 -d /srv/resources/repos/ovirt/tested/./4.1/rpm/el7/noarch
+ find ./4.1/rpm/el7/noarch -maxdepth 1 '!' -type d -print0
+ xargs -0 -r cp -RPplf -t /srv/resources/repos/ovirt/tested/./4.1/rpm/el7/noarch
+ [[ -d ./4.1/rpm/el7/noarch/repodata ]]
+ read dir
+ install -o deploy-ovirt-experimental -m 755 -d /srv/resources/repos/ovirt/tested/./4.1/rpm/el7/SRPMS
+ find ./4.1/rpm/el7/SRPMS -maxdepth 1 '!' -type d -print0
+ xargs -0 -r cp -RPplf -t /srv/resources/repos/ovirt/tested/./4.1/rpm/el7/SRPMS
+ [[ -d ./4.1/rpm/el7/SRPMS/repodata ]]
+ xargs -P 8 -d ' '
++ date +%Y-%m-%d-%H:%M
+ comm -23 /dev/fd/63 /dev/fd/62
++ sort
++ find /srv/resources/repos/ovirt/tested/./4.1/rpm/el7/SRPMS -name '*.rpm' -type f -mtime +14
++ tr / _
++ echo /srv/resources/repos/ovirt/tested
++ repomanage -k1 --new -c /srv/resources/repos/ovirt/tested/./4.1/rpm/el7/SRPMS
++ sort
+ createrepo_c --update --retain-old-md 50 --workers 8 /srv/resources/repos/ovirt/tested/./4.1/rpm/el7/SRPMS
Directory walk started
Directory walk done - 410 packages
Loaded information about 410 packages
Temporary output repo path: /srv/resources/repos/ovirt/tested/./4.1/rpm/el7/SRPMS/.repodata/
Preparing sqlite DBs
Pool started (with 8 workers)
Pool finished
+ read dir
+ install -o deploy-ovirt-experimental -m 755 -d /srv/resources/repos/ovirt/tested/./4.1/rpm/el7
+ find ./4.1/rpm/el7 -maxdepth 1 '!' -type d -print0
+ xargs -0 -r cp -RPplf -t /srv/resources/repos/ovirt/tested/./4.1/rpm/el7
+ [[ -d ./4.1/rpm/el7/repodata ]]
+ xargs -P 8 -d ' '
+ comm -23 /dev/fd/63 /dev/fd/62
++ date +%Y-%m-%d-%H:%M
++ repomanage -k1 --new -c /srv/resources/repos/ovirt/tested/./4.1/rpm/el7
++ sort
++ sort
++ find /srv/resources/repos/ovirt/tested/./4.1/rpm/el7 -name '*.rpm' -type f -mtime +14
++ echo /srv/resources/repos/ovirt/tested
++ tr / _
xargs: argument line too long
+ rm -rf /srv/resources/repos/ovirt/tested/.deploy.Ap4BPkt4b8
Build step 'Execute shell' marked build as failure
$ ssh-agent -k
unset SSH_AUTH_SOCK;
unset SSH_AGENT_PID;
echo Agent pid 29262 killed;
[ssh-agent] Stopped.
6 years, 4 months
[oVirt Jenkins] ovirt-system-tests_hc-basic-suite-master - Build #
568 - Failure!
by jenkins@jenkins.phx.ovirt.org
Project: http://jenkins.ovirt.org/job/ovirt-system-tests_hc-basic-suite-master/
Build: http://jenkins.ovirt.org/job/ovirt-system-tests_hc-basic-suite-master/568/
Build Number: 568
Build Status: Failure
Triggered By: Started by timer
-------------------------------------
Changes Since Last Success:
-------------------------------------
Changes for Build #568
[Gal Ben Haim] Adding 1GB of ram to hosts in HE suites
-----------------
Failed Tests:
-----------------
1 tests failed.
FAILED: 002_bootstrap.add_hosts
Error Message:
Host lago-hc-basic-suite-master-host-1 is in non responsive state
-------------------- >> begin captured logging << --------------------
ovirtlago.testlib: ERROR: * Unhandled exception in <function _host_is_up_4 at 0x7fb5836c4c08>
Traceback (most recent call last):
File "/usr/lib/python2.7/site-packages/ovirtlago/testlib.py", line 219, in assert_equals_within
res = func()
File "/home/jenkins/workspace/ovirt-system-tests_hc-basic-suite-master/ovirt-system-tests/hc-basic-suite-master/test-scenarios/002_bootstrap.py", line 262, in _host_is_up_4
raise RuntimeError('Host %s is in non responsive state' % api_host.name)
RuntimeError: Host lago-hc-basic-suite-master-host-1 is in non responsive state
--------------------- >> end captured logging << ---------------------
Stack Trace:
File "/usr/lib64/python2.7/unittest/case.py", line 369, in run
testMethod()
File "/usr/lib/python2.7/site-packages/nose/case.py", line 197, in runTest
self.test(*self.arg)
File "/usr/lib/python2.7/site-packages/ovirtlago/testlib.py", line 129, in wrapped_test
test()
File "/usr/lib/python2.7/site-packages/ovirtlago/testlib.py", line 59, in wrapper
return func(get_test_prefix(), *args, **kwargs)
File "/home/jenkins/workspace/ovirt-system-tests_hc-basic-suite-master/ovirt-system-tests/hc-basic-suite-master/test-scenarios/002_bootstrap.py", line 228, in add_hosts
add_hosts_4(prefix)
File "/home/jenkins/workspace/ovirt-system-tests_hc-basic-suite-master/ovirt-system-tests/hc-basic-suite-master/test-scenarios/002_bootstrap.py", line 275, in add_hosts_4
testlib.assert_true_within(_host_is_up_4, timeout=15*60)
File "/usr/lib/python2.7/site-packages/ovirtlago/testlib.py", line 263, in assert_true_within
assert_equals_within(func, True, timeout, allowed_exceptions)
File "/usr/lib/python2.7/site-packages/ovirtlago/testlib.py", line 219, in assert_equals_within
res = func()
File "/home/jenkins/workspace/ovirt-system-tests_hc-basic-suite-master/ovirt-system-tests/hc-basic-suite-master/test-scenarios/002_bootstrap.py", line 262, in _host_is_up_4
raise RuntimeError('Host %s is in non responsive state' % api_host.name)
'Host lago-hc-basic-suite-master-host-1 is in non responsive state\n-------------------- >> begin captured logging << --------------------\novirtlago.testlib: ERROR: * Unhandled exception in <function _host_is_up_4 at 0x7fb5836c4c08>\nTraceback (most recent call last):\n File "/usr/lib/python2.7/site-packages/ovirtlago/testlib.py", line 219, in assert_equals_within\n res = func()\n File "/home/jenkins/workspace/ovirt-system-tests_hc-basic-suite-master/ovirt-system-tests/hc-basic-suite-master/test-scenarios/002_bootstrap.py", line 262, in _host_is_up_4\n raise RuntimeError(\'Host %s is in non responsive state\' % api_host.name)\nRuntimeError: Host lago-hc-basic-suite-master-host-1 is in non responsive state\n--------------------- >> end captured logging << ---------------------'
6 years, 4 months