Build failed in Jenkins: deploy-to_ovirt-master_tested #5186
by jenkins@jenkins.phx.ovirt.org
See <http://jenkins.ovirt.org/job/deploy-to_ovirt-master_tested/5186/display/r...>
------------------------------------------
Started by upstream project "ovirt-node-ng-image_master_build-artifacts-el7-x86_64" build number 75
originally caused by:
Started by timer
[EnvInject] - Loading node environment variables.
Building remotely on vm0061.workers-phx.ovirt.org (libvirt phx fc28 nested) in workspace <http://jenkins.ovirt.org/job/deploy-to_ovirt-master_tested/ws/>
[ssh-agent] Looking for ssh-agent implementation...
[ssh-agent] Exec ssh-agent (binary ssh-agent on a remote machine)
$ ssh-agent
SSH_AUTH_SOCK=/tmp/ssh-uXUKavRadFgh/agent.781
SSH_AGENT_PID=783
[ssh-agent] Started.
Running ssh-add (command line suppressed)
Identity added: <http://jenkins.ovirt.org/job/deploy-to_ovirt-master_tested/ws/@tmp/privat...> (<http://jenkins.ovirt.org/job/deploy-to_ovirt-master_tested/ws/@tmp/privat...)>
[ssh-agent] Using credentials deploy-ovirt-experimental (SSH key for deploying to the tested repo)
[deploy-to_ovirt-master_tested] $ /bin/bash -xe /tmp/jenkins3376543587781164570.sh
+ [[ http://jenkins.ovirt.org/job/ovirt-node-ng-image_master_build-artifacts-e... == '' ]]
+ queue_name=ovirt-master
+ echo repo-extra-dir:master
+ ssh -o StrictHostKeyChecking=no deploy-ovirt-experimental(a)resources.ovirt.org
+ echo http://jenkins.ovirt.org/job/ovirt-node-ng-image_master_build-artifacts-e...
Pseudo-terminal will not be allocated because stdin is not a terminal.
+ BASE_DIR=/srv/resources/repos/ovirt/tested
+ PUBLISH_MD_COPIES=50
+ main
+ local tmp_dir
+ mkdir -p /srv/resources/repos/ovirt/tested
++ mktemp -d /srv/resources/repos/ovirt/tested/.deploy.XXXXXXXXXX
Collecting packages
+ tmp_dir=/srv/resources/repos/ovirt/tested/.deploy.YEMrLGq5As
+ trap 'rm -rf '\''/srv/resources/repos/ovirt/tested/.deploy.YEMrLGq5As'\''' EXIT HUP
+ echo 'Collecting packages'
+ collect_packages /srv/resources/repos/ovirt/tested/.deploy.YEMrLGq5As
+ local repoman_dst=/srv/resources/repos/ovirt/tested/.deploy.YEMrLGq5As
+ repoman --temp-dir generate-in-repo --option main.allowed_repo_paths=/srv/resources/repos/ovirt/tested/.deploy.YEMrLGq5As --option main.on_empty_source=warn /srv/resources/repos/ovirt/tested/.deploy.YEMrLGq5As add conf:stdin
2018-10-06 08:03:55,983::INFO::repoman.cmd::
2018-10-06 08:03:55,984::INFO::repoman.cmd::Adding artifacts to the repo /srv/resources/repos/ovirt/tested/.deploy.YEMrLGq5As
2018-10-06 08:03:55,984::INFO::repoman.common.repo::Adding repo extra dir master
2018-10-06 08:03:55,989::INFO::repoman.common.stores.RPM::Loading repo /srv/resources/repos/ovirt/tested/.deploy.YEMrLGq5As/master
2018-10-06 08:03:55,989::INFO::repoman.common.stores.RPM::Repo /srv/resources/repos/ovirt/tested/.deploy.YEMrLGq5As/master loaded
2018-10-06 08:03:55,993::INFO::repoman.common.stores.iso::Loading repo /srv/resources/repos/ovirt/tested/.deploy.YEMrLGq5As/master
2018-10-06 08:03:55,994::INFO::repoman.common.stores.iso::Repo /srv/resources/repos/ovirt/tested/.deploy.YEMrLGq5As/master loaded
2018-10-06 08:03:56,011::INFO::repoman.common.repo::Resolving artifact source http://jenkins.ovirt.org/job/ovirt-node-ng-image_master_build-artifacts-e...
2018-10-06 08:03:56,223::INFO::repoman.common.sources.jenkins::Parsing jenkins URL: http://jenkins.ovirt.org/job/ovirt-node-ng-image_master_build-artifacts-e...
2018-10-06 08:03:56,228::INFO::repoman.common.sources.jenkins:: Got URL: http://jenkins.ovirt.org/job/ovirt-node-ng-image_master_build-artifacts-e...
2018-10-06 08:03:56,229::INFO::repoman.common.sources.jenkins:: Got URL: http://jenkins.ovirt.org/job/ovirt-node-ng-image_master_build-artifacts-e...
2018-10-06 08:03:56,230::INFO::repoman.common.sources.jenkins:: Got URL: http://jenkins.ovirt.org/job/ovirt-node-ng-image_master_build-artifacts-e...
2018-10-06 08:03:56,232::INFO::root:: Done
2018-10-06 08:03:56,260::INFO::root::Downloading http://jenkins.ovirt.org/job/ovirt-node-ng-image_master_build-artifacts-e..., length 646M ...
%[-----------------------25------------------------50-----------------------75------------------------] %[====================================================================================================]
2018-10-06 08:04:15,945::INFO::repoman.common.stores.RPM::Adding package /srv/resources/repos/ovirt/tested/.deploy.YEMrLGq5As/.lago_tmp/tmpb3GMr1/tmp4CVgm0/ovirt-node-ng-image-update-4.3.0-0.1.master.20181006000054.git13297cd.el7.noarch.rpm to repo /srv/resources/repos/ovirt/tested/.deploy.YEMrLGq5As/master
2018-10-06 08:04:16,003::INFO::root::Downloading http://jenkins.ovirt.org/job/ovirt-node-ng-image_master_build-artifacts-e..., length 646M ...
%[-----------------------25------------------------50-----------------------75------------------------] %[====================================================================================================]
2018-10-06 08:04:33,978::INFO::repoman.common.stores.RPM::Adding package /srv/resources/repos/ovirt/tested/.deploy.YEMrLGq5As/.lago_tmp/tmpb3GMr1/tmp4CVgm0/ovirt-node-ng-image-update-4.3.0-0.1.master.20181006000054.git13297cd.el7.src.rpm to repo /srv/resources/repos/ovirt/tested/.deploy.YEMrLGq5As/master
2018-10-06 08:04:34,009::INFO::root::Downloading http://jenkins.ovirt.org/job/ovirt-node-ng-image_master_build-artifacts-e..., length 1206M ...
%[-----------------------25------------------------50-----------------------75------------------------] %[====================================================================================================]
2018-10-06 08:05:05,459::INFO::repoman.common.stores.iso::Adding iso /srv/resources/repos/ovirt/tested/.deploy.YEMrLGq5As/.lago_tmp/tmpb3GMr1/tmp4CVgm0/ovirt-node-ng-installer-master-el7-2018100607.iso to repo /srv/resources/repos/ovirt/tested/.deploy.YEMrLGq5As/master
2018-10-06 08:05:05,460::INFO::repoman.cmd::
2018-10-06 08:05:05,461::INFO::repoman.common.stores.RPM::Saving new added rpms into /srv/resources/repos/ovirt/tested/.deploy.YEMrLGq5As/master
2018-10-06 08:05:05,461::INFO::root::Saving /srv/resources/repos/ovirt/tested/.deploy.YEMrLGq5As/master/rpm/el7/noarch/ovirt-node-ng-image-update-4.3.0-0.1.master.20181006000054.git13297cd.el7.noarch.rpm
2018-10-06 08:05:05,463::INFO::root::Saving /srv/resources/repos/ovirt/tested/.deploy.YEMrLGq5As/master/rpm/el7/SRPMS/ovirt-node-ng-image-update-4.3.0-0.1.master.20181006000054.git13297cd.el7.src.rpm
2018-10-06 08:05:05,464::INFO::repoman.common.stores.RPM::
2018-10-06 08:05:05,464::INFO::repoman.common.stores.RPM::Updating metadata
2018-10-06 08:05:05,464::INFO::repoman.common.stores.RPM:: Creating metadata for el7
2018-10-06 08:05:22,398::INFO::repoman.common.stores.RPM::
2018-10-06 08:05:22,400::INFO::repoman.common.stores.RPM::Creating symlinks
2018-10-06 08:05:22,402::INFO::repoman.common.stores.RPM::
2018-10-06 08:05:22,402::INFO::repoman.common.stores.RPM::Saved /srv/resources/repos/ovirt/tested/.deploy.YEMrLGq5As/master
2018-10-06 08:05:22,402::INFO::repoman.common.stores.iso::Saving new added isos into /srv/resources/repos/ovirt/tested/.deploy.YEMrLGq5As/master
2018-10-06 08:05:22,404::INFO::root::Saving /srv/resources/repos/ovirt/tested/.deploy.YEMrLGq5As/master/iso/ovirt-node-ng-installer-master-e/7-2018100607/ovirt-node-ng-installer-master-e-7-2018100607.iso
2018-10-06 08:05:22,432::INFO::repoman.common.stores.iso::
2018-10-06 08:05:22,432::INFO::repoman.common.stores.iso::Saved /srv/resources/repos/ovirt/tested/.deploy.YEMrLGq5As/master
2018-10-06 08:05:22,437::INFO::repoman.common.repo::Cleaning up temporary dir /srv/resources/repos/ovirt/tested/.deploy.YEMrLGq5As/.lago_tmp/tmpb3GMr1/tmp4CVgm0
2018-10-06 08:05:22,438::INFO::repoman.common.repo::Cleaning up temporary dir /srv/resources/repos/ovirt/tested/.deploy.YEMrLGq5As/.lago_tmp/tmpb3GMr1
Publishing to repo
+ echo 'Publishing to repo'
+ push_to_tested /srv/resources/repos/ovirt/tested/.deploy.YEMrLGq5As /srv/resources/repos/ovirt/tested
+ local pkg_src=/srv/resources/repos/ovirt/tested/.deploy.YEMrLGq5As
+ local pkg_dst=/srv/resources/repos/ovirt/tested
+ cd /srv/resources/repos/ovirt/tested/.deploy.YEMrLGq5As
+ find . -type d '!' -name repodata
+ tac
+ read dir
+ install -o deploy-ovirt-experimental -m 755 -d /srv/resources/repos/ovirt/tested/./.lago_tmp
+ find ./.lago_tmp -maxdepth 1 '!' -type d -print0
+ xargs -0 -r cp -RPplf -t /srv/resources/repos/ovirt/tested/./.lago_tmp
+ [[ -d ./.lago_tmp/repodata ]]
+ read dir
+ install -o deploy-ovirt-experimental -m 755 -d /srv/resources/repos/ovirt/tested/./master/iso/ovirt-node-ng-installer-master-e/7-2018100607
+ find ./master/iso/ovirt-node-ng-installer-master-e/7-2018100607 -maxdepth 1 '!' -type d -print0
+ xargs -0 -r cp -RPplf -t /srv/resources/repos/ovirt/tested/./master/iso/ovirt-node-ng-installer-master-e/7-2018100607
+ [[ -d ./master/iso/ovirt-node-ng-installer-master-e/7-2018100607/repodata ]]
+ read dir
+ install -o deploy-ovirt-experimental -m 755 -d /srv/resources/repos/ovirt/tested/./master/iso/ovirt-node-ng-installer-master-e
+ find ./master/iso/ovirt-node-ng-installer-master-e -maxdepth 1 '!' -type d -print0
+ xargs -0 -r cp -RPplf -t /srv/resources/repos/ovirt/tested/./master/iso/ovirt-node-ng-installer-master-e
+ [[ -d ./master/iso/ovirt-node-ng-installer-master-e/repodata ]]
+ read dir
+ install -o deploy-ovirt-experimental -m 755 -d /srv/resources/repos/ovirt/tested/./master/iso
+ find ./master/iso -maxdepth 1 '!' -type d -print0
+ xargs -0 -r cp -RPplf -t /srv/resources/repos/ovirt/tested/./master/iso
+ [[ -d ./master/iso/repodata ]]
+ read dir
+ install -o deploy-ovirt-experimental -m 755 -d /srv/resources/repos/ovirt/tested/./master/rpm/el7/noarch
+ find ./master/rpm/el7/noarch -maxdepth 1 '!' -type d -print0
+ xargs -0 -r cp -RPplf -t /srv/resources/repos/ovirt/tested/./master/rpm/el7/noarch
+ [[ -d ./master/rpm/el7/noarch/repodata ]]
+ read dir
+ install -o deploy-ovirt-experimental -m 755 -d /srv/resources/repos/ovirt/tested/./master/rpm/el7/SRPMS
+ find ./master/rpm/el7/SRPMS -maxdepth 1 '!' -type d -print0
+ xargs -0 -r cp -RPplf -t /srv/resources/repos/ovirt/tested/./master/rpm/el7/SRPMS
+ [[ -d ./master/rpm/el7/SRPMS/repodata ]]
+ xargs -L 1 -P 8 -r rm -f
+ comm -23 /dev/fd/63 /dev/fd/62
++ sort
++ repomanage -k1 --new -c /srv/resources/repos/ovirt/tested/./master/rpm/el7/SRPMS
++ sort
++ find /srv/resources/repos/ovirt/tested/master/rpm/el7/SRPMS -name '*.rpm' -type f -mtime +14
+ createrepo_c --update --retain-old-md 50 --workers 8 /srv/resources/repos/ovirt/tested/./master/rpm/el7/SRPMS
Directory walk started
Directory walk done - 185 packages
Loaded information about 184 packages
Temporary output repo path: /srv/resources/repos/ovirt/tested/./master/rpm/el7/SRPMS/.repodata/
Preparing sqlite DBs
Pool started (with 8 workers)
Pool finished
+ read dir
+ install -o deploy-ovirt-experimental -m 755 -d /srv/resources/repos/ovirt/tested/./master/rpm/el7
+ find ./master/rpm/el7 -maxdepth 1 '!' -type d -print0
+ xargs -0 -r cp -RPplf -t /srv/resources/repos/ovirt/tested/./master/rpm/el7
+ [[ -d ./master/rpm/el7/repodata ]]
+ xargs -L 1 -P 8 -r rm -f
+ comm -23 /dev/fd/63 /dev/fd/62
++ sort
++ repomanage -k1 --new -c /srv/resources/repos/ovirt/tested/./master/rpm/el7
++ find /srv/resources/repos/ovirt/tested/master/rpm/el7 -name '*.rpm' -type f -mtime +14
++ sort
+ createrepo_c --update --retain-old-md 50 --workers 8 /srv/resources/repos/ovirt/tested/./master/rpm/el7
Temporary repodata directory /srv/resources/repos/ovirt/tested/./master/rpm/el7/.repodata/ already exists! (Another createrepo process is running?)
+ rm -rf /srv/resources/repos/ovirt/tested/.deploy.YEMrLGq5As
Build step 'Execute shell' marked build as failure
$ ssh-agent -k
unset SSH_AUTH_SOCK;
unset SSH_AGENT_PID;
echo Agent pid 783 killed;
[ssh-agent] Stopped.
6 years, 2 months
Build failed in Jenkins: system-sync_mirrors-epel-el6-x86_64 #1929
by jenkins@jenkins.phx.ovirt.org
See <http://jenkins.ovirt.org/job/system-sync_mirrors-epel-el6-x86_64/1929/dis...>
------------------------------------------
Started by timer
[EnvInject] - Loading node environment variables.
Building remotely on mirrors.phx.ovirt.org (mirrors) in workspace <http://jenkins.ovirt.org/job/system-sync_mirrors-epel-el6-x86_64/ws/>
> git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
> git config remote.origin.url http://gerrit.ovirt.org/jenkins.git # timeout=10
Cleaning workspace
> git rev-parse --verify HEAD # timeout=10
Resetting working tree
> git reset --hard # timeout=10
> git clean -fdx # timeout=10
Pruning obsolete local branches
Fetching upstream changes from http://gerrit.ovirt.org/jenkins.git
> git --version # timeout=10
> git fetch --tags --progress http://gerrit.ovirt.org/jenkins.git +refs/heads/*:refs/remotes/origin/* --prune
> git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 10a52f406c12d470350ec407f75fe5f4ff71650e (origin/master)
> git config core.sparsecheckout # timeout=10
> git checkout -f 10a52f406c12d470350ec407f75fe5f4ff71650e
Commit message: "add ovirt-web-ui to stdci v2"
> git rev-list --no-walk 10a52f406c12d470350ec407f75fe5f4ff71650e # timeout=10
[system-sync_mirrors-epel-el6-x86_64] $ /bin/bash -xe /tmp/jenkins800939038759870143.sh
+ jenkins/scripts/mirror_mgr.sh resync_yum_mirror epel-el6 x86_64 jenkins/data/mirrors-reposync.conf
Checking if mirror needs a resync
Traceback (most recent call last):
File "/usr/bin/reposync", line 343, in <module>
main()
File "/usr/bin/reposync", line 175, in main
my.doRepoSetup()
File "/usr/lib/python2.7/site-packages/yum/__init__.py", line 681, in doRepoSetup
return self._getRepos(thisrepo, True)
File "/usr/lib/python2.7/site-packages/yum/__init__.py", line 721, in _getRepos
self._repos.doSetup(thisrepo)
File "/usr/lib/python2.7/site-packages/yum/repos.py", line 157, in doSetup
self.retrieveAllMD()
File "/usr/lib/python2.7/site-packages/yum/repos.py", line 88, in retrieveAllMD
dl = repo._async and repo._commonLoadRepoXML(repo)
File "/usr/lib/python2.7/site-packages/yum/yumRepo.py", line 1474, in _commonLoadRepoXML
if self._latestRepoXML(local):
File "/usr/lib/python2.7/site-packages/yum/yumRepo.py", line 1443, in _latestRepoXML
oxml = self._saveOldRepoXML(local)
File "/usr/lib/python2.7/site-packages/yum/yumRepo.py", line 1300, in _saveOldRepoXML
shutil.copy2(local, old_local)
File "/usr/lib64/python2.7/shutil.py", line 131, in copy2
copystat(src, dst)
File "/usr/lib64/python2.7/shutil.py", line 98, in copystat
os.utime(dst, (st.st_atime, st.st_mtime))
OSError: [Errno 2] No such file or directory: '/home/jenkins/mirrors_cache/centos-extras-el7/repomd.xml.old.tmp'
Build step 'Execute shell' marked build as failure
6 years, 2 months
[oVirt Jenkins] ovirt-system-tests_he-basic-ansible-suite-4.2 -
Build # 630 - Failure!
by jenkins@jenkins.phx.ovirt.org
Project: http://jenkins.ovirt.org/job/ovirt-system-tests_he-basic-ansible-suite-4.2/
Build: http://jenkins.ovirt.org/job/ovirt-system-tests_he-basic-ansible-suite-4....
Build Number: 630
Build Status: Failure
Triggered By: Started by timer
-------------------------------------
Changes Since Last Success:
-------------------------------------
Changes for Build #630
[Gal Ben Haim] Remove stale url to the internal repo
-----------------
Failed Tests:
-----------------
1 tests failed.
FAILED: 008_restart_he_vm.restart_he_vm
Error Message:
could not parse JSON: {"1": {"conf_on_shared_storage": true, "live-data": true, "extra": "metadata_parse_version=1\nmetadata_feature_version=1\ntimestamp=2614 (Fri Oct 5 20:09:16 2018)\nhost-id=1\nscore=3400\nvm_conf_refresh_time=2614 (Fri Oct 5 20:09:17 2018)\nconf_on_shared_storage=True\nmaintenance=False\nstate=GlobalMaintenance\nstopped=False\n", "hostname": "lago-he-basic-ansible-suite-4-2-host-0", "host-id": 1, "engine-status": {"reason": "bad vm status", "health": "bad", "vm": "up", "detail": "Powering up"}, "score": 3400, "stopped": false, "maintenance": false, "crc32": "e0beb128", "local_conf_timestamp": 2614, "host-ts": 2614}, "2": {"conf_on_shared_storage": true, "live-data": false, "extra": "metadata_parse_version=1\nmetadata_feature_version=1\ntimestamp=2605 (Fri Oct 5 20:09:08 2018)\nhost-id=2\nscore=3400\nvm_conf_refresh_time=2605 (Fri Oct 5 20:09:08 2018)\nconf_on_shared_storage=True\nmaintenance=False\nstate=GlobalMaintenance\nstopped=False\n", "hostname": "lago-he-basic-ansible-suite-4-2-host-1", "host-id": 2
-------------------- >> begin captured logging << --------------------
lago.ssh: DEBUG: start task:6d60779b-e79e-47e9-998c-39cfc96cd6e4:Get ssh client for lago-he-basic-ansible-suite-4-2-host-0:
lago.ssh: DEBUG: end task:6d60779b-e79e-47e9-998c-39cfc96cd6e4:Get ssh client for lago-he-basic-ansible-suite-4-2-host-0:
lago.ssh: DEBUG: Running ddf43558 on lago-he-basic-ansible-suite-4-2-host-0: hosted-engine --vm-status --json
lago.ssh: DEBUG: Command ddf43558 on lago-he-basic-ansible-suite-4-2-host-0 returned with 0
lago.ssh: DEBUG: Command ddf43558 on lago-he-basic-ansible-suite-4-2-host-0 output:
{"1": {"conf_on_shared_storage": true, "live-data": true, "extra": "metadata_parse_version=1\nmetadata_feature_version=1\ntimestamp=2528 (Fri Oct 5 20:07:50 2018)\nhost-id=1\nscore=3000\nvm_conf_refresh_time=2528 (Fri Oct 5 20:07:50 2018)\nconf_on_shared_storage=True\nmaintenance=False\nstate=GlobalMaintenance\nstopped=False\n", "hostname": "lago-he-basic-ansible-suite-4-2-host-0", "host-id": 1, "engine-status": {"health": "good", "vm": "up", "detail": "Up"}, "score": 3000, "stopped": false, "maintenance": false, "crc32": "0ae6d9ce", "local_conf_timestamp": 2528, "host-ts": 2528}, "2": {"conf_on_shared_storage": true, "live-data": true, "extra": "metadata_parse_version=1\nmetadata_feature_version=1\ntimestamp=2525 (Fri Oct 5 20:07:48 2018)\nhost-id=2\nscore=3400\nvm_conf_refresh_time=2525 (Fri Oct 5 20:07:48 2018)\nconf_on_shared_storage=True\nmaintenance=False\nstate=EngineDown\nstopped=False\n", "hostname": "lago-he-basic-ansible-suite-4-2-host-1", "host-id": 2, "engine-status": {"reason": "vm not running on this host", "health": "bad", "vm": "down", "detail": "unknown"}, "score": 3400, "stopped": false, "maintenance": false, "crc32": "94893d0b", "local_conf_timestamp": 2525, "host-ts": 2525}, "global_maintenance": true}
root: INFO: * Shutting down HE VM on host: lago-he-basic-ansible-suite-4-2-host-0
lago.ssh: DEBUG: start task:7b60163a-89bd-45bf-892b-6646944ca73c:Get ssh client for lago-he-basic-ansible-suite-4-2-host-0:
lago.ssh: DEBUG: end task:7b60163a-89bd-45bf-892b-6646944ca73c:Get ssh client for lago-he-basic-ansible-suite-4-2-host-0:
lago.ssh: DEBUG: Running de6f6eee on lago-he-basic-ansible-suite-4-2-host-0: hosted-engine --vm-shutdown
lago.ssh: DEBUG: Command de6f6eee on lago-he-basic-ansible-suite-4-2-host-0 returned with 0
root: INFO: * Command succeeded
root: INFO: * Waiting for VM to be down...
lago.ssh: DEBUG: start task:79648c0a-2c58-4b9f-9378-35b1d2928ff2:Get ssh client for lago-he-basic-ansible-suite-4-2-host-0:
lago.ssh: DEBUG: end task:79648c0a-2c58-4b9f-9378-35b1d2928ff2:Get ssh client for lago-he-basic-ansible-suite-4-2-host-0:
lago.ssh: DEBUG: Running dfced7d4 on lago-he-basic-ansible-suite-4-2-host-0: hosted-engine --vm-status --json
lago.ssh: DEBUG: Command dfced7d4 on lago-he-basic-ansible-suite-4-2-host-0 returned with 0
lago.ssh: DEBUG: Command dfced7d4 on lago-he-basic-ansible-suite-4-2-host-0 output:
{"1": {"conf_on_shared_storage": true, "live-data": true, "extra": "metadata_parse_version=1\nmetadata_feature_version=1\ntimestamp=2528 (Fri Oct 5 20:07:50 2018)\nhost-id=1\nscore=3000\nvm_conf_refresh_time=2528 (Fri Oct 5 20:07:50 2018)\nconf_on_shared_storage=True\nmaintenance=False\nstate=GlobalMaintenance\nstopped=False\n", "hostname": "lago-he-basic-ansible-suite-4-2-host-0", "host-id": 1, "engine-status": {"health": "good", "vm": "up", "detail": "Up"}, "score": 3000, "stopped": false, "maintenance": false, "crc32": "0ae6d9ce", "local_conf_timestamp": 2528, "host-ts": 2528}, "2": {"conf_on_shared_storage": true, "live-data": true, "extra": "metadata_parse_version=1\nmetadata_feature_version=1\ntimestamp=2525 (Fri Oct 5 20:07:48 2018)\nhost-id=2\nscore=3400\nvm_conf_refresh_time=2525 (Fri Oct 5 20:07:48 2018)\nconf_on_shared_storage=True\nmaintenance=False\nstate=EngineDown\nstopped=False\n", "hostname": "lago-he-basic-ansible-suite-4-2-host-1", "host-id": 2, "engine-status": {"reason": "vm not running on this host", "health": "bad", "vm": "down", "detail": "unknown"}, "score": 3400, "stopped": false, "maintenance": false, "crc32": "94893d0b", "local_conf_timestamp": 2525, "host-ts": 2525}, "global_maintenance": true}
lago.ssh: DEBUG: start task:a41bee7a-3df1-42b7-8f4e-8fc96f15783e:Get ssh client for lago-he-basic-ansible-suite-4-2-host-0:
lago.ssh: DEBUG: end task:a41bee7a-3df1-42b7-8f4e-8fc96f15783e:Get ssh client for lago-he-basic-ansible-suite-4-2-host-0:
lago.ssh: DEBUG: Running e645a99e on lago-he-basic-ansible-suite-4-2-host-0: hosted-engine --vm-status --json
lago.ssh: DEBUG: Command e645a99e on lago-he-basic-ansible-suite-4-2-host-0 returned with 0
lago.ssh: DEBUG: Command e645a99e on lago-he-basic-ansible-suite-4-2-host-0 output:
{"1": {"conf_on_shared_storage": true, "live-data": true, "extra": "metadata_parse_version=1\nmetadata_feature_version=1\ntimestamp=2538 (Fri Oct 5 20:08:00 2018)\nhost-id=1\nscore=3000\nvm_conf_refresh_time=2538 (Fri Oct 5 20:08:01 2018)\nconf_on_shared_storage=True\nmaintenance=False\nstate=GlobalMaintenance\nstopped=False\n", "hostname": "lago-he-basic-ansible-suite-4-2-host-0", "host-id": 1, "engine-status": {"reason": "failed liveliness check", "health": "bad", "vm": "up", "detail": "Powering down"}, "score": 3000, "stopped": false, "maintenance": false, "crc32": "4d8f1a0d", "local_conf_timestamp": 2538, "host-ts": 2538}, "2": {"conf_on_shared_storage": true, "live-data": true, "extra": "metadata_parse_version=1\nmetadata_feature_version=1\ntimestamp=2535 (Fri Oct 5 20:07:58 2018)\nhost-id=2\nscore=3400\nvm_conf_refresh_time=2535 (Fri Oct 5 20:07:58 2018)\nconf_on_shared_storage=True\nmaintenance=False\nstate=EngineDown\nstopped=False\n", "hostname": "lago-he-basic-ansible-suite-4-2-host-1", "host-id": 2, "engine-status": {"reason": "vm not running on this host", "health": "bad", "vm": "down", "detail": "unknown"}, "score": 3400, "stopped": false, "maintenance": false, "crc32": "62e7f5d6", "local_conf_timestamp": 2535, "host-ts": 2535}, "global_maintenance": true}
lago.ssh: DEBUG: start task:f94c5d2f-8f54-460d-a98d-e33e771daa29:Get ssh client for lago-he-basic-ansible-suite-4-2-host-0:
lago.ssh: DEBUG: end task:f94c5d2f-8f54-460d-a98d-e33e771daa29:Get ssh client for lago-he-basic-ansible-suite-4-2-host-0:
lago.ssh: DEBUG: Running e894511e on lago-he-basic-ansible-suite-4-2-host-0: hosted-engine --vm-status --json
lago.ssh: DEBUG: Command e894511e on lago-he-basic-ansible-suite-4-2-host-0 returned with 0
lago.ssh: DEBUG: Command e894511e on lago-he-basic-ansible-suite-4-2-host-0 output:
{"1": {"conf_on_shared_storage": true, "live-data": true, "extra": "metadata_parse_version=1\nmetadata_feature_version=1\ntimestamp=2538 (Fri Oct 5 20:08:00 2018)\nhost-id=1\nscore=3000\nvm_conf_refresh_time=2538 (Fri Oct 5 20:08:01 2018)\nconf_on_shared_storage=True\nmaintenance=False\nstate=GlobalMaintenance\nstopped=False\n", "hostname": "lago-he-basic-ansible-suite-4-2-host-0", "host-id": 1, "engine-status": {"reason": "failed liveliness check", "health": "bad", "vm": "up", "detail": "Powering down"}, "score": 3000, "stopped": false, "maintenance": false, "crc32": "4d8f1a0d", "local_conf_timestamp": 2538, "host-ts": 2538}, "2": {"conf_on_shared_storage": true, "live-data": true, "extra": "metadata_parse_version=1\nmetadata_feature_version=1\ntimestamp=2535 (Fri Oct 5 20:07:58 2018)\nhost-id=2\nscore=3400\nvm_conf_refresh_time=2535 (Fri Oct 5 20:07:58 2018)\nconf_on_shared_storage=True\nmaintenance=False\nstate=EngineDown\nstopped=False\n", "hostname": "lago-he-basic-ansible-suite-4-2-host-1", "host-id": 2, "engine-status": {"reason": "vm not running on this host", "health": "bad", "vm": "down", "detail": "unknown"}, "score": 3400, "stopped": false, "maintenance": false, "crc32": "62e7f5d6", "local_conf_timestamp": 2535, "host-ts": 2535}, "global_maintenance": true}
lago.ssh: DEBUG: start task:fe6c5374-0d07-415c-86d2-d062e00b3dd8:Get ssh client for lago-he-basic-ansible-suite-4-2-host-0:
lago.ssh: DEBUG: end task:fe6c5374-0d07-415c-86d2-d062e00b3dd8:Get ssh client for lago-he-basic-ansible-suite-4-2-host-0:
lago.ssh: DEBUG: Running eae90770 on lago-he-basic-ansible-suite-4-2-host-0: hosted-engine --vm-status --json
lago.ssh: DEBUG: Command eae90770 on lago-he-basic-ansible-suite-4-2-host-0 returned with 0
lago.ssh: DEBUG: Command eae90770 on lago-he-basic-ansible-suite-4-2-host-0 output:
{"1": {"conf_on_shared_storage": true, "live-data": true, "extra": "metadata_parse_version=1\nmetadata_feature_version=1\ntimestamp=2548 (Fri Oct 5 20:08:11 2018)\nhost-id=1\nscore=3400\nvm_conf_refresh_time=2548 (Fri Oct 5 20:08:11 2018)\nconf_on_shared_storage=True\nmaintenance=False\nstate=GlobalMaintenance\nstopped=False\n", "hostname": "lago-he-basic-ansible-suite-4-2-host-0", "host-id": 1, "engine-status": {"reason": "bad vm status", "health": "bad", "vm": "down_unexpected", "detail": "Down"}, "score": 3400, "stopped": false, "maintenance": false, "crc32": "7e8da80c", "local_conf_timestamp": 2548, "host-ts": 2548}, "2": {"conf_on_shared_storage": true, "live-data": true, "extra": "metadata_parse_version=1\nmetadata_feature_version=1\ntimestamp=2545 (Fri Oct 5 20:08:07 2018)\nhost-id=2\nscore=3400\nvm_conf_refresh_time=2545 (Fri Oct 5 20:08:08 2018)\nconf_on_shared_storage=True\nmaintenance=False\nstate=GlobalMaintenance\nstopped=False\n", "hostname": "lago-he-basic-ansible-suite-4-2-host-1", "host-id": 2, "engine-status": {"reason": "vm not running on this host", "health": "bad", "vm": "down", "detail": "unknown"}, "score": 3400, "stopped": false, "maintenance": false, "crc32": "cc7c8506", "local_conf_timestamp": 2545, "host-ts": 2545}, "global_maintenance": true}
root: INFO: * VM is down.
root: INFO: * Stopping services...
lago.ssh: DEBUG: start task:8bee4b95-9d41-4a07-8b01-22a28b0b68e1:Get ssh client for lago-he-basic-ansible-suite-4-2-host-0:
lago.ssh: DEBUG: end task:8bee4b95-9d41-4a07-8b01-22a28b0b68e1:Get ssh client for lago-he-basic-ansible-suite-4-2-host-0:
lago.ssh: DEBUG: Running eb5d2844 on lago-he-basic-ansible-suite-4-2-host-0: systemctl stop vdsmd ovirt-ha-broker ovirt-ha-agent
lago.ssh: DEBUG: Command eb5d2844 on lago-he-basic-ansible-suite-4-2-host-0 returned with 0
root: INFO: * Starting services...
lago.ssh: DEBUG: start task:b7d48c4f-78d1-4522-b25f-631ceeba11ed:Get ssh client for lago-he-basic-ansible-suite-4-2-host-0:
lago.ssh: DEBUG: end task:b7d48c4f-78d1-4522-b25f-631ceeba11ed:Get ssh client for lago-he-basic-ansible-suite-4-2-host-0:
lago.ssh: DEBUG: Running f30b9404 on lago-he-basic-ansible-suite-4-2-host-0: systemctl start vdsmd ovirt-ha-broker ovirt-ha-agent
lago.ssh: DEBUG: Command f30b9404 on lago-he-basic-ansible-suite-4-2-host-0 returned with 0
root: INFO: * Waiting for agent to be ready...
lago.ssh: DEBUG: start task:a906a664-8725-4391-986a-075cbeceb847:Get ssh client for lago-he-basic-ansible-suite-4-2-host-0:
lago.ssh: DEBUG: end task:a906a664-8725-4391-986a-075cbeceb847:Get ssh client for lago-he-basic-ansible-suite-4-2-host-0:
lago.ssh: DEBUG: Running f5a174b8 on lago-he-basic-ansible-suite-4-2-host-0: hosted-engine --vm-status
lago.ssh: DEBUG: Command f5a174b8 on lago-he-basic-ansible-suite-4-2-host-0 returned with 1
lago.ssh: DEBUG: Command f5a174b8 on lago-he-basic-ansible-suite-4-2-host-0 output:
The hosted engine configuration has not been retrieved from shared storage. Please ensure that ovirt-ha-agent is running and the storage server is reachable.
lago.ssh: DEBUG: start task:6d97ee02-a35e-45dc-b9f6-d08eebcfdb73:Get ssh client for lago-he-basic-ansible-suite-4-2-host-0:
lago.ssh: DEBUG: end task:6d97ee02-a35e-45dc-b9f6-d08eebcfdb73:Get ssh client for lago-he-basic-ansible-suite-4-2-host-0:
lago.ssh: DEBUG: Running fbfa66c6 on lago-he-basic-ansible-suite-4-2-host-0: hosted-engine --vm-status
lago.ssh: DEBUG: Command fbfa66c6 on lago-he-basic-ansible-suite-4-2-host-0 returned with 1
lago.ssh: DEBUG: Command fbfa66c6 on lago-he-basic-ansible-suite-4-2-host-0 output:
The hosted engine configuration has not been retrieved from shared storage. Please ensure that ovirt-ha-agent is running and the storage server is reachable.
lago.ssh: DEBUG: start task:81a47930-f5a7-4d9a-bf95-c516a4d20119:Get ssh client for lago-he-basic-ansible-suite-4-2-host-0:
lago.ssh: DEBUG: end task:81a47930-f5a7-4d9a-bf95-c516a4d20119:Get ssh client for lago-he-basic-ansible-suite-4-2-host-0:
lago.ssh: DEBUG: Running fe1c7976 on lago-he-basic-ansible-suite-4-2-host-0: hosted-engine --vm-status
lago.ssh: DEBUG: Command fe1c7976 on lago-he-basic-ansible-suite-4-2-host-0 returned with 1
lago.ssh: DEBUG: Command fe1c7976 on lago-he-basic-ansible-suite-4-2-host-0 output:
The hosted engine configuration has not been retrieved from shared storage. Please ensure that ovirt-ha-agent is running and the storage server is reachable.
lago.ssh: DEBUG: start task:a09e616e-557f-46cf-96da-4c1d08e6c20c:Get ssh client for lago-he-basic-ansible-suite-4-2-host-0:
lago.ssh: DEBUG: end task:a09e616e-557f-46cf-96da-4c1d08e6c20c:Get ssh client for lago-he-basic-ansible-suite-4-2-host-0:
lago.ssh: DEBUG: Running 005dceec on lago-he-basic-ansible-suite-4-2-host-0: hosted-engine --vm-status
lago.ssh: DEBUG: Command 005dceec on lago-he-basic-ansible-suite-4-2-host-0 returned with 1
lago.ssh: DEBUG: Command 005dceec on lago-he-basic-ansible-suite-4-2-host-0 output:
The hosted engine configuration has not been retrieved from shared storage. Please ensure that ovirt-ha-agent is running and the storage server is reachable.
lago.ssh: DEBUG: start task:680788ce-4a96-456e-939e-4187ba02ecf9:Get ssh client for lago-he-basic-ansible-suite-4-2-host-0:
lago.ssh: DEBUG: end task:680788ce-4a96-456e-939e-4187ba02ecf9:Get ssh client for lago-he-basic-ansible-suite-4-2-host-0:
lago.ssh: DEBUG: Running 0289b564 on lago-he-basic-ansible-suite-4-2-host-0: hosted-engine --vm-status
lago.ssh: DEBUG: Command 0289b564 on lago-he-basic-ansible-suite-4-2-host-0 returned with 1
lago.ssh: DEBUG: Command 0289b564 on lago-he-basic-ansible-suite-4-2-host-0 output:
The hosted engine configuration has not been retrieved from shared storage. Please ensure that ovirt-ha-agent is running and the storage server is reachable.
lago.ssh: DEBUG: start task:d677c081-7afd-49ed-83bc-9e42703f5926:Get ssh client for lago-he-basic-ansible-suite-4-2-host-0:
lago.ssh: DEBUG: end task:d677c081-7afd-49ed-83bc-9e42703f5926:Get ssh client for lago-he-basic-ansible-suite-4-2-host-0:
lago.ssh: DEBUG: Running 04b2c16e on lago-he-basic-ansible-suite-4-2-host-0: hosted-engine --vm-status
lago.ssh: DEBUG: Command 04b2c16e on lago-he-basic-ansible-suite-4-2-host-0 returned with 1
lago.ssh: DEBUG: Command 04b2c16e on lago-he-basic-ansible-suite-4-2-host-0 output:
The hosted engine configuration has not been retrieved from shared storage. Please ensure that ovirt-ha-agent is running and the storage server is reachable.
lago.ssh: DEBUG: start task:3ec69661-9088-4107-a61a-78d064f23194:Get ssh client for lago-he-basic-ansible-suite-4-2-host-0:
lago.ssh: DEBUG: end task:3ec69661-9088-4107-a61a-78d064f23194:Get ssh client for lago-he-basic-ansible-suite-4-2-host-0:
lago.ssh: DEBUG: Running 06dafbb4 on lago-he-basic-ansible-suite-4-2-host-0: hosted-engine --vm-status
lago.ssh: DEBUG: Command 06dafbb4 on lago-he-basic-ansible-suite-4-2-host-0 returned with 1
lago.ssh: DEBUG: Command 06dafbb4 on lago-he-basic-ansible-suite-4-2-host-0 output:
The hosted engine configuration has not been retrieved from shared storage. Please ensure that ovirt-ha-agent is running and the storage server is reachable.
lago.ssh: DEBUG: start task:a7313617-2000-4cba-b1f3-2b0430d3d031:Get ssh client for lago-he-basic-ansible-suite-4-2-host-0:
lago.ssh: DEBUG: end task:a7313617-2000-4cba-b1f3-2b0430d3d031:Get ssh client for lago-he-basic-ansible-suite-4-2-host-0:
lago.ssh: DEBUG: Running 0909032c on lago-he-basic-ansible-suite-4-2-host-0: hosted-engine --vm-status
lago.ssh: DEBUG: Command 0909032c on lago-he-basic-ansible-suite-4-2-host-0 returned with 1
lago.ssh: DEBUG: Command 0909032c on lago-he-basic-ansible-suite-4-2-host-0 output:
The hosted engine configuration has not been retrieved from shared storage. Please ensure that ovirt-ha-agent is running and the storage server is reachable.
lago.ssh: DEBUG: start task:b749c52f-0363-47af-8b07-12d48fbc0f89:Get ssh client for lago-he-basic-ansible-suite-4-2-host-0:
lago.ssh: DEBUG: end task:b749c52f-0363-47af-8b07-12d48fbc0f89:Get ssh client for lago-he-basic-ansible-suite-4-2-host-0:
lago.ssh: DEBUG: Running 0b3262ec on lago-he-basic-ansible-suite-4-2-host-0: hosted-engine --vm-status
lago.ssh: DEBUG: Command 0b3262ec on lago-he-basic-ansible-suite-4-2-host-0 returned with 0
lago.ssh: DEBUG: Command 0b3262ec on lago-he-basic-ansible-suite-4-2-host-0 output:
!! Cluster is in GLOBAL MAINTENANCE mode !!
--== Host 1 status ==--
conf_on_shared_storage : True
Status up-to-date : True
Hostname : lago-he-basic-ansible-suite-4-2-host-0
Host ID : 1
Engine status : {"reason": "bad vm status", "health": "bad", "vm": "down", "detail": "Down"}
Score : 0
stopped : False
Local maintenance : False
crc32 : 6d4d94fa
local_conf_timestamp : 2604
Host timestamp : 2604
Extra metadata (valid at timestamp):
metadata_parse_version=1
metadata_feature_version=1
timestamp=2604 (Fri Oct 5 20:09:06 2018)
host-id=1
score=0
vm_conf_refresh_time=2604 (Fri Oct 5 20:09:06 2018)
conf_on_shared_storage=True
maintenance=False
state=ReinitializeFSM
stopped=False
--== Host 2 status ==--
conf_on_shared_storage : True
Status up-to-date : False
Hostname : lago-he-basic-ansible-suite-4-2-host-1
Host ID : 2
Engine status : unknown stale-data
Score : 3400
stopped : False
Local maintenance : False
crc32 : 8bf93dc1
local_conf_timestamp : 2595
Host timestamp : 2595
Extra metadata (valid at timestamp):
metadata_parse_version=1
metadata_feature_version=1
timestamp=2595 (Fri Oct 5 20:08:58 2018)
host-id=2
score=3400
vm_conf_refresh_time=2595 (Fri Oct 5 20:08:58 2018)
conf_on_shared_storage=True
maintenance=False
state=GlobalMaintenance
stopped=False
!! Cluster is in GLOBAL MAINTENANCE mode !!
root: INFO: * Agent is ready.
root: INFO: * Starting VM...
lago.ssh: DEBUG: start task:ca0f6687-e4a5-4a45-a33b-851a4b0a27fb:Get ssh client for lago-he-basic-ansible-suite-4-2-host-0:
lago.ssh: DEBUG: end task:ca0f6687-e4a5-4a45-a33b-851a4b0a27fb:Get ssh client for lago-he-basic-ansible-suite-4-2-host-0:
lago.ssh: DEBUG: Running 0bbc3922 on lago-he-basic-ansible-suite-4-2-host-0: hosted-engine --vm-start
lago.ssh: DEBUG: Command 0bbc3922 on lago-he-basic-ansible-suite-4-2-host-0 returned with 0
lago.ssh: DEBUG: Command 0bbc3922 on lago-he-basic-ansible-suite-4-2-host-0 output:
VM exists and is down, cleaning up and restarting
root: INFO: * Command succeeded
root: INFO: * Waiting for VM to be UP...
lago.ssh: DEBUG: start task:13617503-7603-4b90-bb3e-7c88ee81e50e:Get ssh client for lago-he-basic-ansible-suite-4-2-host-0:
lago.ssh: DEBUG: end task:13617503-7603-4b90-bb3e-7c88ee81e50e:Get ssh client for lago-he-basic-ansible-suite-4-2-host-0:
lago.ssh: DEBUG: Running 0cf57dbc on lago-he-basic-ansible-suite-4-2-host-0: hosted-engine --vm-status --json
lago.ssh: DEBUG: Command 0cf57dbc on lago-he-basic-ansible-suite-4-2-host-0 returned with 0
lago.ssh: DEBUG: Command 0cf57dbc on lago-he-basic-ansible-suite-4-2-host-0 output:
{"1": {"conf_on_shared_storage": true, "live-data": true, "extra": "metadata_parse_version=1\nmetadata_feature_version=1\ntimestamp=2604 (Fri Oct 5 20:09:06 2018)\nhost-id=1\nscore=0\nvm_conf_refresh_time=2604 (Fri Oct 5 20:09:06 2018)\nconf_on_shared_storage=True\nmaintenance=False\nstate=ReinitializeFSM\nstopped=False\n", "hostname": "lago-he-basic-ansible-suite-4-2-host-0", "host-id": 1, "engine-status": {"reason": "bad vm status", "health": "bad", "vm": "down", "detail": "Down"}, "score": 0, "stopped": false, "maintenance": false, "crc32": "6d4d94fa", "local_conf_timestamp": 2604, "host-ts": 2604}, "2": {"conf_on_shared_storage": true, "live-data": false, "extra": "metadata_parse_version=1\nmetadata_feature_version=1\ntimestamp=2595 (Fri Oct 5 20:08:58 2018)\nhost-id=2\nscore=3400\nvm_conf_refresh_time=2595 (Fri Oct 5 20:08:58 2018)\nconf_on_shared_storage=True\nmaintenance=False\nstate=GlobalMaintenance\nstopped=False\n", "hostname": "lago-he-basic-ansible-suite-4-2-host-1", "host-id": 2, "engine-status": {"reason": "vm not running on this host", "health": "bad", "vm": "down", "detail": "unknown"}, "score": 3400, "stopped": false, "maintenance": false, "crc32": "8bf93dc1", "local_conf_timestamp": 2595, "host-ts": 2595}, "global_maintenance": true}
lago.ssh: DEBUG: start task:28442f55-f66b-4ae8-8be6-dc95642b20e4:Get ssh client for lago-he-basic-ansible-suite-4-2-host-0:
lago.ssh: DEBUG: end task:28442f55-f66b-4ae8-8be6-dc95642b20e4:Get ssh client for lago-he-basic-ansible-suite-4-2-host-0:
lago.ssh: DEBUG: Running 137469e6 on lago-he-basic-ansible-suite-4-2-host-0: hosted-engine --vm-status --json
lago.ssh: DEBUG: Command 137469e6 on lago-he-basic-ansible-suite-4-2-host-0 returned with 0
lago.ssh: DEBUG: Command 137469e6 on lago-he-basic-ansible-suite-4-2-host-0 output:
{"1": {"conf_on_shared_storage": true, "live-data": true, "extra": "metadata_parse_version=1\nmetadata_feature_version=1\ntimestamp=2614 (Fri Oct 5 20:09:16 2018)\nhost-id=1\nscore=3400\nvm_conf_refresh_time=2614 (Fri Oct 5 20:09:17 2018)\nconf_on_shared_storage=True\nmaintenance=False\nstate=GlobalMaintenance\nstopped=False\n", "hostname": "lago-he-basic-ansible-suite-4-2-host-0", "host-id": 1, "engine-status": {"reason": "bad vm status", "health": "bad", "vm": "up", "detail": "Powering up"}, "score": 3400, "stopped": false, "maintenance": false, "crc32": "e0beb128", "local_conf_timestamp": 2614, "host-ts": 2614}, "2": {"conf_on_shared_storage": true, "live-data": false, "extra": "metadata_parse_version=1\nmetadata_feature_version=1\ntimestamp=2605 (Fri Oct 5 20:09:08 2018)\nhost-id=2\nscore=3400\nvm_conf_refresh_time=2605 (Fri Oct 5 20:09:08 2018)\nconf_on_shared_storage=True\nmaintenance=False\nstate=GlobalMaintenance\nstopped=False\n", "hostname": "lago-he-basic-ansible-suite-4-2-host-1", "host-id": 2, "engine-status": {"reason": "vm not running on this host", "health": "bad", "vm": "down", "detail": "unknown"}, "score": 3400, "stopped": false, "maintenance": false, "crc32": "71e9ed52", "local_conf_timestamp": 2605, "host-ts": 2605}, "global_maintenance": true}
root: INFO: * VM is UP.
root: INFO: * Waiting for engine to start...
lago.ssh: DEBUG: start task:7c749b65-99a0-4890-b105-d81df4cd8b2e:Get ssh client for lago-he-basic-ansible-suite-4-2-host-0:
lago.ssh: DEBUG: end task:7c749b65-99a0-4890-b105-d81df4cd8b2e:Get ssh client for lago-he-basic-ansible-suite-4-2-host-0:
lago.ssh: DEBUG: Running 13faa7d6 on lago-he-basic-ansible-suite-4-2-host-0: hosted-engine --vm-status --json
lago.ssh: DEBUG: Command 13faa7d6 on lago-he-basic-ansible-suite-4-2-host-0 returned with 0
lago.ssh: DEBUG: Command 13faa7d6 on lago-he-basic-ansible-suite-4-2-host-0 output:
{"1": {"conf_on_shared_storage": true, "live-data": true, "extra": "metadata_parse_version=1\nmetadata_feature_version=1\ntimestamp=2614 (Fri Oct 5 20:09:16 2018)\nhost-id=1\nscore=3400\nvm_conf_refresh_time=2614 (Fri Oct 5 20:09:17 2018)\nconf_on_shared_storage=True\nmaintenance=False\nstate=GlobalMaintenance\nstopped=False\n", "hostname": "lago-he-basic-ansible-suite-4-2-host-0", "host-id": 1, "engine-status": {"reason": "bad vm status", "health": "bad", "vm": "up", "detail": "Powering up"}, "score": 3400, "stopped": false, "maintenance": false, "crc32": "e0beb128", "local_conf_timestamp": 2614, "host-ts": 2614}, "2": {"conf_on_shared_storage": true, "live-data": false, "extra": "metadata_parse_version=1\nmetadata_feature_version=1\ntimestamp=2605 (Fri Oct 5 20:09:08 2018)\nhost-id=2\nscore=3400\nvm_conf_refresh_time=2605 (Fri Oct 5 20:09:08 2018)\nconf_on_shared_storage=True\nmaintenance=False\nstate=GlobalMaintenance\nstopped=False\n", "hostname": "lago-he-basic-ansible-suite-4-2-host-1", "host-id": 2
ovirtlago.testlib: ERROR: * Unhandled exception in <function <lambda> at 0x7feaff4389b0>
Traceback (most recent call last):
File "/usr/lib/python2.7/site-packages/ovirtlago/testlib.py", line 234, in assert_equals_within
res = func()
File "/home/jenkins/workspace/ovirt-system-tests_he-basic-ansible-suite-4.2/ovirt-system-tests/he-basic-ansible-suite-4.2/test-scenarios/008_restart_he_vm.py", line 187, in <lambda>
for k, v in _get_he_status(host).items()
File "/home/jenkins/workspace/ovirt-system-tests_he-basic-ansible-suite-4.2/ovirt-system-tests/he-basic-ansible-suite-4.2/test-scenarios/008_restart_he_vm.py", line 128, in _get_he_status
raise RuntimeError('could not parse JSON: %s' % ret.out)
RuntimeError: could not parse JSON: {"1": {"conf_on_shared_storage": true, "live-data": true, "extra": "metadata_parse_version=1\nmetadata_feature_version=1\ntimestamp=2614 (Fri Oct 5 20:09:16 2018)\nhost-id=1\nscore=3400\nvm_conf_refresh_time=2614 (Fri Oct 5 20:09:17 2018)\nconf_on_shared_storage=True\nmaintenance=False\nstate=GlobalMaintenance\nstopped=False\n", "hostname": "lago-he-basic-ansible-suite-4-2-host-0", "host-id": 1, "engine-status": {"reason": "bad vm status", "health": "bad", "vm": "up", "detail": "Powering up"}, "score": 3400, "stopped": false, "maintenance": false, "crc32": "e0beb128", "local_conf_timestamp": 2614, "host-ts": 2614}, "2": {"conf_on_shared_storage": true, "live-data": false, "extra": "metadata_parse_version=1\nmetadata_feature_version=1\ntimestamp=2605 (Fri Oct 5 20:09:08 2018)\nhost-id=2\nscore=3400\nvm_conf_refresh_time=2605 (Fri Oct 5 20:09:08 2018)\nconf_on_shared_storage=True\nmaintenance=False\nstate=GlobalMaintenance\nstopped=False\n", "hostname": "lago-he-basic-ansible-suite-4-2-host-1", "host-id": 2
--------------------- >> end captured logging << ---------------------
Stack Trace:
File "/usr/lib64/python2.7/unittest/case.py", line 369, in run
testMethod()
File "/usr/lib/python2.7/site-packages/nose/case.py", line 197, in runTest
self.test(*self.arg)
File "/usr/lib/python2.7/site-packages/ovirtlago/testlib.py", line 142, in wrapped_test
test()
File "/usr/lib/python2.7/site-packages/ovirtlago/testlib.py", line 60, in wrapper
return func(get_test_prefix(), *args, **kwargs)
File "/home/jenkins/workspace/ovirt-system-tests_he-basic-ansible-suite-4.2/ovirt-system-tests/he-basic-ansible-suite-4.2/test-scenarios/008_restart_he_vm.py", line 53, in restart_he_vm
_wait_for_engine_health(host)
File "/home/jenkins/workspace/ovirt-system-tests_he-basic-ansible-suite-4.2/ovirt-system-tests/he-basic-ansible-suite-4.2/test-scenarios/008_restart_he_vm.py", line 185, in _wait_for_engine_health
testlib.assert_true_within_long(lambda: any(
File "/usr/lib/python2.7/site-packages/ovirtlago/testlib.py", line 286, in assert_true_within_long
assert_equals_within_long(func, True, allowed_exceptions)
File "/usr/lib/python2.7/site-packages/ovirtlago/testlib.py", line 273, in assert_equals_within_long
func, value, LONG_TIMEOUT, allowed_exceptions=allowed_exceptions
File "/usr/lib/python2.7/site-packages/ovirtlago/testlib.py", line 234, in assert_equals_within
res = func()
File "/home/jenkins/workspace/ovirt-system-tests_he-basic-ansible-suite-4.2/ovirt-system-tests/he-basic-ansible-suite-4.2/test-scenarios/008_restart_he_vm.py", line 187, in <lambda>
for k, v in _get_he_status(host).items()
File "/home/jenkins/workspace/ovirt-system-tests_he-basic-ansible-suite-4.2/ovirt-system-tests/he-basic-ansible-suite-4.2/test-scenarios/008_restart_he_vm.py", line 128, in _get_he_status
raise RuntimeError('could not parse JSON: %s' % ret.out)
'could not parse JSON: {"1": {"conf_on_shared_storage": true, "live-data": true, "extra": "metadata_parse_version=1\\nmetadata_feature_version=1\\ntimestamp=2614 (Fri Oct 5 20:09:16 2018)\\nhost-id=1\\nscore=3400\\nvm_conf_refresh_time=2614 (Fri Oct 5 20:09:17 2018)\\nconf_on_shared_storage=True\\nmaintenance=False\\nstate=GlobalMaintenance\\nstopped=False\\n", "hostname": "lago-he-basic-ansible-suite-4-2-host-0", "host-id": 1, "engine-status": {"reason": "bad vm status", "health": "bad", "vm": "up", "detail": "Powering up"}, "score": 3400, "stopped": false, "maintenance": false, "crc32": "e0beb128", "local_conf_timestamp": 2614, "host-ts": 2614}, "2": {"conf_on_shared_storage": true, "live-data": false, "extra": "metadata_parse_version=1\\nmetadata_feature_version=1\\ntimestamp=2605 (Fri Oct 5 20:09:08 2018)\\nhost-id=2\\nscore=3400\\nvm_conf_refresh_time=2605 (Fri Oct 5 20:09:08 2018)\\nconf_on_shared_storage=True\\nmaintenance=False\\nstate=GlobalMaintenance\\nstopped=False\\n", "hostname": "lago-he-basic-ansible-suite-4-2-host-1", "host-id": 2\n-------------------- >> begin captured logging << --------------------\nlago.ssh: DEBUG: start task:6d60779b-e79e-47e9-998c-39cfc96cd6e4:Get ssh client for lago-he-basic-ansible-suite-4-2-host-0:\nlago.ssh: DEBUG: end task:6d60779b-e79e-47e9-998c-39cfc96cd6e4:Get ssh client for lago-he-basic-ansible-suite-4-2-host-0:\nlago.ssh: DEBUG: Running ddf43558 on lago-he-basic-ansible-suite-4-2-host-0: hosted-engine --vm-status --json\nlago.ssh: DEBUG: Command ddf43558 on lago-he-basic-ansible-suite-4-2-host-0 returned with 0\nlago.ssh: DEBUG: Command ddf43558 on lago-he-basic-ansible-suite-4-2-host-0 output:\n {"1": {"conf_on_shared_storage": true, "live-data": true, "extra": "metadata_parse_version=1\\nmetadata_feature_version=1\\ntimestamp=2528 (Fri Oct 5 20:07:50 2018)\\nhost-id=1\\nscore=3000\\nvm_conf_refresh_time=2528 (Fri Oct 5 20:07:50 2018)\\nconf_on_shared_storage=True\\nmaintenance=False\\nstate=GlobalMaintenance\\nstopped=False\\n", "hostname": "lago-he-basic-ansible-suite-4-2-host-0", "host-id": 1, "engine-status": {"health": "good", "vm": "up", "detail": "Up"}, "score": 3000, "stopped": false, "maintenance": false, "crc32": "0ae6d9ce", "local_conf_timestamp": 2528, "host-ts": 2528}, "2": {"conf_on_shared_storage": true, "live-data": true, "extra": "metadata_parse_version=1\\nmetadata_feature_version=1\\ntimestamp=2525 (Fri Oct 5 20:07:48 2018)\\nhost-id=2\\nscore=3400\\nvm_conf_refresh_time=2525 (Fri Oct 5 20:07:48 2018)\\nconf_on_shared_storage=True\\nmaintenance=False\\nstate=EngineDown\\nstopped=False\\n", "hostname": "lago-he-basic-ansible-suite-4-2-host-1", "host-id": 2, "engine-status": {"reason": "vm not running on this host", "health": "bad", "vm": "down", "detail": "unknown"}, "score": 3400, "stopped": false, "maintenance": false, "crc32": "94893d0b", "local_conf_timestamp": 2525, "host-ts": 2525}, "global_maintenance": true}\n\nroot: INFO: * Shutting down HE VM on host: lago-he-basic-ansible-suite-4-2-host-0\nlago.ssh: DEBUG: start task:7b60163a-89bd-45bf-892b-6646944ca73c:Get ssh client for lago-he-basic-ansible-suite-4-2-host-0:\nlago.ssh: DEBUG: end task:7b60163a-89bd-45bf-892b-6646944ca73c:Get ssh client for lago-he-basic-ansible-suite-4-2-host-0:\nlago.ssh: DEBUG: Running de6f6eee on lago-he-basic-ansible-suite-4-2-host-0: hosted-engine --vm-shutdown\nlago.ssh: DEBUG: Command de6f6eee on lago-he-basic-ansible-suite-4-2-host-0 returned with 0\nroot: INFO: * Command succeeded\nroot: INFO: * Waiting for VM to be down...\nlago.ssh: DEBUG: start task:79648c0a-2c58-4b9f-9378-35b1d2928ff2:Get ssh client for lago-he-basic-ansible-suite-4-2-host-0:\nlago.ssh: DEBUG: end task:79648c0a-2c58-4b9f-9378-35b1d2928ff2:Get ssh client for lago-he-basic-ansible-suite-4-2-host-0:\nlago.ssh: DEBUG: Running dfced7d4 on lago-he-basic-ansible-suite-4-2-host-0: hosted-engine --vm-status --json\nlago.ssh: DEBUG: Command dfced7d4 on lago-he-basic-ansible-suite-4-2-host-0 returned with 0\nlago.ssh: DEBUG: Command dfced7d4 on lago-he-basic-ansible-suite-4-2-host-0 output:\n {"1": {"conf_on_shared_storage": true, "live-data": true, "extra": "metadata_parse_version=1\\nmetadata_feature_version=1\\ntimestamp=2528 (Fri Oct 5 20:07:50 2018)\\nhost-id=1\\nscore=3000\\nvm_conf_refresh_time=2528 (Fri Oct 5 20:07:50 2018)\\nconf_on_shared_storage=True\\nmaintenance=False\\nstate=GlobalMaintenance\\nstopped=False\\n", "hostname": "lago-he-basic-ansible-suite-4-2-host-0", "host-id": 1, "engine-status": {"health": "good", "vm": "up", "detail": "Up"}, "score": 3000, "stopped": false, "maintenance": false, "crc32": "0ae6d9ce", "local_conf_timestamp": 2528, "host-ts": 2528}, "2": {"conf_on_shared_storage": true, "live-data": true, "extra": "metadata_parse_version=1\\nmetadata_feature_version=1\\ntimestamp=2525 (Fri Oct 5 20:07:48 2018)\\nhost-id=2\\nscore=3400\\nvm_conf_refresh_time=2525 (Fri Oct 5 20:07:48 2018)\\nconf_on_shared_storage=True\\nmaintenance=False\\nstate=EngineDown\\nstopped=False\\n", "hostname": "lago-he-basic-ansible-suite-4-2-host-1", "host-id": 2, "engine-status": {"reason": "vm not running on this host", "health": "bad", "vm": "down", "detail": "unknown"}, "score": 3400, "stopped": false, "maintenance": false, "crc32": "94893d0b", "local_conf_timestamp": 2525, "host-ts": 2525}, "global_maintenance": true}\n\nlago.ssh: DEBUG: start task:a41bee7a-3df1-42b7-8f4e-8fc96f15783e:Get ssh client for lago-he-basic-ansible-suite-4-2-host-0:\nlago.ssh: DEBUG: end task:a41bee7a-3df1-42b7-8f4e-8fc96f15783e:Get ssh client for lago-he-basic-ansible-suite-4-2-host-0:\nlago.ssh: DEBUG: Running e645a99e on lago-he-basic-ansible-suite-4-2-host-0: hosted-engine --vm-status --json\nlago.ssh: DEBUG: Command e645a99e on lago-he-basic-ansible-suite-4-2-host-0 returned with 0\nlago.ssh: DEBUG: Command e645a99e on lago-he-basic-ansible-suite-4-2-host-0 output:\n {"1": {"conf_on_shared_storage": true, "live-data": true, "extra": "metadata_parse_version=1\\nmetadata_feature_version=1\\ntimestamp=2538 (Fri Oct 5 20:08:00 2018)\\nhost-id=1\\nscore=3000\\nvm_conf_refresh_time=2538 (Fri Oct 5 20:08:01 2018)\\nconf_on_shared_storage=True\\nmaintenance=False\\nstate=GlobalMaintenance\\nstopped=False\\n", "hostname": "lago-he-basic-ansible-suite-4-2-host-0", "host-id": 1, "engine-status": {"reason": "failed liveliness check", "health": "bad", "vm": "up", "detail": "Powering down"}, "score": 3000, "stopped": false, "maintenance": false, "crc32": "4d8f1a0d", "local_conf_timestamp": 2538, "host-ts": 2538}, "2": {"conf_on_shared_storage": true, "live-data": true, "extra": "metadata_parse_version=1\\nmetadata_feature_version=1\\ntimestamp=2535 (Fri Oct 5 20:07:58 2018)\\nhost-id=2\\nscore=3400\\nvm_conf_refresh_time=2535 (Fri Oct 5 20:07:58 2018)\\nconf_on_shared_storage=True\\nmaintenance=False\\nstate=EngineDown\\nstopped=False\\n", "hostname": "lago-he-basic-ansible-suite-4-2-host-1", "host-id": 2, "engine-status": {"reason": "vm not running on this host", "health": "bad", "vm": "down", "detail": "unknown"}, "score": 3400, "stopped": false, "maintenance": false, "crc32": "62e7f5d6", "local_conf_timestamp": 2535, "host-ts": 2535}, "global_maintenance": true}\n\nlago.ssh: DEBUG: start task:f94c5d2f-8f54-460d-a98d-e33e771daa29:Get ssh client for lago-he-basic-ansible-suite-4-2-host-0:\nlago.ssh: DEBUG: end task:f94c5d2f-8f54-460d-a98d-e33e771daa29:Get ssh client for lago-he-basic-ansible-suite-4-2-host-0:\nlago.ssh: DEBUG: Running e894511e on lago-he-basic-ansible-suite-4-2-host-0: hosted-engine --vm-status --json\nlago.ssh: DEBUG: Command e894511e on lago-he-basic-ansible-suite-4-2-host-0 returned with 0\nlago.ssh: DEBUG: Command e894511e on lago-he-basic-ansible-suite-4-2-host-0 output:\n {"1": {"conf_on_shared_storage": true, "live-data": true, "extra": "metadata_parse_version=1\\nmetadata_feature_version=1\\ntimestamp=2538 (Fri Oct 5 20:08:00 2018)\\nhost-id=1\\nscore=3000\\nvm_conf_refresh_time=2538 (Fri Oct 5 20:08:01 2018)\\nconf_on_shared_storage=True\\nmaintenance=False\\nstate=GlobalMaintenance\\nstopped=False\\n", "hostname": "lago-he-basic-ansible-suite-4-2-host-0", "host-id": 1, "engine-status": {"reason": "failed liveliness check", "health": "bad", "vm": "up", "detail": "Powering down"}, "score": 3000, "stopped": false, "maintenance": false, "crc32": "4d8f1a0d", "local_conf_timestamp": 2538, "host-ts": 2538}, "2": {"conf_on_shared_storage": true, "live-data": true, "extra": "metadata_parse_version=1\\nmetadata_feature_version=1\\ntimestamp=2535 (Fri Oct 5 20:07:58 2018)\\nhost-id=2\\nscore=3400\\nvm_conf_refresh_time=2535 (Fri Oct 5 20:07:58 2018)\\nconf_on_shared_storage=True\\nmaintenance=False\\nstate=EngineDown\\nstopped=False\\n", "hostname": "lago-he-basic-ansible-suite-4-2-host-1", "host-id": 2, "engine-status": {"reason": "vm not running on this host", "health": "bad", "vm": "down", "detail": "unknown"}, "score": 3400, "stopped": false, "maintenance": false, "crc32": "62e7f5d6", "local_conf_timestamp": 2535, "host-ts": 2535}, "global_maintenance": true}\n\nlago.ssh: DEBUG: start task:fe6c5374-0d07-415c-86d2-d062e00b3dd8:Get ssh client for lago-he-basic-ansible-suite-4-2-host-0:\nlago.ssh: DEBUG: end task:fe6c5374-0d07-415c-86d2-d062e00b3dd8:Get ssh client for lago-he-basic-ansible-suite-4-2-host-0:\nlago.ssh: DEBUG: Running eae90770 on lago-he-basic-ansible-suite-4-2-host-0: hosted-engine --vm-status --json\nlago.ssh: DEBUG: Command eae90770 on lago-he-basic-ansible-suite-4-2-host-0 returned with 0\nlago.ssh: DEBUG: Command eae90770 on lago-he-basic-ansible-suite-4-2-host-0 output:\n {"1": {"conf_on_shared_storage": true, "live-data": true, "extra": "metadata_parse_version=1\\nmetadata_feature_version=1\\ntimestamp=2548 (Fri Oct 5 20:08:11 2018)\\nhost-id=1\\nscore=3400\\nvm_conf_refresh_time=2548 (Fri Oct 5 20:08:11 2018)\\nconf_on_shared_storage=True\\nmaintenance=False\\nstate=GlobalMaintenance\\nstopped=False\\n", "hostname": "lago-he-basic-ansible-suite-4-2-host-0", "host-id": 1, "engine-status": {"reason": "bad vm status", "health": "bad", "vm": "down_unexpected", "detail": "Down"}, "score": 3400, "stopped": false, "maintenance": false, "crc32": "7e8da80c", "local_conf_timestamp": 2548, "host-ts": 2548}, "2": {"conf_on_shared_storage": true, "live-data": true, "extra": "metadata_parse_version=1\\nmetadata_feature_version=1\\ntimestamp=2545 (Fri Oct 5 20:08:07 2018)\\nhost-id=2\\nscore=3400\\nvm_conf_refresh_time=2545 (Fri Oct 5 20:08:08 2018)\\nconf_on_shared_storage=True\\nmaintenance=False\\nstate=GlobalMaintenance\\nstopped=False\\n", "hostname": "lago-he-basic-ansible-suite-4-2-host-1", "host-id": 2, "engine-status": {"reason": "vm not running on this host", "health": "bad", "vm": "down", "detail": "unknown"}, "score": 3400, "stopped": false, "maintenance": false, "crc32": "cc7c8506", "local_conf_timestamp": 2545, "host-ts": 2545}, "global_maintenance": true}\n\nroot: INFO: * VM is down.\nroot: INFO: * Stopping services...\nlago.ssh: DEBUG: start task:8bee4b95-9d41-4a07-8b01-22a28b0b68e1:Get ssh client for lago-he-basic-ansible-suite-4-2-host-0:\nlago.ssh: DEBUG: end task:8bee4b95-9d41-4a07-8b01-22a28b0b68e1:Get ssh client for lago-he-basic-ansible-suite-4-2-host-0:\nlago.ssh: DEBUG: Running eb5d2844 on lago-he-basic-ansible-suite-4-2-host-0: systemctl stop vdsmd ovirt-ha-broker ovirt-ha-agent\nlago.ssh: DEBUG: Command eb5d2844 on lago-he-basic-ansible-suite-4-2-host-0 returned with 0\nroot: INFO: * Starting services...\nlago.ssh: DEBUG: start task:b7d48c4f-78d1-4522-b25f-631ceeba11ed:Get ssh client for lago-he-basic-ansible-suite-4-2-host-0:\nlago.ssh: DEBUG: end task:b7d48c4f-78d1-4522-b25f-631ceeba11ed:Get ssh client for lago-he-basic-ansible-suite-4-2-host-0:\nlago.ssh: DEBUG: Running f30b9404 on lago-he-basic-ansible-suite-4-2-host-0: systemctl start vdsmd ovirt-ha-broker ovirt-ha-agent\nlago.ssh: DEBUG: Command f30b9404 on lago-he-basic-ansible-suite-4-2-host-0 returned with 0\nroot: INFO: * Waiting for agent to be ready...\nlago.ssh: DEBUG: start task:a906a664-8725-4391-986a-075cbeceb847:Get ssh client for lago-he-basic-ansible-suite-4-2-host-0:\nlago.ssh: DEBUG: end task:a906a664-8725-4391-986a-075cbeceb847:Get ssh client for lago-he-basic-ansible-suite-4-2-host-0:\nlago.ssh: DEBUG: Running f5a174b8 on lago-he-basic-ansible-suite-4-2-host-0: hosted-engine --vm-status\nlago.ssh: DEBUG: Command f5a174b8 on lago-he-basic-ansible-suite-4-2-host-0 returned with 1\nlago.ssh: DEBUG: Command f5a174b8 on lago-he-basic-ansible-suite-4-2-host-0 output:\n The hosted engine configuration has not been retrieved from shared storage. Please ensure that ovirt-ha-agent is running and the storage server is reachable.\n\nlago.ssh: DEBUG: start task:6d97ee02-a35e-45dc-b9f6-d08eebcfdb73:Get ssh client for lago-he-basic-ansible-suite-4-2-host-0:\nlago.ssh: DEBUG: end task:6d97ee02-a35e-45dc-b9f6-d08eebcfdb73:Get ssh client for lago-he-basic-ansible-suite-4-2-host-0:\nlago.ssh: DEBUG: Running fbfa66c6 on lago-he-basic-ansible-suite-4-2-host-0: hosted-engine --vm-status\nlago.ssh: DEBUG: Command fbfa66c6 on lago-he-basic-ansible-suite-4-2-host-0 returned with 1\nlago.ssh: DEBUG: Command fbfa66c6 on lago-he-basic-ansible-suite-4-2-host-0 output:\n The hosted engine configuration has not been retrieved from shared storage. Please ensure that ovirt-ha-agent is running and the storage server is reachable.\n\nlago.ssh: DEBUG: start task:81a47930-f5a7-4d9a-bf95-c516a4d20119:Get ssh client for lago-he-basic-ansible-suite-4-2-host-0:\nlago.ssh: DEBUG: end task:81a47930-f5a7-4d9a-bf95-c516a4d20119:Get ssh client for lago-he-basic-ansible-suite-4-2-host-0:\nlago.ssh: DEBUG: Running fe1c7976 on lago-he-basic-ansible-suite-4-2-host-0: hosted-engine --vm-status\nlago.ssh: DEBUG: Command fe1c7976 on lago-he-basic-ansible-suite-4-2-host-0 returned with 1\nlago.ssh: DEBUG: Command fe1c7976 on lago-he-basic-ansible-suite-4-2-host-0 output:\n The hosted engine configuration has not been retrieved from shared storage. Please ensure that ovirt-ha-agent is running and the storage server is reachable.\n\nlago.ssh: DEBUG: start task:a09e616e-557f-46cf-96da-4c1d08e6c20c:Get ssh client for lago-he-basic-ansible-suite-4-2-host-0:\nlago.ssh: DEBUG: end task:a09e616e-557f-46cf-96da-4c1d08e6c20c:Get ssh client for lago-he-basic-ansible-suite-4-2-host-0:\nlago.ssh: DEBUG: Running 005dceec on lago-he-basic-ansible-suite-4-2-host-0: hosted-engine --vm-status\nlago.ssh: DEBUG: Command 005dceec on lago-he-basic-ansible-suite-4-2-host-0 returned with 1\nlago.ssh: DEBUG: Command 005dceec on lago-he-basic-ansible-suite-4-2-host-0 output:\n The hosted engine configuration has not been retrieved from shared storage. Please ensure that ovirt-ha-agent is running and the storage server is reachable.\n\nlago.ssh: DEBUG: start task:680788ce-4a96-456e-939e-4187ba02ecf9:Get ssh client for lago-he-basic-ansible-suite-4-2-host-0:\nlago.ssh: DEBUG: end task:680788ce-4a96-456e-939e-4187ba02ecf9:Get ssh client for lago-he-basic-ansible-suite-4-2-host-0:\nlago.ssh: DEBUG: Running 0289b564 on lago-he-basic-ansible-suite-4-2-host-0: hosted-engine --vm-status\nlago.ssh: DEBUG: Command 0289b564 on lago-he-basic-ansible-suite-4-2-host-0 returned with 1\nlago.ssh: DEBUG: Command 0289b564 on lago-he-basic-ansible-suite-4-2-host-0 output:\n The hosted engine configuration has not been retrieved from shared storage. Please ensure that ovirt-ha-agent is running and the storage server is reachable.\n\nlago.ssh: DEBUG: start task:d677c081-7afd-49ed-83bc-9e42703f5926:Get ssh client for lago-he-basic-ansible-suite-4-2-host-0:\nlago.ssh: DEBUG: end task:d677c081-7afd-49ed-83bc-9e42703f5926:Get ssh client for lago-he-basic-ansible-suite-4-2-host-0:\nlago.ssh: DEBUG: Running 04b2c16e on lago-he-basic-ansible-suite-4-2-host-0: hosted-engine --vm-status\nlago.ssh: DEBUG: Command 04b2c16e on lago-he-basic-ansible-suite-4-2-host-0 returned with 1\nlago.ssh: DEBUG: Command 04b2c16e on lago-he-basic-ansible-suite-4-2-host-0 output:\n The hosted engine configuration has not been retrieved from shared storage. Please ensure that ovirt-ha-agent is running and the storage server is reachable.\n\nlago.ssh: DEBUG: start task:3ec69661-9088-4107-a61a-78d064f23194:Get ssh client for lago-he-basic-ansible-suite-4-2-host-0:\nlago.ssh: DEBUG: end task:3ec69661-9088-4107-a61a-78d064f23194:Get ssh client for lago-he-basic-ansible-suite-4-2-host-0:\nlago.ssh: DEBUG: Running 06dafbb4 on lago-he-basic-ansible-suite-4-2-host-0: hosted-engine --vm-status\nlago.ssh: DEBUG: Command 06dafbb4 on lago-he-basic-ansible-suite-4-2-host-0 returned with 1\nlago.ssh: DEBUG: Command 06dafbb4 on lago-he-basic-ansible-suite-4-2-host-0 output:\n The hosted engine configuration has not been retrieved from shared storage. Please ensure that ovirt-ha-agent is running and the storage server is reachable.\n\nlago.ssh: DEBUG: start task:a7313617-2000-4cba-b1f3-2b0430d3d031:Get ssh client for lago-he-basic-ansible-suite-4-2-host-0:\nlago.ssh: DEBUG: end task:a7313617-2000-4cba-b1f3-2b0430d3d031:Get ssh client for lago-he-basic-ansible-suite-4-2-host-0:\nlago.ssh: DEBUG: Running 0909032c on lago-he-basic-ansible-suite-4-2-host-0: hosted-engine --vm-status\nlago.ssh: DEBUG: Command 0909032c on lago-he-basic-ansible-suite-4-2-host-0 returned with 1\nlago.ssh: DEBUG: Command 0909032c on lago-he-basic-ansible-suite-4-2-host-0 output:\n The hosted engine configuration has not been retrieved from shared storage. Please ensure that ovirt-ha-agent is running and the storage server is reachable.\n\nlago.ssh: DEBUG: start task:b749c52f-0363-47af-8b07-12d48fbc0f89:Get ssh client for lago-he-basic-ansible-suite-4-2-host-0:\nlago.ssh: DEBUG: end task:b749c52f-0363-47af-8b07-12d48fbc0f89:Get ssh client for lago-he-basic-ansible-suite-4-2-host-0:\nlago.ssh: DEBUG: Running 0b3262ec on lago-he-basic-ansible-suite-4-2-host-0: hosted-engine --vm-status\nlago.ssh: DEBUG: Command 0b3262ec on lago-he-basic-ansible-suite-4-2-host-0 returned with 0\nlago.ssh: DEBUG: Command 0b3262ec on lago-he-basic-ansible-suite-4-2-host-0 output:\n \n\n!! Cluster is in GLOBAL MAINTENANCE mode !!\n\n\n\n--== Host 1 status ==--\n\nconf_on_shared_storage : True\nStatus up-to-date : True\nHostname : lago-he-basic-ansible-suite-4-2-host-0\nHost ID : 1\nEngine status : {"reason": "bad vm status", "health": "bad", "vm": "down", "detail": "Down"}\nScore : 0\nstopped : False\nLocal maintenance : False\ncrc32 : 6d4d94fa\nlocal_conf_timestamp : 2604\nHost timestamp : 2604\nExtra metadata (valid at timestamp):\n\tmetadata_parse_version=1\n\tmetadata_feature_version=1\n\ttimestamp=2604 (Fri Oct 5 20:09:06 2018)\n\thost-id=1\n\tscore=0\n\tvm_conf_refresh_time=2604 (Fri Oct 5 20:09:06 2018)\n\tconf_on_shared_storage=True\n\tmaintenance=False\n\tstate=ReinitializeFSM\n\tstopped=False\n\n\n--== Host 2 status ==--\n\nconf_on_shared_storage : True\nStatus up-to-date : False\nHostname : lago-he-basic-ansible-suite-4-2-host-1\nHost ID : 2\nEngine status : unknown stale-data\nScore : 3400\nstopped : False\nLocal maintenance : False\ncrc32 : 8bf93dc1\nlocal_conf_timestamp : 2595\nHost timestamp : 2595\nExtra metadata (valid at timestamp):\n\tmetadata_parse_version=1\n\tmetadata_feature_version=1\n\ttimestamp=2595 (Fri Oct 5 20:08:58 2018)\n\thost-id=2\n\tscore=3400\n\tvm_conf_refresh_time=2595 (Fri Oct 5 20:08:58 2018)\n\tconf_on_shared_storage=True\n\tmaintenance=False\n\tstate=GlobalMaintenance\n\tstopped=False\n\n\n!! Cluster is in GLOBAL MAINTENANCE mode !!\n\n\nroot: INFO: * Agent is ready.\nroot: INFO: * Starting VM...\nlago.ssh: DEBUG: start task:ca0f6687-e4a5-4a45-a33b-851a4b0a27fb:Get ssh client for lago-he-basic-ansible-suite-4-2-host-0:\nlago.ssh: DEBUG: end task:ca0f6687-e4a5-4a45-a33b-851a4b0a27fb:Get ssh client for lago-he-basic-ansible-suite-4-2-host-0:\nlago.ssh: DEBUG: Running 0bbc3922 on lago-he-basic-ansible-suite-4-2-host-0: hosted-engine --vm-start\nlago.ssh: DEBUG: Command 0bbc3922 on lago-he-basic-ansible-suite-4-2-host-0 returned with 0\nlago.ssh: DEBUG: Command 0bbc3922 on lago-he-basic-ansible-suite-4-2-host-0 output:\n VM exists and is down, cleaning up and restarting\n\nroot: INFO: * Command succeeded\nroot: INFO: * Waiting for VM to be UP...\nlago.ssh: DEBUG: start task:13617503-7603-4b90-bb3e-7c88ee81e50e:Get ssh client for lago-he-basic-ansible-suite-4-2-host-0:\nlago.ssh: DEBUG: end task:13617503-7603-4b90-bb3e-7c88ee81e50e:Get ssh client for lago-he-basic-ansible-suite-4-2-host-0:\nlago.ssh: DEBUG: Running 0cf57dbc on lago-he-basic-ansible-suite-4-2-host-0: hosted-engine --vm-status --json\nlago.ssh: DEBUG: Command 0cf57dbc on lago-he-basic-ansible-suite-4-2-host-0 returned with 0\nlago.ssh: DEBUG: Command 0cf57dbc on lago-he-basic-ansible-suite-4-2-host-0 output:\n {"1": {"conf_on_shared_storage": true, "live-data": true, "extra": "metadata_parse_version=1\\nmetadata_feature_version=1\\ntimestamp=2604 (Fri Oct 5 20:09:06 2018)\\nhost-id=1\\nscore=0\\nvm_conf_refresh_time=2604 (Fri Oct 5 20:09:06 2018)\\nconf_on_shared_storage=True\\nmaintenance=False\\nstate=ReinitializeFSM\\nstopped=False\\n", "hostname": "lago-he-basic-ansible-suite-4-2-host-0", "host-id": 1, "engine-status": {"reason": "bad vm status", "health": "bad", "vm": "down", "detail": "Down"}, "score": 0, "stopped": false, "maintenance": false, "crc32": "6d4d94fa", "local_conf_timestamp": 2604, "host-ts": 2604}, "2": {"conf_on_shared_storage": true, "live-data": false, "extra": "metadata_parse_version=1\\nmetadata_feature_version=1\\ntimestamp=2595 (Fri Oct 5 20:08:58 2018)\\nhost-id=2\\nscore=3400\\nvm_conf_refresh_time=2595 (Fri Oct 5 20:08:58 2018)\\nconf_on_shared_storage=True\\nmaintenance=False\\nstate=GlobalMaintenance\\nstopped=False\\n", "hostname": "lago-he-basic-ansible-suite-4-2-host-1", "host-id": 2, "engine-status": {"reason": "vm not running on this host", "health": "bad", "vm": "down", "detail": "unknown"}, "score": 3400, "stopped": false, "maintenance": false, "crc32": "8bf93dc1", "local_conf_timestamp": 2595, "host-ts": 2595}, "global_maintenance": true}\n\nlago.ssh: DEBUG: start task:28442f55-f66b-4ae8-8be6-dc95642b20e4:Get ssh client for lago-he-basic-ansible-suite-4-2-host-0:\nlago.ssh: DEBUG: end task:28442f55-f66b-4ae8-8be6-dc95642b20e4:Get ssh client for lago-he-basic-ansible-suite-4-2-host-0:\nlago.ssh: DEBUG: Running 137469e6 on lago-he-basic-ansible-suite-4-2-host-0: hosted-engine --vm-status --json\nlago.ssh: DEBUG: Command 137469e6 on lago-he-basic-ansible-suite-4-2-host-0 returned with 0\nlago.ssh: DEBUG: Command 137469e6 on lago-he-basic-ansible-suite-4-2-host-0 output:\n {"1": {"conf_on_shared_storage": true, "live-data": true, "extra": "metadata_parse_version=1\\nmetadata_feature_version=1\\ntimestamp=2614 (Fri Oct 5 20:09:16 2018)\\nhost-id=1\\nscore=3400\\nvm_conf_refresh_time=2614 (Fri Oct 5 20:09:17 2018)\\nconf_on_shared_storage=True\\nmaintenance=False\\nstate=GlobalMaintenance\\nstopped=False\\n", "hostname": "lago-he-basic-ansible-suite-4-2-host-0", "host-id": 1, "engine-status": {"reason": "bad vm status", "health": "bad", "vm": "up", "detail": "Powering up"}, "score": 3400, "stopped": false, "maintenance": false, "crc32": "e0beb128", "local_conf_timestamp": 2614, "host-ts": 2614}, "2": {"conf_on_shared_storage": true, "live-data": false, "extra": "metadata_parse_version=1\\nmetadata_feature_version=1\\ntimestamp=2605 (Fri Oct 5 20:09:08 2018)\\nhost-id=2\\nscore=3400\\nvm_conf_refresh_time=2605 (Fri Oct 5 20:09:08 2018)\\nconf_on_shared_storage=True\\nmaintenance=False\\nstate=GlobalMaintenance\\nstopped=False\\n", "hostname": "lago-he-basic-ansible-suite-4-2-host-1", "host-id": 2, "engine-status": {"reason": "vm not running on this host", "health": "bad", "vm": "down", "detail": "unknown"}, "score": 3400, "stopped": false, "maintenance": false, "crc32": "71e9ed52", "local_conf_timestamp": 2605, "host-ts": 2605}, "global_maintenance": true}\n\nroot: INFO: * VM is UP.\nroot: INFO: * Waiting for engine to start...\nlago.ssh: DEBUG: start task:7c749b65-99a0-4890-b105-d81df4cd8b2e:Get ssh client for lago-he-basic-ansible-suite-4-2-host-0:\nlago.ssh: DEBUG: end task:7c749b65-99a0-4890-b105-d81df4cd8b2e:Get ssh client for lago-he-basic-ansible-suite-4-2-host-0:\nlago.ssh: DEBUG: Running 13faa7d6 on lago-he-basic-ansible-suite-4-2-host-0: hosted-engine --vm-status --json\nlago.ssh: DEBUG: Command 13faa7d6 on lago-he-basic-ansible-suite-4-2-host-0 returned with 0\nlago.ssh: DEBUG: Command 13faa7d6 on lago-he-basic-ansible-suite-4-2-host-0 output:\n {"1": {"conf_on_shared_storage": true, "live-data": true, "extra": "metadata_parse_version=1\\nmetadata_feature_version=1\\ntimestamp=2614 (Fri Oct 5 20:09:16 2018)\\nhost-id=1\\nscore=3400\\nvm_conf_refresh_time=2614 (Fri Oct 5 20:09:17 2018)\\nconf_on_shared_storage=True\\nmaintenance=False\\nstate=GlobalMaintenance\\nstopped=False\\n", "hostname": "lago-he-basic-ansible-suite-4-2-host-0", "host-id": 1, "engine-status": {"reason": "bad vm status", "health": "bad", "vm": "up", "detail": "Powering up"}, "score": 3400, "stopped": false, "maintenance": false, "crc32": "e0beb128", "local_conf_timestamp": 2614, "host-ts": 2614}, "2": {"conf_on_shared_storage": true, "live-data": false, "extra": "metadata_parse_version=1\\nmetadata_feature_version=1\\ntimestamp=2605 (Fri Oct 5 20:09:08 2018)\\nhost-id=2\\nscore=3400\\nvm_conf_refresh_time=2605 (Fri Oct 5 20:09:08 2018)\\nconf_on_shared_storage=True\\nmaintenance=False\\nstate=GlobalMaintenance\\nstopped=False\\n", "hostname": "lago-he-basic-ansible-suite-4-2-host-1", "host-id": 2\novirtlago.testlib: ERROR: * Unhandled exception in <function <lambda> at 0x7feaff4389b0>\nTraceback (most recent call last):\n File "/usr/lib/python2.7/site-packages/ovirtlago/testlib.py", line 234, in assert_equals_within\n res = func()\n File "/home/jenkins/workspace/ovirt-system-tests_he-basic-ansible-suite-4.2/ovirt-system-tests/he-basic-ansible-suite-4.2/test-scenarios/008_restart_he_vm.py", line 187, in <lambda>\n for k, v in _get_he_status(host).items()\n File "/home/jenkins/workspace/ovirt-system-tests_he-basic-ansible-suite-4.2/ovirt-system-tests/he-basic-ansible-suite-4.2/test-scenarios/008_restart_he_vm.py", line 128, in _get_he_status\n raise RuntimeError(\'could not parse JSON: %s\' % ret.out)\nRuntimeError: could not parse JSON: {"1": {"conf_on_shared_storage": true, "live-data": true, "extra": "metadata_parse_version=1\\nmetadata_feature_version=1\\ntimestamp=2614 (Fri Oct 5 20:09:16 2018)\\nhost-id=1\\nscore=3400\\nvm_conf_refresh_time=2614 (Fri Oct 5 20:09:17 2018)\\nconf_on_shared_storage=True\\nmaintenance=False\\nstate=GlobalMaintenance\\nstopped=False\\n", "hostname": "lago-he-basic-ansible-suite-4-2-host-0", "host-id": 1, "engine-status": {"reason": "bad vm status", "health": "bad", "vm": "up", "detail": "Powering up"}, "score": 3400, "stopped": false, "maintenance": false, "crc32": "e0beb128", "local_conf_timestamp": 2614, "host-ts": 2614}, "2": {"conf_on_shared_storage": true, "live-data": false, "extra": "metadata_parse_version=1\\nmetadata_feature_version=1\\ntimestamp=2605 (Fri Oct 5 20:09:08 2018)\\nhost-id=2\\nscore=3400\\nvm_conf_refresh_time=2605 (Fri Oct 5 20:09:08 2018)\\nconf_on_shared_storage=True\\nmaintenance=False\\nstate=GlobalMaintenance\\nstopped=False\\n", "hostname": "lago-he-basic-ansible-suite-4-2-host-1", "host-id": 2\n--------------------- >> end captured logging << ---------------------'
6 years, 2 months
Build failed in Jenkins: system-mk_mirrors_index-yum #39408
by jenkins@jenkins.phx.ovirt.org
See <http://jenkins.ovirt.org/job/system-mk_mirrors_index-yum/39408/display/re...>
------------------------------------------
Started by upstream project "system-sync_mirrors-fedora-updates-fc28-x86_64" build number 473
originally caused by:
Started by timer
[EnvInject] - Loading node environment variables.
Building remotely on mirrors.phx.ovirt.org (mirrors) in workspace <http://jenkins.ovirt.org/job/system-mk_mirrors_index-yum/ws/>
> git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
> git config remote.origin.url http://gerrit.ovirt.org/jenkins.git # timeout=10
Cleaning workspace
> git rev-parse --verify HEAD # timeout=10
Resetting working tree
> git reset --hard # timeout=10
> git clean -fdx # timeout=10
Pruning obsolete local branches
Fetching upstream changes from http://gerrit.ovirt.org/jenkins.git
> git --version # timeout=10
> git fetch --tags --progress http://gerrit.ovirt.org/jenkins.git +refs/heads/*:refs/remotes/origin/* --prune
ERROR: Error fetching remote repo 'origin'
hudson.plugins.git.GitException: Failed to fetch from http://gerrit.ovirt.org/jenkins.git
at hudson.plugins.git.GitSCM.fetchFrom(GitSCM.java:888)
at hudson.plugins.git.GitSCM.retrieveChanges(GitSCM.java:1155)
at hudson.plugins.git.GitSCM.checkout(GitSCM.java:1186)
at hudson.scm.SCM.checkout(SCM.java:504)
at hudson.model.AbstractProject.checkout(AbstractProject.java:1208)
at hudson.model.AbstractBuild$AbstractBuildExecution.defaultCheckout(AbstractBuild.java:574)
at jenkins.scm.SCMCheckoutStrategy.checkout(SCMCheckoutStrategy.java:86)
at hudson.model.AbstractBuild$AbstractBuildExecution.run(AbstractBuild.java:499)
at hudson.model.Run.execute(Run.java:1798)
at hudson.model.FreeStyleBuild.run(FreeStyleBuild.java:43)
at hudson.model.ResourceController.execute(ResourceController.java:97)
at hudson.model.Executor.run(Executor.java:429)
Caused by: hudson.plugins.git.GitException: Command "git fetch --tags --progress http://gerrit.ovirt.org/jenkins.git +refs/heads/*:refs/remotes/origin/* --prune" returned status code 128:
stdout:
stderr: fatal: unable to access 'http://gerrit.ovirt.org/jenkins.git/': The requested URL returned error: 503
at org.jenkinsci.plugins.gitclient.CliGitAPIImpl.launchCommandIn(CliGitAPIImpl.java:2016)
at org.jenkinsci.plugins.gitclient.CliGitAPIImpl.launchCommandWithCredentials(CliGitAPIImpl.java:1735)
at org.jenkinsci.plugins.gitclient.CliGitAPIImpl.access$300(CliGitAPIImpl.java:72)
at org.jenkinsci.plugins.gitclient.CliGitAPIImpl$1.execute(CliGitAPIImpl.java:420)
at org.jenkinsci.plugins.gitclient.RemoteGitImpl$CommandInvocationHandler$1.call(RemoteGitImpl.java:153)
at org.jenkinsci.plugins.gitclient.RemoteGitImpl$CommandInvocationHandler$1.call(RemoteGitImpl.java:146)
at hudson.remoting.UserRequest.perform(UserRequest.java:212)
at hudson.remoting.UserRequest.perform(UserRequest.java:54)
at hudson.remoting.Request$2.run(Request.java:369)
at hudson.remoting.InterceptingExecutorService$1.call(InterceptingExecutorService.java:72)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Suppressed: hudson.remoting.Channel$CallSiteStackTrace: Remote call to mirrors.phx.ovirt.org
at hudson.remoting.Channel.attachCallSiteStackTrace(Channel.java:1741)
at hudson.remoting.UserRequest$ExceptionResponse.retrieve(UserRequest.java:357)
at hudson.remoting.Channel.call(Channel.java:955)
at org.jenkinsci.plugins.gitclient.RemoteGitImpl$CommandInvocationHandler.execute(RemoteGitImpl.java:146)
at sun.reflect.GeneratedMethodAccessor278.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.jenkinsci.plugins.gitclient.RemoteGitImpl$CommandInvocationHandler.invoke(RemoteGitImpl.java:132)
at com.sun.proxy.$Proxy82.execute(Unknown Source)
at hudson.plugins.git.GitSCM.fetchFrom(GitSCM.java:886)
at hudson.plugins.git.GitSCM.retrieveChanges(GitSCM.java:1155)
at hudson.plugins.git.GitSCM.checkout(GitSCM.java:1186)
at hudson.scm.SCM.checkout(SCM.java:504)
at hudson.model.AbstractProject.checkout(AbstractProject.java:1208)
at hudson.model.AbstractBuild$AbstractBuildExecution.defaultCheckout(AbstractBuild.java:574)
at jenkins.scm.SCMCheckoutStrategy.checkout(SCMCheckoutStrategy.java:86)
at hudson.model.AbstractBuild$AbstractBuildExecution.run(AbstractBuild.java:499)
at hudson.model.Run.execute(Run.java:1798)
at hudson.model.FreeStyleBuild.run(FreeStyleBuild.java:43)
at hudson.model.ResourceController.execute(ResourceController.java:97)
at hudson.model.Executor.run(Executor.java:429)
ERROR: Error fetching remote repo 'origin'
6 years, 2 months
Build failed in Jenkins: deploy-to_ovirt-master_tested #5181
by jenkins@jenkins.phx.ovirt.org
See <http://jenkins.ovirt.org/job/deploy-to_ovirt-master_tested/5181/display/r...>
------------------------------------------
[...truncated 192.20 KB...]
2018-10-05 13:52:49,276::INFO::root::Downloading http://jenkins.ovirt.org/job/vdsm_master_build-artifacts-fc28-x86_64/204/..., length 1M ...
%[-----------------------25------------------------50-----------------------75------------------------] %[====================================================================================================]
2018-10-05 13:52:49,384::INFO::repoman.common.stores.RPM::Adding package /srv/resources/repos/ovirt/tested/.deploy.Pd92uy6NaL/.lago_tmp/tmpXC3ICy/tmpOOfaoD/vdsm-tests-4.30.0-619.git9252d21fb.fc28.noarch.rpm to repo /srv/resources/repos/ovirt/tested/.deploy.Pd92uy6NaL/master
2018-10-05 13:52:49,409::INFO::root::Downloading http://jenkins.ovirt.org/job/vdsm_master_build-artifacts-fc28-x86_64/204/..., length 38K ...
%[-----------------------25------------------------50-----------------------75------------------------] %[====================================================================================================]
2018-10-05 13:52:49,485::INFO::repoman.common.stores.RPM::Adding package /srv/resources/repos/ovirt/tested/.deploy.Pd92uy6NaL/.lago_tmp/tmpXC3ICy/tmpOOfaoD/vdsm-yajsonrpc-4.30.0-619.git9252d21fb.fc28.noarch.rpm to repo /srv/resources/repos/ovirt/tested/.deploy.Pd92uy6NaL/master
2018-10-05 13:52:49,486::INFO::repoman.common.repo::Resolving artifact source jenkins:http://jenkins.ovirt.org/job/vdsm-jsonrpc-java_master_build-artif...
2018-10-05 13:52:49,577::INFO::repoman.common.sources.jenkins::Parsing jenkins URL: http://jenkins.ovirt.org/job/vdsm-jsonrpc-java_master_build-artifacts-el7...
2018-10-05 13:52:49,580::INFO::repoman.common.sources.jenkins:: Got URL: http://jenkins.ovirt.org/job/vdsm-jsonrpc-java_master_build-artifacts-el7...
2018-10-05 13:52:49,580::INFO::repoman.common.sources.jenkins:: Got URL: http://jenkins.ovirt.org/job/vdsm-jsonrpc-java_master_build-artifacts-el7...
2018-10-05 13:52:49,581::INFO::repoman.common.sources.jenkins:: Got URL: http://jenkins.ovirt.org/job/vdsm-jsonrpc-java_master_build-artifacts-el7...
2018-10-05 13:52:49,581::INFO::root:: Done
2018-10-05 13:52:49,607::INFO::root::Downloading http://jenkins.ovirt.org/job/vdsm-jsonrpc-java_master_build-artifacts-el7..., length 123K ...
%[-----------------------25------------------------50-----------------------75------------------------] %[====================================================================================================]
2018-10-05 13:52:49,709::INFO::repoman.common.stores.RPM::Adding package /srv/resources/repos/ovirt/tested/.deploy.Pd92uy6NaL/.lago_tmp/tmpXC3ICy/tmpOOfaoD/vdsm-jsonrpc-java-1.4.15-2.20181005112302.gitc3f5fa5.el7.noarch.rpm to repo /srv/resources/repos/ovirt/tested/.deploy.Pd92uy6NaL/master
2018-10-05 13:52:49,759::INFO::root::Downloading http://jenkins.ovirt.org/job/vdsm-jsonrpc-java_master_build-artifacts-el7..., length 150K ...
%[-----------------------25------------------------50-----------------------75------------------------] %[====================================================================================================]
2018-10-05 13:52:49,842::INFO::repoman.common.stores.RPM::Adding package /srv/resources/repos/ovirt/tested/.deploy.Pd92uy6NaL/.lago_tmp/tmpXC3ICy/tmpOOfaoD/vdsm-jsonrpc-java-1.4.15-2.20181005112302.gitc3f5fa5.el7.src.rpm to repo /srv/resources/repos/ovirt/tested/.deploy.Pd92uy6NaL/master
2018-10-05 13:52:49,868::INFO::root::Downloading http://jenkins.ovirt.org/job/vdsm-jsonrpc-java_master_build-artifacts-el7..., length 119K ...
%[-----------------------25------------------------50-----------------------75------------------------] %[====================================================================================================]
2018-10-05 13:52:49,950::INFO::repoman.common.stores.RPM::Adding package /srv/resources/repos/ovirt/tested/.deploy.Pd92uy6NaL/.lago_tmp/tmpXC3ICy/tmpOOfaoD/vdsm-jsonrpc-java-javadoc-1.4.15-2.20181005112302.gitc3f5fa5.el7.noarch.rpm to repo /srv/resources/repos/ovirt/tested/.deploy.Pd92uy6NaL/master
2018-10-05 13:52:49,951::INFO::repoman.common.repo::Resolving artifact source jenkins:http://jenkins.ovirt.org/job/vdsm-jsonrpc-java_master_build-artif...
2018-10-05 13:52:50,035::INFO::repoman.common.sources.jenkins::Parsing jenkins URL: http://jenkins.ovirt.org/job/vdsm-jsonrpc-java_master_build-artifacts-fc2...
2018-10-05 13:52:50,037::INFO::repoman.common.sources.jenkins:: Got URL: http://jenkins.ovirt.org/job/vdsm-jsonrpc-java_master_build-artifacts-fc2...
2018-10-05 13:52:50,037::INFO::repoman.common.sources.jenkins:: Got URL: http://jenkins.ovirt.org/job/vdsm-jsonrpc-java_master_build-artifacts-fc2...
2018-10-05 13:52:50,038::INFO::repoman.common.sources.jenkins:: Got URL: http://jenkins.ovirt.org/job/vdsm-jsonrpc-java_master_build-artifacts-fc2...
2018-10-05 13:52:50,038::INFO::root:: Done
2018-10-05 13:52:50,062::INFO::root::Downloading http://jenkins.ovirt.org/job/vdsm-jsonrpc-java_master_build-artifacts-fc2..., length 128K ...
%[-----------------------25------------------------50-----------------------75------------------------] %[====================================================================================================]
2018-10-05 13:52:50,139::INFO::repoman.common.stores.RPM::Adding package /srv/resources/repos/ovirt/tested/.deploy.Pd92uy6NaL/.lago_tmp/tmpXC3ICy/tmpOOfaoD/vdsm-jsonrpc-java-1.4.15-2.20181005112644.gitc3f5fa5.fc28.noarch.rpm to repo /srv/resources/repos/ovirt/tested/.deploy.Pd92uy6NaL/master
2018-10-05 13:52:50,195::INFO::root::Downloading http://jenkins.ovirt.org/job/vdsm-jsonrpc-java_master_build-artifacts-fc2..., length 162K ...
%[-----------------------25------------------------50-----------------------75------------------------] %[====================================================================================================]
2018-10-05 13:52:50,279::INFO::repoman.common.stores.RPM::Adding package /srv/resources/repos/ovirt/tested/.deploy.Pd92uy6NaL/.lago_tmp/tmpXC3ICy/tmpOOfaoD/vdsm-jsonrpc-java-1.4.15-2.20181005112644.gitc3f5fa5.fc28.src.rpm to repo /srv/resources/repos/ovirt/tested/.deploy.Pd92uy6NaL/master
2018-10-05 13:52:50,303::INFO::root::Downloading http://jenkins.ovirt.org/job/vdsm-jsonrpc-java_master_build-artifacts-fc2..., length 122K ...
%[-----------------------25------------------------50-----------------------75------------------------] %[====================================================================================================]
2018-10-05 13:52:50,387::INFO::repoman.common.stores.RPM::Adding package /srv/resources/repos/ovirt/tested/.deploy.Pd92uy6NaL/.lago_tmp/tmpXC3ICy/tmpOOfaoD/vdsm-jsonrpc-java-javadoc-1.4.15-2.20181005112644.gitc3f5fa5.fc28.noarch.rpm to repo /srv/resources/repos/ovirt/tested/.deploy.Pd92uy6NaL/master
2018-10-05 13:52:50,388::INFO::repoman.common.repo::Resolving artifact source jenkins:http://jenkins.ovirt.org/job/oVirt_ovirt-ansible-hosted-engine-se...
2018-10-05 13:52:51,142::INFO::repoman.common.sources.jenkins::Parsing jenkins URL: http://jenkins.ovirt.org/job/oVirt_ovirt-ansible-hosted-engine-setup_stan...
2018-10-05 13:52:51,143::INFO::repoman.common.sources.jenkins:: Got URL: http://jenkins.ovirt.org/job/oVirt_ovirt-ansible-hosted-engine-setup_stan...
2018-10-05 13:52:51,143::INFO::repoman.common.sources.jenkins:: Got URL: http://jenkins.ovirt.org/job/oVirt_ovirt-ansible-hosted-engine-setup_stan...
2018-10-05 13:52:51,145::INFO::repoman.common.sources.jenkins:: Got URL: http://jenkins.ovirt.org/job/oVirt_ovirt-ansible-hosted-engine-setup_stan...
2018-10-05 13:52:51,145::INFO::repoman.common.sources.jenkins:: Got URL: http://jenkins.ovirt.org/job/oVirt_ovirt-ansible-hosted-engine-setup_stan...
2018-10-05 13:52:51,146::INFO::root:: Done
2018-10-05 13:52:51,169::INFO::root::Downloading http://jenkins.ovirt.org/job/oVirt_ovirt-ansible-hosted-engine-setup_stan..., length 36K ...
%[-----------------------25------------------------50-----------------------75------------------------] %[====================================================================================================]
2018-10-05 13:52:51,245::INFO::repoman.common.stores.RPM::Adding package /srv/resources/repos/ovirt/tested/.deploy.Pd92uy6NaL/.lago_tmp/tmpXC3ICy/tmpOOfaoD/ovirt-ansible-hosted-engine-setup-1.0.2-0.1.master.20181005120739.el7.noarch.rpm to repo /srv/resources/repos/ovirt/tested/.deploy.Pd92uy6NaL/master
2018-10-05 13:52:51,270::INFO::root::Downloading http://jenkins.ovirt.org/job/oVirt_ovirt-ansible-hosted-engine-setup_stan..., length 31K ...
%[-----------------------25------------------------50-----------------------75------------------------] %[====================================================================================================]
2018-10-05 13:52:51,356::INFO::repoman.common.stores.RPM::Adding package /srv/resources/repos/ovirt/tested/.deploy.Pd92uy6NaL/.lago_tmp/tmpXC3ICy/tmpOOfaoD/ovirt-ansible-hosted-engine-setup-1.0.2-0.1.master.20181005120739.el7.src.rpm to repo /srv/resources/repos/ovirt/tested/.deploy.Pd92uy6NaL/master
2018-10-05 13:52:51,382::INFO::root::Downloading http://jenkins.ovirt.org/job/oVirt_ovirt-ansible-hosted-engine-setup_stan..., length 39K ...
%[-----------------------25------------------------50-----------------------75------------------------] %[====================================================================================================]
2018-10-05 13:52:51,456::INFO::repoman.common.stores.RPM::Adding package /srv/resources/repos/ovirt/tested/.deploy.Pd92uy6NaL/.lago_tmp/tmpXC3ICy/tmpOOfaoD/ovirt-ansible-hosted-engine-setup-1.0.2-0.1.master.20181005121232.fc28.noarch.rpm to repo /srv/resources/repos/ovirt/tested/.deploy.Pd92uy6NaL/master
2018-10-05 13:52:51,506::INFO::root::Downloading http://jenkins.ovirt.org/job/oVirt_ovirt-ansible-hosted-engine-setup_stan..., length 36K ...
%[-----------------------25------------------------50-----------------------75------------------------] %[====================================================================================================]
2018-10-05 13:52:51,624::INFO::repoman.common.stores.RPM::Adding package /srv/resources/repos/ovirt/tested/.deploy.Pd92uy6NaL/.lago_tmp/tmpXC3ICy/tmpOOfaoD/ovirt-ansible-hosted-engine-setup-1.0.2-0.1.master.20181005121232.fc28.src.rpm to repo /srv/resources/repos/ovirt/tested/.deploy.Pd92uy6NaL/master
2018-10-05 13:52:51,625::INFO::repoman.cmd::
2018-10-05 13:52:51,625::INFO::repoman.common.stores.RPM::Saving new added rpms into /srv/resources/repos/ovirt/tested/.deploy.Pd92uy6NaL/master
2018-10-05 13:52:51,627::INFO::root::Saving /srv/resources/repos/ovirt/tested/.deploy.Pd92uy6NaL/master/rpm/el7/ppc64le/vdsm-4.30.0-619.git9252d21.el7.ppc64le.rpm
2018-10-05 13:52:51,632::INFO::root::Saving /srv/resources/repos/ovirt/tested/.deploy.Pd92uy6NaL/master/rpm/el7/SRPMS/vdsm-4.30.0-619.git9252d21.el7.src.rpm
2018-10-05 13:52:51,633::INFO::root::Saving /srv/resources/repos/ovirt/tested/.deploy.Pd92uy6NaL/master/rpm/el7/noarch/vdsm-api-4.30.0-619.git9252d21.el7.noarch.rpm
2018-10-05 13:52:51,634::INFO::root::Saving /srv/resources/repos/ovirt/tested/.deploy.Pd92uy6NaL/master/rpm/el7/noarch/vdsm-client-4.30.0-619.git9252d21.el7.noarch.rpm
2018-10-05 13:52:51,636::INFO::root::Saving /srv/resources/repos/ovirt/tested/.deploy.Pd92uy6NaL/master/rpm/el7/noarch/vdsm-common-4.30.0-619.git9252d21.el7.noarch.rpm
2018-10-05 13:52:51,636::INFO::root::Saving /srv/resources/repos/ovirt/tested/.deploy.Pd92uy6NaL/master/rpm/el7/ppc64le/vdsm-gluster-4.30.0-619.git9252d21.el7.ppc64le.rpm
2018-10-05 13:52:51,637::INFO::root::Saving /srv/resources/repos/ovirt/tested/.deploy.Pd92uy6NaL/master/rpm/el7/noarch/vdsm-hook-allocate_net-4.30.0-619.git9252d21.el7.noarch.rpm
2018-10-05 13:52:51,637::INFO::root::Saving /srv/resources/repos/ovirt/tested/.deploy.Pd92uy6NaL/master/rpm/el7/noarch/vdsm-hook-boot_hostdev-4.30.0-619.git9252d21.el7.noarch.rpm
2018-10-05 13:52:51,637::INFO::root::Saving /srv/resources/repos/ovirt/tested/.deploy.Pd92uy6NaL/master/rpm/el7/noarch/vdsm-hook-checkimages-4.30.0-619.git9252d21.el7.noarch.rpm
2018-10-05 13:52:51,638::INFO::root::Saving /srv/resources/repos/ovirt/tested/.deploy.Pd92uy6NaL/master/rpm/el7/ppc64le/vdsm-hook-checkips-4.30.0-619.git9252d21.el7.ppc64le.rpm
2018-10-05 13:52:51,638::INFO::root::Saving /srv/resources/repos/ovirt/tested/.deploy.Pd92uy6NaL/master/rpm/el7/noarch/vdsm-hook-cpuflags-4.30.0-619.git9252d21.el7.noarch.rpm
2018-10-05 13:52:51,638::INFO::root::Saving /srv/resources/repos/ovirt/tested/.deploy.Pd92uy6NaL/master/rpm/el7/noarch/vdsm-hook-diskunmap-4.30.0-619.git9252d21.el7.noarch.rpm
2018-10-05 13:52:51,638::INFO::root::Saving /srv/resources/repos/ovirt/tested/.deploy.Pd92uy6NaL/master/rpm/el7/noarch/vdsm-hook-ethtool-options-4.30.0-619.git9252d21.el7.noarch.rpm
2018-10-05 13:52:51,639::INFO::root::Saving /srv/resources/repos/ovirt/tested/.deploy.Pd92uy6NaL/master/rpm/el7/noarch/vdsm-hook-extnet-4.30.0-619.git9252d21.el7.noarch.rpm
2018-10-05 13:52:51,639::INFO::root::Saving /srv/resources/repos/ovirt/tested/.deploy.Pd92uy6NaL/master/rpm/el7/ppc64le/vdsm-hook-extra-ipv4-addrs-4.30.0-619.git9252d21.el7.ppc64le.rpm
2018-10-05 13:52:51,639::INFO::root::Saving /srv/resources/repos/ovirt/tested/.deploy.Pd92uy6NaL/master/rpm/el7/noarch/vdsm-hook-fakevmstats-4.30.0-619.git9252d21.el7.noarch.rpm
2018-10-05 13:52:51,639::INFO::root::Saving /srv/resources/repos/ovirt/tested/.deploy.Pd92uy6NaL/master/rpm/el7/noarch/vdsm-hook-faqemu-4.30.0-619.git9252d21.el7.noarch.rpm
2018-10-05 13:52:51,640::INFO::root::Saving /srv/resources/repos/ovirt/tested/.deploy.Pd92uy6NaL/master/rpm/el7/noarch/vdsm-hook-fcoe-4.30.0-619.git9252d21.el7.noarch.rpm
2018-10-05 13:52:51,640::INFO::root::Saving /srv/resources/repos/ovirt/tested/.deploy.Pd92uy6NaL/master/rpm/el7/noarch/vdsm-hook-fileinject-4.30.0-619.git9252d21.el7.noarch.rpm
2018-10-05 13:52:51,640::INFO::root::Saving /srv/resources/repos/ovirt/tested/.deploy.Pd92uy6NaL/master/rpm/el7/noarch/vdsm-hook-floppy-4.30.0-619.git9252d21.el7.noarch.rpm
2018-10-05 13:52:51,640::INFO::root::Saving /srv/resources/repos/ovirt/tested/.deploy.Pd92uy6NaL/master/rpm/el7/noarch/vdsm-hook-httpsisoboot-4.30.0-619.git9252d21.el7.noarch.rpm
2018-10-05 13:52:51,641::INFO::root::Saving /srv/resources/repos/ovirt/tested/.deploy.Pd92uy6NaL/master/rpm/el7/noarch/vdsm-hook-isolatedprivatevlan-4.30.0-619.git9252d21.el7.noarch.rpm
2018-10-05 13:52:51,641::INFO::root::Saving /srv/resources/repos/ovirt/tested/.deploy.Pd92uy6NaL/master/rpm/el7/noarch/vdsm-hook-localdisk-4.30.0-619.git9252d21.el7.noarch.rpm
2018-10-05 13:52:51,641::INFO::root::Saving /srv/resources/repos/ovirt/tested/.deploy.Pd92uy6NaL/master/rpm/el7/noarch/vdsm-hook-macbind-4.30.0-619.git9252d21.el7.noarch.rpm
2018-10-05 13:52:51,641::INFO::root::Saving /srv/resources/repos/ovirt/tested/.deploy.Pd92uy6NaL/master/rpm/el7/noarch/vdsm-hook-macspoof-4.30.0-619.git9252d21.el7.noarch.rpm
2018-10-05 13:52:51,642::INFO::root::Saving /srv/resources/repos/ovirt/tested/.deploy.Pd92uy6NaL/master/rpm/el7/noarch/vdsm-hook-nestedvt-4.30.0-619.git9252d21.el7.noarch.rpm
2018-10-05 13:52:51,642::INFO::root::Saving /srv/resources/repos/ovirt/tested/.deploy.Pd92uy6NaL/master/rpm/el7/noarch/vdsm-hook-noipspoof-4.30.0-619.git9252d21.el7.noarch.rpm
2018-10-05 13:52:51,642::INFO::root::Saving /srv/resources/repos/ovirt/tested/.deploy.Pd92uy6NaL/master/rpm/el7/noarch/vdsm-hook-numa-4.30.0-619.git9252d21.el7.noarch.rpm
2018-10-05 13:52:51,643::INFO::root::Saving /srv/resources/repos/ovirt/tested/.deploy.Pd92uy6NaL/master/rpm/el7/noarch/vdsm-hook-openstacknet-4.30.0-619.git9252d21.el7.noarch.rpm
2018-10-05 13:52:51,643::INFO::root::Saving /srv/resources/repos/ovirt/tested/.deploy.Pd92uy6NaL/master/rpm/el7/noarch/vdsm-hook-pincpu-4.30.0-619.git9252d21.el7.noarch.rpm
2018-10-05 13:52:51,643::INFO::root::Saving /srv/resources/repos/ovirt/tested/.deploy.Pd92uy6NaL/master/rpm/el7/noarch/vdsm-hook-promisc-4.30.0-619.git9252d21.el7.noarch.rpm
2018-10-05 13:52:51,644::INFO::root::Saving /srv/resources/repos/ovirt/tested/.deploy.Pd92uy6NaL/master/rpm/el7/noarch/vdsm-hook-qemucmdline-4.30.0-619.git9252d21.el7.noarch.rpm
2018-10-05 13:52:51,644::INFO::root::Saving /srv/resources/repos/ovirt/tested/.deploy.Pd92uy6NaL/master/rpm/el7/noarch/vdsm-hook-qos-4.30.0-619.git9252d21.el7.noarch.rpm
2018-10-05 13:52:51,644::INFO::root::Saving /srv/resources/repos/ovirt/tested/.deploy.Pd92uy6NaL/master/rpm/el7/noarch/vdsm-hook-scratchpad-4.30.0-619.git9252d21.el7.noarch.rpm
2018-10-05 13:52:51,644::INFO::root::Saving /srv/resources/repos/ovirt/tested/.deploy.Pd92uy6NaL/master/rpm/el7/noarch/vdsm-hook-smbios-4.30.0-619.git9252d21.el7.noarch.rpm
2018-10-05 13:52:51,645::INFO::root::Saving /srv/resources/repos/ovirt/tested/.deploy.Pd92uy6NaL/master/rpm/el7/noarch/vdsm-hook-spiceoptions-4.30.0-619.git9252d21.el7.noarch.rpm
2018-10-05 13:52:51,645::INFO::root::Saving /srv/resources/repos/ovirt/tested/.deploy.Pd92uy6NaL/master/rpm/el7/noarch/vdsm-hook-vhostmd-4.30.0-619.git9252d21.el7.noarch.rpm
2018-10-05 13:52:51,645::INFO::root::Saving /srv/resources/repos/ovirt/tested/.deploy.Pd92uy6NaL/master/rpm/el7/noarch/vdsm-hook-vmdisk-4.30.0-619.git9252d21.el7.noarch.rpm
2018-10-05 13:52:51,646::INFO::root::Saving /srv/resources/repos/ovirt/tested/.deploy.Pd92uy6NaL/master/rpm/el7/noarch/vdsm-hook-vmfex-dev-4.30.0-619.git9252d21.el7.noarch.rpm
2018-10-05 13:52:51,646::INFO::root::Saving /srv/resources/repos/ovirt/tested/.deploy.Pd92uy6NaL/master/rpm/el7/noarch/vdsm-http-4.30.0-619.git9252d21.el7.noarch.rpm
2018-10-05 13:52:51,646::INFO::root::Saving /srv/resources/repos/ovirt/tested/.deploy.Pd92uy6NaL/master/rpm/el7/noarch/vdsm-jsonrpc-4.30.0-619.git9252d21.el7.noarch.rpm
2018-10-05 13:52:51,646::INFO::root::Saving /srv/resources/repos/ovirt/tested/.deploy.Pd92uy6NaL/master/rpm/el7/ppc64le/vdsm-network-4.30.0-619.git9252d21.el7.ppc64le.rpm
2018-10-05 13:52:51,647::INFO::root::Saving /srv/resources/repos/ovirt/tested/.deploy.Pd92uy6NaL/master/rpm/el7/noarch/vdsm-python-4.30.0-619.git9252d21.el7.noarch.rpm
2018-10-05 13:52:51,647::INFO::root::Saving /srv/resources/repos/ovirt/tested/.deploy.Pd92uy6NaL/master/rpm/el7/noarch/vdsm-tests-4.30.0-619.git9252d21.el7.noarch.rpm
2018-10-05 13:52:51,647::INFO::root::Saving /srv/resources/repos/ovirt/tested/.deploy.Pd92uy6NaL/master/rpm/el7/noarch/vdsm-yajsonrpc-4.30.0-619.git9252d21.el7.noarch.rpm
2018-10-05 13:52:51,648::INFO::root::Saving /srv/resources/repos/ovirt/tested/.deploy.Pd92uy6NaL/master/rpm/el7/x86_64/vdsm-4.30.0-619.git9252d21.el7.x86_64.rpm
2018-10-05 13:52:51,648::INFO::root::Saving /srv/resources/repos/ovirt/tested/.deploy.Pd92uy6NaL/master/rpm/el7/x86_64/vdsm-gluster-4.30.0-619.git9252d21.el7.x86_64.rpm
2018-10-05 13:52:51,649::INFO::root::Saving /srv/resources/repos/ovirt/tested/.deploy.Pd92uy6NaL/master/rpm/el7/x86_64/vdsm-hook-checkips-4.30.0-619.git9252d21.el7.x86_64.rpm
2018-10-05 13:52:51,649::INFO::root::Saving /srv/resources/repos/ovirt/tested/.deploy.Pd92uy6NaL/master/rpm/el7/x86_64/vdsm-hook-extra-ipv4-addrs-4.30.0-619.git9252d21.el7.x86_64.rpm
2018-10-05 13:52:51,651::INFO::root::Saving /srv/resources/repos/ovirt/tested/.deploy.Pd92uy6NaL/master/rpm/el7/x86_64/vdsm-network-4.30.0-619.git9252d21.el7.x86_64.rpm
2018-10-05 13:52:51,652::INFO::root::Saving /srv/resources/repos/ovirt/tested/.deploy.Pd92uy6NaL/master/rpm/fc28/s390x/vdsm-4.30.0-619.git9252d21fb.fc28.s390x.rpm
2018-10-05 13:52:51,652::INFO::root::Saving /srv/resources/repos/ovirt/tested/.deploy.Pd92uy6NaL/master/rpm/fc28/SRPMS/vdsm-4.30.0-619.git9252d21fb.fc28.src.rpm
2018-10-05 13:52:51,652::INFO::root::Saving /srv/resources/repos/ovirt/tested/.deploy.Pd92uy6NaL/master/rpm/fc28/noarch/vdsm-api-4.30.0-619.git9252d21fb.fc28.noarch.rpm
2018-10-05 13:52:51,653::INFO::root::Saving /srv/resources/repos/ovirt/tested/.deploy.Pd92uy6NaL/master/rpm/fc28/noarch/vdsm-client-4.30.0-619.git9252d21fb.fc28.noarch.rpm
2018-10-05 13:52:51,653::INFO::root::Saving /srv/resources/repos/ovirt/tested/.deploy.Pd92uy6NaL/master/rpm/fc28/noarch/vdsm-common-4.30.0-619.git9252d21fb.fc28.noarch.rpm
2018-10-05 13:52:51,653::INFO::root::Saving /srv/resources/repos/ovirt/tested/.deploy.Pd92uy6NaL/master/rpm/fc28/s390x/vdsm-gluster-4.30.0-619.git9252d21fb.fc28.s390x.rpm
2018-10-05 13:52:51,654::INFO::root::Saving /srv/resources/repos/ovirt/tested/.deploy.Pd92uy6NaL/master/rpm/fc28/noarch/vdsm-hook-allocate_net-4.30.0-619.git9252d21fb.fc28.noarch.rpm
2018-10-05 13:52:51,654::INFO::root::Saving /srv/resources/repos/ovirt/tested/.deploy.Pd92uy6NaL/master/rpm/fc28/noarch/vdsm-hook-boot_hostdev-4.30.0-619.git9252d21fb.fc28.noarch.rpm
2018-10-05 13:52:51,654::INFO::root::Saving /srv/resources/repos/ovirt/tested/.deploy.Pd92uy6NaL/master/rpm/fc28/noarch/vdsm-hook-checkimages-4.30.0-619.git9252d21fb.fc28.noarch.rpm
2018-10-05 13:52:51,654::INFO::root::Saving /srv/resources/repos/ovirt/tested/.deploy.Pd92uy6NaL/master/rpm/fc28/s390x/vdsm-hook-checkips-4.30.0-619.git9252d21fb.fc28.s390x.rpm
2018-10-05 13:52:51,655::INFO::root::Saving /srv/resources/repos/ovirt/tested/.deploy.Pd92uy6NaL/master/rpm/fc28/noarch/vdsm-hook-cpuflags-4.30.0-619.git9252d21fb.fc28.noarch.rpm
2018-10-05 13:52:51,655::INFO::root::Saving /srv/resources/repos/ovirt/tested/.deploy.Pd92uy6NaL/master/rpm/fc28/noarch/vdsm-hook-diskunmap-4.30.0-619.git9252d21fb.fc28.noarch.rpm
2018-10-05 13:52:51,655::INFO::root::Saving /srv/resources/repos/ovirt/tested/.deploy.Pd92uy6NaL/master/rpm/fc28/noarch/vdsm-hook-ethtool-options-4.30.0-619.git9252d21fb.fc28.noarch.rpm
2018-10-05 13:52:51,656::INFO::root::Saving /srv/resources/repos/ovirt/tested/.deploy.Pd92uy6NaL/master/rpm/fc28/noarch/vdsm-hook-extnet-4.30.0-619.git9252d21fb.fc28.noarch.rpm
2018-10-05 13:52:51,656::INFO::root::Saving /srv/resources/repos/ovirt/tested/.deploy.Pd92uy6NaL/master/rpm/fc28/s390x/vdsm-hook-extra-ipv4-addrs-4.30.0-619.git9252d21fb.fc28.s390x.rpm
2018-10-05 13:52:51,656::INFO::root::Saving /srv/resources/repos/ovirt/tested/.deploy.Pd92uy6NaL/master/rpm/fc28/noarch/vdsm-hook-fakevmstats-4.30.0-619.git9252d21fb.fc28.noarch.rpm
2018-10-05 13:52:51,656::INFO::root::Saving /srv/resources/repos/ovirt/tested/.deploy.Pd92uy6NaL/master/rpm/fc28/noarch/vdsm-hook-faqemu-4.30.0-619.git9252d21fb.fc28.noarch.rpm
2018-10-05 13:52:51,657::INFO::root::Saving /srv/resources/repos/ovirt/tested/.deploy.Pd92uy6NaL/master/rpm/fc28/noarch/vdsm-hook-fcoe-4.30.0-619.git9252d21fb.fc28.noarch.rpm
2018-10-05 13:52:51,657::INFO::root::Saving /srv/resources/repos/ovirt/tested/.deploy.Pd92uy6NaL/master/rpm/fc28/noarch/vdsm-hook-fileinject-4.30.0-619.git9252d21fb.fc28.noarch.rpm
2018-10-05 13:52:51,657::INFO::root::Saving /srv/resources/repos/ovirt/tested/.deploy.Pd92uy6NaL/master/rpm/fc28/noarch/vdsm-hook-floppy-4.30.0-619.git9252d21fb.fc28.noarch.rpm
2018-10-05 13:52:51,657::INFO::root::Saving /srv/resources/repos/ovirt/tested/.deploy.Pd92uy6NaL/master/rpm/fc28/noarch/vdsm-hook-httpsisoboot-4.30.0-619.git9252d21fb.fc28.noarch.rpm
2018-10-05 13:52:51,658::INFO::root::Saving /srv/resources/repos/ovirt/tested/.deploy.Pd92uy6NaL/master/rpm/fc28/noarch/vdsm-hook-isolatedprivatevlan-4.30.0-619.git9252d21fb.fc28.noarch.rpm
2018-10-05 13:52:51,658::INFO::root::Saving /srv/resources/repos/ovirt/tested/.deploy.Pd92uy6NaL/master/rpm/fc28/noarch/vdsm-hook-localdisk-4.30.0-619.git9252d21fb.fc28.noarch.rpm
2018-10-05 13:52:51,658::INFO::root::Saving /srv/resources/repos/ovirt/tested/.deploy.Pd92uy6NaL/master/rpm/fc28/noarch/vdsm-hook-macbind-4.30.0-619.git9252d21fb.fc28.noarch.rpm
2018-10-05 13:52:51,658::INFO::root::Saving /srv/resources/repos/ovirt/tested/.deploy.Pd92uy6NaL/master/rpm/fc28/noarch/vdsm-hook-macspoof-4.30.0-619.git9252d21fb.fc28.noarch.rpm
2018-10-05 13:52:51,659::INFO::root::Saving /srv/resources/repos/ovirt/tested/.deploy.Pd92uy6NaL/master/rpm/fc28/noarch/vdsm-hook-nestedvt-4.30.0-619.git9252d21fb.fc28.noarch.rpm
2018-10-05 13:52:51,659::INFO::root::Saving /srv/resources/repos/ovirt/tested/.deploy.Pd92uy6NaL/master/rpm/fc28/noarch/vdsm-hook-noipspoof-4.30.0-619.git9252d21fb.fc28.noarch.rpm
2018-10-05 13:52:51,659::INFO::root::Saving /srv/resources/repos/ovirt/tested/.deploy.Pd92uy6NaL/master/rpm/fc28/noarch/vdsm-hook-numa-4.30.0-619.git9252d21fb.fc28.noarch.rpm
2018-10-05 13:52:51,659::INFO::root::Saving /srv/resources/repos/ovirt/tested/.deploy.Pd92uy6NaL/master/rpm/fc28/noarch/vdsm-hook-openstacknet-4.30.0-619.git9252d21fb.fc28.noarch.rpm
2018-10-05 13:52:51,660::INFO::root::Saving /srv/resources/repos/ovirt/tested/.deploy.Pd92uy6NaL/master/rpm/fc28/noarch/vdsm-hook-pincpu-4.30.0-619.git9252d21fb.fc28.noarch.rpm
2018-10-05 13:52:51,660::INFO::root::Saving /srv/resources/repos/ovirt/tested/.deploy.Pd92uy6NaL/master/rpm/fc28/noarch/vdsm-hook-promisc-4.30.0-619.git9252d21fb.fc28.noarch.rpm
2018-10-05 13:52:51,660::INFO::root::Saving /srv/resources/repos/ovirt/tested/.deploy.Pd92uy6NaL/master/rpm/fc28/noarch/vdsm-hook-qemucmdline-4.30.0-619.git9252d21fb.fc28.noarch.rpm
2018-10-05 13:52:51,660::INFO::root::Saving /srv/resources/repos/ovirt/tested/.deploy.Pd92uy6NaL/master/rpm/fc28/noarch/vdsm-hook-qos-4.30.0-619.git9252d21fb.fc28.noarch.rpm
2018-10-05 13:52:51,660::INFO::root::Saving /srv/resources/repos/ovirt/tested/.deploy.Pd92uy6NaL/master/rpm/fc28/noarch/vdsm-hook-scratchpad-4.30.0-619.git9252d21fb.fc28.noarch.rpm
2018-10-05 13:52:51,661::INFO::root::Saving /srv/resources/repos/ovirt/tested/.deploy.Pd92uy6NaL/master/rpm/fc28/noarch/vdsm-hook-smbios-4.30.0-619.git9252d21fb.fc28.noarch.rpm
2018-10-05 13:52:51,661::INFO::root::Saving /srv/resources/repos/ovirt/tested/.deploy.Pd92uy6NaL/master/rpm/fc28/noarch/vdsm-hook-spiceoptions-4.30.0-619.git9252d21fb.fc28.noarch.rpm
2018-10-05 13:52:51,661::INFO::root::Saving /srv/resources/repos/ovirt/tested/.deploy.Pd92uy6NaL/master/rpm/fc28/noarch/vdsm-hook-vhostmd-4.30.0-619.git9252d21fb.fc28.noarch.rpm
2018-10-05 13:52:51,661::INFO::root::Saving /srv/resources/repos/ovirt/tested/.deploy.Pd92uy6NaL/master/rpm/fc28/noarch/vdsm-hook-vmdisk-4.30.0-619.git9252d21fb.fc28.noarch.rpm
2018-10-05 13:52:51,662::INFO::root::Saving /srv/resources/repos/ovirt/tested/.deploy.Pd92uy6NaL/master/rpm/fc28/noarch/vdsm-hook-vmfex-dev-4.30.0-619.git9252d21fb.fc28.noarch.rpm
2018-10-05 13:52:51,662::INFO::root::Saving /srv/resources/repos/ovirt/tested/.deploy.Pd92uy6NaL/master/rpm/fc28/noarch/vdsm-http-4.30.0-619.git9252d21fb.fc28.noarch.rpm
2018-10-05 13:52:51,662::INFO::root::Saving /srv/resources/repos/ovirt/tested/.deploy.Pd92uy6NaL/master/rpm/fc28/noarch/vdsm-jsonrpc-4.30.0-619.git9252d21fb.fc28.noarch.rpm
2018-10-05 13:52:51,662::INFO::root::Saving /srv/resources/repos/ovirt/tested/.deploy.Pd92uy6NaL/master/rpm/fc28/s390x/vdsm-network-4.30.0-619.git9252d21fb.fc28.s390x.rpm
2018-10-05 13:52:51,663::INFO::root::Saving /srv/resources/repos/ovirt/tested/.deploy.Pd92uy6NaL/master/rpm/fc28/noarch/vdsm-python-4.30.0-619.git9252d21fb.fc28.noarch.rpm
2018-10-05 13:52:51,663::INFO::root::Saving /srv/resources/repos/ovirt/tested/.deploy.Pd92uy6NaL/master/rpm/fc28/noarch/vdsm-tests-4.30.0-619.git9252d21fb.fc28.noarch.rpm
2018-10-05 13:52:51,663::INFO::root::Saving /srv/resources/repos/ovirt/tested/.deploy.Pd92uy6NaL/master/rpm/fc28/noarch/vdsm-yajsonrpc-4.30.0-619.git9252d21fb.fc28.noarch.rpm
2018-10-05 13:52:51,663::INFO::root::Saving /srv/resources/repos/ovirt/tested/.deploy.Pd92uy6NaL/master/rpm/fc28/x86_64/vdsm-4.30.0-619.git9252d21fb.fc28.x86_64.rpm
2018-10-05 13:52:51,664::INFO::root::Saving /srv/resources/repos/ovirt/tested/.deploy.Pd92uy6NaL/master/rpm/fc28/x86_64/vdsm-gluster-4.30.0-619.git9252d21fb.fc28.x86_64.rpm
2018-10-05 13:52:51,664::INFO::root::Saving /srv/resources/repos/ovirt/tested/.deploy.Pd92uy6NaL/master/rpm/fc28/x86_64/vdsm-hook-checkips-4.30.0-619.git9252d21fb.fc28.x86_64.rpm
2018-10-05 13:52:51,665::INFO::root::Saving /srv/resources/repos/ovirt/tested/.deploy.Pd92uy6NaL/master/rpm/fc28/x86_64/vdsm-hook-extra-ipv4-addrs-4.30.0-619.git9252d21fb.fc28.x86_64.rpm
2018-10-05 13:52:51,666::INFO::root::Saving /srv/resources/repos/ovirt/tested/.deploy.Pd92uy6NaL/master/rpm/fc28/x86_64/vdsm-network-4.30.0-619.git9252d21fb.fc28.x86_64.rpm
2018-10-05 13:52:51,667::INFO::root::Saving /srv/resources/repos/ovirt/tested/.deploy.Pd92uy6NaL/master/rpm/el7/noarch/vdsm-jsonrpc-java-1.4.15-2.20181005112302.gitc3f5fa5.el7.noarch.rpm
2018-10-05 13:52:51,667::INFO::root::Saving /srv/resources/repos/ovirt/tested/.deploy.Pd92uy6NaL/master/rpm/el7/SRPMS/vdsm-jsonrpc-java-1.4.15-2.20181005112302.gitc3f5fa5.el7.src.rpm
2018-10-05 13:52:51,667::INFO::root::Saving /srv/resources/repos/ovirt/tested/.deploy.Pd92uy6NaL/master/rpm/el7/noarch/vdsm-jsonrpc-java-javadoc-1.4.15-2.20181005112302.gitc3f5fa5.el7.noarch.rpm
2018-10-05 13:52:51,667::INFO::root::Saving /srv/resources/repos/ovirt/tested/.deploy.Pd92uy6NaL/master/rpm/fc28/noarch/vdsm-jsonrpc-java-1.4.15-2.20181005112644.gitc3f5fa5.fc28.noarch.rpm
2018-10-05 13:52:51,667::INFO::root::Saving /srv/resources/repos/ovirt/tested/.deploy.Pd92uy6NaL/master/rpm/fc28/SRPMS/vdsm-jsonrpc-java-1.4.15-2.20181005112644.gitc3f5fa5.fc28.src.rpm
2018-10-05 13:52:51,668::INFO::root::Saving /srv/resources/repos/ovirt/tested/.deploy.Pd92uy6NaL/master/rpm/fc28/noarch/vdsm-jsonrpc-java-javadoc-1.4.15-2.20181005112644.gitc3f5fa5.fc28.noarch.rpm
2018-10-05 13:52:51,668::INFO::root::Saving /srv/resources/repos/ovirt/tested/.deploy.Pd92uy6NaL/master/rpm/el7/noarch/ovirt-ansible-hosted-engine-setup-1.0.2-0.1.master.20181005120739.el7.noarch.rpm
2018-10-05 13:52:51,668::INFO::root::Saving /srv/resources/repos/ovirt/tested/.deploy.Pd92uy6NaL/master/rpm/el7/SRPMS/ovirt-ansible-hosted-engine-setup-1.0.2-0.1.master.20181005120739.el7.src.rpm
2018-10-05 13:52:51,668::INFO::root::Saving /srv/resources/repos/ovirt/tested/.deploy.Pd92uy6NaL/master/rpm/fc28/noarch/ovirt-ansible-hosted-engine-setup-1.0.2-0.1.master.20181005121232.fc28.noarch.rpm
2018-10-05 13:52:51,669::INFO::root::Saving /srv/resources/repos/ovirt/tested/.deploy.Pd92uy6NaL/master/rpm/fc28/SRPMS/ovirt-ansible-hosted-engine-setup-1.0.2-0.1.master.20181005121232.fc28.src.rpm
2018-10-05 13:52:51,669::INFO::repoman.common.stores.RPM::
2018-10-05 13:52:51,669::INFO::repoman.common.stores.RPM::Updating metadata
2018-10-05 13:52:51,669::INFO::repoman.common.stores.RPM:: Creating metadata for el7
2018-10-05 13:52:51,817::INFO::repoman.common.stores.RPM:: Creating metadata for fc28
2018-10-05 13:52:55,043::INFO::repoman.common.stores.RPM::
2018-10-05 13:52:55,047::INFO::repoman.common.stores.RPM::Creating symlinks
2018-10-05 13:52:55,048::INFO::repoman.common.stores.RPM::
2018-10-05 13:52:55,048::INFO::repoman.common.stores.RPM::Saved /srv/resources/repos/ovirt/tested/.deploy.Pd92uy6NaL/master
2018-10-05 13:52:55,049::INFO::repoman.common.stores.iso::Saving new added isos into /srv/resources/repos/ovirt/tested/.deploy.Pd92uy6NaL/master
2018-10-05 13:52:55,049::INFO::repoman.common.stores.iso::
2018-10-05 13:52:55,049::INFO::repoman.common.stores.iso::Saved /srv/resources/repos/ovirt/tested/.deploy.Pd92uy6NaL/master
2018-10-05 13:52:55,082::INFO::repoman.common.repo::Cleaning up temporary dir /srv/resources/repos/ovirt/tested/.deploy.Pd92uy6NaL/.lago_tmp/tmpXC3ICy/tmpOOfaoD
2018-10-05 13:52:55,084::INFO::repoman.common.repo::Cleaning up temporary dir /srv/resources/repos/ovirt/tested/.deploy.Pd92uy6NaL/.lago_tmp/tmpXC3ICy
Publishing to repo
+ echo 'Publishing to repo'
+ push_to_tested /srv/resources/repos/ovirt/tested/.deploy.Pd92uy6NaL /srv/resources/repos/ovirt/tested
+ local pkg_src=/srv/resources/repos/ovirt/tested/.deploy.Pd92uy6NaL
+ local pkg_dst=/srv/resources/repos/ovirt/tested
+ cd /srv/resources/repos/ovirt/tested/.deploy.Pd92uy6NaL
+ find . -type d '!' -name repodata
+ tac
+ read dir
+ install -o deploy-ovirt-experimental -m 755 -d /srv/resources/repos/ovirt/tested/./.lago_tmp
+ find ./.lago_tmp -maxdepth 1 '!' -type d -print0
+ xargs -0 -r cp -RPplf -t /srv/resources/repos/ovirt/tested/./.lago_tmp
+ [[ -d ./.lago_tmp/repodata ]]
+ read dir
+ install -o deploy-ovirt-experimental -m 755 -d /srv/resources/repos/ovirt/tested/./master/rpm/el7/noarch
+ find ./master/rpm/el7/noarch -maxdepth 1 '!' -type d -print0
+ xargs -0 -r cp -RPplf -t /srv/resources/repos/ovirt/tested/./master/rpm/el7/noarch
+ [[ -d ./master/rpm/el7/noarch/repodata ]]
+ read dir
+ install -o deploy-ovirt-experimental -m 755 -d /srv/resources/repos/ovirt/tested/./master/rpm/el7/x86_64
+ find ./master/rpm/el7/x86_64 -maxdepth 1 '!' -type d -print0
+ xargs -0 -r cp -RPplf -t /srv/resources/repos/ovirt/tested/./master/rpm/el7/x86_64
+ [[ -d ./master/rpm/el7/x86_64/repodata ]]
+ read dir
+ install -o deploy-ovirt-experimental -m 755 -d /srv/resources/repos/ovirt/tested/./master/rpm/el7/SRPMS
+ find ./master/rpm/el7/SRPMS -maxdepth 1 '!' -type d -print0
+ xargs -0 -r cp -RPplf -t /srv/resources/repos/ovirt/tested/./master/rpm/el7/SRPMS
+ [[ -d ./master/rpm/el7/SRPMS/repodata ]]
+ xargs -L 1 -P 8 -r rm -f
+ comm -23 /dev/fd/63 /dev/fd/62
++ sort
++ repomanage -k1 --new -c /srv/resources/repos/ovirt/tested/./master/rpm/el7/SRPMS
++ sort
++ find /srv/resources/repos/ovirt/tested/master/rpm/el7/SRPMS -name '*.rpm' -type f -mtime +14
+ createrepo_c --update --retain-old-md 50 --workers 8 /srv/resources/repos/ovirt/tested/./master/rpm/el7/SRPMS
Directory walk started
Directory walk done - 188 packages
Loaded information about 185 packages
Temporary output repo path: /srv/resources/repos/ovirt/tested/./master/rpm/el7/SRPMS/.repodata/
Preparing sqlite DBs
Pool started (with 8 workers)
Pool finished
+ read dir
+ install -o deploy-ovirt-experimental -m 755 -d /srv/resources/repos/ovirt/tested/./master/rpm/el7/ppc64le
+ find ./master/rpm/el7/ppc64le -maxdepth 1 '!' -type d -print0
+ xargs -0 -r cp -RPplf -t /srv/resources/repos/ovirt/tested/./master/rpm/el7/ppc64le
+ [[ -d ./master/rpm/el7/ppc64le/repodata ]]
+ read dir
+ install -o deploy-ovirt-experimental -m 755 -d /srv/resources/repos/ovirt/tested/./master/rpm/el7
+ find ./master/rpm/el7 -maxdepth 1 '!' -type d -print0
+ xargs -0 -r cp -RPplf -t /srv/resources/repos/ovirt/tested/./master/rpm/el7
+ [[ -d ./master/rpm/el7/repodata ]]
+ xargs -L 1 -P 8 -r rm -f
+ comm -23 /dev/fd/63 /dev/fd/62
++ sort
++ find /srv/resources/repos/ovirt/tested/master/rpm/el7 -name '*.rpm' -type f -mtime +14
++ repomanage -k1 --new -c /srv/resources/repos/ovirt/tested/./master/rpm/el7
++ sort
+ createrepo_c --update --retain-old-md 50 --workers 8 /srv/resources/repos/ovirt/tested/./master/rpm/el7
/home/deploy-ovirt-experimental/deploy-to-tested.sh: line 33: 53432 Killed createrepo_c --update --retain-old-md "$PUBLISH_MD_COPIES" --workers 8 "$pkg_dst/$dir"
+ rm -rf /srv/resources/repos/ovirt/tested/.deploy.Pd92uy6NaL
Build step 'Execute shell' marked build as failure
$ ssh-agent -k
unset SSH_AUTH_SOCK;
unset SSH_AGENT_PID;
echo Agent pid 19974 killed;
[ssh-agent] Stopped.
6 years, 2 months
[JENKINS] Failed to setup proejct
kubevirt_kubevirt_standard-check-pr
by jenkins@jenkins.phx.ovirt.org
Failed to run project_setup.sh for:
#2106 kubevirt [check-patch].
It probably means that docker_cleanup.py failed.
This step doesn't fail the job, but we do collect
data about such failures to find the root cause.
Infra owner, ensure that we're not running out of
disk space on ovirt-srv04.phx.ovirt.org-container-5.
6 years, 2 months