[oVirt Jenkins] ovirt-system-tests_basic-suite-master_nightly -
Build # 277 - Failure!
by jenkins@jenkins.phx.ovirt.org
Project: https://jenkins.ovirt.org/job/ovirt-system-tests_basic-suite-master_nightly/
Build: https://jenkins.ovirt.org/job/ovirt-system-tests_basic-suite-master_night...
Build Number: 277
Build Status: Failure
Triggered By: Started by timer
-------------------------------------
Changes Since Last Success:
-------------------------------------
Changes for Build #277
[Marcin Sobczyk] ost_utils: ansible: Sort events by creation date
-----------------
Failed Tests:
-----------------
1 tests failed.
FAILED: basic-suite-master.test-scenarios.004_basic_sanity_pytest.test_sparsify_disk1
Error Message:
AssertionError: False != True after 600 seconds
Stack Trace:
api_v4 = <ovirtsdk4.Connection object at 0x7f09db47a850>
@order_by(_TEST_LIST)
def test_sparsify_disk1(api_v4):
engine = api_v4.system_service()
disk_service = test_utils.get_disk_service(engine, DISK1_NAME)
with test_utils.TestEvent(engine, 1325): # USER_SPARSIFY_IMAGE_START event
disk_service.sparsify()
with test_utils.TestEvent(engine, 1326): # USER_SPARSIFY_IMAGE_FINISH_SUCCESS
> pass
../basic-suite-master/test-scenarios/004_basic_sanity_pytest.py:295:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/lib64/python2.7/contextlib.py:24: in __exit__
self.gen.next()
../basic-suite-master/test_utils/__init__.py:252: in TestEvent
lambda:
/usr/lib/python2.7/site-packages/ovirtlago/testlib.py:286: in assert_true_within_long
assert_equals_within_long(func, True, allowed_exceptions)
/usr/lib/python2.7/site-packages/ovirtlago/testlib.py:273: in assert_equals_within_long
func, value, LONG_TIMEOUT, allowed_exceptions=allowed_exceptions
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
func = <function <lambda> at 0x7f09db4781b8>, value = True, timeout = 600
allowed_exceptions = [], initial_wait = 0
def assert_equals_within(
func, value, timeout, allowed_exceptions=None, initial_wait=10
):
allowed_exceptions = allowed_exceptions or []
with utils.EggTimer(timeout) as timer:
while not timer.elapsed():
try:
res = func()
if res == value:
return
except Exception as exc:
if _instance_of_any(exc, allowed_exceptions):
time.sleep(3)
continue
LOGGER.exception("Unhandled exception in %s", func)
raise
if initial_wait == 0:
time.sleep(3)
else:
time.sleep(initial_wait)
initial_wait = 0
try:
raise AssertionError(
> '%s != %s after %s seconds' % (res, value, timeout)
)
E AssertionError: False != True after 600 seconds
/usr/lib/python2.7/site-packages/ovirtlago/testlib.py:252: AssertionError
4 years, 3 months
Build failed in Jenkins: deploy-to_ovirt-master_tested #12534
by jenkins@jenkins.phx.ovirt.org
See <https://jenkins.ovirt.org/job/deploy-to_ovirt-master_tested/12534/display...>
Changes:
------------------------------------------
Started by upstream project "ovirt-master_change-queue-tester" build number 25422
originally caused by:
Started by upstream project "ovirt-master_change-queue" build number 69346
originally caused by:
Started by upstream project "ovirt-system-tests_standard-on-merge" build number 1177
originally caused by:
Triggered by Gerrit: https://gerrit.ovirt.org/110617
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on vm0010.workers-phx.ovirt.org (el7 phx nested) in workspace <https://jenkins.ovirt.org/job/deploy-to_ovirt-master_tested/ws/>
[ssh-agent] Looking for ssh-agent implementation...
[ssh-agent] Exec ssh-agent (binary ssh-agent on a remote machine)
$ ssh-agent
SSH_AUTH_SOCK=/tmp/ssh-Bq2kYpEEXnfb/agent.28835
SSH_AGENT_PID=28840
[ssh-agent] Started.
Running ssh-add (command line suppressed)
Identity added: <https://jenkins.ovirt.org/job/deploy-to_ovirt-master_tested/ws/@tmp/priva...> (<https://jenkins.ovirt.org/job/deploy-to_ovirt-master_tested/ws/@tmp/priva...)>
[ssh-agent] Using credentials deploy-ovirt-experimental (SSH key for deploying to the tested repo)
[deploy-to_ovirt-master_tested] $ /bin/bash -xe /tmp/jenkins4317454223203596458.sh
+ [[ jenkins:https://jenkins.ovirt.org/job/ovirt-system-tests_standard-on-merg... == '' ]]
+ queue_name=ovirt-master
+ echo repo-extra-dir:master
+ ssh -o StrictHostKeyChecking=no deploy-ovirt-experimental(a)resources.ovirt.org
+ echo jenkins:https://jenkins.ovirt.org/job/ovirt-system-tests_standard-on-merg...
Pseudo-terminal will not be allocated because stdin is not a terminal.
Warning: Permanently added 'resources.ovirt.org,66.187.230.28' (ECDSA) to the list of known hosts.
+ BASE_DIR=/srv/resources/repos/ovirt/tested
+ PUBLISH_MD_COPIES=50
+ main
+ local tmp_dir
+ mkdir -p /srv/resources/repos/ovirt/tested
++ mktemp -d /srv/resources/repos/ovirt/tested/.deploy.XXXXXXXXXX
Collecting packages
+ tmp_dir=/srv/resources/repos/ovirt/tested/.deploy.Sy5a26gDe1
+ trap 'rm -rf '\''/srv/resources/repos/ovirt/tested/.deploy.Sy5a26gDe1'\''' EXIT HUP
+ echo 'Collecting packages'
+ collect_packages /srv/resources/repos/ovirt/tested/.deploy.Sy5a26gDe1
+ local repoman_dst=/srv/resources/repos/ovirt/tested/.deploy.Sy5a26gDe1
+ repoman --temp-dir generate-in-repo --option main.allowed_repo_paths=/srv/resources/repos/ovirt/tested/.deploy.Sy5a26gDe1 --option main.on_empty_source=warn /srv/resources/repos/ovirt/tested/.deploy.Sy5a26gDe1 add conf:stdin
2020-08-17 11:40:53,560::INFO::repoman.cmd::
2020-08-17 11:40:53,560::INFO::repoman.cmd::Adding artifacts to the repo /srv/resources/repos/ovirt/tested/.deploy.Sy5a26gDe1
2020-08-17 11:40:53,560::INFO::repoman.common.repo::Adding repo extra dir master
2020-08-17 11:40:53,563::INFO::repoman.common.stores.RPM::Loading repo /srv/resources/repos/ovirt/tested/.deploy.Sy5a26gDe1/master
2020-08-17 11:40:53,563::INFO::repoman.common.stores.RPM::Repo /srv/resources/repos/ovirt/tested/.deploy.Sy5a26gDe1/master loaded
2020-08-17 11:40:53,564::INFO::repoman.common.stores.iso::Loading repo /srv/resources/repos/ovirt/tested/.deploy.Sy5a26gDe1/master
2020-08-17 11:40:53,564::INFO::repoman.common.stores.iso::Repo /srv/resources/repos/ovirt/tested/.deploy.Sy5a26gDe1/master loaded
2020-08-17 11:40:53,574::INFO::repoman.common.repo::Resolving artifact source jenkins:https://jenkins.ovirt.org/job/ovirt-system-tests_standard-on-merg...
2020-08-17 11:40:54,130::INFO::repoman.common.sources.jenkins::Parsing jenkins URL: https://jenkins.ovirt.org/job/ovirt-system-tests_standard-on-merge/1177/
2020-08-17 11:40:54,132::INFO::repoman.common.sources.jenkins:: Got URL: https://jenkins.ovirt.org/job/ovirt-system-tests_standard-on-merge/1177//...
2020-08-17 11:40:54,132::INFO::repoman.common.sources.jenkins:: Got URL: https://jenkins.ovirt.org/job/ovirt-system-tests_standard-on-merge/1177//...
2020-08-17 11:40:54,133::INFO::root:: Done
2020-08-17 11:40:54,190::INFO::root::Downloading https://jenkins.ovirt.org/job/ovirt-system-tests_standard-on-merge/1177//..., length 19K ...
%[-----------------------25------------------------50-----------------------75------------------------] %[====================================================================================================]
2020-08-17 11:40:54,304::INFO::repoman.common.stores.RPM::Adding package /srv/resources/repos/ovirt/tested/.deploy.Sy5a26gDe1/.lago_tmp/tmpcUFtj2/tmpA74IB1/ovirtlib-0.0.1-0.20200817112919.git2082307.el7.src.rpm to repo /srv/resources/repos/ovirt/tested/.deploy.Sy5a26gDe1/master
2020-08-17 11:40:54,361::INFO::root::Downloading https://jenkins.ovirt.org/job/ovirt-system-tests_standard-on-merge/1177//..., length 47K ...
%[-----------------------25------------------------50-----------------------75------------------------] %[====================================================================================================]
2020-08-17 11:40:54,469::INFO::repoman.common.stores.RPM::Adding package /srv/resources/repos/ovirt/tested/.deploy.Sy5a26gDe1/.lago_tmp/tmpcUFtj2/tmpA74IB1/python2-ovirtlib-0.0.1-0.20200817112919.git2082307.el7.noarch.rpm to repo /srv/resources/repos/ovirt/tested/.deploy.Sy5a26gDe1/master
2020-08-17 11:40:54,470::INFO::repoman.cmd::
2020-08-17 11:40:54,470::INFO::repoman.common.stores.RPM::Saving new added rpms into /srv/resources/repos/ovirt/tested/.deploy.Sy5a26gDe1/master
2020-08-17 11:40:54,470::INFO::root::Saving /srv/resources/repos/ovirt/tested/.deploy.Sy5a26gDe1/master/rpm/el7/SRPMS/ovirtlib-0.0.1-0.20200817112919.git2082307.el7.src.rpm
2020-08-17 11:40:54,472::INFO::root::Saving /srv/resources/repos/ovirt/tested/.deploy.Sy5a26gDe1/master/rpm/el7/noarch/python2-ovirtlib-0.0.1-0.20200817112919.git2082307.el7.noarch.rpm
2020-08-17 11:40:54,473::INFO::repoman.common.stores.RPM::
2020-08-17 11:40:54,473::INFO::repoman.common.stores.RPM::Updating metadata
2020-08-17 11:40:54,473::INFO::repoman.common.stores.RPM:: Creating metadata for el7
2020-08-17 11:40:54,797::INFO::repoman.common.stores.RPM::
2020-08-17 11:40:54,797::INFO::repoman.common.stores.RPM::Creating symlinks
2020-08-17 11:40:54,798::INFO::repoman.common.stores.RPM::
2020-08-17 11:40:54,798::INFO::repoman.common.stores.RPM::Saved /srv/resources/repos/ovirt/tested/.deploy.Sy5a26gDe1/master
2020-08-17 11:40:54,798::INFO::repoman.common.stores.iso::Saving new added isos into /srv/resources/repos/ovirt/tested/.deploy.Sy5a26gDe1/master
2020-08-17 11:40:54,798::INFO::repoman.common.stores.iso::
2020-08-17 11:40:54,799::INFO::repoman.common.stores.iso::Saved /srv/resources/repos/ovirt/tested/.deploy.Sy5a26gDe1/master
2020-08-17 11:40:54,800::INFO::repoman.common.repo::Cleaning up temporary dir /srv/resources/repos/ovirt/tested/.deploy.Sy5a26gDe1/.lago_tmp/tmpcUFtj2/tmpA74IB1
2020-08-17 11:40:54,800::INFO::repoman.common.repo::Cleaning up temporary dir /srv/resources/repos/ovirt/tested/.deploy.Sy5a26gDe1/.lago_tmp/tmpcUFtj2
Publishing to repo
+ echo 'Publishing to repo'
+ push_to_tested /srv/resources/repos/ovirt/tested/.deploy.Sy5a26gDe1 /srv/resources/repos/ovirt/tested
+ local pkg_src=/srv/resources/repos/ovirt/tested/.deploy.Sy5a26gDe1
+ local pkg_dst=/srv/resources/repos/ovirt/tested
+ iso_cleanup /srv/resources/repos/ovirt/tested
+ local max_file_age=8
+ local minimal_files_allowed=3
++ find /srv/resources/repos/ovirt/tested -mindepth 1 -maxdepth 1 -type d '!' -name '.*'
+ list_of_sub_dirs='/srv/resources/repos/ovirt/tested/master.under_testing
/srv/resources/repos/ovirt/tested/latest
/srv/resources/repos/ovirt/tested/4.3.under_testing
/srv/resources/repos/ovirt/tested/master
/srv/resources/repos/ovirt/tested/4.4
/srv/resources/repos/ovirt/tested/4.2
/srv/resources/repos/ovirt/tested/4.3'
+ test -z '/srv/resources/repos/ovirt/tested/master.under_testing
/srv/resources/repos/ovirt/tested/latest
/srv/resources/repos/ovirt/tested/4.3.under_testing
/srv/resources/repos/ovirt/tested/master
/srv/resources/repos/ovirt/tested/4.4
/srv/resources/repos/ovirt/tested/4.2
/srv/resources/repos/ovirt/tested/4.3'
+ for dir in '${list_of_sub_dirs[@]}'
+ full_path=/srv/resources/repos/ovirt/tested/master.under_testing/iso/ovirt-node-ng-installer
+ '[' '!' -d /srv/resources/repos/ovirt/tested/master.under_testing/iso/ovirt-node-ng-installer ']'
+ isos_matching_policy=($(find ${full_path} -type f -name "*.iso" -mtime +${max_file_age}))
++ find /srv/resources/repos/ovirt/tested/master.under_testing/iso/ovirt-node-ng-installer -type f -name '*.iso' -mtime +8
+ test -z /srv/resources/repos/ovirt/tested/master.under_testing/iso/ovirt-node-ng-installer/4.4.0-2020051007/el8/ovirt-node-ng-installer-4.4.0-2020051007.el8.iso
+ all_isos=($(find ${full_path} -type f -name "*.iso"))
++ find /srv/resources/repos/ovirt/tested/master.under_testing/iso/ovirt-node-ng-installer -type f -name '*.iso'
there are less than 3 isos under /srv/resources/repos/ovirt/tested/master.under_testing/iso/ovirt-node-ng-installer
+ '[' 2 -lt 3 ']'
+ echo 'there are less than 3 isos under /srv/resources/repos/ovirt/tested/master.under_testing/iso/ovirt-node-ng-installer'
+ continue
+ for dir in '${list_of_sub_dirs[@]}'
+ full_path=/srv/resources/repos/ovirt/tested/latest/iso/ovirt-node-ng-installer
+ '[' '!' -d /srv/resources/repos/ovirt/tested/latest/iso/ovirt-node-ng-installer ']'
following path does not exist: /srv/resources/repos/ovirt/tested/latest/iso/ovirt-node-ng-installer
+ echo 'following path does not exist: /srv/resources/repos/ovirt/tested/latest/iso/ovirt-node-ng-installer'
+ continue
+ for dir in '${list_of_sub_dirs[@]}'
+ full_path=/srv/resources/repos/ovirt/tested/4.3.under_testing/iso/ovirt-node-ng-installer
+ '[' '!' -d /srv/resources/repos/ovirt/tested/4.3.under_testing/iso/ovirt-node-ng-installer ']'
+ isos_matching_policy=($(find ${full_path} -type f -name "*.iso" -mtime +${max_file_age}))
++ find /srv/resources/repos/ovirt/tested/4.3.under_testing/iso/ovirt-node-ng-installer -type f -name '*.iso' -mtime +8
+ test -z /srv/resources/repos/ovirt/tested/4.3.under_testing/iso/ovirt-node-ng-installer/4.3.10-2020051108/el7/ovirt-node-ng-installer-4.3.10-2020051108.el7.iso
+ all_isos=($(find ${full_path} -type f -name "*.iso"))
++ find /srv/resources/repos/ovirt/tested/4.3.under_testing/iso/ovirt-node-ng-installer -type f -name '*.iso'
there are less than 3 isos under /srv/resources/repos/ovirt/tested/4.3.under_testing/iso/ovirt-node-ng-installer
+ '[' 2 -lt 3 ']'
+ echo 'there are less than 3 isos under /srv/resources/repos/ovirt/tested/4.3.under_testing/iso/ovirt-node-ng-installer'
+ continue
+ for dir in '${list_of_sub_dirs[@]}'
+ full_path=/srv/resources/repos/ovirt/tested/master/iso/ovirt-node-ng-installer
+ '[' '!' -d /srv/resources/repos/ovirt/tested/master/iso/ovirt-node-ng-installer ']'
+ isos_matching_policy=($(find ${full_path} -type f -name "*.iso" -mtime +${max_file_age}))
++ find /srv/resources/repos/ovirt/tested/master/iso/ovirt-node-ng-installer -type f -name '*.iso' -mtime +8
no iso files matching policy under /srv/resources/repos/ovirt/tested/master
following path does not exist: /srv/resources/repos/ovirt/tested/4.4/iso/ovirt-node-ng-installer
+ test -z ''
+ echo 'no iso files matching policy under /srv/resources/repos/ovirt/tested/master'
+ continue
+ for dir in '${list_of_sub_dirs[@]}'
+ full_path=/srv/resources/repos/ovirt/tested/4.4/iso/ovirt-node-ng-installer
+ '[' '!' -d /srv/resources/repos/ovirt/tested/4.4/iso/ovirt-node-ng-installer ']'
+ echo 'following path does not exist: /srv/resources/repos/ovirt/tested/4.4/iso/ovirt-node-ng-installer'
+ continue
+ for dir in '${list_of_sub_dirs[@]}'
+ full_path=/srv/resources/repos/ovirt/tested/4.2/iso/ovirt-node-ng-installer
+ '[' '!' -d /srv/resources/repos/ovirt/tested/4.2/iso/ovirt-node-ng-installer ']'
+ isos_matching_policy=($(find ${full_path} -type f -name "*.iso" -mtime +${max_file_age}))
++ find /srv/resources/repos/ovirt/tested/4.2/iso/ovirt-node-ng-installer -type f -name '*.iso' -mtime +8
+ test -z /srv/resources/repos/ovirt/tested/4.2/iso/ovirt-node-ng-installer/4.2.0-2019091608/el7/ovirt-node-ng-installer-4.2.0-2019091608.el7.iso
+ all_isos=($(find ${full_path} -type f -name "*.iso"))
++ find /srv/resources/repos/ovirt/tested/4.2/iso/ovirt-node-ng-installer -type f -name '*.iso'
there are less than 3 isos under /srv/resources/repos/ovirt/tested/4.2/iso/ovirt-node-ng-installer
+ '[' 2 -lt 3 ']'
+ echo 'there are less than 3 isos under /srv/resources/repos/ovirt/tested/4.2/iso/ovirt-node-ng-installer'
+ continue
+ for dir in '${list_of_sub_dirs[@]}'
+ full_path=/srv/resources/repos/ovirt/tested/4.3/iso/ovirt-node-ng-installer
+ '[' '!' -d /srv/resources/repos/ovirt/tested/4.3/iso/ovirt-node-ng-installer ']'
+ isos_matching_policy=($(find ${full_path} -type f -name "*.iso" -mtime +${max_file_age}))
++ find /srv/resources/repos/ovirt/tested/4.3/iso/ovirt-node-ng-installer -type f -name '*.iso' -mtime +8
+ test -z /srv/resources/repos/ovirt/tested/4.3/iso/ovirt-node-ng-installer/4.3.11-2020080808/el7/ovirt-node-ng-installer-4.3.11-2020080808.el7.iso
+ all_isos=($(find ${full_path} -type f -name "*.iso"))
++ find /srv/resources/repos/ovirt/tested/4.3/iso/ovirt-node-ng-installer -type f -name '*.iso'
+ '[' 10 -lt 3 ']'
+ isos_for_deletion_str=/srv/resources/repos/ovirt/tested/4.3/iso/ovirt-node-ng-installer/4.3.11-2020080808/el7/ovirt-node-ng-installer-4.3.11-2020080808.el7.iso
+ all_isos_str='/srv/resources/repos/ovirt/tested/4.3/iso/ovirt-node-ng-installer/4.3.11-2020081708/el7/ovirt-node-ng-installer-4.3.11-2020081708.el7.iso /srv/resources/repos/ovirt/tested/4.3/iso/ovirt-node-ng-installer/4.3.11-2020080808/el7/ovirt-node-ng-installer-4.3.11-2020080808.el7.iso /srv/resources/repos/ovirt/tested/4.3/iso/ovirt-node-ng-installer/4.3.11-2020081208/el7/ovirt-node-ng-installer-4.3.11-2020081208.el7.iso /srv/resources/repos/ovirt/tested/4.3/iso/ovirt-node-ng-installer/4.3.11-2020081408/el7/ovirt-node-ng-installer-4.3.11-2020081408.el7.iso /srv/resources/repos/ovirt/tested/4.3/iso/ovirt-node-ng-installer/4.3.11-2020080908/el7/ovirt-node-ng-installer-4.3.11-2020080908.el7.iso /srv/resources/repos/ovirt/tested/4.3/iso/ovirt-node-ng-installer/4.3.11-2020081108/el7/ovirt-node-ng-installer-4.3.11-2020081108.el7.iso /srv/resources/repos/ovirt/tested/4.3/iso/ovirt-node-ng-installer/4.3.11-2020081308/el7/ovirt-node-ng-installer-4.3.11-2020081308.el7.iso /srv/resources/repos/ovirt/tested/4.3/iso/ovirt-node-ng-installer/4.3.11-2020081608/el7/ovirt-node-ng-installer-4.3.11-2020081608.el7.iso /srv/resources/repos/ovirt/tested/4.3/iso/ovirt-node-ng-installer/4.3.11-2020081008/el7/ovirt-node-ng-installer-4.3.11-2020081008.el7.iso /srv/resources/repos/ovirt/tested/4.3/iso/ovirt-node-ng-installer/4.3.11-2020081508/el7/ovirt-node-ng-installer-4.3.11-2020081508.el7.iso'
+ '[' /srv/resources/repos/ovirt/tested/4.3/iso/ovirt-node-ng-installer/4.3.11-2020080808/el7/ovirt-node-ng-installer-4.3.11-2020080808.el7.iso == '/srv/resources/repos/ovirt/tested/4.3/iso/ovirt-node-ng-installer/4.3.11-2020081708/el7/ovirt-node-ng-installer-4.3.11-2020081708.el7.iso /srv/resources/repos/ovirt/tested/4.3/iso/ovirt-node-ng-installer/4.3.11-2020080808/el7/ovirt-node-ng-installer-4.3.11-2020080808.el7.iso /srv/resources/repos/ovirt/tested/4.3/iso/ovirt-node-ng-installer/4.3.11-2020081208/el7/ovirt-node-ng-installer-4.3.11-2020081208.el7.iso /srv/resources/repos/ovirt/tested/4.3/iso/ovirt-node-ng-installer/4.3.11-2020081408/el7/ovirt-node-ng-installer-4.3.11-2020081408.el7.iso /srv/resources/repos/ovirt/tested/4.3/iso/ovirt-node-ng-installer/4.3.11-2020080908/el7/ovirt-node-ng-installer-4.3.11-2020080908.el7.iso /srv/resources/repos/ovirt/tested/4.3/iso/ovirt-node-ng-installer/4.3.11-2020081108/el7/ovirt-node-ng-installer-4.3.11-2020081108.el7.iso /srv/resources/repos/ovirt/tested/4.3/iso/ovirt-node-ng-installer/4.3.11-2020081308/el7/ovirt-node-ng-installer-4.3.11-2020081308.el7.iso /srv/resources/repos/ovirt/tested/4.3/iso/ovirt-node-ng-installer/4.3.11-2020081608/el7/ovirt-node-ng-installer-4.3.11-2020081608.el7.iso /srv/resources/repos/ovirt/tested/4.3/iso/ovirt-node-ng-installer/4.3.11-2020081008/el7/ovirt-node-ng-installer-4.3.11-2020081008.el7.iso /srv/resources/repos/ovirt/tested/4.3/iso/ovirt-node-ng-installer/4.3.11-2020081508/el7/ovirt-node-ng-installer-4.3.11-2020081508.el7.iso' ']'
+ to_be_deleted=("${isos_matching_policy[@]}")
+ for file in '${to_be_deleted[@]}'
+ rm /srv/resources/repos/ovirt/tested/4.3/iso/ovirt-node-ng-installer/4.3.11-2020080808/el7/ovirt-node-ng-installer-4.3.11-2020080808.el7.iso
rm: cannot remove ���/srv/resources/repos/ovirt/tested/4.3/iso/ovirt-node-ng-installer/4.3.11-2020080808/el7/ovirt-node-ng-installer-4.3.11-2020080808.el7.iso���: No such file or directory
+ rm -rf /srv/resources/repos/ovirt/tested/.deploy.Sy5a26gDe1
Build step 'Execute shell' marked build as failure
$ ssh-agent -k
unset SSH_AUTH_SOCK;
unset SSH_AGENT_PID;
echo Agent pid 28840 killed;
[ssh-agent] Stopped.
4 years, 3 months
[oVirt Jenkins] ovirt-system-tests_basic-suite-4.3_nightly - Build
# 287 - Failure!
by jenkins@jenkins.phx.ovirt.org
Project: https://jenkins.ovirt.org/job/ovirt-system-tests_basic-suite-4.3_nightly/
Build: https://jenkins.ovirt.org/job/ovirt-system-tests_basic-suite-4.3_nightly/...
Build Number: 287
Build Status: Failure
Triggered By: Started by timer
-------------------------------------
Changes Since Last Success:
-------------------------------------
Changes for Build #287
[Michal Skrivanek] avoid ballooning crash in vm0 after memory hotplug
-----------------
Failed Tests:
-----------------
1 tests failed.
FAILED: 005_network_by_label.assign_labeled_network
Error Message:
False != True after 180 seconds
Stack Trace:
Traceback (most recent call last):
File "/usr/lib64/python2.7/unittest/case.py", line 369, in run
testMethod()
File "/usr/lib/python2.7/site-packages/nose/case.py", line 197, in runTest
self.test(*self.arg)
File "/usr/lib/python2.7/site-packages/ovirtlago/testlib.py", line 142, in wrapped_test
test()
File "/usr/lib/python2.7/site-packages/ovirtlago/testlib.py", line 60, in wrapper
return func(get_test_prefix(), *args, **kwargs)
File "/usr/lib/python2.7/site-packages/ovirtlago/testlib.py", line 79, in wrapper
prefix.virt_env.engine_vm().get_api(api_ver=4), *args, **kwargs
File "/home/jenkins/agent/workspace/ovirt-system-tests_basic-suite-4.3_nightly/ovirt-system-tests/basic-suite-4.3/test-scenarios/005_network_by_label.py", line 149, in assign_labeled_network
host_service, LABELED_NET_NAME))
File "/usr/lib/python2.7/site-packages/ovirtlago/testlib.py", line 282, in assert_true_within_short
assert_equals_within_short(func, True, allowed_exceptions)
File "/usr/lib/python2.7/site-packages/ovirtlago/testlib.py", line 266, in assert_equals_within_short
func, value, SHORT_TIMEOUT, allowed_exceptions=allowed_exceptions
File "/usr/lib/python2.7/site-packages/ovirtlago/testlib.py", line 252, in assert_equals_within
'%s != %s after %s seconds' % (res, value, timeout)
AssertionError: False != True after 180 seconds
4 years, 3 months
[oVirt Jenkins] ovirt-system-tests_basic-suite-4.3_nightly - Build
# 279 - Failure!
by jenkins@jenkins.phx.ovirt.org
Project: https://jenkins.ovirt.org/job/ovirt-system-tests_basic-suite-4.3_nightly/
Build: https://jenkins.ovirt.org/job/ovirt-system-tests_basic-suite-4.3_nightly/...
Build Number: 279
Build Status: Failure
Triggered By: Started by timer
-------------------------------------
Changes Since Last Success:
-------------------------------------
Changes for Build #279
[Michal Skrivanek] avoid ballooning crash in vm0 after memory hotplug
-----------------
Failed Tests:
-----------------
1 tests failed.
FAILED: 008_basic_ui_sanity.start_grid
Error Message:
could not get ip address of selenium hub. See previous messages for probable docker failure
-------------------- >> begin captured stdout << ---------------------
executing shell: docker ps
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
executing shell: docker rm -f grid_node_chrome grid_node_firefox selenium-hub
Error response from daemon: No such container: grid_node_chrome
Error response from daemon: No such container: grid_node_firefox
Error response from daemon: No such container: selenium-hub
executing shell: docker kill grid_node_chrome grid_node_firefox selenium-hub
Error response from daemon: Cannot kill container: grid_node_chrome: No such container: grid_node_chrome
Error response from daemon: Cannot kill container: grid_node_firefox: No such container: grid_node_firefox
Error response from daemon: Cannot kill container: selenium-hub: No such container: selenium-hub
executing shell: docker ps
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
executing shell: docker network ls
NETWORK ID NAME DRIVER SCOPE
336b430c5db4 bridge bridge local
bd7765e6670f host host local
245daf9e104c none null local
executing shell: docker network rm grid
Error response from daemon: network grid not found
executing shell: docker network ls
NETWORK ID NAME DRIVER SCOPE
336b430c5db4 bridge bridge local
bd7765e6670f host host local
245daf9e104c none null local
executing shell: docker network ls
NETWORK ID NAME DRIVER SCOPE
336b430c5db4 bridge bridge local
bd7765e6670f host host local
245daf9e104c none null local
creating docker network for grid
executing shell: docker network create grid
4aa37e0f55a57e7fa995d5786c78e60116d2ea099a390774afe8defbab8f4c23
executing shell: docker network ls
NETWORK ID NAME DRIVER SCOPE
336b430c5db4 bridge bridge local
4aa37e0f55a5 grid bridge local
bd7765e6670f host host local
245daf9e104c none null local
starting hub
executing shell: docker run -d -p 4444:4444 --net grid --name selenium-hub selenium/hub:3.9.1-actinium
Unable to find image 'selenium/hub:3.9.1-actinium' locally
3.9.1-actinium: Pulling from selenium/hub
1be7f2b886e8: Pulling fs layer
6fbc4a21b806: Pulling fs layer
c71a6f8e1378: Pulling fs layer
4be3072e5a37: Pulling fs layer
06c6d2f59700: Pulling fs layer
edcd5e9f2f91: Pulling fs layer
4be3072e5a37: Waiting
0eeaf787f757: Pulling fs layer
06c6d2f59700: Waiting
c949dee5af7e: Pulling fs layer
df88a49b4162: Pulling fs layer
edcd5e9f2f91: Waiting
ce3c6f42fd24: Pulling fs layer
6c8e191c3abf: Pulling fs layer
df88a49b4162: Waiting
ce3c6f42fd24: Waiting
c9aca8b50247: Pulling fs layer
c9aca8b50247: Waiting
6c8e191c3abf: Waiting
c949dee5af7e: Waiting
0eeaf787f757: Waiting
/usr/bin/docker-current: error pulling image configuration: Get https://production.cloudflare.docker.com/registry-v2/docker/registry/v2/b...: dial tcp: lookup production.cloudflare.docker.com on 172.19.11.6:53: read udp 10.129.5.217:34274->172.19.11.6:53: i/o timeout.
See '/usr/bin/docker-current run --help'.
getting ip of hub
--------------------- >> end captured stdout << ----------------------
Stack Trace:
File "/usr/lib64/python2.7/unittest/case.py", line 369, in run
testMethod()
File "/usr/lib/python2.7/site-packages/nose/case.py", line 197, in runTest
self.test(*self.arg)
File "/usr/lib/python2.7/site-packages/ovirtlago/testlib.py", line 142, in wrapped_test
test()
File "/home/jenkins/agent/workspace/ovirt-system-tests_basic-suite-4.3_nightly/ovirt-system-tests/basic-suite-4.3/test-scenarios/008_basic_ui_sanity.py", line 189, in start_grid
raise RuntimeError("could not get ip address of selenium hub. See previous messages for probable docker failure")
could not get ip address of selenium hub. See previous messages for probable docker failure
-------------------- >> begin captured stdout << ---------------------
executing shell: docker ps
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
executing shell: docker rm -f grid_node_chrome grid_node_firefox selenium-hub
Error response from daemon: No such container: grid_node_chrome
Error response from daemon: No such container: grid_node_firefox
Error response from daemon: No such container: selenium-hub
executing shell: docker kill grid_node_chrome grid_node_firefox selenium-hub
Error response from daemon: Cannot kill container: grid_node_chrome: No such container: grid_node_chrome
Error response from daemon: Cannot kill container: grid_node_firefox: No such container: grid_node_firefox
Error response from daemon: Cannot kill container: selenium-hub: No such container: selenium-hub
executing shell: docker ps
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
executing shell: docker network ls
NETWORK ID NAME DRIVER SCOPE
336b430c5db4 bridge bridge local
bd7765e6670f host host local
245daf9e104c none null local
executing shell: docker network rm grid
Error response from daemon: network grid not found
executing shell: docker network ls
NETWORK ID NAME DRIVER SCOPE
336b430c5db4 bridge bridge local
bd7765e6670f host host local
245daf9e104c none null local
executing shell: docker network ls
NETWORK ID NAME DRIVER SCOPE
336b430c5db4 bridge bridge local
bd7765e6670f host host local
245daf9e104c none null local
creating docker network for grid
executing shell: docker network create grid
4aa37e0f55a57e7fa995d5786c78e60116d2ea099a390774afe8defbab8f4c23
executing shell: docker network ls
NETWORK ID NAME DRIVER SCOPE
336b430c5db4 bridge bridge local
4aa37e0f55a5 grid bridge local
bd7765e6670f host host local
245daf9e104c none null local
starting hub
executing shell: docker run -d -p 4444:4444 --net grid --name selenium-hub selenium/hub:3.9.1-actinium
Unable to find image 'selenium/hub:3.9.1-actinium' locally
3.9.1-actinium: Pulling from selenium/hub
1be7f2b886e8: Pulling fs layer
6fbc4a21b806: Pulling fs layer
c71a6f8e1378: Pulling fs layer
4be3072e5a37: Pulling fs layer
06c6d2f59700: Pulling fs layer
edcd5e9f2f91: Pulling fs layer
4be3072e5a37: Waiting
0eeaf787f757: Pulling fs layer
06c6d2f59700: Waiting
c949dee5af7e: Pulling fs layer
df88a49b4162: Pulling fs layer
edcd5e9f2f91: Waiting
ce3c6f42fd24: Pulling fs layer
6c8e191c3abf: Pulling fs layer
df88a49b4162: Waiting
ce3c6f42fd24: Waiting
c9aca8b50247: Pulling fs layer
c9aca8b50247: Waiting
6c8e191c3abf: Waiting
c949dee5af7e: Waiting
0eeaf787f757: Waiting
/usr/bin/docker-current: error pulling image configuration: Get https://production.cloudflare.docker.com/registry-v2/docker/registry/v2/b...: dial tcp: lookup production.cloudflare.docker.com on 172.19.11.6:53: read udp 10.129.5.217:34274->172.19.11.6:53: i/o timeout.
See '/usr/bin/docker-current run --help'.
getting ip of hub
--------------------- >> end captured stdout << ----------------------
4 years, 3 months