Build failed in Jenkins: ovirt_3.6_he-system-tests #787
by jenkins@jenkins.phx.ovirt.org
See <http://jenkins.ovirt.org/job/ovirt_3.6_he-system-tests/787/>
------------------------------------------
Started by timer
[EnvInject] - Loading node environment variables.
Building remotely on ovirt-srv22.phx.ovirt.org (fc24 phx integ-tests physical) in workspace <http://jenkins.ovirt.org/job/ovirt_3.6_he-system-tests/ws/>
> git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
> git config remote.origin.url git://gerrit.ovirt.org/ovirt-system-tests.git # timeout=10
Cleaning workspace
> git rev-parse --verify HEAD # timeout=10
Resetting working tree
> git reset --hard # timeout=10
> git clean -fdx # timeout=10
Pruning obsolete local branches
Fetching upstream changes from git://gerrit.ovirt.org/ovirt-system-tests.git
> git --version # timeout=10
> git -c core.askpass=true fetch --tags --progress git://gerrit.ovirt.org/ovirt-system-tests.git refs/heads/master --prune
ERROR: Error fetching remote repo 'origin'
hudson.plugins.git.GitException: Failed to fetch from git://gerrit.ovirt.org/ovirt-system-tests.git
at hudson.plugins.git.GitSCM.fetchFrom(GitSCM.java:766)
at hudson.plugins.git.GitSCM.retrieveChanges(GitSCM.java:1022)
at hudson.plugins.git.GitSCM.checkout(GitSCM.java:1053)
at org.jenkinsci.plugins.multiplescms.MultiSCM.checkout(MultiSCM.java:129)
at hudson.scm.SCM.checkout(SCM.java:485)
at hudson.model.AbstractProject.checkout(AbstractProject.java:1269)
at hudson.model.AbstractBuild$AbstractBuildExecution.defaultCheckout(AbstractBuild.java:607)
at jenkins.scm.SCMCheckoutStrategy.checkout(SCMCheckoutStrategy.java:86)
at hudson.model.AbstractBuild$AbstractBuildExecution.run(AbstractBuild.java:529)
at hudson.model.Run.execute(Run.java:1738)
at hudson.model.FreeStyleBuild.run(FreeStyleBuild.java:43)
at hudson.model.ResourceController.execute(ResourceController.java:98)
at hudson.model.Executor.run(Executor.java:410)
Caused by: hudson.plugins.git.GitException: Command "git -c core.askpass=true fetch --tags --progress git://gerrit.ovirt.org/ovirt-system-tests.git refs/heads/master --prune" returned status code 128:
stdout:
stderr: fatal: unable to connect to gerrit.ovirt.org:
gerrit.ovirt.org[0: 107.22.212.69]: errno=Connection refused
at org.jenkinsci.plugins.gitclient.CliGitAPIImpl.launchCommandIn(CliGitAPIImpl.java:1640)
at org.jenkinsci.plugins.gitclient.CliGitAPIImpl.launchCommandWithCredentials(CliGitAPIImpl.java:1388)
at org.jenkinsci.plugins.gitclient.CliGitAPIImpl.access$300(CliGitAPIImpl.java:62)
at org.jenkinsci.plugins.gitclient.CliGitAPIImpl$1.execute(CliGitAPIImpl.java:313)
at org.jenkinsci.plugins.gitclient.RemoteGitImpl$CommandInvocationHandler$1.call(RemoteGitImpl.java:152)
at org.jenkinsci.plugins.gitclient.RemoteGitImpl$CommandInvocationHandler$1.call(RemoteGitImpl.java:145)
at hudson.remoting.UserRequest.perform(UserRequest.java:152)
at hudson.remoting.UserRequest.perform(UserRequest.java:50)
at hudson.remoting.Request$2.run(Request.java:332)
at hudson.remoting.InterceptingExecutorService$1.call(InterceptingExecutorService.java:68)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
at ......remote call to ovirt-srv22.phx.ovirt.org(Native Method)
at hudson.remoting.Channel.attachCallSiteStackTrace(Channel.java:1416)
at hudson.remoting.UserResponse.retrieve(UserRequest.java:252)
at hudson.remoting.Channel.call(Channel.java:781)
at org.jenkinsci.plugins.gitclient.RemoteGitImpl$CommandInvocationHandler.execute(RemoteGitImpl.java:145)
at sun.reflect.GeneratedMethodAccessor452.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.jenkinsci.plugins.gitclient.RemoteGitImpl$CommandInvocationHandler.invoke(RemoteGitImpl.java:131)
at com.sun.proxy.$Proxy60.execute(Unknown Source)
at hudson.plugins.git.GitSCM.fetchFrom(GitSCM.java:764)
... 12 more
ERROR: null
Performing Post build task...
Match found for :.* : True
Logical operation result is TRUE
Running script : #!/bin/bash -xe
echo 'shell_scripts/system_tests.collect_logs.sh'
#
# Required jjb vars:
# version
#
VERSION=3.6
SUITE_TYPE=
WORKSPACE="$PWD"
OVIRT_SUITE="$SUITE_TYPE_suite_$VERSION"
TESTS_LOGS="$WORKSPACE/ovirt-system-tests/exported-artifacts"
rm -rf "$WORKSPACE/exported-artifacts"
mkdir -p "$WORKSPACE/exported-artifacts"
if [[ -d "$TESTS_LOGS" ]]; then
mv "$TESTS_LOGS/"* "$WORKSPACE/exported-artifacts/"
fi
[ovirt_3.6_he-system-tests] $ /bin/bash -xe /tmp/hudson6880587810211349529.sh
+ echo shell_scripts/system_tests.collect_logs.sh
shell_scripts/system_tests.collect_logs.sh
+ VERSION=3.6
+ SUITE_TYPE=
+ WORKSPACE=<http://jenkins.ovirt.org/job/ovirt_3.6_he-system-tests/ws/>
+ OVIRT_SUITE=3.6
+ TESTS_LOGS=<http://jenkins.ovirt.org/job/ovirt_3.6_he-system-tests/ws/ovirt-system-te...>
+ rm -rf <http://jenkins.ovirt.org/job/ovirt_3.6_he-system-tests/787/artifact/expor...>
+ mkdir -p <http://jenkins.ovirt.org/job/ovirt_3.6_he-system-tests/787/artifact/expor...>
+ [[ -d <http://jenkins.ovirt.org/job/ovirt_3.6_he-system-tests/ws/ovirt-system-te...> ]]
POST BUILD TASK : SUCCESS
END OF POST BUILD TASK : 0
Recording test results
ERROR: Step ?Publish JUnit test result report? failed: No test report files were found. Configuration error?
Archiving artifacts
7 years, 11 months
Heads up! Influence of our recent Mock/Proxy changes on Lago jobs
by Barak Korren
Hi infra team members!
As you may know, we've recently changed out proxied Mock configuration
so that the 'http_proxy' environment variable gets defined inside the
Mock environment. This was in an effort to make 'pip', 'curl' and
'wget' commands go through our PHX proxy. As it turns out, this also
have unforeseen influence on yum tools.
Now, when it come to yum, as it is used inside the mock environmet, we
long has the proxied configuration hard-wiring it to use the proxy by
setting it in "yum.conf". However, so far, yum tools (Such as
reposync) that brought their own configuration, essentially bypassed
the "yum.conf" file and hence were not using the proxy.
Well, now it turns out that 'yum' and the derived tools also respect
the 'http_proxy' environment variable [1]:
10.2. Configuring Proxy Server Access for a Single User
To enable proxy access for a specific user, add the lines in the example box
below to the user's shell profile. For the default bash shell, the
profile is
the file ~/.bash_profile. The settings below enable yum to use the proxy
server mycache.mydomain.com, connecting to port 3128.
# The Web proxy server used by this account
http_proxy="http://mycache.mydomain.com:3128"
export http_proxy
This is generally a good thing, but it can lead to formerly unexpected
consequences.
Case-to-point: The Lago job reposync failures of last Thursday (Dec 22nd, 2016).
The root-cause behind the failures was that the
"ovirt-web-ui-0.1.0-4.el7.centos.x86_64.rpm" file was changed in the
"ovirt-master-snapshot-static" repo. Updating an RPM file without
changing the version or revision numbers breaks YUM`s rules and makes
reposync choke. We already knew about this and actually had a
work-around in the Lago code [2].
We I came in Thursday morning, and saw reposync failing in all the
Lago jobs, I just assumed that our work-around simply failed to work.
My assumption was enforced by the fact that I was able to reproduce
the issue by running 'reposync' manually on the Lago hosts, and also
managed to rectify it by removing the offending from file the reposync
cache. I spent the next few hours chasing down failing jobs and
cleaning up the caches on the hosts they ran on. I took me a while to
figure out that I was seeing the problem (Essentially, the older
version of the package file) reappear on the same hosts over and over
again!
Wondering how could that be, and after ensuring the older package file
was nowhere to be found on any of the repos the jobs were using, Me
and Gal took a look at the Lago code to see if it could be causing the
issue. Imagine our puzzlement when we realized the work-around code
was doing _exactly_ what I was doing manually, and still somehow
managed to make the very issue it was designed to solve reappear!
Eventually the problem seemed to disappear on its own. Now, armed with
the knowledge above I can provide a plausible explanation to what we
were seeing.
The difference between my manual executions of 'reposync' and the way
Lago was running it was that Lago was running within Mock, where
'http_proxy' was defined. What was probably happening is that reposync
kept getting the old RPM file from the proxy while still getting a
newer yum metadate file.
Conclusion - The next time such an issue arises, we must make sure to
clear the PHX proxy cache, there is actually no need to clear the
cache on the Lago hosts themselves, because our work-around will
resolve the issue there. Longer term we may configure the proxy to not
cache files coming from resources.ovirt.org.
[1]: https://www.centos.org/docs/5/html/yum/sn-yum-proxy-server.html
[2]: https://github.com/lago-project/lago/blob/master/ovirtlago/reposetup.py#L...
--
Barak Korren
bkorren(a)redhat.com
RHCE, RHCi, RHV-DevOps Team
https://ifireball.wordpress.com/
7 years, 11 months
Build failed in Jenkins: ovirt_3.6_he-system-tests #785
by jenkins@jenkins.phx.ovirt.org
See <http://jenkins.ovirt.org/job/ovirt_3.6_he-system-tests/785/changes>
Changes:
[Yaniv Kaul] Fixes and changes to storage tests
[Sandro Bonazzola] ovirt-appliance: exclude f23 node builders
[Eyal Edri] remove fc23 slaves for lago testing
[Yaniv Bronhaim] Add new vdsm branch ovirt-4.1 to automation checks
[Eyal Edri] std ci: add notification plugin to standard CI
[Sandro Bonazzola] ovirt-appliance: add check and manual jobs
[Eyal Edri] adding missing vdsm builds for ppc64le
[Ryan Barry] Add builders for ovirt-engine-appliance:ovirt-4.1-snapshot
------------------------------------------
[...truncated 931 lines...]
>From 192.168.200.4 icmp_seq=99 Destination Host Unreachable
>From 192.168.200.4 icmp_seq=100 Destination Host Unreachable
>From 192.168.200.4 icmp_seq=101 Destination Host Unreachable
>From 192.168.200.4 icmp_seq=102 Destination Host Unreachable
>From 192.168.200.4 icmp_seq=103 Destination Host Unreachable
>From 192.168.200.4 icmp_seq=104 Destination Host Unreachable
64 bytes from 192.168.200.99: icmp_seq=105 ttl=64 time=1.06 ms
64 bytes from 192.168.200.99: icmp_seq=106 ttl=64 time=0.769 ms
64 bytes from 192.168.200.99: icmp_seq=107 ttl=64 time=0.653 ms
64 bytes from 192.168.200.99: icmp_seq=108 ttl=64 time=0.684 ms
64 bytes from 192.168.200.99: icmp_seq=109 ttl=64 time=5.17 ms
64 bytes from 192.168.200.99: icmp_seq=110 ttl=64 time=1.70 ms
64 bytes from 192.168.200.99: icmp_seq=111 ttl=64 time=8.15 ms
64 bytes from 192.168.200.99: icmp_seq=112 ttl=64 time=0.762 ms
64 bytes from 192.168.200.99: icmp_seq=113 ttl=64 time=0.487 ms
64 bytes from 192.168.200.99: icmp_seq=114 ttl=64 time=0.901 ms
64 bytes from 192.168.200.99: icmp_seq=115 ttl=64 time=0.565 ms
64 bytes from 192.168.200.99: icmp_seq=116 ttl=64 time=7.10 ms
64 bytes from 192.168.200.99: icmp_seq=117 ttl=64 time=3.13 ms
64 bytes from 192.168.200.99: icmp_seq=118 ttl=64 time=7.50 ms
64 bytes from 192.168.200.99: icmp_seq=119 ttl=64 time=7.82 ms
64 bytes from 192.168.200.99: icmp_seq=120 ttl=64 time=3.79 ms
64 bytes from 192.168.200.99: icmp_seq=121 ttl=64 time=2.57 ms
64 bytes from 192.168.200.99: icmp_seq=122 ttl=64 time=8.06 ms
64 bytes from 192.168.200.99: icmp_seq=123 ttl=64 time=7.76 ms
64 bytes from 192.168.200.99: icmp_seq=124 ttl=64 time=12.8 ms
64 bytes from 192.168.200.99: icmp_seq=125 ttl=64 time=0.477 ms
64 bytes from 192.168.200.99: icmp_seq=126 ttl=64 time=7.97 ms
64 bytes from 192.168.200.99: icmp_seq=127 ttl=64 time=7.85 ms
64 bytes from 192.168.200.99: icmp_seq=128 ttl=64 time=0.735 ms
64 bytes from 192.168.200.99: icmp_seq=129 ttl=64 time=0.582 ms
64 bytes from 192.168.200.99: icmp_seq=130 ttl=64 time=4.02 ms
64 bytes from 192.168.200.99: icmp_seq=131 ttl=64 time=0.517 ms
64 bytes from 192.168.200.99: icmp_seq=132 ttl=64 time=11.0 ms
64 bytes from 192.168.200.99: icmp_seq=133 ttl=64 time=7.82 ms
64 bytes from 192.168.200.99: icmp_seq=134 ttl=64 time=8.90 ms
64 bytes from 192.168.200.99: icmp_seq=135 ttl=64 time=0.706 ms
64 bytes from 192.168.200.99: icmp_seq=136 ttl=64 time=4.23 ms
64 bytes from 192.168.200.99: icmp_seq=137 ttl=64 time=7.23 ms
64 bytes from 192.168.200.99: icmp_seq=138 ttl=64 time=0.562 ms
64 bytes from 192.168.200.99: icmp_seq=139 ttl=64 time=22.0 ms
64 bytes from 192.168.200.99: icmp_seq=140 ttl=64 time=7.85 ms
64 bytes from 192.168.200.99: icmp_seq=141 ttl=64 time=7.83 ms
64 bytes from 192.168.200.99: icmp_seq=142 ttl=64 time=1.08 ms
64 bytes from 192.168.200.99: icmp_seq=143 ttl=64 time=1.19 ms
64 bytes from 192.168.200.99: icmp_seq=144 ttl=64 time=0.712 ms
64 bytes from 192.168.200.99: icmp_seq=145 ttl=64 time=0.771 ms
64 bytes from 192.168.200.99: icmp_seq=146 ttl=64 time=0.738 ms
64 bytes from 192.168.200.99: icmp_seq=147 ttl=64 time=0.769 ms
64 bytes from 192.168.200.99: icmp_seq=148 ttl=64 time=8.01 ms
64 bytes from 192.168.200.99: icmp_seq=149 ttl=64 time=1.02 ms
64 bytes from 192.168.200.99: icmp_seq=150 ttl=64 time=0.808 ms
64 bytes from 192.168.200.99: icmp_seq=151 ttl=64 time=0.788 ms
64 bytes from 192.168.200.99: icmp_seq=152 ttl=64 time=0.848 ms
64 bytes from 192.168.200.99: icmp_seq=153 ttl=64 time=0.789 ms
64 bytes from 192.168.200.99: icmp_seq=154 ttl=64 time=0.861 ms
64 bytes from 192.168.200.99: icmp_seq=155 ttl=64 time=0.776 ms
64 bytes from 192.168.200.99: icmp_seq=156 ttl=64 time=2.93 ms
64 bytes from 192.168.200.99: icmp_seq=157 ttl=64 time=0.854 ms
64 bytes from 192.168.200.99: icmp_seq=158 ttl=64 time=0.655 ms
64 bytes from 192.168.200.99: icmp_seq=159 ttl=64 time=0.769 ms
64 bytes from 192.168.200.99: icmp_seq=160 ttl=64 time=0.927 ms
64 bytes from 192.168.200.99: icmp_seq=161 ttl=64 time=0.797 ms
64 bytes from 192.168.200.99: icmp_seq=162 ttl=64 time=0.757 ms
64 bytes from 192.168.200.99: icmp_seq=163 ttl=64 time=0.797 ms
64 bytes from 192.168.200.99: icmp_seq=164 ttl=64 time=0.835 ms
64 bytes from 192.168.200.99: icmp_seq=165 ttl=64 time=0.899 ms
64 bytes from 192.168.200.99: icmp_seq=166 ttl=64 time=0.802 ms
64 bytes from 192.168.200.99: icmp_seq=167 ttl=64 time=0.898 ms
64 bytes from 192.168.200.99: icmp_seq=168 ttl=64 time=0.839 ms
64 bytes from 192.168.200.99: icmp_seq=169 ttl=64 time=1.20 ms
64 bytes from 192.168.200.99: icmp_seq=170 ttl=64 time=0.718 ms
64 bytes from 192.168.200.99: icmp_seq=171 ttl=64 time=0.780 ms
64 bytes from 192.168.200.99: icmp_seq=172 ttl=64 time=0.715 ms
64 bytes from 192.168.200.99: icmp_seq=173 ttl=64 time=0.716 ms
64 bytes from 192.168.200.99: icmp_seq=174 ttl=64 time=0.694 ms
64 bytes from 192.168.200.99: icmp_seq=175 ttl=64 time=0.618 ms
64 bytes from 192.168.200.99: icmp_seq=176 ttl=64 time=0.689 ms
64 bytes from 192.168.200.99: icmp_seq=177 ttl=64 time=0.675 ms
64 bytes from 192.168.200.99: icmp_seq=178 ttl=64 time=0.692 ms
64 bytes from 192.168.200.99: icmp_seq=179 ttl=64 time=0.868 ms
64 bytes from 192.168.200.99: icmp_seq=180 ttl=64 time=0.726 ms
64 bytes from 192.168.200.99: icmp_seq=181 ttl=64 time=0.659 ms
64 bytes from 192.168.200.99: icmp_seq=182 ttl=64 time=0.698 ms
64 bytes from 192.168.200.99: icmp_seq=183 ttl=64 time=0.795 ms
64 bytes from 192.168.200.99: icmp_seq=184 ttl=64 time=0.893 ms
64 bytes from 192.168.200.99: icmp_seq=185 ttl=64 time=0.785 ms
64 bytes from 192.168.200.99: icmp_seq=186 ttl=64 time=0.824 ms
64 bytes from 192.168.200.99: icmp_seq=187 ttl=64 time=0.740 ms
64 bytes from 192.168.200.99: icmp_seq=188 ttl=64 time=0.740 ms
64 bytes from 192.168.200.99: icmp_seq=189 ttl=64 time=0.781 ms
64 bytes from 192.168.200.99: icmp_seq=190 ttl=64 time=0.767 ms
64 bytes from 192.168.200.99: icmp_seq=191 ttl=64 time=0.719 ms
64 bytes from 192.168.200.99: icmp_seq=192 ttl=64 time=0.632 ms
64 bytes from 192.168.200.99: icmp_seq=193 ttl=64 time=0.731 ms
64 bytes from 192.168.200.99: icmp_seq=194 ttl=64 time=0.606 ms
64 bytes from 192.168.200.99: icmp_seq=195 ttl=64 time=0.608 ms
64 bytes from 192.168.200.99: icmp_seq=196 ttl=64 time=0.545 ms
64 bytes from 192.168.200.99: icmp_seq=197 ttl=64 time=0.658 ms
64 bytes from 192.168.200.99: icmp_seq=198 ttl=64 time=0.625 ms
64 bytes from 192.168.200.99: icmp_seq=199 ttl=64 time=0.566 ms
64 bytes from 192.168.200.99: icmp_seq=200 ttl=64 time=0.676 ms
64 bytes from 192.168.200.99: icmp_seq=201 ttl=64 time=2.42 ms
64 bytes from 192.168.200.99: icmp_seq=202 ttl=64 time=0.624 ms
64 bytes from 192.168.200.99: icmp_seq=203 ttl=64 time=0.532 ms
64 bytes from 192.168.200.99: icmp_seq=204 ttl=64 time=0.581 ms
64 bytes from 192.168.200.99: icmp_seq=205 ttl=64 time=0.708 ms
64 bytes from 192.168.200.99: icmp_seq=206 ttl=64 time=0.582 ms
64 bytes from 192.168.200.99: icmp_seq=207 ttl=64 time=0.694 ms
64 bytes from 192.168.200.99: icmp_seq=208 ttl=64 time=0.737 ms
64 bytes from 192.168.200.99: icmp_seq=209 ttl=64 time=0.560 ms
64 bytes from 192.168.200.99: icmp_seq=210 ttl=64 time=0.549 ms
64 bytes from 192.168.200.99: icmp_seq=211 ttl=64 time=0.745 ms
64 bytes from 192.168.200.99: icmp_seq=212 ttl=64 time=0.660 ms
64 bytes from 192.168.200.99: icmp_seq=213 ttl=64 time=0.735 ms
64 bytes from 192.168.200.99: icmp_seq=214 ttl=64 time=0.556 ms
64 bytes from 192.168.200.99: icmp_seq=215 ttl=64 time=0.703 ms
64 bytes from 192.168.200.99: icmp_seq=216 ttl=64 time=0.666 ms
64 bytes from 192.168.200.99: icmp_seq=217 ttl=64 time=0.696 ms
64 bytes from 192.168.200.99: icmp_seq=218 ttl=64 time=0.762 ms
64 bytes from 192.168.200.99: icmp_seq=219 ttl=64 time=0.616 ms
64 bytes from 192.168.200.99: icmp_seq=220 ttl=64 time=0.628 ms
--- 192.168.200.99 ping statistics ---
220 packets transmitted, 116 received, +104 errors, 47% packet loss, time 219089ms
rtt min/avg/max/mdev = 0.477/2.170/22.036/3.284 ms, pipe 4
Warning: Permanently added '192.168.200.99' (ECDSA) to the list of known hosts.
/var/tmp: 30 GiB (32196378624 bytes) trimmed
[ INFO ] Stage: Initializing
[ INFO ] Generating a temporary VNC password.
[ INFO ] Stage: Environment setup
Configuration files: ['/root/hosted-engine-deploy-answers-file.conf']
Log file: /var/log/ovirt-hosted-engine-setup/ovirt-hosted-engine-setup-20161222222459-9yovu9.log
Version: otopi-1.4.3_master (otopi-1.4.3-0.0.master.20160525142447.gitcf9c977.el7)
[ INFO ] Hardware supports virtualization
[ INFO ] Stage: Environment packages setup
[ INFO ] Stage: Programs detection
[ INFO ] Stage: Environment setup
[ INFO ] Waiting for VDSM hardware info
[ INFO ] Generating libvirt-spice certificates
[ INFO ] Stage: Environment customization
--== STORAGE CONFIGURATION ==--
During customization use CTRL-D to abort.
[ INFO ] Installing on additional host
--== SYSTEM CONFIGURATION ==--
[WARNING] A configuration file must be supplied to deploy Hosted Engine on an additional host.
[ INFO ] Answer file successfully loaded
--== NETWORK CONFIGURATION ==--
[ INFO ] Additional host deployment, firewall manager is 'iptables'
The following CPU types are supported by this host:
- model_Haswell-noTSX: Intel Haswell-noTSX Family
- model_SandyBridge: Intel SandyBridge Family
- model_Westmere: Intel Westmere Family
- model_Nehalem: Intel Nehalem Family
- model_Penryn: Intel Penryn Family
- model_Conroe: Intel Conroe Family
--== HOSTED ENGINE CONFIGURATION ==--
[ INFO ] Stage: Setup validation
[WARNING] Failed to resolve lago-he-basic-suite-3-6-host1.lago.local using DNS, it can be resolved only locally
--== CONFIGURATION PREVIEW ==--
Engine FQDN : hosted-engine.lago.local
Bridge name : ovirtmgmt
Host address : lago-he-basic-suite-3-6-host1.lago.local
SSH daemon port : 22
Firewall manager : iptables
Gateway address : 192.168.200.1
Host name for web application : lago-he-basic-suite-3-6-host1
Storage Domain type : nfs3
Host ID : 2
Image size GB : 10
GlusterFS Share Name : hosted_engine_glusterfs
GlusterFS Brick Provisioning : False
Storage connection : lago-he-basic-suite-3-6-storage:/exports/nfs_he
Console type : vnc
Memory size MB : 3171
MAC address : 00:16:3e:24:d3:63
Boot type : disk
Number of CPUs : 2
Restart engine VM after engine-setup: True
CPU Type : model_SandyBridge
[ INFO ] Stage: Transaction setup
[ INFO ] Stage: Misc configuration
[ INFO ] Stage: Package installation
[ INFO ] Stage: Misc configuration
[ INFO ] Configuring libvirt
[ INFO ] Configuring VDSM
[ INFO ] Starting vdsmd
[ INFO ] Waiting for VDSM hardware info
[ INFO ] Configuring VM
[ INFO ] Updating hosted-engine configuration
[ INFO ] Stage: Transaction commit
[ INFO ] Stage: Closing up
[ INFO ] Acquiring internal CA cert from the engine
[ INFO ] The following CA certificate is going to be used, please immediately interrupt if not correct:
[ INFO ] Issuer: C=US, O=lago.local, CN=hosted-engine.lago.local.54217, Subject: C=US, O=lago.local, CN=hosted-engine.lago.local.54217, Fingerprint (SHA-1): E9A33881A51A9C8CB5EA30C23728CAC2F2479C2A
[ INFO ] Connecting to the Engine
[ INFO ] Waiting for the host to become operational in the engine. This may take several minutes...
Build timed out (after 360 minutes). Marking the build as failed.
Build was aborted
Performing Post build task...
Match found for :.* : True
Logical operation result is TRUE
Running script : #!/bin/bash -xe
echo 'shell_scripts/system_tests.collect_logs.sh'
#
# Required jjb vars:
# version
#
VERSION=3.6
SUITE_TYPE=
WORKSPACE="$PWD"
OVIRT_SUITE="$SUITE_TYPE_suite_$VERSION"
TESTS_LOGS="$WORKSPACE/ovirt-system-tests/exported-artifacts"
rm -rf "$WORKSPACE/exported-artifacts"
mkdir -p "$WORKSPACE/exported-artifacts"
if [[ -d "$TESTS_LOGS" ]]; then
mv "$TESTS_LOGS/"* "$WORKSPACE/exported-artifacts/"
fi
[ovirt_3.6_he-system-tests] $ /bin/bash -xe /tmp/hudson9180790764453283670.sh
+ echo shell_scripts/system_tests.collect_logs.sh
shell_scripts/system_tests.collect_logs.sh
+ VERSION=3.6
+ SUITE_TYPE=
+ WORKSPACE=<http://jenkins.ovirt.org/job/ovirt_3.6_he-system-tests/ws/>
+ OVIRT_SUITE=3.6
+ TESTS_LOGS=<http://jenkins.ovirt.org/job/ovirt_3.6_he-system-tests/ws/ovirt-system-te...>
+ rm -rf <http://jenkins.ovirt.org/job/ovirt_3.6_he-system-tests/785/artifact/expor...>
+ mkdir -p <http://jenkins.ovirt.org/job/ovirt_3.6_he-system-tests/785/artifact/expor...>
+ [[ -d <http://jenkins.ovirt.org/job/ovirt_3.6_he-system-tests/ws/ovirt-system-te...> ]]
POST BUILD TASK : SUCCESS
END OF POST BUILD TASK : 0
Recording test results
ERROR: Step ?Publish JUnit test result report? failed: No test report files were found. Configuration error?
Archiving artifacts
7 years, 11 months
Build failed in Jenkins: ovirt_4.0_he-system-tests #603
by jenkins@jenkins.phx.ovirt.org
See <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/603/changes>
Changes:
[Yaniv Kaul] Fixes and changes to storage tests
[Sandro Bonazzola] ovirt-hosted-engine-ha: added 4.1 branch
[Sandro Bonazzola] ovirt-release: move to 4.1 branch
[Sandro Bonazzola] cockpit-ovirt: exclude fc23 from 4.1
[Gil Shinar] Fix bugs in sdk yamls
------------------------------------------
[...truncated 1118 lines...]
+ [[ attempt > 0 ]]
+ sleep 1s
+ sudo umount --lazy /var/lib/mock/epel-7-x86_64-ed452d1ea0f6dadf50c66d3b3f6b3a6e-40783/root/sys
umount: /var/lib/mock/epel-7-x86_64-ed452d1ea0f6dadf50c66d3b3f6b3a6e-40783/root/sys: not mounted
+ findmnt --kernel --first /var/lib/mock/epel-7-x86_64-ed452d1ea0f6dadf50c66d3b3f6b3a6e-40783/root/sys
+ (( attempt++ ))
+ (( attempt < 3 ))
+ echo 'ERROR: Failed to umount /var/lib/mock/epel-7-x86_64-ed452d1ea0f6dadf50c66d3b3f6b3a6e-40783/root/sys.'
ERROR: Failed to umount /var/lib/mock/epel-7-x86_64-ed452d1ea0f6dadf50c66d3b3f6b3a6e-40783/root/sys.
+ return 1
+ failed=true
+ for mount in '"${mounts[@]}"'
+ safe_umount /var/lib/mock/epel-7-x86_64-ed452d1ea0f6dadf50c66d3b3f6b3a6e-40783/root/dev/shm
+ local mount=/var/lib/mock/epel-7-x86_64-ed452d1ea0f6dadf50c66d3b3f6b3a6e-40783/root/dev/shm
+ local attempt
+ (( attempt=0 ))
+ (( attempt < 3 ))
+ [[ attempt > 0 ]]
+ sleep 1s
+ sudo umount --lazy /var/lib/mock/epel-7-x86_64-ed452d1ea0f6dadf50c66d3b3f6b3a6e-40783/root/dev/shm
umount: /var/lib/mock/epel-7-x86_64-ed452d1ea0f6dadf50c66d3b3f6b3a6e-40783/root/dev/shm: not mounted
+ findmnt --kernel --first /var/lib/mock/epel-7-x86_64-ed452d1ea0f6dadf50c66d3b3f6b3a6e-40783/root/dev/shm
+ (( attempt++ ))
+ (( attempt < 3 ))
+ [[ attempt > 0 ]]
+ sleep 1s
+ sudo umount --lazy /var/lib/mock/epel-7-x86_64-ed452d1ea0f6dadf50c66d3b3f6b3a6e-40783/root/dev/shm
umount: /var/lib/mock/epel-7-x86_64-ed452d1ea0f6dadf50c66d3b3f6b3a6e-40783/root/dev/shm: not mounted
+ findmnt --kernel --first /var/lib/mock/epel-7-x86_64-ed452d1ea0f6dadf50c66d3b3f6b3a6e-40783/root/dev/shm
+ (( attempt++ ))
+ (( attempt < 3 ))
+ [[ attempt > 0 ]]
+ sleep 1s
+ sudo umount --lazy /var/lib/mock/epel-7-x86_64-ed452d1ea0f6dadf50c66d3b3f6b3a6e-40783/root/dev/shm
umount: /var/lib/mock/epel-7-x86_64-ed452d1ea0f6dadf50c66d3b3f6b3a6e-40783/root/dev/shm: not mounted
+ findmnt --kernel --first /var/lib/mock/epel-7-x86_64-ed452d1ea0f6dadf50c66d3b3f6b3a6e-40783/root/dev/shm
+ (( attempt++ ))
+ (( attempt < 3 ))
+ echo 'ERROR: Failed to umount /var/lib/mock/epel-7-x86_64-ed452d1ea0f6dadf50c66d3b3f6b3a6e-40783/root/dev/shm.'
ERROR: Failed to umount /var/lib/mock/epel-7-x86_64-ed452d1ea0f6dadf50c66d3b3f6b3a6e-40783/root/dev/shm.
+ return 1
+ failed=true
+ for mount in '"${mounts[@]}"'
+ safe_umount /var/lib/mock/epel-7-x86_64-ed452d1ea0f6dadf50c66d3b3f6b3a6e-40783/root/dev/pts
+ local mount=/var/lib/mock/epel-7-x86_64-ed452d1ea0f6dadf50c66d3b3f6b3a6e-40783/root/dev/pts
+ local attempt
+ (( attempt=0 ))
+ (( attempt < 3 ))
+ [[ attempt > 0 ]]
+ sleep 1s
+ sudo umount --lazy /var/lib/mock/epel-7-x86_64-ed452d1ea0f6dadf50c66d3b3f6b3a6e-40783/root/dev/pts
umount: /var/lib/mock/epel-7-x86_64-ed452d1ea0f6dadf50c66d3b3f6b3a6e-40783/root/dev/pts: not mounted
+ findmnt --kernel --first /var/lib/mock/epel-7-x86_64-ed452d1ea0f6dadf50c66d3b3f6b3a6e-40783/root/dev/pts
+ (( attempt++ ))
+ (( attempt < 3 ))
+ [[ attempt > 0 ]]
+ sleep 1s
+ sudo umount --lazy /var/lib/mock/epel-7-x86_64-ed452d1ea0f6dadf50c66d3b3f6b3a6e-40783/root/dev/pts
umount: /var/lib/mock/epel-7-x86_64-ed452d1ea0f6dadf50c66d3b3f6b3a6e-40783/root/dev/pts: not mounted
+ findmnt --kernel --first /var/lib/mock/epel-7-x86_64-ed452d1ea0f6dadf50c66d3b3f6b3a6e-40783/root/dev/pts
+ (( attempt++ ))
+ (( attempt < 3 ))
+ [[ attempt > 0 ]]
+ sleep 1s
+ sudo umount --lazy /var/lib/mock/epel-7-x86_64-ed452d1ea0f6dadf50c66d3b3f6b3a6e-40783/root/dev/pts
umount: /var/lib/mock/epel-7-x86_64-ed452d1ea0f6dadf50c66d3b3f6b3a6e-40783/root/dev/pts: not mounted
+ findmnt --kernel --first /var/lib/mock/epel-7-x86_64-ed452d1ea0f6dadf50c66d3b3f6b3a6e-40783/root/dev/pts
+ (( attempt++ ))
+ (( attempt < 3 ))
+ echo 'ERROR: Failed to umount /var/lib/mock/epel-7-x86_64-ed452d1ea0f6dadf50c66d3b3f6b3a6e-40783/root/dev/pts.'
ERROR: Failed to umount /var/lib/mock/epel-7-x86_64-ed452d1ea0f6dadf50c66d3b3f6b3a6e-40783/root/dev/pts.
+ return 1
+ failed=true
+ for mount in '"${mounts[@]}"'
+ safe_umount /var/lib/mock/epel-7-x86_64-ed452d1ea0f6dadf50c66d3b3f6b3a6e-40783/root/var/cache/yum
+ local mount=/var/lib/mock/epel-7-x86_64-ed452d1ea0f6dadf50c66d3b3f6b3a6e-40783/root/var/cache/yum
+ local attempt
+ (( attempt=0 ))
+ (( attempt < 3 ))
+ [[ attempt > 0 ]]
+ sleep 1s
+ sudo umount --lazy /var/lib/mock/epel-7-x86_64-ed452d1ea0f6dadf50c66d3b3f6b3a6e-40783/root/var/cache/yum
umount: /var/lib/mock/epel-7-x86_64-ed452d1ea0f6dadf50c66d3b3f6b3a6e-40783/root/var/cache/yum: not mounted
+ findmnt --kernel --first /var/lib/mock/epel-7-x86_64-ed452d1ea0f6dadf50c66d3b3f6b3a6e-40783/root/var/cache/yum
+ (( attempt++ ))
+ (( attempt < 3 ))
+ [[ attempt > 0 ]]
+ sleep 1s
+ sudo umount --lazy /var/lib/mock/epel-7-x86_64-ed452d1ea0f6dadf50c66d3b3f6b3a6e-40783/root/var/cache/yum
umount: /var/lib/mock/epel-7-x86_64-ed452d1ea0f6dadf50c66d3b3f6b3a6e-40783/root/var/cache/yum: not mounted
+ findmnt --kernel --first /var/lib/mock/epel-7-x86_64-ed452d1ea0f6dadf50c66d3b3f6b3a6e-40783/root/var/cache/yum
+ (( attempt++ ))
+ (( attempt < 3 ))
+ [[ attempt > 0 ]]
+ sleep 1s
+ sudo umount --lazy /var/lib/mock/epel-7-x86_64-ed452d1ea0f6dadf50c66d3b3f6b3a6e-40783/root/var/cache/yum
umount: /var/lib/mock/epel-7-x86_64-ed452d1ea0f6dadf50c66d3b3f6b3a6e-40783/root/var/cache/yum: not mounted
+ findmnt --kernel --first /var/lib/mock/epel-7-x86_64-ed452d1ea0f6dadf50c66d3b3f6b3a6e-40783/root/var/cache/yum
+ (( attempt++ ))
+ (( attempt < 3 ))
+ echo 'ERROR: Failed to umount /var/lib/mock/epel-7-x86_64-ed452d1ea0f6dadf50c66d3b3f6b3a6e-40783/root/var/cache/yum.'
ERROR: Failed to umount /var/lib/mock/epel-7-x86_64-ed452d1ea0f6dadf50c66d3b3f6b3a6e-40783/root/var/cache/yum.
+ return 1
+ failed=true
+ for mount in '"${mounts[@]}"'
+ safe_umount /var/lib/mock/epel-7-x86_64-ed452d1ea0f6dadf50c66d3b3f6b3a6e-40783/root<http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ovirt-system-tests>
+ local mount=/var/lib/mock/epel-7-x86_64-ed452d1ea0f6dadf50c66d3b3f6b3a6e-40783/root<http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ovirt-system-tests>
+ local attempt
+ (( attempt=0 ))
+ (( attempt < 3 ))
+ [[ attempt > 0 ]]
+ sleep 1s
+ sudo umount --lazy /var/lib/mock/epel-7-x86_64-ed452d1ea0f6dadf50c66d3b3f6b3a6e-40783/root<http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ovirt-system-tests>
umount: /var/lib/mock/epel-7-x86_64-ed452d1ea0f6dadf50c66d3b3f6b3a6e-40783/root<http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ovirt-system-tests>: not mounted
+ findmnt --kernel --first /var/lib/mock/epel-7-x86_64-ed452d1ea0f6dadf50c66d3b3f6b3a6e-40783/root<http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ovirt-system-tests>
+ (( attempt++ ))
+ (( attempt < 3 ))
+ [[ attempt > 0 ]]
+ sleep 1s
+ sudo umount --lazy /var/lib/mock/epel-7-x86_64-ed452d1ea0f6dadf50c66d3b3f6b3a6e-40783/root<http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ovirt-system-tests>
umount: /var/lib/mock/epel-7-x86_64-ed452d1ea0f6dadf50c66d3b3f6b3a6e-40783/root<http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ovirt-system-tests>: not mounted
+ findmnt --kernel --first /var/lib/mock/epel-7-x86_64-ed452d1ea0f6dadf50c66d3b3f6b3a6e-40783/root<http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ovirt-system-tests>
+ (( attempt++ ))
+ (( attempt < 3 ))
+ [[ attempt > 0 ]]
+ sleep 1s
+ sudo umount --lazy /var/lib/mock/epel-7-x86_64-ed452d1ea0f6dadf50c66d3b3f6b3a6e-40783/root<http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ovirt-system-tests>
umount: /var/lib/mock/epel-7-x86_64-ed452d1ea0f6dadf50c66d3b3f6b3a6e-40783/root<http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ovirt-system-tests>: not mounted
+ findmnt --kernel --first /var/lib/mock/epel-7-x86_64-ed452d1ea0f6dadf50c66d3b3f6b3a6e-40783/root<http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ovirt-system-tests>
+ (( attempt++ ))
+ (( attempt < 3 ))
+ echo 'ERROR: Failed to umount /var/lib/mock/epel-7-x86_64-ed452d1ea0f6dadf50c66d3b3f6b3a6e-40783/root<http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ovirt-system-tests.'>
ERROR: Failed to umount /var/lib/mock/epel-7-x86_64-ed452d1ea0f6dadf50c66d3b3f6b3a6e-40783/root<http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ovirt-system-tests.>
+ return 1
+ failed=true
+ for mount in '"${mounts[@]}"'
+ safe_umount /var/lib/mock/epel-7-x86_64-ed452d1ea0f6dadf50c66d3b3f6b3a6e-40783/root/run/libvirt
+ local mount=/var/lib/mock/epel-7-x86_64-ed452d1ea0f6dadf50c66d3b3f6b3a6e-40783/root/run/libvirt
+ local attempt
+ (( attempt=0 ))
+ (( attempt < 3 ))
+ [[ attempt > 0 ]]
+ sleep 1s
+ sudo umount --lazy /var/lib/mock/epel-7-x86_64-ed452d1ea0f6dadf50c66d3b3f6b3a6e-40783/root/run/libvirt
umount: /var/lib/mock/epel-7-x86_64-ed452d1ea0f6dadf50c66d3b3f6b3a6e-40783/root/run/libvirt: not mounted
+ findmnt --kernel --first /var/lib/mock/epel-7-x86_64-ed452d1ea0f6dadf50c66d3b3f6b3a6e-40783/root/run/libvirt
+ (( attempt++ ))
+ (( attempt < 3 ))
+ [[ attempt > 0 ]]
+ sleep 1s
+ sudo umount --lazy /var/lib/mock/epel-7-x86_64-ed452d1ea0f6dadf50c66d3b3f6b3a6e-40783/root/run/libvirt
umount: /var/lib/mock/epel-7-x86_64-ed452d1ea0f6dadf50c66d3b3f6b3a6e-40783/root/run/libvirt: not mounted
+ findmnt --kernel --first /var/lib/mock/epel-7-x86_64-ed452d1ea0f6dadf50c66d3b3f6b3a6e-40783/root/run/libvirt
+ (( attempt++ ))
+ (( attempt < 3 ))
+ [[ attempt > 0 ]]
+ sleep 1s
+ sudo umount --lazy /var/lib/mock/epel-7-x86_64-ed452d1ea0f6dadf50c66d3b3f6b3a6e-40783/root/run/libvirt
umount: /var/lib/mock/epel-7-x86_64-ed452d1ea0f6dadf50c66d3b3f6b3a6e-40783/root/run/libvirt: not mounted
+ findmnt --kernel --first /var/lib/mock/epel-7-x86_64-ed452d1ea0f6dadf50c66d3b3f6b3a6e-40783/root/run/libvirt
+ (( attempt++ ))
+ (( attempt < 3 ))
+ echo 'ERROR: Failed to umount /var/lib/mock/epel-7-x86_64-ed452d1ea0f6dadf50c66d3b3f6b3a6e-40783/root/run/libvirt.'
ERROR: Failed to umount /var/lib/mock/epel-7-x86_64-ed452d1ea0f6dadf50c66d3b3f6b3a6e-40783/root/run/libvirt.
+ return 1
+ failed=true
+ for mount in '"${mounts[@]}"'
+ safe_umount /var/lib/mock/epel-7-x86_64-ed452d1ea0f6dadf50c66d3b3f6b3a6e-40783/root/var/lib/lago
+ local mount=/var/lib/mock/epel-7-x86_64-ed452d1ea0f6dadf50c66d3b3f6b3a6e-40783/root/var/lib/lago
+ local attempt
+ (( attempt=0 ))
+ (( attempt < 3 ))
+ [[ attempt > 0 ]]
+ sleep 1s
+ sudo umount --lazy /var/lib/mock/epel-7-x86_64-ed452d1ea0f6dadf50c66d3b3f6b3a6e-40783/root/var/lib/lago
umount: /var/lib/mock/epel-7-x86_64-ed452d1ea0f6dadf50c66d3b3f6b3a6e-40783/root/var/lib/lago: not mounted
+ findmnt --kernel --first /var/lib/mock/epel-7-x86_64-ed452d1ea0f6dadf50c66d3b3f6b3a6e-40783/root/var/lib/lago
+ (( attempt++ ))
+ (( attempt < 3 ))
+ [[ attempt > 0 ]]
+ sleep 1s
+ sudo umount --lazy /var/lib/mock/epel-7-x86_64-ed452d1ea0f6dadf50c66d3b3f6b3a6e-40783/root/var/lib/lago
umount: /var/lib/mock/epel-7-x86_64-ed452d1ea0f6dadf50c66d3b3f6b3a6e-40783/root/var/lib/lago: not mounted
+ findmnt --kernel --first /var/lib/mock/epel-7-x86_64-ed452d1ea0f6dadf50c66d3b3f6b3a6e-40783/root/var/lib/lago
+ (( attempt++ ))
+ (( attempt < 3 ))
+ [[ attempt > 0 ]]
+ sleep 1s
+ sudo umount --lazy /var/lib/mock/epel-7-x86_64-ed452d1ea0f6dadf50c66d3b3f6b3a6e-40783/root/var/lib/lago
umount: /var/lib/mock/epel-7-x86_64-ed452d1ea0f6dadf50c66d3b3f6b3a6e-40783/root/var/lib/lago: not mounted
+ findmnt --kernel --first /var/lib/mock/epel-7-x86_64-ed452d1ea0f6dadf50c66d3b3f6b3a6e-40783/root/var/lib/lago
+ (( attempt++ ))
+ (( attempt < 3 ))
+ echo 'ERROR: Failed to umount /var/lib/mock/epel-7-x86_64-ed452d1ea0f6dadf50c66d3b3f6b3a6e-40783/root/var/lib/lago.'
ERROR: Failed to umount /var/lib/mock/epel-7-x86_64-ed452d1ea0f6dadf50c66d3b3f6b3a6e-40783/root/var/lib/lago.
+ return 1
+ failed=true
+ for mount in '"${mounts[@]}"'
+ safe_umount /var/lib/mock/epel-7-x86_64-ed452d1ea0f6dadf50c66d3b3f6b3a6e-40783/root/proc/filesystems
+ local mount=/var/lib/mock/epel-7-x86_64-ed452d1ea0f6dadf50c66d3b3f6b3a6e-40783/root/proc/filesystems
+ local attempt
+ (( attempt=0 ))
+ (( attempt < 3 ))
+ [[ attempt > 0 ]]
+ sleep 1s
+ sudo umount --lazy /var/lib/mock/epel-7-x86_64-ed452d1ea0f6dadf50c66d3b3f6b3a6e-40783/root/proc/filesystems
umount: /var/lib/mock/epel-7-x86_64-ed452d1ea0f6dadf50c66d3b3f6b3a6e-40783/root/proc/filesystems: mountpoint not found
+ findmnt --kernel --first /var/lib/mock/epel-7-x86_64-ed452d1ea0f6dadf50c66d3b3f6b3a6e-40783/root/proc/filesystems
+ (( attempt++ ))
+ (( attempt < 3 ))
+ [[ attempt > 0 ]]
+ sleep 1s
+ sudo umount --lazy /var/lib/mock/epel-7-x86_64-ed452d1ea0f6dadf50c66d3b3f6b3a6e-40783/root/proc/filesystems
umount: /var/lib/mock/epel-7-x86_64-ed452d1ea0f6dadf50c66d3b3f6b3a6e-40783/root/proc/filesystems: mountpoint not found
+ findmnt --kernel --first /var/lib/mock/epel-7-x86_64-ed452d1ea0f6dadf50c66d3b3f6b3a6e-40783/root/proc/filesystems
+ (( attempt++ ))
+ (( attempt < 3 ))
+ [[ attempt > 0 ]]
+ sleep 1s
+ sudo umount --lazy /var/lib/mock/epel-7-x86_64-ed452d1ea0f6dadf50c66d3b3f6b3a6e-40783/root/proc/filesystems
umount: /var/lib/mock/epel-7-x86_64-ed452d1ea0f6dadf50c66d3b3f6b3a6e-40783/root/proc/filesystems: mountpoint not found
+ findmnt --kernel --first /var/lib/mock/epel-7-x86_64-ed452d1ea0f6dadf50c66d3b3f6b3a6e-40783/root/proc/filesystems
+ (( attempt++ ))
+ (( attempt < 3 ))
+ echo 'ERROR: Failed to umount /var/lib/mock/epel-7-x86_64-ed452d1ea0f6dadf50c66d3b3f6b3a6e-40783/root/proc/filesystems.'
ERROR: Failed to umount /var/lib/mock/epel-7-x86_64-ed452d1ea0f6dadf50c66d3b3f6b3a6e-40783/root/proc/filesystems.
+ return 1
+ failed=true
+ for mock_root in '/var/lib/mock/*'
+ this_chroot_failed=false
+ mounts=($(cut -d\ -f2 /proc/mounts | grep "$mock_root" | sort -r))
++ cut '-d ' -f2 /proc/mounts
++ grep /var/lib/mock/epel-7-x86_64-ed452d1ea0f6dadf50c66d3b3f6b3a6e-40783
++ sort -r
+ [[ -n '' ]]
+ false
+ sudo rm -rf /var/lib/mock/epel-7-x86_64-ed452d1ea0f6dadf50c66d3b3f6b3a6e-40783
+ find /var/cache/mock/ -mindepth 1 -maxdepth 1 -type d -mtime +2 -print0
+ xargs -0 -tr sudo rm -rf
++ virsh list --all --uuid
+ true
+ echo 'Cleanup script failed, propegating failure to job'
Cleanup script failed, propegating failure to job
+ exit 1
POST BUILD TASK : FAILURE
END OF POST BUILD TASK : 1
Recording test results
ERROR: Step ?Publish JUnit test result report? failed: No test report files were found. Configuration error?
Archiving artifacts
7 years, 11 months