<div dir="ltr"><br><div class="gmail_extra"><br><div class="gmail_quote">On Tue, Jan 10, 2017 at 3:14 PM, Evgheni Dereveanchin <span dir="ltr"><<a href="mailto:ederevea@redhat.com" target="_blank">ederevea@redhat.com</a>></span> wrote:<br><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex">Not sure what the initial problem was, but on my laptop (Haswell-MB)<br>
I always use the lowest possible CPU family to ensure it's using<br>
as few features as possible in nested VMs:<br></blockquote><div><br></div><div>;-)</div><div><br></div><div>I'm doing the exact opposite, for two reasons:</div><div>1. I want the best possible performance. Specifically, I'd like the tests to run as fast as possible.</div><div>2. I'd like to expose as many of the latest features up to the hosts (and VMs).</div><div> </div><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex">
<cpu mode='custom' match='exact'><br>
<model fallback='allow'>core2duo</<wbr>model><br>
<feature policy='require' name='vmx'/><br>
</cpu><br>
<br>
Respectively, I use model_Conroe on oVirt side and didn't have<br>
problems with it. Do we really need to use newer CPU families<br>
in our tests?<br></blockquote><div><br></div><div>We probably don't - we used to have Conroe hard-coded in the tests (until I changed it to use something different).</div><div><br></div><div>It does mean it'll be a bit challenging to run on AMD if we decide to go back to hard-code Conroe.</div><div>Y.</div><div> </div><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex">
<br>
Regards,<br>
Evgheni Dereveanchin<br>
<div class="HOEnZb"><div class="h5"><br>
----- Original Message -----<br>
From: "Milan Zamazal" <<a href="mailto:mzamazal@redhat.com">mzamazal@redhat.com</a>><br>
To: "Yaniv Kaul" <<a href="mailto:ykaul@redhat.com">ykaul@redhat.com</a>><br>
Cc: "Lev Veyde" <<a href="mailto:lveyde@redhat.com">lveyde@redhat.com</a>>, "Eyal Edri" <<a href="mailto:eedri@redhat.com">eedri@redhat.com</a>>, "Sandro Bonazzola" <<a href="mailto:sbonazzo@redhat.com">sbonazzo@redhat.com</a>>, "infra" <<a href="mailto:infra@ovirt.org">infra@ovirt.org</a>>, "Gal Ben Haim" <<a href="mailto:gbenhaim@redhat.com">gbenhaim@redhat.com</a>>, "Martin Polednik" <<a href="mailto:mpoledni@redhat.com">mpoledni@redhat.com</a>>, "Evgheni Dereveanchin" <<a href="mailto:ederevea@redhat.com">ederevea@redhat.com</a>><br>
Sent: Tuesday, 10 January, 2017 1:16:09 PM<br>
Subject: Re: Build failed in Jenkins: ovirt_4.0_he-system-tests #627<br>
<br>
Yaniv Kaul <<a href="mailto:ykaul@redhat.com">ykaul@redhat.com</a>> writes:<br>
<br>
> On Tue, Jan 10, 2017 at 12:45 PM, Milan Zamazal <<a href="mailto:mzamazal@redhat.com">mzamazal@redhat.com</a>> wrote:<br>
><br>
>> Yaniv Kaul <<a href="mailto:ykaul@redhat.com">ykaul@redhat.com</a>> writes:<br>
>><br>
>> > On Tue, Jan 10, 2017 at 12:08 PM, Lev Veyde <<a href="mailto:lveyde@redhat.com">lveyde@redhat.com</a>> wrote:<br>
>> ><br>
>> >> This patch is one that caused it probably:<br>
>> >> <a href="https://github.com/lago-project/lago/commit/" rel="noreferrer" target="_blank">https://github.com/lago-<wbr>project/lago/commit/</a><br>
>> 05ccf7240976f91b0c14d6a1f88016<br>
>> >> 376d5e87f0<br>
>> ><br>
>> ><br>
>> > +Milan.<br>
>><br>
>> +Martin<br>
>><br>
>> > I must confess that I did not like the patch to begin with...<br>
>> > I did not understand what real problem it solved, but Michal assured me<br>
>> > there was a real issue.<br>
>><br>
>> Yes, there was a real issue with nested virtualization. Some CPU flags<br>
>> are missing with Haswell and Lago doesn't run properly.<br>
>><br>
><br>
> Is this a libvirt bug btw?<br>
<br>
I'm not sure. When the sets of CPU flags on the host and in the VM with<br>
a copied host CPU are different, it's not clear what's the right thing<br>
to do.<br>
<br>
> Perhaps we need a switch to turn this feature on and off?<br>
<br>
I think it would be useful to have a possibility to specify a particular<br>
CPU type in the Lago configuration.<br>
<br>
>> > I know have Engine with a Java@ 100% CPU - I hope it's unrelated to<br>
>> this as<br>
>> > well.<br>
>> ><br>
>> > I suggest we do survey to see who doesn't have SandyBridge and above and<br>
>> > perhaps move higher than Westmere.<br>
>><br>
>> We've got Westmere servers in the Brno lab.<br>
>><br>
><br>
> Do we know the scope of the problem? Does it happen only on Westmere, for<br>
> example?<br>
<br>
The problem was with Haswell-noTSX (on my Lenovo, but I think Martin has<br>
observed the same problem too). We don't know the scope of the problem,<br>
but if we want to be able to run Lago on Brno servers then we must be<br>
Westmere compatible.<br>
<br>
> Y.<br>
><br>
><br>
>> > What do we have in CI?<br>
>> > Y.<br>
>> ><br>
>> ><br>
>> >><br>
>> >> Thanks in advance,<br>
>> >> Lev Veyde.<br>
>> >><br>
>> >> ----- Original Message -----<br>
>> >> From: "Lev Veyde" <<a href="mailto:lveyde@redhat.com">lveyde@redhat.com</a>><br>
>> >> To: "Eyal Edri" <<a href="mailto:eedri@redhat.com">eedri@redhat.com</a>>, <a href="mailto:sbonazzo@redhat.com">sbonazzo@redhat.com</a><br>
>> >> Cc: <a href="mailto:infra@ovirt.org">infra@ovirt.org</a>, "Gal Ben Haim" <<a href="mailto:gbenhaim@redhat.com">gbenhaim@redhat.com</a>><br>
>> >> Sent: Tuesday, January 10, 2017 11:50:05 AM<br>
>> >> Subject: Re: Build failed in Jenkins: ovirt_4.0_he-system-tests #627<br>
>> >><br>
>> >> Hi,<br>
>> >><br>
>> >> Checked the logs and see the following:<br>
>> >><br>
>> >> 02:42:05 [WARNING] OVF does not contain a valid image description, using<br>
>> >> default.<br>
>> >> 02:42:05 The following CPU types are supported by this host:<br>
>> >> 02:42:05 - model_Westmere: Intel Westmere Family<br>
>> >> 02:42:05 - model_Nehalem: Intel Nehalem Family<br>
>> >> 02:42:05 - model_Penryn: Intel Penryn Family<br>
>> >> 02:42:05 - model_Conroe: Intel Conroe Family<br>
>> >> 02:42:05 [ ERROR ] Failed to execute stage 'Environment customization':<br>
>> >> Invalid CPU type specified: model_SandyBridge<br>
>> >><br>
>> >> Barak thinks that it may be related to the recent update in the Lago<br>
>> code.<br>
>> >><br>
>> >> Gal, any idea ?<br>
>> >><br>
>> >> Thanks in advance,<br>
>> >> Lev Veyde.<br>
>> >><br>
>> >> ----- Original Message -----<br>
>> >> From: <a href="mailto:jenkins@jenkins.phx.ovirt.org">jenkins@jenkins.phx.ovirt.org</a><br>
>> >> To: <a href="mailto:sbonazzo@redhat.com">sbonazzo@redhat.com</a>, <a href="mailto:infra@ovirt.org">infra@ovirt.org</a>, <a href="mailto:lveyde@redhat.com">lveyde@redhat.com</a><br>
>> >> Sent: Tuesday, January 10, 2017 4:42:14 AM<br>
>> >> Subject: Build failed in Jenkins: ovirt_4.0_he-system-tests #627<br>
>> >><br>
>> >> See <<a href="http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/627/changes" rel="noreferrer" target="_blank">http://jenkins.ovirt.org/job/<wbr>ovirt_4.0_he-system-tests/627/<wbr>changes</a><br>
>> ><br>
>> >><br>
>> >> Changes:<br>
>> >><br>
>> >> [Lev Veyde] Mask NetworkManager service<br>
>> >><br>
>> >> [Eyal Edri] fix imgbased job names in jjb<br>
>> >><br>
>> >> [Daniel Belenky] fixing jjb version for cockpit-ovirt<br>
>> >><br>
>> >> [Gil Shinar] Add some more 4.1 to experimental<br>
>> >><br>
>> >> [Juan Hernandez] Don't build RPMs for the JBoss modules Maven plugin<br>
>> >><br>
>> >> [pkliczewski] jsonrpc 4.1 branch<br>
>> >><br>
>> >> ------------------------------<wbr>------------<br>
>> >> [...truncated 749 lines...]<br>
>> >> Finish: shell<br>
>> >> @@@@@@@@@@@@@@@@@@@@@@@@@@@@@@<wbr>@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@<wbr>@@<br>
>> >> @@ Tue Jan 10 02:42:07 UTC 2017 automation/<a href="http://he_basic_suite_4.0.sh" rel="noreferrer" target="_blank">he_basic_suite_4.0.<wbr>sh</a> chroot<br>
>> >> finished<br>
>> >> @@ took 360 seconds<br>
>> >> @@ rc = 1<br>
>> >> @@@@@@@@@@@@@@@@@@@@@@@@@@@@@@<wbr>@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@<wbr>@@<br>
>> >> ========== Scrubbing chroot<br>
>> >> mock \<br>
>> >> --configdir="<<a href="http://jenkins.ovirt.org/job/ovirt_4.0_he-" rel="noreferrer" target="_blank">http://jenkins.<wbr>ovirt.org/job/ovirt_4.0_he-</a><br>
>> >> system-tests/ws/ovirt-system-<wbr>tests"> \<br>
>> >> --root="mocker-epel-7-x86_64.<wbr>el7" \<br>
>> >> --resultdir="./mock_logs.<wbr>xGGwEk6V/mocker-epel-7-x86_64.<br>
>> el7.scrub"<br>
>> >> \<br>
>> >> --scrub=chroot<br>
>> >> WARNING: Could not find required logging config file: <<br>
>> >> <a href="http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/" rel="noreferrer" target="_blank">http://jenkins.ovirt.org/job/<wbr>ovirt_4.0_he-system-tests/ws/</a><br>
>> >> ovirt-system-tests/logging.<wbr>ini.> Using default...<br>
>> >> INFO: mock.py version 1.2.21 starting (python version = 3.4.3)...<br>
>> >> Start: init plugins<br>
>> >> INFO: selinux enabled<br>
>> >> Finish: init plugins<br>
>> >> Start: run<br>
>> >> Start: scrub ['chroot']<br>
>> >> INFO: scrubbing chroot for mocker-epel-7-x86_64.el7<br>
>> >> Finish: scrub ['chroot']<br>
>> >> Finish: run<br>
>> >> Scrub chroot took 6 seconds<br>
>> >> ============================<br>
>> >> ##############################<wbr>############################<br>
>> >> ## Tue Jan 10 02:42:13 UTC 2017 Finished env: el7:epel-7-x86_64<br>
>> >> ## took 366 seconds<br>
>> >> ## rc = 1<br>
>> >> ##############################<wbr>############################<br>
>> >> find: ‘logs’: No such file or directory<br>
>> >> No log files found, check command output<br>
>> >> ##!###########################<wbr>#############################<br>
>> >> Collecting mock logs<br>
>> >> ‘./mock_logs.xGGwEk6V/mocker-<wbr>epel-7-x86_64.el7.clean_rpmdb’ -><br>
>> >> ‘exported-artifacts/mock_logs/<wbr>mocker-epel-7-x86_64.el7.<wbr>clean_rpmdb’<br>
>> >> ‘./mock_logs.xGGwEk6V/<a href="http://mocker-epel-7-x86_64.el7.he_basic_suite_4.0.sh" rel="noreferrer" target="_blank">mocker-<wbr>epel-7-x86_64.el7.he_basic_<wbr>suite_4.0.sh</a>’<br>
>> -><br>
>> >> ‘exported-artifacts/mock_logs/<wbr>mocker-epel-7-x86_64.el7.he_<br>
>> >> <a href="http://basic_suite_4.0.sh" rel="noreferrer" target="_blank">basic_suite_4.0.sh</a>’<br>
>> >> ‘./mock_logs.xGGwEk6V/mocker-<wbr>epel-7-x86_64.el7.init’ -><br>
>> >> ‘exported-artifacts/mock_logs/<wbr>mocker-epel-7-x86_64.el7.init’<br>
>> >> ##############################<wbr>############################<br>
>> >> Build step 'Execute shell' marked build as failure<br>
>> >> Performing Post build task...<br>
>> >> Match found for :.* : True<br>
>> >> Logical operation result is TRUE<br>
>> >> Running script : #!/bin/bash -xe<br>
>> >> echo 'shell_scripts/<a href="http://system_tests.collect_logs.sh" rel="noreferrer" target="_blank">system_tests.<wbr>collect_logs.sh</a>'<br>
>> >><br>
>> >> #<br>
>> >> # Required jjb vars:<br>
>> >> # version<br>
>> >> #<br>
>> >> VERSION=4.0<br>
>> >> SUITE_TYPE=<br>
>> >><br>
>> >> WORKSPACE="$PWD"<br>
>> >> OVIRT_SUITE="$SUITE_TYPE_<wbr>suite_$VERSION"<br>
>> >> TESTS_LOGS="$WORKSPACE/ovirt-<wbr>system-tests/exported-<wbr>artifacts"<br>
>> >><br>
>> >> rm -rf "$WORKSPACE/exported-<wbr>artifacts"<br>
>> >> mkdir -p "$WORKSPACE/exported-<wbr>artifacts"<br>
>> >><br>
>> >> if [[ -d "$TESTS_LOGS" ]]; then<br>
>> >> mv "$TESTS_LOGS/"* "$WORKSPACE/exported-<wbr>artifacts/"<br>
>> >> fi<br>
>> >><br>
>> >> [ovirt_4.0_he-system-tests] $ /bin/bash -xe<br>
>> /tmp/hudson302101162661598371.<br>
>> >> sh<br>
>> >> + echo shell_scripts/<a href="http://system_tests.collect_logs.sh" rel="noreferrer" target="_blank">system_tests.<wbr>collect_logs.sh</a><br>
>> >> shell_scripts/<a href="http://system_tests.collect_logs.sh" rel="noreferrer" target="_blank">system_tests.<wbr>collect_logs.sh</a><br>
>> >> + VERSION=4.0<br>
>> >> + SUITE_TYPE=<br>
>> >> + WORKSPACE=<<a href="http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/" rel="noreferrer" target="_blank">http://jenkins.<wbr>ovirt.org/job/ovirt_4.0_he-<wbr>system-tests/ws/</a><br>
>> ><br>
>> >> + OVIRT_SUITE=4.0<br>
>> >> + TESTS_LOGS=<<a href="http://jenkins.ovirt.org/job/ovirt_4.0_he-" rel="noreferrer" target="_blank">http://jenkins.<wbr>ovirt.org/job/ovirt_4.0_he-</a><br>
>> >> system-tests/ws/ovirt-system-<wbr>tests/exported-artifacts><br>
>> >> + rm -rf <<a href="http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/627/" rel="noreferrer" target="_blank">http://jenkins.ovirt.org/job/<wbr>ovirt_4.0_he-system-tests/627/</a><br>
>> >> artifact/exported-artifacts><br>
>> >> + mkdir -p <<a href="http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/627/" rel="noreferrer" target="_blank">http://jenkins.ovirt.org/job/<wbr>ovirt_4.0_he-system-tests/627/</a><br>
>> >> artifact/exported-artifacts><br>
>> >> + [[ -d <<a href="http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/" rel="noreferrer" target="_blank">http://jenkins.ovirt.org/job/<wbr>ovirt_4.0_he-system-tests/ws/</a><br>
>> >> ovirt-system-tests/exported-<wbr>artifacts> ]]<br>
>> >> + mv <<a href="http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/" rel="noreferrer" target="_blank">http://jenkins.ovirt.org/job/<wbr>ovirt_4.0_he-system-tests/ws/</a><br>
>> >> ovirt-system-tests/exported-<wbr>artifacts/failure_msg.txt> <<br>
>> >> <a href="http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/" rel="noreferrer" target="_blank">http://jenkins.ovirt.org/job/<wbr>ovirt_4.0_he-system-tests/ws/</a><br>
>> >> ovirt-system-tests/exported-<wbr>artifacts/lago_logs> <<br>
>> >> <a href="http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/" rel="noreferrer" target="_blank">http://jenkins.ovirt.org/job/<wbr>ovirt_4.0_he-system-tests/ws/</a><br>
>> >> ovirt-system-tests/exported-<wbr>artifacts/mock_logs> <<br>
>> >> <a href="http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/627/" rel="noreferrer" target="_blank">http://jenkins.ovirt.org/job/<wbr>ovirt_4.0_he-system-tests/627/</a><br>
>> >> artifact/exported-artifacts/><br>
>> >> POST BUILD TASK : SUCCESS<br>
>> >> END OF POST BUILD TASK : 0<br>
>> >> Match found for :.* : True<br>
>> >> Logical operation result is TRUE<br>
>> >> Running script : #!/bin/bash -x<br>
>> >> echo "shell-scripts/mock_cleanup.<wbr>sh"<br>
>> >> # Make clear this is the cleanup, helps reading the jenkins logs<br>
>> >> cat <<EOC<br>
>> >> ______________________________<wbr>______________________________<wbr>___________<br>
>> >> ##############################<wbr>##############################<wbr>###########<br>
>> >> # #<br>
>> >> # CLEANUP #<br>
>> >> # #<br>
>> >> ##############################<wbr>##############################<wbr>###########<br>
>> >> EOC<br>
>> >><br>
>> >> shopt -s nullglob<br>
>> >><br>
>> >> WORKSPACE="${WORKSPACE:-$PWD}"<br>
>> >> UMOUNT_RETRIES="${UMOUNT_<wbr>RETRIES:-3}"<br>
>> >> UMOUNT_RETRY_DELAY="${UMOUNT_<wbr>RETRY_DELAY:-1s}"<br>
>> >><br>
>> >> safe_umount() {<br>
>> >> local mount="${1:?}"<br>
>> >> local attempt<br>
>> >> for ((attempt=0 ; attempt < $UMOUNT_RETRIES ; attempt++)); do<br>
>> >> # If this is not the 1st time through the loop, Sleep a while to<br>
>> >> let<br>
>> >> # the problem "solve itself"<br>
>> >> [[ attempt > 0 ]] && sleep "$UMOUNT_RETRY_DELAY"<br>
>> >> # Try to umount<br>
>> >> sudo umount --lazy "$mount" && return 0<br>
>> >> # See if the mount is already not there despite failing<br>
>> >> findmnt --kernel --first "$mount" > /dev/null && return 0<br>
>> >> done<br>
>> >> echo "ERROR: Failed to umount $mount."<br>
>> >> return 1<br>
>> >> }<br>
>> >><br>
>> >> # restore the permissions in the working dir, as sometimes it leaves<br>
>> files<br>
>> >> # owned by root and then the 'cleanup workspace' from jenkins job fails<br>
>> to<br>
>> >> # clean and breaks the jobs<br>
>> >> sudo chown -R "$USER" "$WORKSPACE"<br>
>> >><br>
>> >> # stop any processes running inside the chroot<br>
>> >> failed=false<br>
>> >> mock_confs=("$WORKSPACE"/*/<wbr>mocker*)<br>
>> >> # Clean current jobs mockroot if any<br>
>> >> for mock_conf_file in "${mock_confs[@]}"; do<br>
>> >> [[ "$mock_conf_file" ]] || continue<br>
>> >> echo "Cleaning up mock $mock_conf"<br>
>> >> mock_root="${mock_conf_file##*<wbr>/}"<br>
>> >> mock_root="${mock_root%.*}"<br>
>> >> my_mock="/usr/bin/mock"<br>
>> >> my_mock+=" --configdir=${mock_conf_file%/<wbr>*}"<br>
>> >> my_mock+=" --root=${mock_root}"<br>
>> >> my_mock+=" --resultdir=$WORKSPACE"<br>
>> >><br>
>> >> #TODO: investigate why mock --clean fails to umount certain dirs<br>
>> >> sometimes,<br>
>> >> #so we can use it instead of manually doing all this.<br>
>> >> echo "Killing all mock orphan processes, if any."<br>
>> >> $my_mock \<br>
>> >> --orphanskill \<br>
>> >> || {<br>
>> >> echo "ERROR: Failed to kill orphans on $chroot."<br>
>> >> failed=true<br>
>> >> }<br>
>> >><br>
>> >> mock_root="$(\<br>
>> >> grep \<br>
>> >> -Po "(?<=config_opts\['root'\] = ')[^']*" \<br>
>> >> "$mock_conf_file" \<br>
>> >> )" || :<br>
>> >> [[ "$mock_root" ]] || continue<br>
>> >> mounts=($(mount | awk '{print $3}' | grep "$mock_root")) || :<br>
>> >> if [[ "$mounts" ]]; then<br>
>> >> echo "Found mounted dirs inside the chroot $chroot. Trying to<br>
>> >> umount."<br>
>> >> fi<br>
>> >> for mount in "${mounts[@]}"; do<br>
>> >> safe_umount "$mount" || failed=true<br>
>> >> done<br>
>> >> done<br>
>> >><br>
>> >> # Clean any leftover chroot from other jobs<br>
>> >> for mock_root in /var/lib/mock/*; do<br>
>> >> this_chroot_failed=false<br>
>> >> mounts=($(cut -d\ -f2 /proc/mounts | grep "$mock_root" | sort -r))<br>
>> ||<br>
>> >> :<br>
>> >> if [[ "$mounts" ]]; then<br>
>> >> echo "Found mounted dirs inside the chroot $mock_root." \<br>
>> >> "Trying to umount."<br>
>> >> fi<br>
>> >> for mount in "${mounts[@]}"; do<br>
>> >> safe_umount "$mount" && continue<br>
>> >> # If we got here, we failed $UMOUNT_RETRIES attempts so we<br>
>> should<br>
>> >> make<br>
>> >> # noise<br>
>> >> failed=true<br>
>> >> this_chroot_failed=true<br>
>> >> done<br>
>> >> if ! $this_chroot_failed; then<br>
>> >> sudo rm -rf "$mock_root"<br>
>> >> fi<br>
>> >> done<br>
>> >><br>
>> >> # remove mock caches that are older then 2 days:<br>
>> >> find /var/cache/mock/ -mindepth 1 -maxdepth 1 -type d -mtime +2 -print0<br>
>> | \<br>
>> >> xargs -0 -tr sudo rm -rf<br>
>> >> # We make no effort to leave around caches that may still be in use<br>
>> because<br>
>> >> # packages installed in them may go out of date, so may as well recreate<br>
>> >> them<br>
>> >><br>
>> >> # Drop all left over libvirt domains<br>
>> >> for UUID in $(virsh list --all --uuid); do<br>
>> >> virsh destroy $UUID || :<br>
>> >> sleep 2<br>
>> >> virsh undefine --remove-all-storage --storage vda --snapshots-metadata<br>
>> >> $UUID || :<br>
>> >> done<br>
>> >><br>
>> >> if $failed; then<br>
>> >> echo "Cleanup script failed, propegating failure to job"<br>
>> >> exit 1<br>
>> >> fi<br>
>> >><br>
>> >> [ovirt_4.0_he-system-tests] $ /bin/bash -x /tmp/<br>
>> >> hudson1888216492513466503.sh<br>
>> >> + echo shell-scripts/mock_cleanup.sh<br>
>> >> shell-scripts/mock_cleanup.sh<br>
>> >> + cat<br>
>> >> ______________________________<wbr>______________________________<wbr>___________<br>
>> >> ##############################<wbr>##############################<wbr>###########<br>
>> >> # #<br>
>> >> # CLEANUP #<br>
>> >> # #<br>
>> >> ##############################<wbr>##############################<wbr>###########<br>
>> >> + shopt -s nullglob<br>
>> >> + WORKSPACE=<<a href="http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/" rel="noreferrer" target="_blank">http://jenkins.<wbr>ovirt.org/job/ovirt_4.0_he-<wbr>system-tests/ws/</a><br>
>> ><br>
>> >> + UMOUNT_RETRIES=3<br>
>> >> + UMOUNT_RETRY_DELAY=1s<br>
>> >> + sudo chown -R jenkins <<a href="http://jenkins.ovirt.org/job/" rel="noreferrer" target="_blank">http://jenkins.ovirt.org/job/</a><br>
>> >> ovirt_4.0_he-system-tests/ws/><br>
>> >> + failed=false<br>
>> >> + mock_confs=("$WORKSPACE"/*/<wbr>mocker*)<br>
>> >> + for mock_conf_file in '"${mock_confs[@]}"'<br>
>> >> + [[ -n <<a href="http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/" rel="noreferrer" target="_blank">http://jenkins.ovirt.org/job/<wbr>ovirt_4.0_he-system-tests/ws/</a><br>
>> >> ovirt-system-tests/mocker-<wbr>epel-7-x86_64.el7.cfg> ]]<br>
>> >> + echo 'Cleaning up mock '<br>
>> >> Cleaning up mock<br>
>> >> + mock_root=mocker-epel-7-x86_<wbr>64.el7.cfg<br>
>> >> + mock_root=mocker-epel-7-x86_<wbr>64.el7<br>
>> >> + my_mock=/usr/bin/mock<br>
>> >> + my_mock+=' --configdir=<<a href="http://jenkins.ovirt.org/job/ovirt_4.0_he-" rel="noreferrer" target="_blank">http://jenkins.<wbr>ovirt.org/job/ovirt_4.0_he-</a><br>
>> >> system-tests/ws/ovirt-system-<wbr>tests'><br>
>> >> + my_mock+=' --root=mocker-epel-7-x86_64.<wbr>el7'<br>
>> >> + my_mock+=' --resultdir=<<a href="http://jenkins.ovirt.org/job/ovirt_4.0_he-" rel="noreferrer" target="_blank">http://jenkins.<wbr>ovirt.org/job/ovirt_4.0_he-</a><br>
>> >> system-tests/ws/'><br>
>> >> + echo 'Killing all mock orphan processes, if any.'<br>
>> >> Killing all mock orphan processes, if any.<br>
>> >> + /usr/bin/mock --configdir=<<a href="http://jenkins.ovirt.org/job/ovirt_4.0_he-" rel="noreferrer" target="_blank">http://jenkins.<wbr>ovirt.org/job/ovirt_4.0_he-</a><br>
>> >> system-tests/ws/ovirt-system-<wbr>tests> --root=mocker-epel-7-x86_64.<wbr>el7<br>
>> >> --resultdir=<<a href="http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/" rel="noreferrer" target="_blank">http://jenkins.<wbr>ovirt.org/job/ovirt_4.0_he-<wbr>system-tests/ws/</a><br>
>> ><br>
>> >> --orphanskill<br>
>> >> WARNING: Could not find required logging config file: <<br>
>> >> <a href="http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/" rel="noreferrer" target="_blank">http://jenkins.ovirt.org/job/<wbr>ovirt_4.0_he-system-tests/ws/</a><br>
>> >> ovirt-system-tests/logging.<wbr>ini.> Using default...<br>
>> >> INFO: mock.py version 1.2.21 starting (python version = 3.4.3)...<br>
>> >> Start: init plugins<br>
>> >> INFO: selinux enabled<br>
>> >> Finish: init plugins<br>
>> >> Start: run<br>
>> >> Finish: run<br>
>> >> ++ grep -Po '(?<=config_opts\['\''root'\''<wbr>\] = '\'')[^'\'']*' <<br>
>> >> <a href="http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/" rel="noreferrer" target="_blank">http://jenkins.ovirt.org/job/<wbr>ovirt_4.0_he-system-tests/ws/</a><br>
>> >> ovirt-system-tests/mocker-<wbr>epel-7-x86_64.el7.cfg><br>
>> >> + mock_root=epel-7-x86_64-<wbr>6f628e6dc1a827c86d5e1bd9d3b3d3<wbr>8b<br>
>> >> + [[ -n epel-7-x86_64-<wbr>6f628e6dc1a827c86d5e1bd9d3b3d3<wbr>8b ]]<br>
>> >> + mounts=($(mount | awk '{print $3}' | grep "$mock_root"))<br>
>> >> ++ mount<br>
>> >> ++ awk '{print $3}'<br>
>> >> ++ grep epel-7-x86_64-<wbr>6f628e6dc1a827c86d5e1bd9d3b3d3<wbr>8b<br>
>> >> + :<br>
>> >> + [[ -n '' ]]<br>
>> >> + find /var/cache/mock/ -mindepth 1 -maxdepth 1 -type d -mtime +2<br>
>> -print0<br>
>> >> + xargs -0 -tr sudo rm -rf<br>
>> >> ++ virsh list --all --uuid<br>
>> >> + false<br>
>> >> POST BUILD TASK : SUCCESS<br>
>> >> END OF POST BUILD TASK : 1<br>
>> >> Recording test results<br>
>> >> ERROR: Step ‘Publish JUnit test result report’ failed: No test report<br>
>> >> files were found. Configuration error?<br>
>> >> Archiving artifacts<br>
>> >><br>
>><br>
</div></div></blockquote></div><br></div></div>