Build failed in Jenkins: ovirt_4.0_he-system-tests #627

See <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/627/changes> Changes: [Lev Veyde] Mask NetworkManager service [Eyal Edri] fix imgbased job names in jjb [Daniel Belenky] fixing jjb version for cockpit-ovirt [Gil Shinar] Add some more 4.1 to experimental [Juan Hernandez] Don't build RPMs for the JBoss modules Maven plugin [pkliczewski] jsonrpc 4.1 branch ------------------------------------------ [...truncated 749 lines...] Finish: shell @@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@ @@ Tue Jan 10 02:42:07 UTC 2017 automation/he_basic_suite_4.0.sh chroot finished @@ took 360 seconds @@ rc = 1 @@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@ ========== Scrubbing chroot mock \ --configdir="<http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ovirt-system-tests"> \ --root="mocker-epel-7-x86_64.el7" \ --resultdir="./mock_logs.xGGwEk6V/mocker-epel-7-x86_64.el7.scrub" \ --scrub=chroot WARNING: Could not find required logging config file: <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ovirt-system-tests/logging.ini.> Using default... INFO: mock.py version 1.2.21 starting (python version = 3.4.3)... Start: init plugins INFO: selinux enabled Finish: init plugins Start: run Start: scrub ['chroot'] INFO: scrubbing chroot for mocker-epel-7-x86_64.el7 Finish: scrub ['chroot'] Finish: run Scrub chroot took 6 seconds ============================ ########################################################## ## Tue Jan 10 02:42:13 UTC 2017 Finished env: el7:epel-7-x86_64 ## took 366 seconds ## rc = 1 ########################################################## find: ‘logs’: No such file or directory No log files found, check command output ##!######################################################## Collecting mock logs ‘./mock_logs.xGGwEk6V/mocker-epel-7-x86_64.el7.clean_rpmdb’ -> ‘exported-artifacts/mock_logs/mocker-epel-7-x86_64.el7.clean_rpmdb’ ‘./mock_logs.xGGwEk6V/mocker-epel-7-x86_64.el7.he_basic_suite_4.0.sh’ -> ‘exported-artifacts/mock_logs/mocker-epel-7-x86_64.el7.he_basic_suite_4.0.sh’ ‘./mock_logs.xGGwEk6V/mocker-epel-7-x86_64.el7.init’ -> ‘exported-artifacts/mock_logs/mocker-epel-7-x86_64.el7.init’ ########################################################## Build step 'Execute shell' marked build as failure Performing Post build task... Match found for :.* : True Logical operation result is TRUE Running script : #!/bin/bash -xe echo 'shell_scripts/system_tests.collect_logs.sh' # # Required jjb vars: # version # VERSION=4.0 SUITE_TYPE= WORKSPACE="$PWD" OVIRT_SUITE="$SUITE_TYPE_suite_$VERSION" TESTS_LOGS="$WORKSPACE/ovirt-system-tests/exported-artifacts" rm -rf "$WORKSPACE/exported-artifacts" mkdir -p "$WORKSPACE/exported-artifacts" if [[ -d "$TESTS_LOGS" ]]; then mv "$TESTS_LOGS/"* "$WORKSPACE/exported-artifacts/" fi [ovirt_4.0_he-system-tests] $ /bin/bash -xe /tmp/hudson302101162661598371.sh + echo shell_scripts/system_tests.collect_logs.sh shell_scripts/system_tests.collect_logs.sh + VERSION=4.0 + SUITE_TYPE= + WORKSPACE=<http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/> + OVIRT_SUITE=4.0 + TESTS_LOGS=<http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ovirt-system-tests/exported-artifacts> + rm -rf <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/627/artifact/exported-artifacts> + mkdir -p <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/627/artifact/exported-artifacts> + [[ -d <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ovirt-system-tests/exported-artifacts> ]] + mv <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ovirt-system-tests/exported-artifacts/failure_msg.txt> <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ovirt-system-tests/exported-artifacts/lago_logs> <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ovirt-system-tests/exported-artifacts/mock_logs> <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/627/artifact/exported-artifacts/> POST BUILD TASK : SUCCESS END OF POST BUILD TASK : 0 Match found for :.* : True Logical operation result is TRUE Running script : #!/bin/bash -x echo "shell-scripts/mock_cleanup.sh" # Make clear this is the cleanup, helps reading the jenkins logs cat <<EOC _______________________________________________________________________ ####################################################################### # # # CLEANUP # # # ####################################################################### EOC shopt -s nullglob WORKSPACE="${WORKSPACE:-$PWD}" UMOUNT_RETRIES="${UMOUNT_RETRIES:-3}" UMOUNT_RETRY_DELAY="${UMOUNT_RETRY_DELAY:-1s}" safe_umount() { local mount="${1:?}" local attempt for ((attempt=0 ; attempt < $UMOUNT_RETRIES ; attempt++)); do # If this is not the 1st time through the loop, Sleep a while to let # the problem "solve itself" [[ attempt > 0 ]] && sleep "$UMOUNT_RETRY_DELAY" # Try to umount sudo umount --lazy "$mount" && return 0 # See if the mount is already not there despite failing findmnt --kernel --first "$mount" > /dev/null && return 0 done echo "ERROR: Failed to umount $mount." return 1 } # restore the permissions in the working dir, as sometimes it leaves files # owned by root and then the 'cleanup workspace' from jenkins job fails to # clean and breaks the jobs sudo chown -R "$USER" "$WORKSPACE" # stop any processes running inside the chroot failed=false mock_confs=("$WORKSPACE"/*/mocker*) # Clean current jobs mockroot if any for mock_conf_file in "${mock_confs[@]}"; do [[ "$mock_conf_file" ]] || continue echo "Cleaning up mock $mock_conf" mock_root="${mock_conf_file##*/}" mock_root="${mock_root%.*}" my_mock="/usr/bin/mock" my_mock+=" --configdir=${mock_conf_file%/*}" my_mock+=" --root=${mock_root}" my_mock+=" --resultdir=$WORKSPACE" #TODO: investigate why mock --clean fails to umount certain dirs sometimes, #so we can use it instead of manually doing all this. echo "Killing all mock orphan processes, if any." $my_mock \ --orphanskill \ || { echo "ERROR: Failed to kill orphans on $chroot." failed=true } mock_root="$(\ grep \ -Po "(?<=config_opts\['root'\] = ')[^']*" \ "$mock_conf_file" \ )" || : [[ "$mock_root" ]] || continue mounts=($(mount | awk '{print $3}' | grep "$mock_root")) || : if [[ "$mounts" ]]; then echo "Found mounted dirs inside the chroot $chroot. Trying to umount." fi for mount in "${mounts[@]}"; do safe_umount "$mount" || failed=true done done # Clean any leftover chroot from other jobs for mock_root in /var/lib/mock/*; do this_chroot_failed=false mounts=($(cut -d\ -f2 /proc/mounts | grep "$mock_root" | sort -r)) || : if [[ "$mounts" ]]; then echo "Found mounted dirs inside the chroot $mock_root." \ "Trying to umount." fi for mount in "${mounts[@]}"; do safe_umount "$mount" && continue # If we got here, we failed $UMOUNT_RETRIES attempts so we should make # noise failed=true this_chroot_failed=true done if ! $this_chroot_failed; then sudo rm -rf "$mock_root" fi done # remove mock caches that are older then 2 days: find /var/cache/mock/ -mindepth 1 -maxdepth 1 -type d -mtime +2 -print0 | \ xargs -0 -tr sudo rm -rf # We make no effort to leave around caches that may still be in use because # packages installed in them may go out of date, so may as well recreate them # Drop all left over libvirt domains for UUID in $(virsh list --all --uuid); do virsh destroy $UUID || : sleep 2 virsh undefine --remove-all-storage --storage vda --snapshots-metadata $UUID || : done if $failed; then echo "Cleanup script failed, propegating failure to job" exit 1 fi [ovirt_4.0_he-system-tests] $ /bin/bash -x /tmp/hudson1888216492513466503.sh + echo shell-scripts/mock_cleanup.sh shell-scripts/mock_cleanup.sh + cat _______________________________________________________________________ ####################################################################### # # # CLEANUP # # # ####################################################################### + shopt -s nullglob + WORKSPACE=<http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/> + UMOUNT_RETRIES=3 + UMOUNT_RETRY_DELAY=1s + sudo chown -R jenkins <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/> + failed=false + mock_confs=("$WORKSPACE"/*/mocker*) + for mock_conf_file in '"${mock_confs[@]}"' + [[ -n <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ovirt-system-tests/mocker-epel-7-x86_64.el7.cfg> ]] + echo 'Cleaning up mock ' Cleaning up mock + mock_root=mocker-epel-7-x86_64.el7.cfg + mock_root=mocker-epel-7-x86_64.el7 + my_mock=/usr/bin/mock + my_mock+=' --configdir=<http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ovirt-system-tests'> + my_mock+=' --root=mocker-epel-7-x86_64.el7' + my_mock+=' --resultdir=<http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/'> + echo 'Killing all mock orphan processes, if any.' Killing all mock orphan processes, if any. + /usr/bin/mock --configdir=<http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ovirt-system-tests> --root=mocker-epel-7-x86_64.el7 --resultdir=<http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/> --orphanskill WARNING: Could not find required logging config file: <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ovirt-system-tests/logging.ini.> Using default... INFO: mock.py version 1.2.21 starting (python version = 3.4.3)... Start: init plugins INFO: selinux enabled Finish: init plugins Start: run Finish: run ++ grep -Po '(?<=config_opts\['\''root'\''\] = '\'')[^'\'']*' <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ovirt-system-tests/mocker-epel-7-x86_64.el7.cfg> + mock_root=epel-7-x86_64-6f628e6dc1a827c86d5e1bd9d3b3d38b + [[ -n epel-7-x86_64-6f628e6dc1a827c86d5e1bd9d3b3d38b ]] + mounts=($(mount | awk '{print $3}' | grep "$mock_root")) ++ mount ++ awk '{print $3}' ++ grep epel-7-x86_64-6f628e6dc1a827c86d5e1bd9d3b3d38b + : + [[ -n '' ]] + find /var/cache/mock/ -mindepth 1 -maxdepth 1 -type d -mtime +2 -print0 + xargs -0 -tr sudo rm -rf ++ virsh list --all --uuid + false POST BUILD TASK : SUCCESS END OF POST BUILD TASK : 1 Recording test results ERROR: Step ‘Publish JUnit test result report’ failed: No test report files were found. Configuration error? Archiving artifacts

Hi, Checked the logs and see the following: 02:42:05 [WARNING] OVF does not contain a valid image description, using default. 02:42:05 The following CPU types are supported by this host: 02:42:05 - model_Westmere: Intel Westmere Family 02:42:05 - model_Nehalem: Intel Nehalem Family 02:42:05 - model_Penryn: Intel Penryn Family 02:42:05 - model_Conroe: Intel Conroe Family 02:42:05 [ ERROR ] Failed to execute stage 'Environment customization': Invalid CPU type specified: model_SandyBridge Barak thinks that it may be related to the recent update in the Lago code. Gal, any idea ? Thanks in advance, Lev Veyde. ----- Original Message ----- From: jenkins@jenkins.phx.ovirt.org To: sbonazzo@redhat.com, infra@ovirt.org, lveyde@redhat.com Sent: Tuesday, January 10, 2017 4:42:14 AM Subject: Build failed in Jenkins: ovirt_4.0_he-system-tests #627 See <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/627/changes> Changes: [Lev Veyde] Mask NetworkManager service [Eyal Edri] fix imgbased job names in jjb [Daniel Belenky] fixing jjb version for cockpit-ovirt [Gil Shinar] Add some more 4.1 to experimental [Juan Hernandez] Don't build RPMs for the JBoss modules Maven plugin [pkliczewski] jsonrpc 4.1 branch ------------------------------------------ [...truncated 749 lines...] Finish: shell @@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@ @@ Tue Jan 10 02:42:07 UTC 2017 automation/he_basic_suite_4.0.sh chroot finished @@ took 360 seconds @@ rc = 1 @@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@ ========== Scrubbing chroot mock \ --configdir="<http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ovirt-system-tests"> \ --root="mocker-epel-7-x86_64.el7" \ --resultdir="./mock_logs.xGGwEk6V/mocker-epel-7-x86_64.el7.scrub" \ --scrub=chroot WARNING: Could not find required logging config file: <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ovirt-system-tests/logging.ini.> Using default... INFO: mock.py version 1.2.21 starting (python version = 3.4.3)... Start: init plugins INFO: selinux enabled Finish: init plugins Start: run Start: scrub ['chroot'] INFO: scrubbing chroot for mocker-epel-7-x86_64.el7 Finish: scrub ['chroot'] Finish: run Scrub chroot took 6 seconds ============================ ########################################################## ## Tue Jan 10 02:42:13 UTC 2017 Finished env: el7:epel-7-x86_64 ## took 366 seconds ## rc = 1 ########################################################## find: ‘logs’: No such file or directory No log files found, check command output ##!######################################################## Collecting mock logs ‘./mock_logs.xGGwEk6V/mocker-epel-7-x86_64.el7.clean_rpmdb’ -> ‘exported-artifacts/mock_logs/mocker-epel-7-x86_64.el7.clean_rpmdb’ ‘./mock_logs.xGGwEk6V/mocker-epel-7-x86_64.el7.he_basic_suite_4.0.sh’ -> ‘exported-artifacts/mock_logs/mocker-epel-7-x86_64.el7.he_basic_suite_4.0.sh’ ‘./mock_logs.xGGwEk6V/mocker-epel-7-x86_64.el7.init’ -> ‘exported-artifacts/mock_logs/mocker-epel-7-x86_64.el7.init’ ########################################################## Build step 'Execute shell' marked build as failure Performing Post build task... Match found for :.* : True Logical operation result is TRUE Running script : #!/bin/bash -xe echo 'shell_scripts/system_tests.collect_logs.sh' # # Required jjb vars: # version # VERSION=4.0 SUITE_TYPE= WORKSPACE="$PWD" OVIRT_SUITE="$SUITE_TYPE_suite_$VERSION" TESTS_LOGS="$WORKSPACE/ovirt-system-tests/exported-artifacts" rm -rf "$WORKSPACE/exported-artifacts" mkdir -p "$WORKSPACE/exported-artifacts" if [[ -d "$TESTS_LOGS" ]]; then mv "$TESTS_LOGS/"* "$WORKSPACE/exported-artifacts/" fi [ovirt_4.0_he-system-tests] $ /bin/bash -xe /tmp/hudson302101162661598371.sh + echo shell_scripts/system_tests.collect_logs.sh shell_scripts/system_tests.collect_logs.sh + VERSION=4.0 + SUITE_TYPE= + WORKSPACE=<http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/> + OVIRT_SUITE=4.0 + TESTS_LOGS=<http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ovirt-system-tests/exported-artifacts> + rm -rf <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/627/artifact/exported-artifacts> + mkdir -p <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/627/artifact/exported-artifacts> + [[ -d <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ovirt-system-tests/exported-artifacts> ]] + mv <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ovirt-system-tests/exported-artifacts/failure_msg.txt> <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ovirt-system-tests/exported-artifacts/lago_logs> <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ovirt-system-tests/exported-artifacts/mock_logs> <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/627/artifact/exported-artifacts/> POST BUILD TASK : SUCCESS END OF POST BUILD TASK : 0 Match found for :.* : True Logical operation result is TRUE Running script : #!/bin/bash -x echo "shell-scripts/mock_cleanup.sh" # Make clear this is the cleanup, helps reading the jenkins logs cat <<EOC _______________________________________________________________________ ####################################################################### # # # CLEANUP # # # ####################################################################### EOC shopt -s nullglob WORKSPACE="${WORKSPACE:-$PWD}" UMOUNT_RETRIES="${UMOUNT_RETRIES:-3}" UMOUNT_RETRY_DELAY="${UMOUNT_RETRY_DELAY:-1s}" safe_umount() { local mount="${1:?}" local attempt for ((attempt=0 ; attempt < $UMOUNT_RETRIES ; attempt++)); do # If this is not the 1st time through the loop, Sleep a while to let # the problem "solve itself" [[ attempt > 0 ]] && sleep "$UMOUNT_RETRY_DELAY" # Try to umount sudo umount --lazy "$mount" && return 0 # See if the mount is already not there despite failing findmnt --kernel --first "$mount" > /dev/null && return 0 done echo "ERROR: Failed to umount $mount." return 1 } # restore the permissions in the working dir, as sometimes it leaves files # owned by root and then the 'cleanup workspace' from jenkins job fails to # clean and breaks the jobs sudo chown -R "$USER" "$WORKSPACE" # stop any processes running inside the chroot failed=false mock_confs=("$WORKSPACE"/*/mocker*) # Clean current jobs mockroot if any for mock_conf_file in "${mock_confs[@]}"; do [[ "$mock_conf_file" ]] || continue echo "Cleaning up mock $mock_conf" mock_root="${mock_conf_file##*/}" mock_root="${mock_root%.*}" my_mock="/usr/bin/mock" my_mock+=" --configdir=${mock_conf_file%/*}" my_mock+=" --root=${mock_root}" my_mock+=" --resultdir=$WORKSPACE" #TODO: investigate why mock --clean fails to umount certain dirs sometimes, #so we can use it instead of manually doing all this. echo "Killing all mock orphan processes, if any." $my_mock \ --orphanskill \ || { echo "ERROR: Failed to kill orphans on $chroot." failed=true } mock_root="$(\ grep \ -Po "(?<=config_opts\['root'\] = ')[^']*" \ "$mock_conf_file" \ )" || : [[ "$mock_root" ]] || continue mounts=($(mount | awk '{print $3}' | grep "$mock_root")) || : if [[ "$mounts" ]]; then echo "Found mounted dirs inside the chroot $chroot. Trying to umount." fi for mount in "${mounts[@]}"; do safe_umount "$mount" || failed=true done done # Clean any leftover chroot from other jobs for mock_root in /var/lib/mock/*; do this_chroot_failed=false mounts=($(cut -d\ -f2 /proc/mounts | grep "$mock_root" | sort -r)) || : if [[ "$mounts" ]]; then echo "Found mounted dirs inside the chroot $mock_root." \ "Trying to umount." fi for mount in "${mounts[@]}"; do safe_umount "$mount" && continue # If we got here, we failed $UMOUNT_RETRIES attempts so we should make # noise failed=true this_chroot_failed=true done if ! $this_chroot_failed; then sudo rm -rf "$mock_root" fi done # remove mock caches that are older then 2 days: find /var/cache/mock/ -mindepth 1 -maxdepth 1 -type d -mtime +2 -print0 | \ xargs -0 -tr sudo rm -rf # We make no effort to leave around caches that may still be in use because # packages installed in them may go out of date, so may as well recreate them # Drop all left over libvirt domains for UUID in $(virsh list --all --uuid); do virsh destroy $UUID || : sleep 2 virsh undefine --remove-all-storage --storage vda --snapshots-metadata $UUID || : done if $failed; then echo "Cleanup script failed, propegating failure to job" exit 1 fi [ovirt_4.0_he-system-tests] $ /bin/bash -x /tmp/hudson1888216492513466503.sh + echo shell-scripts/mock_cleanup.sh shell-scripts/mock_cleanup.sh + cat _______________________________________________________________________ ####################################################################### # # # CLEANUP # # # ####################################################################### + shopt -s nullglob + WORKSPACE=<http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/> + UMOUNT_RETRIES=3 + UMOUNT_RETRY_DELAY=1s + sudo chown -R jenkins <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/> + failed=false + mock_confs=("$WORKSPACE"/*/mocker*) + for mock_conf_file in '"${mock_confs[@]}"' + [[ -n <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ovirt-system-tests/mocker-epel-7-x86_64.el7.cfg> ]] + echo 'Cleaning up mock ' Cleaning up mock + mock_root=mocker-epel-7-x86_64.el7.cfg + mock_root=mocker-epel-7-x86_64.el7 + my_mock=/usr/bin/mock + my_mock+=' --configdir=<http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ovirt-system-tests'> + my_mock+=' --root=mocker-epel-7-x86_64.el7' + my_mock+=' --resultdir=<http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/'> + echo 'Killing all mock orphan processes, if any.' Killing all mock orphan processes, if any. + /usr/bin/mock --configdir=<http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ovirt-system-tests> --root=mocker-epel-7-x86_64.el7 --resultdir=<http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/> --orphanskill WARNING: Could not find required logging config file: <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ovirt-system-tests/logging.ini.> Using default... INFO: mock.py version 1.2.21 starting (python version = 3.4.3)... Start: init plugins INFO: selinux enabled Finish: init plugins Start: run Finish: run ++ grep -Po '(?<=config_opts\['\''root'\''\] = '\'')[^'\'']*' <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ovirt-system-tests/mocker-epel-7-x86_64.el7.cfg> + mock_root=epel-7-x86_64-6f628e6dc1a827c86d5e1bd9d3b3d38b + [[ -n epel-7-x86_64-6f628e6dc1a827c86d5e1bd9d3b3d38b ]] + mounts=($(mount | awk '{print $3}' | grep "$mock_root")) ++ mount ++ awk '{print $3}' ++ grep epel-7-x86_64-6f628e6dc1a827c86d5e1bd9d3b3d38b + : + [[ -n '' ]] + find /var/cache/mock/ -mindepth 1 -maxdepth 1 -type d -mtime +2 -print0 + xargs -0 -tr sudo rm -rf ++ virsh list --all --uuid + false POST BUILD TASK : SUCCESS END OF POST BUILD TASK : 1 Recording test results ERROR: Step ‘Publish JUnit test result report’ failed: No test report files were found. Configuration error? Archiving artifacts

This patch is one that caused it probably: https://github.com/lago-project/lago/commit/05ccf7240976f91b0c14d6a1f8801637... Thanks in advance, Lev Veyde. ----- Original Message ----- From: "Lev Veyde" <lveyde@redhat.com> To: "Eyal Edri" <eedri@redhat.com>, sbonazzo@redhat.com Cc: infra@ovirt.org, "Gal Ben Haim" <gbenhaim@redhat.com> Sent: Tuesday, January 10, 2017 11:50:05 AM Subject: Re: Build failed in Jenkins: ovirt_4.0_he-system-tests #627 Hi, Checked the logs and see the following: 02:42:05 [WARNING] OVF does not contain a valid image description, using default. 02:42:05 The following CPU types are supported by this host: 02:42:05 - model_Westmere: Intel Westmere Family 02:42:05 - model_Nehalem: Intel Nehalem Family 02:42:05 - model_Penryn: Intel Penryn Family 02:42:05 - model_Conroe: Intel Conroe Family 02:42:05 [ ERROR ] Failed to execute stage 'Environment customization': Invalid CPU type specified: model_SandyBridge Barak thinks that it may be related to the recent update in the Lago code. Gal, any idea ? Thanks in advance, Lev Veyde. ----- Original Message ----- From: jenkins@jenkins.phx.ovirt.org To: sbonazzo@redhat.com, infra@ovirt.org, lveyde@redhat.com Sent: Tuesday, January 10, 2017 4:42:14 AM Subject: Build failed in Jenkins: ovirt_4.0_he-system-tests #627 See <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/627/changes> Changes: [Lev Veyde] Mask NetworkManager service [Eyal Edri] fix imgbased job names in jjb [Daniel Belenky] fixing jjb version for cockpit-ovirt [Gil Shinar] Add some more 4.1 to experimental [Juan Hernandez] Don't build RPMs for the JBoss modules Maven plugin [pkliczewski] jsonrpc 4.1 branch ------------------------------------------ [...truncated 749 lines...] Finish: shell @@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@ @@ Tue Jan 10 02:42:07 UTC 2017 automation/he_basic_suite_4.0.sh chroot finished @@ took 360 seconds @@ rc = 1 @@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@ ========== Scrubbing chroot mock \ --configdir="<http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ovirt-system-tests"> \ --root="mocker-epel-7-x86_64.el7" \ --resultdir="./mock_logs.xGGwEk6V/mocker-epel-7-x86_64.el7.scrub" \ --scrub=chroot WARNING: Could not find required logging config file: <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ovirt-system-tests/logging.ini.> Using default... INFO: mock.py version 1.2.21 starting (python version = 3.4.3)... Start: init plugins INFO: selinux enabled Finish: init plugins Start: run Start: scrub ['chroot'] INFO: scrubbing chroot for mocker-epel-7-x86_64.el7 Finish: scrub ['chroot'] Finish: run Scrub chroot took 6 seconds ============================ ########################################################## ## Tue Jan 10 02:42:13 UTC 2017 Finished env: el7:epel-7-x86_64 ## took 366 seconds ## rc = 1 ########################################################## find: ‘logs’: No such file or directory No log files found, check command output ##!######################################################## Collecting mock logs ‘./mock_logs.xGGwEk6V/mocker-epel-7-x86_64.el7.clean_rpmdb’ -> ‘exported-artifacts/mock_logs/mocker-epel-7-x86_64.el7.clean_rpmdb’ ‘./mock_logs.xGGwEk6V/mocker-epel-7-x86_64.el7.he_basic_suite_4.0.sh’ -> ‘exported-artifacts/mock_logs/mocker-epel-7-x86_64.el7.he_basic_suite_4.0.sh’ ‘./mock_logs.xGGwEk6V/mocker-epel-7-x86_64.el7.init’ -> ‘exported-artifacts/mock_logs/mocker-epel-7-x86_64.el7.init’ ########################################################## Build step 'Execute shell' marked build as failure Performing Post build task... Match found for :.* : True Logical operation result is TRUE Running script : #!/bin/bash -xe echo 'shell_scripts/system_tests.collect_logs.sh' # # Required jjb vars: # version # VERSION=4.0 SUITE_TYPE= WORKSPACE="$PWD" OVIRT_SUITE="$SUITE_TYPE_suite_$VERSION" TESTS_LOGS="$WORKSPACE/ovirt-system-tests/exported-artifacts" rm -rf "$WORKSPACE/exported-artifacts" mkdir -p "$WORKSPACE/exported-artifacts" if [[ -d "$TESTS_LOGS" ]]; then mv "$TESTS_LOGS/"* "$WORKSPACE/exported-artifacts/" fi [ovirt_4.0_he-system-tests] $ /bin/bash -xe /tmp/hudson302101162661598371.sh + echo shell_scripts/system_tests.collect_logs.sh shell_scripts/system_tests.collect_logs.sh + VERSION=4.0 + SUITE_TYPE= + WORKSPACE=<http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/> + OVIRT_SUITE=4.0 + TESTS_LOGS=<http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ovirt-system-tests/exported-artifacts> + rm -rf <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/627/artifact/exported-artifacts> + mkdir -p <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/627/artifact/exported-artifacts> + [[ -d <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ovirt-system-tests/exported-artifacts> ]] + mv <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ovirt-system-tests/exported-artifacts/failure_msg.txt> <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ovirt-system-tests/exported-artifacts/lago_logs> <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ovirt-system-tests/exported-artifacts/mock_logs> <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/627/artifact/exported-artifacts/> POST BUILD TASK : SUCCESS END OF POST BUILD TASK : 0 Match found for :.* : True Logical operation result is TRUE Running script : #!/bin/bash -x echo "shell-scripts/mock_cleanup.sh" # Make clear this is the cleanup, helps reading the jenkins logs cat <<EOC _______________________________________________________________________ ####################################################################### # # # CLEANUP # # # ####################################################################### EOC shopt -s nullglob WORKSPACE="${WORKSPACE:-$PWD}" UMOUNT_RETRIES="${UMOUNT_RETRIES:-3}" UMOUNT_RETRY_DELAY="${UMOUNT_RETRY_DELAY:-1s}" safe_umount() { local mount="${1:?}" local attempt for ((attempt=0 ; attempt < $UMOUNT_RETRIES ; attempt++)); do # If this is not the 1st time through the loop, Sleep a while to let # the problem "solve itself" [[ attempt > 0 ]] && sleep "$UMOUNT_RETRY_DELAY" # Try to umount sudo umount --lazy "$mount" && return 0 # See if the mount is already not there despite failing findmnt --kernel --first "$mount" > /dev/null && return 0 done echo "ERROR: Failed to umount $mount." return 1 } # restore the permissions in the working dir, as sometimes it leaves files # owned by root and then the 'cleanup workspace' from jenkins job fails to # clean and breaks the jobs sudo chown -R "$USER" "$WORKSPACE" # stop any processes running inside the chroot failed=false mock_confs=("$WORKSPACE"/*/mocker*) # Clean current jobs mockroot if any for mock_conf_file in "${mock_confs[@]}"; do [[ "$mock_conf_file" ]] || continue echo "Cleaning up mock $mock_conf" mock_root="${mock_conf_file##*/}" mock_root="${mock_root%.*}" my_mock="/usr/bin/mock" my_mock+=" --configdir=${mock_conf_file%/*}" my_mock+=" --root=${mock_root}" my_mock+=" --resultdir=$WORKSPACE" #TODO: investigate why mock --clean fails to umount certain dirs sometimes, #so we can use it instead of manually doing all this. echo "Killing all mock orphan processes, if any." $my_mock \ --orphanskill \ || { echo "ERROR: Failed to kill orphans on $chroot." failed=true } mock_root="$(\ grep \ -Po "(?<=config_opts\['root'\] = ')[^']*" \ "$mock_conf_file" \ )" || : [[ "$mock_root" ]] || continue mounts=($(mount | awk '{print $3}' | grep "$mock_root")) || : if [[ "$mounts" ]]; then echo "Found mounted dirs inside the chroot $chroot. Trying to umount." fi for mount in "${mounts[@]}"; do safe_umount "$mount" || failed=true done done # Clean any leftover chroot from other jobs for mock_root in /var/lib/mock/*; do this_chroot_failed=false mounts=($(cut -d\ -f2 /proc/mounts | grep "$mock_root" | sort -r)) || : if [[ "$mounts" ]]; then echo "Found mounted dirs inside the chroot $mock_root." \ "Trying to umount." fi for mount in "${mounts[@]}"; do safe_umount "$mount" && continue # If we got here, we failed $UMOUNT_RETRIES attempts so we should make # noise failed=true this_chroot_failed=true done if ! $this_chroot_failed; then sudo rm -rf "$mock_root" fi done # remove mock caches that are older then 2 days: find /var/cache/mock/ -mindepth 1 -maxdepth 1 -type d -mtime +2 -print0 | \ xargs -0 -tr sudo rm -rf # We make no effort to leave around caches that may still be in use because # packages installed in them may go out of date, so may as well recreate them # Drop all left over libvirt domains for UUID in $(virsh list --all --uuid); do virsh destroy $UUID || : sleep 2 virsh undefine --remove-all-storage --storage vda --snapshots-metadata $UUID || : done if $failed; then echo "Cleanup script failed, propegating failure to job" exit 1 fi [ovirt_4.0_he-system-tests] $ /bin/bash -x /tmp/hudson1888216492513466503.sh + echo shell-scripts/mock_cleanup.sh shell-scripts/mock_cleanup.sh + cat _______________________________________________________________________ ####################################################################### # # # CLEANUP # # # ####################################################################### + shopt -s nullglob + WORKSPACE=<http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/> + UMOUNT_RETRIES=3 + UMOUNT_RETRY_DELAY=1s + sudo chown -R jenkins <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/> + failed=false + mock_confs=("$WORKSPACE"/*/mocker*) + for mock_conf_file in '"${mock_confs[@]}"' + [[ -n <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ovirt-system-tests/mocker-epel-7-x86_64.el7.cfg> ]] + echo 'Cleaning up mock ' Cleaning up mock + mock_root=mocker-epel-7-x86_64.el7.cfg + mock_root=mocker-epel-7-x86_64.el7 + my_mock=/usr/bin/mock + my_mock+=' --configdir=<http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ovirt-system-tests'> + my_mock+=' --root=mocker-epel-7-x86_64.el7' + my_mock+=' --resultdir=<http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/'> + echo 'Killing all mock orphan processes, if any.' Killing all mock orphan processes, if any. + /usr/bin/mock --configdir=<http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ovirt-system-tests> --root=mocker-epel-7-x86_64.el7 --resultdir=<http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/> --orphanskill WARNING: Could not find required logging config file: <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ovirt-system-tests/logging.ini.> Using default... INFO: mock.py version 1.2.21 starting (python version = 3.4.3)... Start: init plugins INFO: selinux enabled Finish: init plugins Start: run Finish: run ++ grep -Po '(?<=config_opts\['\''root'\''\] = '\'')[^'\'']*' <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ovirt-system-tests/mocker-epel-7-x86_64.el7.cfg> + mock_root=epel-7-x86_64-6f628e6dc1a827c86d5e1bd9d3b3d38b + [[ -n epel-7-x86_64-6f628e6dc1a827c86d5e1bd9d3b3d38b ]] + mounts=($(mount | awk '{print $3}' | grep "$mock_root")) ++ mount ++ awk '{print $3}' ++ grep epel-7-x86_64-6f628e6dc1a827c86d5e1bd9d3b3d38b + : + [[ -n '' ]] + find /var/cache/mock/ -mindepth 1 -maxdepth 1 -type d -mtime +2 -print0 + xargs -0 -tr sudo rm -rf ++ virsh list --all --uuid + false POST BUILD TASK : SUCCESS END OF POST BUILD TASK : 1 Recording test results ERROR: Step ‘Publish JUnit test result report’ failed: No test report files were found. Configuration error? Archiving artifacts

On Tue, Jan 10, 2017 at 12:08 PM, Lev Veyde <lveyde@redhat.com> wrote:
This patch is one that caused it probably: https://github.com/lago-project/lago/commit/05ccf7240976f91b0c14d6a1f88016 376d5e87f0
+Milan. I must confess that I did not like the patch to begin with... I did not understand what real problem it solved, but Michal assured me there was a real issue. I know have Engine with a Java@ 100% CPU - I hope it's unrelated to this as well. I suggest we do survey to see who doesn't have SandyBridge and above and perhaps move higher than Westmere. What do we have in CI? Y.
Thanks in advance, Lev Veyde.
----- Original Message ----- From: "Lev Veyde" <lveyde@redhat.com> To: "Eyal Edri" <eedri@redhat.com>, sbonazzo@redhat.com Cc: infra@ovirt.org, "Gal Ben Haim" <gbenhaim@redhat.com> Sent: Tuesday, January 10, 2017 11:50:05 AM Subject: Re: Build failed in Jenkins: ovirt_4.0_he-system-tests #627
Hi,
Checked the logs and see the following:
02:42:05 [WARNING] OVF does not contain a valid image description, using default. 02:42:05 The following CPU types are supported by this host: 02:42:05 - model_Westmere: Intel Westmere Family 02:42:05 - model_Nehalem: Intel Nehalem Family 02:42:05 - model_Penryn: Intel Penryn Family 02:42:05 - model_Conroe: Intel Conroe Family 02:42:05 [ ERROR ] Failed to execute stage 'Environment customization': Invalid CPU type specified: model_SandyBridge
Barak thinks that it may be related to the recent update in the Lago code.
Gal, any idea ?
Thanks in advance, Lev Veyde.
----- Original Message ----- From: jenkins@jenkins.phx.ovirt.org To: sbonazzo@redhat.com, infra@ovirt.org, lveyde@redhat.com Sent: Tuesday, January 10, 2017 4:42:14 AM Subject: Build failed in Jenkins: ovirt_4.0_he-system-tests #627
See <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/627/changes>
Changes:
[Lev Veyde] Mask NetworkManager service
[Eyal Edri] fix imgbased job names in jjb
[Daniel Belenky] fixing jjb version for cockpit-ovirt
[Gil Shinar] Add some more 4.1 to experimental
[Juan Hernandez] Don't build RPMs for the JBoss modules Maven plugin
[pkliczewski] jsonrpc 4.1 branch
------------------------------------------ [...truncated 749 lines...] Finish: shell @@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@ @@ Tue Jan 10 02:42:07 UTC 2017 automation/he_basic_suite_4.0.sh chroot finished @@ took 360 seconds @@ rc = 1 @@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@ ========== Scrubbing chroot mock \ --configdir="<http://jenkins.ovirt.org/job/ovirt_4.0_he- system-tests/ws/ovirt-system-tests"> \ --root="mocker-epel-7-x86_64.el7" \ --resultdir="./mock_logs.xGGwEk6V/mocker-epel-7-x86_64.el7.scrub" \ --scrub=chroot WARNING: Could not find required logging config file: < http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ ovirt-system-tests/logging.ini.> Using default... INFO: mock.py version 1.2.21 starting (python version = 3.4.3)... Start: init plugins INFO: selinux enabled Finish: init plugins Start: run Start: scrub ['chroot'] INFO: scrubbing chroot for mocker-epel-7-x86_64.el7 Finish: scrub ['chroot'] Finish: run Scrub chroot took 6 seconds ============================ ########################################################## ## Tue Jan 10 02:42:13 UTC 2017 Finished env: el7:epel-7-x86_64 ## took 366 seconds ## rc = 1 ########################################################## find: ‘logs’: No such file or directory No log files found, check command output ##!######################################################## Collecting mock logs ‘./mock_logs.xGGwEk6V/mocker-epel-7-x86_64.el7.clean_rpmdb’ -> ‘exported-artifacts/mock_logs/mocker-epel-7-x86_64.el7.clean_rpmdb’ ‘./mock_logs.xGGwEk6V/mocker-epel-7-x86_64.el7.he_basic_suite_4.0.sh’ -> ‘exported-artifacts/mock_logs/mocker-epel-7-x86_64.el7.he_ basic_suite_4.0.sh’ ‘./mock_logs.xGGwEk6V/mocker-epel-7-x86_64.el7.init’ -> ‘exported-artifacts/mock_logs/mocker-epel-7-x86_64.el7.init’ ########################################################## Build step 'Execute shell' marked build as failure Performing Post build task... Match found for :.* : True Logical operation result is TRUE Running script : #!/bin/bash -xe echo 'shell_scripts/system_tests.collect_logs.sh'
# # Required jjb vars: # version # VERSION=4.0 SUITE_TYPE=
WORKSPACE="$PWD" OVIRT_SUITE="$SUITE_TYPE_suite_$VERSION" TESTS_LOGS="$WORKSPACE/ovirt-system-tests/exported-artifacts"
rm -rf "$WORKSPACE/exported-artifacts" mkdir -p "$WORKSPACE/exported-artifacts"
if [[ -d "$TESTS_LOGS" ]]; then mv "$TESTS_LOGS/"* "$WORKSPACE/exported-artifacts/" fi
[ovirt_4.0_he-system-tests] $ /bin/bash -xe /tmp/hudson302101162661598371. sh + echo shell_scripts/system_tests.collect_logs.sh shell_scripts/system_tests.collect_logs.sh + VERSION=4.0 + SUITE_TYPE= + WORKSPACE=<http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/> + OVIRT_SUITE=4.0 + TESTS_LOGS=<http://jenkins.ovirt.org/job/ovirt_4.0_he- system-tests/ws/ovirt-system-tests/exported-artifacts> + rm -rf <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/627/ artifact/exported-artifacts> + mkdir -p <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/627/ artifact/exported-artifacts> + [[ -d <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ ovirt-system-tests/exported-artifacts> ]] + mv <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ ovirt-system-tests/exported-artifacts/failure_msg.txt> < http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ ovirt-system-tests/exported-artifacts/lago_logs> < http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ ovirt-system-tests/exported-artifacts/mock_logs> < http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/627/ artifact/exported-artifacts/> POST BUILD TASK : SUCCESS END OF POST BUILD TASK : 0 Match found for :.* : True Logical operation result is TRUE Running script : #!/bin/bash -x echo "shell-scripts/mock_cleanup.sh" # Make clear this is the cleanup, helps reading the jenkins logs cat <<EOC _______________________________________________________________________ ####################################################################### # # # CLEANUP # # # ####################################################################### EOC
shopt -s nullglob
WORKSPACE="${WORKSPACE:-$PWD}" UMOUNT_RETRIES="${UMOUNT_RETRIES:-3}" UMOUNT_RETRY_DELAY="${UMOUNT_RETRY_DELAY:-1s}"
safe_umount() { local mount="${1:?}" local attempt for ((attempt=0 ; attempt < $UMOUNT_RETRIES ; attempt++)); do # If this is not the 1st time through the loop, Sleep a while to let # the problem "solve itself" [[ attempt > 0 ]] && sleep "$UMOUNT_RETRY_DELAY" # Try to umount sudo umount --lazy "$mount" && return 0 # See if the mount is already not there despite failing findmnt --kernel --first "$mount" > /dev/null && return 0 done echo "ERROR: Failed to umount $mount." return 1 }
# restore the permissions in the working dir, as sometimes it leaves files # owned by root and then the 'cleanup workspace' from jenkins job fails to # clean and breaks the jobs sudo chown -R "$USER" "$WORKSPACE"
# stop any processes running inside the chroot failed=false mock_confs=("$WORKSPACE"/*/mocker*) # Clean current jobs mockroot if any for mock_conf_file in "${mock_confs[@]}"; do [[ "$mock_conf_file" ]] || continue echo "Cleaning up mock $mock_conf" mock_root="${mock_conf_file##*/}" mock_root="${mock_root%.*}" my_mock="/usr/bin/mock" my_mock+=" --configdir=${mock_conf_file%/*}" my_mock+=" --root=${mock_root}" my_mock+=" --resultdir=$WORKSPACE"
#TODO: investigate why mock --clean fails to umount certain dirs sometimes, #so we can use it instead of manually doing all this. echo "Killing all mock orphan processes, if any." $my_mock \ --orphanskill \ || { echo "ERROR: Failed to kill orphans on $chroot." failed=true }
mock_root="$(\ grep \ -Po "(?<=config_opts\['root'\] = ')[^']*" \ "$mock_conf_file" \ )" || : [[ "$mock_root" ]] || continue mounts=($(mount | awk '{print $3}' | grep "$mock_root")) || : if [[ "$mounts" ]]; then echo "Found mounted dirs inside the chroot $chroot. Trying to umount." fi for mount in "${mounts[@]}"; do safe_umount "$mount" || failed=true done done
# Clean any leftover chroot from other jobs for mock_root in /var/lib/mock/*; do this_chroot_failed=false mounts=($(cut -d\ -f2 /proc/mounts | grep "$mock_root" | sort -r)) || : if [[ "$mounts" ]]; then echo "Found mounted dirs inside the chroot $mock_root." \ "Trying to umount." fi for mount in "${mounts[@]}"; do safe_umount "$mount" && continue # If we got here, we failed $UMOUNT_RETRIES attempts so we should make # noise failed=true this_chroot_failed=true done if ! $this_chroot_failed; then sudo rm -rf "$mock_root" fi done
# remove mock caches that are older then 2 days: find /var/cache/mock/ -mindepth 1 -maxdepth 1 -type d -mtime +2 -print0 | \ xargs -0 -tr sudo rm -rf # We make no effort to leave around caches that may still be in use because # packages installed in them may go out of date, so may as well recreate them
# Drop all left over libvirt domains for UUID in $(virsh list --all --uuid); do virsh destroy $UUID || : sleep 2 virsh undefine --remove-all-storage --storage vda --snapshots-metadata $UUID || : done
if $failed; then echo "Cleanup script failed, propegating failure to job" exit 1 fi
[ovirt_4.0_he-system-tests] $ /bin/bash -x /tmp/ hudson1888216492513466503.sh + echo shell-scripts/mock_cleanup.sh shell-scripts/mock_cleanup.sh + cat _______________________________________________________________________ ####################################################################### # # # CLEANUP # # # ####################################################################### + shopt -s nullglob + WORKSPACE=<http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/> + UMOUNT_RETRIES=3 + UMOUNT_RETRY_DELAY=1s + sudo chown -R jenkins <http://jenkins.ovirt.org/job/ ovirt_4.0_he-system-tests/ws/> + failed=false + mock_confs=("$WORKSPACE"/*/mocker*) + for mock_conf_file in '"${mock_confs[@]}"' + [[ -n <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ ovirt-system-tests/mocker-epel-7-x86_64.el7.cfg> ]] + echo 'Cleaning up mock ' Cleaning up mock + mock_root=mocker-epel-7-x86_64.el7.cfg + mock_root=mocker-epel-7-x86_64.el7 + my_mock=/usr/bin/mock + my_mock+=' --configdir=<http://jenkins.ovirt.org/job/ovirt_4.0_he- system-tests/ws/ovirt-system-tests'> + my_mock+=' --root=mocker-epel-7-x86_64.el7' + my_mock+=' --resultdir=<http://jenkins.ovirt.org/job/ovirt_4.0_he- system-tests/ws/'> + echo 'Killing all mock orphan processes, if any.' Killing all mock orphan processes, if any. + /usr/bin/mock --configdir=<http://jenkins.ovirt.org/job/ovirt_4.0_he- system-tests/ws/ovirt-system-tests> --root=mocker-epel-7-x86_64.el7 --resultdir=<http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/> --orphanskill WARNING: Could not find required logging config file: < http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ ovirt-system-tests/logging.ini.> Using default... INFO: mock.py version 1.2.21 starting (python version = 3.4.3)... Start: init plugins INFO: selinux enabled Finish: init plugins Start: run Finish: run ++ grep -Po '(?<=config_opts\['\''root'\''\] = '\'')[^'\'']*' < http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ ovirt-system-tests/mocker-epel-7-x86_64.el7.cfg> + mock_root=epel-7-x86_64-6f628e6dc1a827c86d5e1bd9d3b3d38b + [[ -n epel-7-x86_64-6f628e6dc1a827c86d5e1bd9d3b3d38b ]] + mounts=($(mount | awk '{print $3}' | grep "$mock_root")) ++ mount ++ awk '{print $3}' ++ grep epel-7-x86_64-6f628e6dc1a827c86d5e1bd9d3b3d38b + : + [[ -n '' ]] + find /var/cache/mock/ -mindepth 1 -maxdepth 1 -type d -mtime +2 -print0 + xargs -0 -tr sudo rm -rf ++ virsh list --all --uuid + false POST BUILD TASK : SUCCESS END OF POST BUILD TASK : 1 Recording test results ERROR: Step ‘Publish JUnit test result report’ failed: No test report files were found. Configuration error? Archiving artifacts

Yaniv Kaul <ykaul@redhat.com> writes:
On Tue, Jan 10, 2017 at 12:08 PM, Lev Veyde <lveyde@redhat.com> wrote:
This patch is one that caused it probably: https://github.com/lago-project/lago/commit/05ccf7240976f91b0c14d6a1f88016 376d5e87f0
+Milan.
+Martin
I must confess that I did not like the patch to begin with... I did not understand what real problem it solved, but Michal assured me there was a real issue.
Yes, there was a real issue with nested virtualization. Some CPU flags are missing with Haswell and Lago doesn't run properly.
I know have Engine with a Java@ 100% CPU - I hope it's unrelated to this as well.
I suggest we do survey to see who doesn't have SandyBridge and above and perhaps move higher than Westmere.
We've got Westmere servers in the Brno lab.
What do we have in CI? Y.
Thanks in advance, Lev Veyde.
----- Original Message ----- From: "Lev Veyde" <lveyde@redhat.com> To: "Eyal Edri" <eedri@redhat.com>, sbonazzo@redhat.com Cc: infra@ovirt.org, "Gal Ben Haim" <gbenhaim@redhat.com> Sent: Tuesday, January 10, 2017 11:50:05 AM Subject: Re: Build failed in Jenkins: ovirt_4.0_he-system-tests #627
Hi,
Checked the logs and see the following:
02:42:05 [WARNING] OVF does not contain a valid image description, using default. 02:42:05 The following CPU types are supported by this host: 02:42:05 - model_Westmere: Intel Westmere Family 02:42:05 - model_Nehalem: Intel Nehalem Family 02:42:05 - model_Penryn: Intel Penryn Family 02:42:05 - model_Conroe: Intel Conroe Family 02:42:05 [ ERROR ] Failed to execute stage 'Environment customization': Invalid CPU type specified: model_SandyBridge
Barak thinks that it may be related to the recent update in the Lago code.
Gal, any idea ?
Thanks in advance, Lev Veyde.
----- Original Message ----- From: jenkins@jenkins.phx.ovirt.org To: sbonazzo@redhat.com, infra@ovirt.org, lveyde@redhat.com Sent: Tuesday, January 10, 2017 4:42:14 AM Subject: Build failed in Jenkins: ovirt_4.0_he-system-tests #627
See <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/627/changes>
Changes:
[Lev Veyde] Mask NetworkManager service
[Eyal Edri] fix imgbased job names in jjb
[Daniel Belenky] fixing jjb version for cockpit-ovirt
[Gil Shinar] Add some more 4.1 to experimental
[Juan Hernandez] Don't build RPMs for the JBoss modules Maven plugin
[pkliczewski] jsonrpc 4.1 branch
------------------------------------------ [...truncated 749 lines...] Finish: shell @@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@ @@ Tue Jan 10 02:42:07 UTC 2017 automation/he_basic_suite_4.0.sh chroot finished @@ took 360 seconds @@ rc = 1 @@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@ ========== Scrubbing chroot mock \ --configdir="<http://jenkins.ovirt.org/job/ovirt_4.0_he- system-tests/ws/ovirt-system-tests"> \ --root="mocker-epel-7-x86_64.el7" \ --resultdir="./mock_logs.xGGwEk6V/mocker-epel-7-x86_64.el7.scrub" \ --scrub=chroot WARNING: Could not find required logging config file: < http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ ovirt-system-tests/logging.ini.> Using default... INFO: mock.py version 1.2.21 starting (python version = 3.4.3)... Start: init plugins INFO: selinux enabled Finish: init plugins Start: run Start: scrub ['chroot'] INFO: scrubbing chroot for mocker-epel-7-x86_64.el7 Finish: scrub ['chroot'] Finish: run Scrub chroot took 6 seconds ============================ ########################################################## ## Tue Jan 10 02:42:13 UTC 2017 Finished env: el7:epel-7-x86_64 ## took 366 seconds ## rc = 1 ########################################################## find: ‘logs’: No such file or directory No log files found, check command output ##!######################################################## Collecting mock logs ‘./mock_logs.xGGwEk6V/mocker-epel-7-x86_64.el7.clean_rpmdb’ -> ‘exported-artifacts/mock_logs/mocker-epel-7-x86_64.el7.clean_rpmdb’ ‘./mock_logs.xGGwEk6V/mocker-epel-7-x86_64.el7.he_basic_suite_4.0.sh’ -> ‘exported-artifacts/mock_logs/mocker-epel-7-x86_64.el7.he_ basic_suite_4.0.sh’ ‘./mock_logs.xGGwEk6V/mocker-epel-7-x86_64.el7.init’ -> ‘exported-artifacts/mock_logs/mocker-epel-7-x86_64.el7.init’ ########################################################## Build step 'Execute shell' marked build as failure Performing Post build task... Match found for :.* : True Logical operation result is TRUE Running script : #!/bin/bash -xe echo 'shell_scripts/system_tests.collect_logs.sh'
# # Required jjb vars: # version # VERSION=4.0 SUITE_TYPE=
WORKSPACE="$PWD" OVIRT_SUITE="$SUITE_TYPE_suite_$VERSION" TESTS_LOGS="$WORKSPACE/ovirt-system-tests/exported-artifacts"
rm -rf "$WORKSPACE/exported-artifacts" mkdir -p "$WORKSPACE/exported-artifacts"
if [[ -d "$TESTS_LOGS" ]]; then mv "$TESTS_LOGS/"* "$WORKSPACE/exported-artifacts/" fi
[ovirt_4.0_he-system-tests] $ /bin/bash -xe /tmp/hudson302101162661598371. sh + echo shell_scripts/system_tests.collect_logs.sh shell_scripts/system_tests.collect_logs.sh + VERSION=4.0 + SUITE_TYPE= + WORKSPACE=<http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/> + OVIRT_SUITE=4.0 + TESTS_LOGS=<http://jenkins.ovirt.org/job/ovirt_4.0_he- system-tests/ws/ovirt-system-tests/exported-artifacts> + rm -rf <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/627/ artifact/exported-artifacts> + mkdir -p <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/627/ artifact/exported-artifacts> + [[ -d <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ ovirt-system-tests/exported-artifacts> ]] + mv <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ ovirt-system-tests/exported-artifacts/failure_msg.txt> < http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ ovirt-system-tests/exported-artifacts/lago_logs> < http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ ovirt-system-tests/exported-artifacts/mock_logs> < http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/627/ artifact/exported-artifacts/> POST BUILD TASK : SUCCESS END OF POST BUILD TASK : 0 Match found for :.* : True Logical operation result is TRUE Running script : #!/bin/bash -x echo "shell-scripts/mock_cleanup.sh" # Make clear this is the cleanup, helps reading the jenkins logs cat <<EOC _______________________________________________________________________ ####################################################################### # # # CLEANUP # # # ####################################################################### EOC
shopt -s nullglob
WORKSPACE="${WORKSPACE:-$PWD}" UMOUNT_RETRIES="${UMOUNT_RETRIES:-3}" UMOUNT_RETRY_DELAY="${UMOUNT_RETRY_DELAY:-1s}"
safe_umount() { local mount="${1:?}" local attempt for ((attempt=0 ; attempt < $UMOUNT_RETRIES ; attempt++)); do # If this is not the 1st time through the loop, Sleep a while to let # the problem "solve itself" [[ attempt > 0 ]] && sleep "$UMOUNT_RETRY_DELAY" # Try to umount sudo umount --lazy "$mount" && return 0 # See if the mount is already not there despite failing findmnt --kernel --first "$mount" > /dev/null && return 0 done echo "ERROR: Failed to umount $mount." return 1 }
# restore the permissions in the working dir, as sometimes it leaves files # owned by root and then the 'cleanup workspace' from jenkins job fails to # clean and breaks the jobs sudo chown -R "$USER" "$WORKSPACE"
# stop any processes running inside the chroot failed=false mock_confs=("$WORKSPACE"/*/mocker*) # Clean current jobs mockroot if any for mock_conf_file in "${mock_confs[@]}"; do [[ "$mock_conf_file" ]] || continue echo "Cleaning up mock $mock_conf" mock_root="${mock_conf_file##*/}" mock_root="${mock_root%.*}" my_mock="/usr/bin/mock" my_mock+=" --configdir=${mock_conf_file%/*}" my_mock+=" --root=${mock_root}" my_mock+=" --resultdir=$WORKSPACE"
#TODO: investigate why mock --clean fails to umount certain dirs sometimes, #so we can use it instead of manually doing all this. echo "Killing all mock orphan processes, if any." $my_mock \ --orphanskill \ || { echo "ERROR: Failed to kill orphans on $chroot." failed=true }
mock_root="$(\ grep \ -Po "(?<=config_opts\['root'\] = ')[^']*" \ "$mock_conf_file" \ )" || : [[ "$mock_root" ]] || continue mounts=($(mount | awk '{print $3}' | grep "$mock_root")) || : if [[ "$mounts" ]]; then echo "Found mounted dirs inside the chroot $chroot. Trying to umount." fi for mount in "${mounts[@]}"; do safe_umount "$mount" || failed=true done done
# Clean any leftover chroot from other jobs for mock_root in /var/lib/mock/*; do this_chroot_failed=false mounts=($(cut -d\ -f2 /proc/mounts | grep "$mock_root" | sort -r)) || : if [[ "$mounts" ]]; then echo "Found mounted dirs inside the chroot $mock_root." \ "Trying to umount." fi for mount in "${mounts[@]}"; do safe_umount "$mount" && continue # If we got here, we failed $UMOUNT_RETRIES attempts so we should make # noise failed=true this_chroot_failed=true done if ! $this_chroot_failed; then sudo rm -rf "$mock_root" fi done
# remove mock caches that are older then 2 days: find /var/cache/mock/ -mindepth 1 -maxdepth 1 -type d -mtime +2 -print0 | \ xargs -0 -tr sudo rm -rf # We make no effort to leave around caches that may still be in use because # packages installed in them may go out of date, so may as well recreate them
# Drop all left over libvirt domains for UUID in $(virsh list --all --uuid); do virsh destroy $UUID || : sleep 2 virsh undefine --remove-all-storage --storage vda --snapshots-metadata $UUID || : done
if $failed; then echo "Cleanup script failed, propegating failure to job" exit 1 fi
[ovirt_4.0_he-system-tests] $ /bin/bash -x /tmp/ hudson1888216492513466503.sh + echo shell-scripts/mock_cleanup.sh shell-scripts/mock_cleanup.sh + cat _______________________________________________________________________ ####################################################################### # # # CLEANUP # # # ####################################################################### + shopt -s nullglob + WORKSPACE=<http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/> + UMOUNT_RETRIES=3 + UMOUNT_RETRY_DELAY=1s + sudo chown -R jenkins <http://jenkins.ovirt.org/job/ ovirt_4.0_he-system-tests/ws/> + failed=false + mock_confs=("$WORKSPACE"/*/mocker*) + for mock_conf_file in '"${mock_confs[@]}"' + [[ -n <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ ovirt-system-tests/mocker-epel-7-x86_64.el7.cfg> ]] + echo 'Cleaning up mock ' Cleaning up mock + mock_root=mocker-epel-7-x86_64.el7.cfg + mock_root=mocker-epel-7-x86_64.el7 + my_mock=/usr/bin/mock + my_mock+=' --configdir=<http://jenkins.ovirt.org/job/ovirt_4.0_he- system-tests/ws/ovirt-system-tests'> + my_mock+=' --root=mocker-epel-7-x86_64.el7' + my_mock+=' --resultdir=<http://jenkins.ovirt.org/job/ovirt_4.0_he- system-tests/ws/'> + echo 'Killing all mock orphan processes, if any.' Killing all mock orphan processes, if any. + /usr/bin/mock --configdir=<http://jenkins.ovirt.org/job/ovirt_4.0_he- system-tests/ws/ovirt-system-tests> --root=mocker-epel-7-x86_64.el7 --resultdir=<http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/> --orphanskill WARNING: Could not find required logging config file: < http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ ovirt-system-tests/logging.ini.> Using default... INFO: mock.py version 1.2.21 starting (python version = 3.4.3)... Start: init plugins INFO: selinux enabled Finish: init plugins Start: run Finish: run ++ grep -Po '(?<=config_opts\['\''root'\''\] = '\'')[^'\'']*' < http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ ovirt-system-tests/mocker-epel-7-x86_64.el7.cfg> + mock_root=epel-7-x86_64-6f628e6dc1a827c86d5e1bd9d3b3d38b + [[ -n epel-7-x86_64-6f628e6dc1a827c86d5e1bd9d3b3d38b ]] + mounts=($(mount | awk '{print $3}' | grep "$mock_root")) ++ mount ++ awk '{print $3}' ++ grep epel-7-x86_64-6f628e6dc1a827c86d5e1bd9d3b3d38b + : + [[ -n '' ]] + find /var/cache/mock/ -mindepth 1 -maxdepth 1 -type d -mtime +2 -print0 + xargs -0 -tr sudo rm -rf ++ virsh list --all --uuid + false POST BUILD TASK : SUCCESS END OF POST BUILD TASK : 1 Recording test results ERROR: Step ‘Publish JUnit test result report’ failed: No test report files were found. Configuration error? Archiving artifacts

On Tue, Jan 10, 2017 at 12:45 PM, Milan Zamazal <mzamazal@redhat.com> wrote:
Yaniv Kaul <ykaul@redhat.com> writes:
On Tue, Jan 10, 2017 at 12:08 PM, Lev Veyde <lveyde@redhat.com> wrote:
This patch is one that caused it probably: https://github.com/lago-project/lago/commit/ 05ccf7240976f91b0c14d6a1f88016 376d5e87f0
+Milan.
+Martin
I must confess that I did not like the patch to begin with... I did not understand what real problem it solved, but Michal assured me there was a real issue.
Yes, there was a real issue with nested virtualization. Some CPU flags are missing with Haswell and Lago doesn't run properly.
Is this a libvirt bug btw? Perhaps we need a switch to turn this feature on and off?
I know have Engine with a Java@ 100% CPU - I hope it's unrelated to this as well.
I suggest we do survey to see who doesn't have SandyBridge and above and perhaps move higher than Westmere.
We've got Westmere servers in the Brno lab.
Do we know the scope of the problem? Does it happen only on Westmere, for example? Y.
What do we have in CI? Y.
Thanks in advance, Lev Veyde.
----- Original Message ----- From: "Lev Veyde" <lveyde@redhat.com> To: "Eyal Edri" <eedri@redhat.com>, sbonazzo@redhat.com Cc: infra@ovirt.org, "Gal Ben Haim" <gbenhaim@redhat.com> Sent: Tuesday, January 10, 2017 11:50:05 AM Subject: Re: Build failed in Jenkins: ovirt_4.0_he-system-tests #627
Hi,
Checked the logs and see the following:
02:42:05 [WARNING] OVF does not contain a valid image description, using default. 02:42:05 The following CPU types are supported by this host: 02:42:05 - model_Westmere: Intel Westmere Family 02:42:05 - model_Nehalem: Intel Nehalem Family 02:42:05 - model_Penryn: Intel Penryn Family 02:42:05 - model_Conroe: Intel Conroe Family 02:42:05 [ ERROR ] Failed to execute stage 'Environment customization': Invalid CPU type specified: model_SandyBridge
Barak thinks that it may be related to the recent update in the Lago
code.
Gal, any idea ?
Thanks in advance, Lev Veyde.
----- Original Message ----- From: jenkins@jenkins.phx.ovirt.org To: sbonazzo@redhat.com, infra@ovirt.org, lveyde@redhat.com Sent: Tuesday, January 10, 2017 4:42:14 AM Subject: Build failed in Jenkins: ovirt_4.0_he-system-tests #627
See <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/627/changes
Changes:
[Lev Veyde] Mask NetworkManager service
[Eyal Edri] fix imgbased job names in jjb
[Daniel Belenky] fixing jjb version for cockpit-ovirt
[Gil Shinar] Add some more 4.1 to experimental
[Juan Hernandez] Don't build RPMs for the JBoss modules Maven plugin
[pkliczewski] jsonrpc 4.1 branch
------------------------------------------ [...truncated 749 lines...] Finish: shell @@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@ @@ Tue Jan 10 02:42:07 UTC 2017 automation/he_basic_suite_4.0.sh chroot finished @@ took 360 seconds @@ rc = 1 @@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@ ========== Scrubbing chroot mock \ --configdir="<http://jenkins.ovirt.org/job/ovirt_4.0_he- system-tests/ws/ovirt-system-tests"> \ --root="mocker-epel-7-x86_64.el7" \ --resultdir="./mock_logs.xGGwEk6V/mocker-epel-7-x86_64.
el7.scrub"
\ --scrub=chroot WARNING: Could not find required logging config file: < http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ ovirt-system-tests/logging.ini.> Using default... INFO: mock.py version 1.2.21 starting (python version = 3.4.3)... Start: init plugins INFO: selinux enabled Finish: init plugins Start: run Start: scrub ['chroot'] INFO: scrubbing chroot for mocker-epel-7-x86_64.el7 Finish: scrub ['chroot'] Finish: run Scrub chroot took 6 seconds ============================ ########################################################## ## Tue Jan 10 02:42:13 UTC 2017 Finished env: el7:epel-7-x86_64 ## took 366 seconds ## rc = 1 ########################################################## find: ‘logs’: No such file or directory No log files found, check command output ##!######################################################## Collecting mock logs ‘./mock_logs.xGGwEk6V/mocker-epel-7-x86_64.el7.clean_rpmdb’ -> ‘exported-artifacts/mock_logs/mocker-epel-7-x86_64.el7.clean_rpmdb’ ‘./mock_logs.xGGwEk6V/mocker-epel-7-x86_64.el7.he_basic_suite_4.0.sh’ -> ‘exported-artifacts/mock_logs/mocker-epel-7-x86_64.el7.he_ basic_suite_4.0.sh’ ‘./mock_logs.xGGwEk6V/mocker-epel-7-x86_64.el7.init’ -> ‘exported-artifacts/mock_logs/mocker-epel-7-x86_64.el7.init’ ########################################################## Build step 'Execute shell' marked build as failure Performing Post build task... Match found for :.* : True Logical operation result is TRUE Running script : #!/bin/bash -xe echo 'shell_scripts/system_tests.collect_logs.sh'
# # Required jjb vars: # version # VERSION=4.0 SUITE_TYPE=
WORKSPACE="$PWD" OVIRT_SUITE="$SUITE_TYPE_suite_$VERSION" TESTS_LOGS="$WORKSPACE/ovirt-system-tests/exported-artifacts"
rm -rf "$WORKSPACE/exported-artifacts" mkdir -p "$WORKSPACE/exported-artifacts"
if [[ -d "$TESTS_LOGS" ]]; then mv "$TESTS_LOGS/"* "$WORKSPACE/exported-artifacts/" fi
[ovirt_4.0_he-system-tests] $ /bin/bash -xe /tmp/hudson302101162661598371. sh + echo shell_scripts/system_tests.collect_logs.sh shell_scripts/system_tests.collect_logs.sh + VERSION=4.0 + SUITE_TYPE= + WORKSPACE=<http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/
+ OVIRT_SUITE=4.0 + TESTS_LOGS=<http://jenkins.ovirt.org/job/ovirt_4.0_he- system-tests/ws/ovirt-system-tests/exported-artifacts> + rm -rf <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/627/ artifact/exported-artifacts> + mkdir -p <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/627/ artifact/exported-artifacts> + [[ -d <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ ovirt-system-tests/exported-artifacts> ]] + mv <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ ovirt-system-tests/exported-artifacts/failure_msg.txt> < http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ ovirt-system-tests/exported-artifacts/lago_logs> < http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ ovirt-system-tests/exported-artifacts/mock_logs> < http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/627/ artifact/exported-artifacts/> POST BUILD TASK : SUCCESS END OF POST BUILD TASK : 0 Match found for :.* : True Logical operation result is TRUE Running script : #!/bin/bash -x echo "shell-scripts/mock_cleanup.sh" # Make clear this is the cleanup, helps reading the jenkins logs cat <<EOC _______________________________________________________________________ ####################################################################### # # # CLEANUP # # # ####################################################################### EOC
shopt -s nullglob
WORKSPACE="${WORKSPACE:-$PWD}" UMOUNT_RETRIES="${UMOUNT_RETRIES:-3}" UMOUNT_RETRY_DELAY="${UMOUNT_RETRY_DELAY:-1s}"
safe_umount() { local mount="${1:?}" local attempt for ((attempt=0 ; attempt < $UMOUNT_RETRIES ; attempt++)); do # If this is not the 1st time through the loop, Sleep a while to let # the problem "solve itself" [[ attempt > 0 ]] && sleep "$UMOUNT_RETRY_DELAY" # Try to umount sudo umount --lazy "$mount" && return 0 # See if the mount is already not there despite failing findmnt --kernel --first "$mount" > /dev/null && return 0 done echo "ERROR: Failed to umount $mount." return 1 }
# restore the permissions in the working dir, as sometimes it leaves files # owned by root and then the 'cleanup workspace' from jenkins job fails to # clean and breaks the jobs sudo chown -R "$USER" "$WORKSPACE"
# stop any processes running inside the chroot failed=false mock_confs=("$WORKSPACE"/*/mocker*) # Clean current jobs mockroot if any for mock_conf_file in "${mock_confs[@]}"; do [[ "$mock_conf_file" ]] || continue echo "Cleaning up mock $mock_conf" mock_root="${mock_conf_file##*/}" mock_root="${mock_root%.*}" my_mock="/usr/bin/mock" my_mock+=" --configdir=${mock_conf_file%/*}" my_mock+=" --root=${mock_root}" my_mock+=" --resultdir=$WORKSPACE"
#TODO: investigate why mock --clean fails to umount certain dirs sometimes, #so we can use it instead of manually doing all this. echo "Killing all mock orphan processes, if any." $my_mock \ --orphanskill \ || { echo "ERROR: Failed to kill orphans on $chroot." failed=true }
mock_root="$(\ grep \ -Po "(?<=config_opts\['root'\] = ')[^']*" \ "$mock_conf_file" \ )" || : [[ "$mock_root" ]] || continue mounts=($(mount | awk '{print $3}' | grep "$mock_root")) || : if [[ "$mounts" ]]; then echo "Found mounted dirs inside the chroot $chroot. Trying to umount." fi for mount in "${mounts[@]}"; do safe_umount "$mount" || failed=true done done
# Clean any leftover chroot from other jobs for mock_root in /var/lib/mock/*; do this_chroot_failed=false mounts=($(cut -d\ -f2 /proc/mounts | grep "$mock_root" | sort -r)) || : if [[ "$mounts" ]]; then echo "Found mounted dirs inside the chroot $mock_root." \ "Trying to umount." fi for mount in "${mounts[@]}"; do safe_umount "$mount" && continue # If we got here, we failed $UMOUNT_RETRIES attempts so we should make # noise failed=true this_chroot_failed=true done if ! $this_chroot_failed; then sudo rm -rf "$mock_root" fi done
# remove mock caches that are older then 2 days: find /var/cache/mock/ -mindepth 1 -maxdepth 1 -type d -mtime +2 -print0 | \ xargs -0 -tr sudo rm -rf # We make no effort to leave around caches that may still be in use because # packages installed in them may go out of date, so may as well recreate them
# Drop all left over libvirt domains for UUID in $(virsh list --all --uuid); do virsh destroy $UUID || : sleep 2 virsh undefine --remove-all-storage --storage vda --snapshots-metadata $UUID || : done
if $failed; then echo "Cleanup script failed, propegating failure to job" exit 1 fi
[ovirt_4.0_he-system-tests] $ /bin/bash -x /tmp/ hudson1888216492513466503.sh + echo shell-scripts/mock_cleanup.sh shell-scripts/mock_cleanup.sh + cat _______________________________________________________________________ ####################################################################### # # # CLEANUP # # # ####################################################################### + shopt -s nullglob + WORKSPACE=<http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/
+ UMOUNT_RETRIES=3 + UMOUNT_RETRY_DELAY=1s + sudo chown -R jenkins <http://jenkins.ovirt.org/job/ ovirt_4.0_he-system-tests/ws/> + failed=false + mock_confs=("$WORKSPACE"/*/mocker*) + for mock_conf_file in '"${mock_confs[@]}"' + [[ -n <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ ovirt-system-tests/mocker-epel-7-x86_64.el7.cfg> ]] + echo 'Cleaning up mock ' Cleaning up mock + mock_root=mocker-epel-7-x86_64.el7.cfg + mock_root=mocker-epel-7-x86_64.el7 + my_mock=/usr/bin/mock + my_mock+=' --configdir=<http://jenkins.ovirt.org/job/ovirt_4.0_he- system-tests/ws/ovirt-system-tests'> + my_mock+=' --root=mocker-epel-7-x86_64.el7' + my_mock+=' --resultdir=<http://jenkins.ovirt.org/job/ovirt_4.0_he- system-tests/ws/'> + echo 'Killing all mock orphan processes, if any.' Killing all mock orphan processes, if any. + /usr/bin/mock --configdir=<http://jenkins.ovirt.org/job/ovirt_4.0_he- system-tests/ws/ovirt-system-tests> --root=mocker-epel-7-x86_64.el7 --resultdir=<http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/
--orphanskill WARNING: Could not find required logging config file: < http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ ovirt-system-tests/logging.ini.> Using default... INFO: mock.py version 1.2.21 starting (python version = 3.4.3)... Start: init plugins INFO: selinux enabled Finish: init plugins Start: run Finish: run ++ grep -Po '(?<=config_opts\['\''root'\''\] = '\'')[^'\'']*' < http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ ovirt-system-tests/mocker-epel-7-x86_64.el7.cfg> + mock_root=epel-7-x86_64-6f628e6dc1a827c86d5e1bd9d3b3d38b + [[ -n epel-7-x86_64-6f628e6dc1a827c86d5e1bd9d3b3d38b ]] + mounts=($(mount | awk '{print $3}' | grep "$mock_root")) ++ mount ++ awk '{print $3}' ++ grep epel-7-x86_64-6f628e6dc1a827c86d5e1bd9d3b3d38b + : + [[ -n '' ]] + find /var/cache/mock/ -mindepth 1 -maxdepth 1 -type d -mtime +2 -print0 + xargs -0 -tr sudo rm -rf ++ virsh list --all --uuid + false POST BUILD TASK : SUCCESS END OF POST BUILD TASK : 1 Recording test results ERROR: Step ‘Publish JUnit test result report’ failed: No test report files were found. Configuration error? Archiving artifacts

Yaniv Kaul <ykaul@redhat.com> writes:
On Tue, Jan 10, 2017 at 12:45 PM, Milan Zamazal <mzamazal@redhat.com> wrote:
Yaniv Kaul <ykaul@redhat.com> writes:
On Tue, Jan 10, 2017 at 12:08 PM, Lev Veyde <lveyde@redhat.com> wrote:
This patch is one that caused it probably: https://github.com/lago-project/lago/commit/ 05ccf7240976f91b0c14d6a1f88016 376d5e87f0
+Milan.
+Martin
I must confess that I did not like the patch to begin with... I did not understand what real problem it solved, but Michal assured me there was a real issue.
Yes, there was a real issue with nested virtualization. Some CPU flags are missing with Haswell and Lago doesn't run properly.
Is this a libvirt bug btw?
I'm not sure. When the sets of CPU flags on the host and in the VM with a copied host CPU are different, it's not clear what's the right thing to do.
Perhaps we need a switch to turn this feature on and off?
I think it would be useful to have a possibility to specify a particular CPU type in the Lago configuration.
I know have Engine with a Java@ 100% CPU - I hope it's unrelated to this as well.
I suggest we do survey to see who doesn't have SandyBridge and above and perhaps move higher than Westmere.
We've got Westmere servers in the Brno lab.
Do we know the scope of the problem? Does it happen only on Westmere, for example?
The problem was with Haswell-noTSX (on my Lenovo, but I think Martin has observed the same problem too). We don't know the scope of the problem, but if we want to be able to run Lago on Brno servers then we must be Westmere compatible.
Y.
What do we have in CI? Y.
Thanks in advance, Lev Veyde.
----- Original Message ----- From: "Lev Veyde" <lveyde@redhat.com> To: "Eyal Edri" <eedri@redhat.com>, sbonazzo@redhat.com Cc: infra@ovirt.org, "Gal Ben Haim" <gbenhaim@redhat.com> Sent: Tuesday, January 10, 2017 11:50:05 AM Subject: Re: Build failed in Jenkins: ovirt_4.0_he-system-tests #627
Hi,
Checked the logs and see the following:
02:42:05 [WARNING] OVF does not contain a valid image description, using default. 02:42:05 The following CPU types are supported by this host: 02:42:05 - model_Westmere: Intel Westmere Family 02:42:05 - model_Nehalem: Intel Nehalem Family 02:42:05 - model_Penryn: Intel Penryn Family 02:42:05 - model_Conroe: Intel Conroe Family 02:42:05 [ ERROR ] Failed to execute stage 'Environment customization': Invalid CPU type specified: model_SandyBridge
Barak thinks that it may be related to the recent update in the Lago
code.
Gal, any idea ?
Thanks in advance, Lev Veyde.
----- Original Message ----- From: jenkins@jenkins.phx.ovirt.org To: sbonazzo@redhat.com, infra@ovirt.org, lveyde@redhat.com Sent: Tuesday, January 10, 2017 4:42:14 AM Subject: Build failed in Jenkins: ovirt_4.0_he-system-tests #627
See <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/627/changes
Changes:
[Lev Veyde] Mask NetworkManager service
[Eyal Edri] fix imgbased job names in jjb
[Daniel Belenky] fixing jjb version for cockpit-ovirt
[Gil Shinar] Add some more 4.1 to experimental
[Juan Hernandez] Don't build RPMs for the JBoss modules Maven plugin
[pkliczewski] jsonrpc 4.1 branch
------------------------------------------ [...truncated 749 lines...] Finish: shell @@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@ @@ Tue Jan 10 02:42:07 UTC 2017 automation/he_basic_suite_4.0.sh chroot finished @@ took 360 seconds @@ rc = 1 @@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@ ========== Scrubbing chroot mock \ --configdir="<http://jenkins.ovirt.org/job/ovirt_4.0_he- system-tests/ws/ovirt-system-tests"> \ --root="mocker-epel-7-x86_64.el7" \ --resultdir="./mock_logs.xGGwEk6V/mocker-epel-7-x86_64.
el7.scrub"
\ --scrub=chroot WARNING: Could not find required logging config file: < http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ ovirt-system-tests/logging.ini.> Using default... INFO: mock.py version 1.2.21 starting (python version = 3.4.3)... Start: init plugins INFO: selinux enabled Finish: init plugins Start: run Start: scrub ['chroot'] INFO: scrubbing chroot for mocker-epel-7-x86_64.el7 Finish: scrub ['chroot'] Finish: run Scrub chroot took 6 seconds ============================ ########################################################## ## Tue Jan 10 02:42:13 UTC 2017 Finished env: el7:epel-7-x86_64 ## took 366 seconds ## rc = 1 ########################################################## find: ‘logs’: No such file or directory No log files found, check command output ##!######################################################## Collecting mock logs ‘./mock_logs.xGGwEk6V/mocker-epel-7-x86_64.el7.clean_rpmdb’ -> ‘exported-artifacts/mock_logs/mocker-epel-7-x86_64.el7.clean_rpmdb’ ‘./mock_logs.xGGwEk6V/mocker-epel-7-x86_64.el7.he_basic_suite_4.0.sh’ -> ‘exported-artifacts/mock_logs/mocker-epel-7-x86_64.el7.he_ basic_suite_4.0.sh’ ‘./mock_logs.xGGwEk6V/mocker-epel-7-x86_64.el7.init’ -> ‘exported-artifacts/mock_logs/mocker-epel-7-x86_64.el7.init’ ########################################################## Build step 'Execute shell' marked build as failure Performing Post build task... Match found for :.* : True Logical operation result is TRUE Running script : #!/bin/bash -xe echo 'shell_scripts/system_tests.collect_logs.sh'
# # Required jjb vars: # version # VERSION=4.0 SUITE_TYPE=
WORKSPACE="$PWD" OVIRT_SUITE="$SUITE_TYPE_suite_$VERSION" TESTS_LOGS="$WORKSPACE/ovirt-system-tests/exported-artifacts"
rm -rf "$WORKSPACE/exported-artifacts" mkdir -p "$WORKSPACE/exported-artifacts"
if [[ -d "$TESTS_LOGS" ]]; then mv "$TESTS_LOGS/"* "$WORKSPACE/exported-artifacts/" fi
[ovirt_4.0_he-system-tests] $ /bin/bash -xe /tmp/hudson302101162661598371. sh + echo shell_scripts/system_tests.collect_logs.sh shell_scripts/system_tests.collect_logs.sh + VERSION=4.0 + SUITE_TYPE= + WORKSPACE=<http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/
+ OVIRT_SUITE=4.0 + TESTS_LOGS=<http://jenkins.ovirt.org/job/ovirt_4.0_he- system-tests/ws/ovirt-system-tests/exported-artifacts> + rm -rf <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/627/ artifact/exported-artifacts> + mkdir -p <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/627/ artifact/exported-artifacts> + [[ -d <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ ovirt-system-tests/exported-artifacts> ]] + mv <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ ovirt-system-tests/exported-artifacts/failure_msg.txt> < http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ ovirt-system-tests/exported-artifacts/lago_logs> < http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ ovirt-system-tests/exported-artifacts/mock_logs> < http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/627/ artifact/exported-artifacts/> POST BUILD TASK : SUCCESS END OF POST BUILD TASK : 0 Match found for :.* : True Logical operation result is TRUE Running script : #!/bin/bash -x echo "shell-scripts/mock_cleanup.sh" # Make clear this is the cleanup, helps reading the jenkins logs cat <<EOC _______________________________________________________________________ ####################################################################### # # # CLEANUP # # # ####################################################################### EOC
shopt -s nullglob
WORKSPACE="${WORKSPACE:-$PWD}" UMOUNT_RETRIES="${UMOUNT_RETRIES:-3}" UMOUNT_RETRY_DELAY="${UMOUNT_RETRY_DELAY:-1s}"
safe_umount() { local mount="${1:?}" local attempt for ((attempt=0 ; attempt < $UMOUNT_RETRIES ; attempt++)); do # If this is not the 1st time through the loop, Sleep a while to let # the problem "solve itself" [[ attempt > 0 ]] && sleep "$UMOUNT_RETRY_DELAY" # Try to umount sudo umount --lazy "$mount" && return 0 # See if the mount is already not there despite failing findmnt --kernel --first "$mount" > /dev/null && return 0 done echo "ERROR: Failed to umount $mount." return 1 }
# restore the permissions in the working dir, as sometimes it leaves files # owned by root and then the 'cleanup workspace' from jenkins job fails to # clean and breaks the jobs sudo chown -R "$USER" "$WORKSPACE"
# stop any processes running inside the chroot failed=false mock_confs=("$WORKSPACE"/*/mocker*) # Clean current jobs mockroot if any for mock_conf_file in "${mock_confs[@]}"; do [[ "$mock_conf_file" ]] || continue echo "Cleaning up mock $mock_conf" mock_root="${mock_conf_file##*/}" mock_root="${mock_root%.*}" my_mock="/usr/bin/mock" my_mock+=" --configdir=${mock_conf_file%/*}" my_mock+=" --root=${mock_root}" my_mock+=" --resultdir=$WORKSPACE"
#TODO: investigate why mock --clean fails to umount certain dirs sometimes, #so we can use it instead of manually doing all this. echo "Killing all mock orphan processes, if any." $my_mock \ --orphanskill \ || { echo "ERROR: Failed to kill orphans on $chroot." failed=true }
mock_root="$(\ grep \ -Po "(?<=config_opts\['root'\] = ')[^']*" \ "$mock_conf_file" \ )" || : [[ "$mock_root" ]] || continue mounts=($(mount | awk '{print $3}' | grep "$mock_root")) || : if [[ "$mounts" ]]; then echo "Found mounted dirs inside the chroot $chroot. Trying to umount." fi for mount in "${mounts[@]}"; do safe_umount "$mount" || failed=true done done
# Clean any leftover chroot from other jobs for mock_root in /var/lib/mock/*; do this_chroot_failed=false mounts=($(cut -d\ -f2 /proc/mounts | grep "$mock_root" | sort -r)) || : if [[ "$mounts" ]]; then echo "Found mounted dirs inside the chroot $mock_root." \ "Trying to umount." fi for mount in "${mounts[@]}"; do safe_umount "$mount" && continue # If we got here, we failed $UMOUNT_RETRIES attempts so we should make # noise failed=true this_chroot_failed=true done if ! $this_chroot_failed; then sudo rm -rf "$mock_root" fi done
# remove mock caches that are older then 2 days: find /var/cache/mock/ -mindepth 1 -maxdepth 1 -type d -mtime +2 -print0 | \ xargs -0 -tr sudo rm -rf # We make no effort to leave around caches that may still be in use because # packages installed in them may go out of date, so may as well recreate them
# Drop all left over libvirt domains for UUID in $(virsh list --all --uuid); do virsh destroy $UUID || : sleep 2 virsh undefine --remove-all-storage --storage vda --snapshots-metadata $UUID || : done
if $failed; then echo "Cleanup script failed, propegating failure to job" exit 1 fi
[ovirt_4.0_he-system-tests] $ /bin/bash -x /tmp/ hudson1888216492513466503.sh + echo shell-scripts/mock_cleanup.sh shell-scripts/mock_cleanup.sh + cat _______________________________________________________________________ ####################################################################### # # # CLEANUP # # # ####################################################################### + shopt -s nullglob + WORKSPACE=<http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/
+ UMOUNT_RETRIES=3 + UMOUNT_RETRY_DELAY=1s + sudo chown -R jenkins <http://jenkins.ovirt.org/job/ ovirt_4.0_he-system-tests/ws/> + failed=false + mock_confs=("$WORKSPACE"/*/mocker*) + for mock_conf_file in '"${mock_confs[@]}"' + [[ -n <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ ovirt-system-tests/mocker-epel-7-x86_64.el7.cfg> ]] + echo 'Cleaning up mock ' Cleaning up mock + mock_root=mocker-epel-7-x86_64.el7.cfg + mock_root=mocker-epel-7-x86_64.el7 + my_mock=/usr/bin/mock + my_mock+=' --configdir=<http://jenkins.ovirt.org/job/ovirt_4.0_he- system-tests/ws/ovirt-system-tests'> + my_mock+=' --root=mocker-epel-7-x86_64.el7' + my_mock+=' --resultdir=<http://jenkins.ovirt.org/job/ovirt_4.0_he- system-tests/ws/'> + echo 'Killing all mock orphan processes, if any.' Killing all mock orphan processes, if any. + /usr/bin/mock --configdir=<http://jenkins.ovirt.org/job/ovirt_4.0_he- system-tests/ws/ovirt-system-tests> --root=mocker-epel-7-x86_64.el7 --resultdir=<http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/
--orphanskill WARNING: Could not find required logging config file: < http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ ovirt-system-tests/logging.ini.> Using default... INFO: mock.py version 1.2.21 starting (python version = 3.4.3)... Start: init plugins INFO: selinux enabled Finish: init plugins Start: run Finish: run ++ grep -Po '(?<=config_opts\['\''root'\''\] = '\'')[^'\'']*' < http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ ovirt-system-tests/mocker-epel-7-x86_64.el7.cfg> + mock_root=epel-7-x86_64-6f628e6dc1a827c86d5e1bd9d3b3d38b + [[ -n epel-7-x86_64-6f628e6dc1a827c86d5e1bd9d3b3d38b ]] + mounts=($(mount | awk '{print $3}' | grep "$mock_root")) ++ mount ++ awk '{print $3}' ++ grep epel-7-x86_64-6f628e6dc1a827c86d5e1bd9d3b3d38b + : + [[ -n '' ]] + find /var/cache/mock/ -mindepth 1 -maxdepth 1 -type d -mtime +2 -print0 + xargs -0 -tr sudo rm -rf ++ virsh list --all --uuid + false POST BUILD TASK : SUCCESS END OF POST BUILD TASK : 1 Recording test results ERROR: Step ‘Publish JUnit test result report’ failed: No test report files were found. Configuration error? Archiving artifacts

Not sure what the initial problem was, but on my laptop (Haswell-MB) I always use the lowest possible CPU family to ensure it's using as few features as possible in nested VMs: <cpu mode='custom' match='exact'> <model fallback='allow'>core2duo</model> <feature policy='require' name='vmx'/> </cpu> Respectively, I use model_Conroe on oVirt side and didn't have problems with it. Do we really need to use newer CPU families in our tests? Regards, Evgheni Dereveanchin ----- Original Message ----- From: "Milan Zamazal" <mzamazal@redhat.com> To: "Yaniv Kaul" <ykaul@redhat.com> Cc: "Lev Veyde" <lveyde@redhat.com>, "Eyal Edri" <eedri@redhat.com>, "Sandro Bonazzola" <sbonazzo@redhat.com>, "infra" <infra@ovirt.org>, "Gal Ben Haim" <gbenhaim@redhat.com>, "Martin Polednik" <mpoledni@redhat.com>, "Evgheni Dereveanchin" <ederevea@redhat.com> Sent: Tuesday, 10 January, 2017 1:16:09 PM Subject: Re: Build failed in Jenkins: ovirt_4.0_he-system-tests #627 Yaniv Kaul <ykaul@redhat.com> writes:
On Tue, Jan 10, 2017 at 12:45 PM, Milan Zamazal <mzamazal@redhat.com> wrote:
Yaniv Kaul <ykaul@redhat.com> writes:
On Tue, Jan 10, 2017 at 12:08 PM, Lev Veyde <lveyde@redhat.com> wrote:
This patch is one that caused it probably: https://github.com/lago-project/lago/commit/ 05ccf7240976f91b0c14d6a1f88016 376d5e87f0
+Milan.
+Martin
I must confess that I did not like the patch to begin with... I did not understand what real problem it solved, but Michal assured me there was a real issue.
Yes, there was a real issue with nested virtualization. Some CPU flags are missing with Haswell and Lago doesn't run properly.
Is this a libvirt bug btw?
I'm not sure. When the sets of CPU flags on the host and in the VM with a copied host CPU are different, it's not clear what's the right thing to do.
Perhaps we need a switch to turn this feature on and off?
I think it would be useful to have a possibility to specify a particular CPU type in the Lago configuration.
I know have Engine with a Java@ 100% CPU - I hope it's unrelated to this as well.
I suggest we do survey to see who doesn't have SandyBridge and above and perhaps move higher than Westmere.
We've got Westmere servers in the Brno lab.
Do we know the scope of the problem? Does it happen only on Westmere, for example?
The problem was with Haswell-noTSX (on my Lenovo, but I think Martin has observed the same problem too). We don't know the scope of the problem, but if we want to be able to run Lago on Brno servers then we must be Westmere compatible.
Y.
What do we have in CI? Y.
Thanks in advance, Lev Veyde.
----- Original Message ----- From: "Lev Veyde" <lveyde@redhat.com> To: "Eyal Edri" <eedri@redhat.com>, sbonazzo@redhat.com Cc: infra@ovirt.org, "Gal Ben Haim" <gbenhaim@redhat.com> Sent: Tuesday, January 10, 2017 11:50:05 AM Subject: Re: Build failed in Jenkins: ovirt_4.0_he-system-tests #627
Hi,
Checked the logs and see the following:
02:42:05 [WARNING] OVF does not contain a valid image description, using default. 02:42:05 The following CPU types are supported by this host: 02:42:05 - model_Westmere: Intel Westmere Family 02:42:05 - model_Nehalem: Intel Nehalem Family 02:42:05 - model_Penryn: Intel Penryn Family 02:42:05 - model_Conroe: Intel Conroe Family 02:42:05 [ ERROR ] Failed to execute stage 'Environment customization': Invalid CPU type specified: model_SandyBridge
Barak thinks that it may be related to the recent update in the Lago
code.
Gal, any idea ?
Thanks in advance, Lev Veyde.
----- Original Message ----- From: jenkins@jenkins.phx.ovirt.org To: sbonazzo@redhat.com, infra@ovirt.org, lveyde@redhat.com Sent: Tuesday, January 10, 2017 4:42:14 AM Subject: Build failed in Jenkins: ovirt_4.0_he-system-tests #627
See <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/627/changes
Changes:
[Lev Veyde] Mask NetworkManager service
[Eyal Edri] fix imgbased job names in jjb
[Daniel Belenky] fixing jjb version for cockpit-ovirt
[Gil Shinar] Add some more 4.1 to experimental
[Juan Hernandez] Don't build RPMs for the JBoss modules Maven plugin
[pkliczewski] jsonrpc 4.1 branch
------------------------------------------ [...truncated 749 lines...] Finish: shell @@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@ @@ Tue Jan 10 02:42:07 UTC 2017 automation/he_basic_suite_4.0.sh chroot finished @@ took 360 seconds @@ rc = 1 @@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@ ========== Scrubbing chroot mock \ --configdir="<http://jenkins.ovirt.org/job/ovirt_4.0_he- system-tests/ws/ovirt-system-tests"> \ --root="mocker-epel-7-x86_64.el7" \ --resultdir="./mock_logs.xGGwEk6V/mocker-epel-7-x86_64.
el7.scrub"
\ --scrub=chroot WARNING: Could not find required logging config file: < http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ ovirt-system-tests/logging.ini.> Using default... INFO: mock.py version 1.2.21 starting (python version = 3.4.3)... Start: init plugins INFO: selinux enabled Finish: init plugins Start: run Start: scrub ['chroot'] INFO: scrubbing chroot for mocker-epel-7-x86_64.el7 Finish: scrub ['chroot'] Finish: run Scrub chroot took 6 seconds ============================ ########################################################## ## Tue Jan 10 02:42:13 UTC 2017 Finished env: el7:epel-7-x86_64 ## took 366 seconds ## rc = 1 ########################################################## find: ‘logs’: No such file or directory No log files found, check command output ##!######################################################## Collecting mock logs ‘./mock_logs.xGGwEk6V/mocker-epel-7-x86_64.el7.clean_rpmdb’ -> ‘exported-artifacts/mock_logs/mocker-epel-7-x86_64.el7.clean_rpmdb’ ‘./mock_logs.xGGwEk6V/mocker-epel-7-x86_64.el7.he_basic_suite_4.0.sh’ -> ‘exported-artifacts/mock_logs/mocker-epel-7-x86_64.el7.he_ basic_suite_4.0.sh’ ‘./mock_logs.xGGwEk6V/mocker-epel-7-x86_64.el7.init’ -> ‘exported-artifacts/mock_logs/mocker-epel-7-x86_64.el7.init’ ########################################################## Build step 'Execute shell' marked build as failure Performing Post build task... Match found for :.* : True Logical operation result is TRUE Running script : #!/bin/bash -xe echo 'shell_scripts/system_tests.collect_logs.sh'
# # Required jjb vars: # version # VERSION=4.0 SUITE_TYPE=
WORKSPACE="$PWD" OVIRT_SUITE="$SUITE_TYPE_suite_$VERSION" TESTS_LOGS="$WORKSPACE/ovirt-system-tests/exported-artifacts"
rm -rf "$WORKSPACE/exported-artifacts" mkdir -p "$WORKSPACE/exported-artifacts"
if [[ -d "$TESTS_LOGS" ]]; then mv "$TESTS_LOGS/"* "$WORKSPACE/exported-artifacts/" fi
[ovirt_4.0_he-system-tests] $ /bin/bash -xe /tmp/hudson302101162661598371. sh + echo shell_scripts/system_tests.collect_logs.sh shell_scripts/system_tests.collect_logs.sh + VERSION=4.0 + SUITE_TYPE= + WORKSPACE=<http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/
+ OVIRT_SUITE=4.0 + TESTS_LOGS=<http://jenkins.ovirt.org/job/ovirt_4.0_he- system-tests/ws/ovirt-system-tests/exported-artifacts> + rm -rf <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/627/ artifact/exported-artifacts> + mkdir -p <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/627/ artifact/exported-artifacts> + [[ -d <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ ovirt-system-tests/exported-artifacts> ]] + mv <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ ovirt-system-tests/exported-artifacts/failure_msg.txt> < http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ ovirt-system-tests/exported-artifacts/lago_logs> < http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ ovirt-system-tests/exported-artifacts/mock_logs> < http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/627/ artifact/exported-artifacts/> POST BUILD TASK : SUCCESS END OF POST BUILD TASK : 0 Match found for :.* : True Logical operation result is TRUE Running script : #!/bin/bash -x echo "shell-scripts/mock_cleanup.sh" # Make clear this is the cleanup, helps reading the jenkins logs cat <<EOC _______________________________________________________________________ ####################################################################### # # # CLEANUP # # # ####################################################################### EOC
shopt -s nullglob
WORKSPACE="${WORKSPACE:-$PWD}" UMOUNT_RETRIES="${UMOUNT_RETRIES:-3}" UMOUNT_RETRY_DELAY="${UMOUNT_RETRY_DELAY:-1s}"
safe_umount() { local mount="${1:?}" local attempt for ((attempt=0 ; attempt < $UMOUNT_RETRIES ; attempt++)); do # If this is not the 1st time through the loop, Sleep a while to let # the problem "solve itself" [[ attempt > 0 ]] && sleep "$UMOUNT_RETRY_DELAY" # Try to umount sudo umount --lazy "$mount" && return 0 # See if the mount is already not there despite failing findmnt --kernel --first "$mount" > /dev/null && return 0 done echo "ERROR: Failed to umount $mount." return 1 }
# restore the permissions in the working dir, as sometimes it leaves files # owned by root and then the 'cleanup workspace' from jenkins job fails to # clean and breaks the jobs sudo chown -R "$USER" "$WORKSPACE"
# stop any processes running inside the chroot failed=false mock_confs=("$WORKSPACE"/*/mocker*) # Clean current jobs mockroot if any for mock_conf_file in "${mock_confs[@]}"; do [[ "$mock_conf_file" ]] || continue echo "Cleaning up mock $mock_conf" mock_root="${mock_conf_file##*/}" mock_root="${mock_root%.*}" my_mock="/usr/bin/mock" my_mock+=" --configdir=${mock_conf_file%/*}" my_mock+=" --root=${mock_root}" my_mock+=" --resultdir=$WORKSPACE"
#TODO: investigate why mock --clean fails to umount certain dirs sometimes, #so we can use it instead of manually doing all this. echo "Killing all mock orphan processes, if any." $my_mock \ --orphanskill \ || { echo "ERROR: Failed to kill orphans on $chroot." failed=true }
mock_root="$(\ grep \ -Po "(?<=config_opts\['root'\] = ')[^']*" \ "$mock_conf_file" \ )" || : [[ "$mock_root" ]] || continue mounts=($(mount | awk '{print $3}' | grep "$mock_root")) || : if [[ "$mounts" ]]; then echo "Found mounted dirs inside the chroot $chroot. Trying to umount." fi for mount in "${mounts[@]}"; do safe_umount "$mount" || failed=true done done
# Clean any leftover chroot from other jobs for mock_root in /var/lib/mock/*; do this_chroot_failed=false mounts=($(cut -d\ -f2 /proc/mounts | grep "$mock_root" | sort -r)) || : if [[ "$mounts" ]]; then echo "Found mounted dirs inside the chroot $mock_root." \ "Trying to umount." fi for mount in "${mounts[@]}"; do safe_umount "$mount" && continue # If we got here, we failed $UMOUNT_RETRIES attempts so we should make # noise failed=true this_chroot_failed=true done if ! $this_chroot_failed; then sudo rm -rf "$mock_root" fi done
# remove mock caches that are older then 2 days: find /var/cache/mock/ -mindepth 1 -maxdepth 1 -type d -mtime +2 -print0 | \ xargs -0 -tr sudo rm -rf # We make no effort to leave around caches that may still be in use because # packages installed in them may go out of date, so may as well recreate them
# Drop all left over libvirt domains for UUID in $(virsh list --all --uuid); do virsh destroy $UUID || : sleep 2 virsh undefine --remove-all-storage --storage vda --snapshots-metadata $UUID || : done
if $failed; then echo "Cleanup script failed, propegating failure to job" exit 1 fi
[ovirt_4.0_he-system-tests] $ /bin/bash -x /tmp/ hudson1888216492513466503.sh + echo shell-scripts/mock_cleanup.sh shell-scripts/mock_cleanup.sh + cat _______________________________________________________________________ ####################################################################### # # # CLEANUP # # # ####################################################################### + shopt -s nullglob + WORKSPACE=<http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/
+ UMOUNT_RETRIES=3 + UMOUNT_RETRY_DELAY=1s + sudo chown -R jenkins <http://jenkins.ovirt.org/job/ ovirt_4.0_he-system-tests/ws/> + failed=false + mock_confs=("$WORKSPACE"/*/mocker*) + for mock_conf_file in '"${mock_confs[@]}"' + [[ -n <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ ovirt-system-tests/mocker-epel-7-x86_64.el7.cfg> ]] + echo 'Cleaning up mock ' Cleaning up mock + mock_root=mocker-epel-7-x86_64.el7.cfg + mock_root=mocker-epel-7-x86_64.el7 + my_mock=/usr/bin/mock + my_mock+=' --configdir=<http://jenkins.ovirt.org/job/ovirt_4.0_he- system-tests/ws/ovirt-system-tests'> + my_mock+=' --root=mocker-epel-7-x86_64.el7' + my_mock+=' --resultdir=<http://jenkins.ovirt.org/job/ovirt_4.0_he- system-tests/ws/'> + echo 'Killing all mock orphan processes, if any.' Killing all mock orphan processes, if any. + /usr/bin/mock --configdir=<http://jenkins.ovirt.org/job/ovirt_4.0_he- system-tests/ws/ovirt-system-tests> --root=mocker-epel-7-x86_64.el7 --resultdir=<http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/
--orphanskill WARNING: Could not find required logging config file: < http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ ovirt-system-tests/logging.ini.> Using default... INFO: mock.py version 1.2.21 starting (python version = 3.4.3)... Start: init plugins INFO: selinux enabled Finish: init plugins Start: run Finish: run ++ grep -Po '(?<=config_opts\['\''root'\''\] = '\'')[^'\'']*' < http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ ovirt-system-tests/mocker-epel-7-x86_64.el7.cfg> + mock_root=epel-7-x86_64-6f628e6dc1a827c86d5e1bd9d3b3d38b + [[ -n epel-7-x86_64-6f628e6dc1a827c86d5e1bd9d3b3d38b ]] + mounts=($(mount | awk '{print $3}' | grep "$mock_root")) ++ mount ++ awk '{print $3}' ++ grep epel-7-x86_64-6f628e6dc1a827c86d5e1bd9d3b3d38b + : + [[ -n '' ]] + find /var/cache/mock/ -mindepth 1 -maxdepth 1 -type d -mtime +2 -print0 + xargs -0 -tr sudo rm -rf ++ virsh list --all --uuid + false POST BUILD TASK : SUCCESS END OF POST BUILD TASK : 1 Recording test results ERROR: Step ‘Publish JUnit test result report’ failed: No test report files were found. Configuration error? Archiving artifacts

On Tue, Jan 10, 2017 at 3:14 PM, Evgheni Dereveanchin <ederevea@redhat.com> wrote:
Not sure what the initial problem was, but on my laptop (Haswell-MB) I always use the lowest possible CPU family to ensure it's using as few features as possible in nested VMs:
;-) I'm doing the exact opposite, for two reasons: 1. I want the best possible performance. Specifically, I'd like the tests to run as fast as possible. 2. I'd like to expose as many of the latest features up to the hosts (and VMs).
<cpu mode='custom' match='exact'> <model fallback='allow'>core2duo</model> <feature policy='require' name='vmx'/> </cpu>
Respectively, I use model_Conroe on oVirt side and didn't have problems with it. Do we really need to use newer CPU families in our tests?
We probably don't - we used to have Conroe hard-coded in the tests (until I changed it to use something different). It does mean it'll be a bit challenging to run on AMD if we decide to go back to hard-code Conroe. Y.
Regards, Evgheni Dereveanchin
----- Original Message ----- From: "Milan Zamazal" <mzamazal@redhat.com> To: "Yaniv Kaul" <ykaul@redhat.com> Cc: "Lev Veyde" <lveyde@redhat.com>, "Eyal Edri" <eedri@redhat.com>, "Sandro Bonazzola" <sbonazzo@redhat.com>, "infra" <infra@ovirt.org>, "Gal Ben Haim" <gbenhaim@redhat.com>, "Martin Polednik" <mpoledni@redhat.com>, "Evgheni Dereveanchin" <ederevea@redhat.com> Sent: Tuesday, 10 January, 2017 1:16:09 PM Subject: Re: Build failed in Jenkins: ovirt_4.0_he-system-tests #627
Yaniv Kaul <ykaul@redhat.com> writes:
On Tue, Jan 10, 2017 at 12:45 PM, Milan Zamazal <mzamazal@redhat.com> wrote:
Yaniv Kaul <ykaul@redhat.com> writes:
On Tue, Jan 10, 2017 at 12:08 PM, Lev Veyde <lveyde@redhat.com> wrote:
This patch is one that caused it probably: https://github.com/lago-project/lago/commit/ 05ccf7240976f91b0c14d6a1f88016 376d5e87f0
+Milan.
+Martin
I must confess that I did not like the patch to begin with... I did not understand what real problem it solved, but Michal assured me there was a real issue.
Yes, there was a real issue with nested virtualization. Some CPU flags are missing with Haswell and Lago doesn't run properly.
Is this a libvirt bug btw?
I'm not sure. When the sets of CPU flags on the host and in the VM with a copied host CPU are different, it's not clear what's the right thing to do.
Perhaps we need a switch to turn this feature on and off?
I think it would be useful to have a possibility to specify a particular CPU type in the Lago configuration.
I know have Engine with a Java@ 100% CPU - I hope it's unrelated to this as well.
I suggest we do survey to see who doesn't have SandyBridge and above and perhaps move higher than Westmere.
We've got Westmere servers in the Brno lab.
Do we know the scope of the problem? Does it happen only on Westmere, for example?
The problem was with Haswell-noTSX (on my Lenovo, but I think Martin has observed the same problem too). We don't know the scope of the problem, but if we want to be able to run Lago on Brno servers then we must be Westmere compatible.
Y.
What do we have in CI? Y.
Thanks in advance, Lev Veyde.
----- Original Message ----- From: "Lev Veyde" <lveyde@redhat.com> To: "Eyal Edri" <eedri@redhat.com>, sbonazzo@redhat.com Cc: infra@ovirt.org, "Gal Ben Haim" <gbenhaim@redhat.com> Sent: Tuesday, January 10, 2017 11:50:05 AM Subject: Re: Build failed in Jenkins: ovirt_4.0_he-system-tests #627
Hi,
Checked the logs and see the following:
02:42:05 [WARNING] OVF does not contain a valid image description,
using
default. 02:42:05 The following CPU types are supported by this host: 02:42:05 - model_Westmere: Intel Westmere Family 02:42:05 - model_Nehalem: Intel Nehalem Family 02:42:05 - model_Penryn: Intel Penryn Family 02:42:05 - model_Conroe: Intel Conroe Family 02:42:05 [ ERROR ] Failed to execute stage 'Environment customization': Invalid CPU type specified: model_SandyBridge
Barak thinks that it may be related to the recent update in the Lago code.
Gal, any idea ?
Thanks in advance, Lev Veyde.
----- Original Message ----- From: jenkins@jenkins.phx.ovirt.org To: sbonazzo@redhat.com, infra@ovirt.org, lveyde@redhat.com Sent: Tuesday, January 10, 2017 4:42:14 AM Subject: Build failed in Jenkins: ovirt_4.0_he-system-tests #627
See <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/627/ changes
Changes:
[Lev Veyde] Mask NetworkManager service
[Eyal Edri] fix imgbased job names in jjb
[Daniel Belenky] fixing jjb version for cockpit-ovirt
[Gil Shinar] Add some more 4.1 to experimental
[Juan Hernandez] Don't build RPMs for the JBoss modules Maven plugin
[pkliczewski] jsonrpc 4.1 branch
------------------------------------------ [...truncated 749 lines...] Finish: shell @@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@ @@ Tue Jan 10 02:42:07 UTC 2017 automation/he_basic_suite_4.0.sh
chroot
finished @@ took 360 seconds @@ rc = 1 @@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@ ========== Scrubbing chroot mock \ --configdir="<http://jenkins.ovirt.org/job/ovirt_4.0_he- system-tests/ws/ovirt-system-tests"> \ --root="mocker-epel-7-x86_64.el7" \ --resultdir="./mock_logs.xGGwEk6V/mocker-epel-7-x86_64. el7.scrub" \ --scrub=chroot WARNING: Could not find required logging config file: < http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ ovirt-system-tests/logging.ini.> Using default... INFO: mock.py version 1.2.21 starting (python version = 3.4.3)... Start: init plugins INFO: selinux enabled Finish: init plugins Start: run Start: scrub ['chroot'] INFO: scrubbing chroot for mocker-epel-7-x86_64.el7 Finish: scrub ['chroot'] Finish: run Scrub chroot took 6 seconds ============================ ########################################################## ## Tue Jan 10 02:42:13 UTC 2017 Finished env: el7:epel-7-x86_64 ## took 366 seconds ## rc = 1 ########################################################## find: ‘logs’: No such file or directory No log files found, check command output ##!######################################################## Collecting mock logs ‘./mock_logs.xGGwEk6V/mocker-epel-7-x86_64.el7.clean_rpmdb’ -> ‘exported-artifacts/mock_logs/mocker-epel-7-x86_64.el7.clean_rpmdb’ ‘./mock_logs.xGGwEk6V/mocker-epel-7-x86_64.el7.he_basic_suite_4.0.sh ’ -> ‘exported-artifacts/mock_logs/mocker-epel-7-x86_64.el7.he_ basic_suite_4.0.sh’ ‘./mock_logs.xGGwEk6V/mocker-epel-7-x86_64.el7.init’ -> ‘exported-artifacts/mock_logs/mocker-epel-7-x86_64.el7.init’ ########################################################## Build step 'Execute shell' marked build as failure Performing Post build task... Match found for :.* : True Logical operation result is TRUE Running script : #!/bin/bash -xe echo 'shell_scripts/system_tests.collect_logs.sh'
# # Required jjb vars: # version # VERSION=4.0 SUITE_TYPE=
WORKSPACE="$PWD" OVIRT_SUITE="$SUITE_TYPE_suite_$VERSION" TESTS_LOGS="$WORKSPACE/ovirt-system-tests/exported-artifacts"
rm -rf "$WORKSPACE/exported-artifacts" mkdir -p "$WORKSPACE/exported-artifacts"
if [[ -d "$TESTS_LOGS" ]]; then mv "$TESTS_LOGS/"* "$WORKSPACE/exported-artifacts/" fi
[ovirt_4.0_he-system-tests] $ /bin/bash -xe /tmp/hudson302101162661598371. sh + echo shell_scripts/system_tests.collect_logs.sh shell_scripts/system_tests.collect_logs.sh + VERSION=4.0 + SUITE_TYPE= + WORKSPACE=<http://jenkins.ovirt.org/job/ovirt_4.0_he- system-tests/ws/
+ OVIRT_SUITE=4.0 + TESTS_LOGS=<http://jenkins.ovirt.org/job/ovirt_4.0_he- system-tests/ws/ovirt-system-tests/exported-artifacts> + rm -rf <http://jenkins.ovirt.org/job/ ovirt_4.0_he-system-tests/627/ artifact/exported-artifacts> + mkdir -p <http://jenkins.ovirt.org/job/ ovirt_4.0_he-system-tests/627/ artifact/exported-artifacts> + [[ -d <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ ovirt-system-tests/exported-artifacts> ]] + mv <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ ovirt-system-tests/exported-artifacts/failure_msg.txt> < http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ ovirt-system-tests/exported-artifacts/lago_logs> < http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ ovirt-system-tests/exported-artifacts/mock_logs> < http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/627/ artifact/exported-artifacts/> POST BUILD TASK : SUCCESS END OF POST BUILD TASK : 0 Match found for :.* : True Logical operation result is TRUE Running script : #!/bin/bash -x echo "shell-scripts/mock_cleanup.sh" # Make clear this is the cleanup, helps reading the jenkins logs cat <<EOC ____________________________________________________________
############################################################ ########### # # # CLEANUP # # # ############################################################ ########### EOC
shopt -s nullglob
WORKSPACE="${WORKSPACE:-$PWD}" UMOUNT_RETRIES="${UMOUNT_RETRIES:-3}" UMOUNT_RETRY_DELAY="${UMOUNT_RETRY_DELAY:-1s}"
safe_umount() { local mount="${1:?}" local attempt for ((attempt=0 ; attempt < $UMOUNT_RETRIES ; attempt++)); do # If this is not the 1st time through the loop, Sleep a while to let # the problem "solve itself" [[ attempt > 0 ]] && sleep "$UMOUNT_RETRY_DELAY" # Try to umount sudo umount --lazy "$mount" && return 0 # See if the mount is already not there despite failing findmnt --kernel --first "$mount" > /dev/null && return 0 done echo "ERROR: Failed to umount $mount." return 1 }
# restore the permissions in the working dir, as sometimes it leaves files # owned by root and then the 'cleanup workspace' from jenkins job fails to # clean and breaks the jobs sudo chown -R "$USER" "$WORKSPACE"
# stop any processes running inside the chroot failed=false mock_confs=("$WORKSPACE"/*/mocker*) # Clean current jobs mockroot if any for mock_conf_file in "${mock_confs[@]}"; do [[ "$mock_conf_file" ]] || continue echo "Cleaning up mock $mock_conf" mock_root="${mock_conf_file##*/}" mock_root="${mock_root%.*}" my_mock="/usr/bin/mock" my_mock+=" --configdir=${mock_conf_file%/*}" my_mock+=" --root=${mock_root}" my_mock+=" --resultdir=$WORKSPACE"
#TODO: investigate why mock --clean fails to umount certain dirs sometimes, #so we can use it instead of manually doing all this. echo "Killing all mock orphan processes, if any." $my_mock \ --orphanskill \ || { echo "ERROR: Failed to kill orphans on $chroot." failed=true }
mock_root="$(\ grep \ -Po "(?<=config_opts\['root'\] = ')[^']*" \ "$mock_conf_file" \ )" || : [[ "$mock_root" ]] || continue mounts=($(mount | awk '{print $3}' | grep "$mock_root")) || : if [[ "$mounts" ]]; then echo "Found mounted dirs inside the chroot $chroot. Trying to umount." fi for mount in "${mounts[@]}"; do safe_umount "$mount" || failed=true done done
# Clean any leftover chroot from other jobs for mock_root in /var/lib/mock/*; do this_chroot_failed=false mounts=($(cut -d\ -f2 /proc/mounts | grep "$mock_root" | sort -r)) || : if [[ "$mounts" ]]; then echo "Found mounted dirs inside the chroot $mock_root." \ "Trying to umount." fi for mount in "${mounts[@]}"; do safe_umount "$mount" && continue # If we got here, we failed $UMOUNT_RETRIES attempts so we should make # noise failed=true this_chroot_failed=true done if ! $this_chroot_failed; then sudo rm -rf "$mock_root" fi done
# remove mock caches that are older then 2 days: find /var/cache/mock/ -mindepth 1 -maxdepth 1 -type d -mtime +2 -print0 | \ xargs -0 -tr sudo rm -rf # We make no effort to leave around caches that may still be in use because # packages installed in them may go out of date, so may as well recreate them
# Drop all left over libvirt domains for UUID in $(virsh list --all --uuid); do virsh destroy $UUID || : sleep 2 virsh undefine --remove-all-storage --storage vda --snapshots-metadata $UUID || : done
if $failed; then echo "Cleanup script failed, propegating failure to job" exit 1 fi
[ovirt_4.0_he-system-tests] $ /bin/bash -x /tmp/ hudson1888216492513466503.sh + echo shell-scripts/mock_cleanup.sh shell-scripts/mock_cleanup.sh + cat ____________________________________________________________
############################################################ ########### # # # CLEANUP # # # ############################################################ ########### + shopt -s nullglob + WORKSPACE=<http://jenkins.ovirt.org/job/ovirt_4.0_he- system-tests/ws/
+ UMOUNT_RETRIES=3 + UMOUNT_RETRY_DELAY=1s + sudo chown -R jenkins <http://jenkins.ovirt.org/job/ ovirt_4.0_he-system-tests/ws/> + failed=false + mock_confs=("$WORKSPACE"/*/mocker*) + for mock_conf_file in '"${mock_confs[@]}"' + [[ -n <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ ovirt-system-tests/mocker-epel-7-x86_64.el7.cfg> ]] + echo 'Cleaning up mock ' Cleaning up mock + mock_root=mocker-epel-7-x86_64.el7.cfg + mock_root=mocker-epel-7-x86_64.el7 + my_mock=/usr/bin/mock + my_mock+=' --configdir=<http://jenkins.ovirt.org/job/ovirt_4.0_he- system-tests/ws/ovirt-system-tests'> + my_mock+=' --root=mocker-epel-7-x86_64.el7' + my_mock+=' --resultdir=<http://jenkins.ovirt.org/job/ovirt_4.0_he- system-tests/ws/'> + echo 'Killing all mock orphan processes, if any.' Killing all mock orphan processes, if any. + /usr/bin/mock --configdir=<http://jenkins. ovirt.org/job/ovirt_4.0_he- system-tests/ws/ovirt-system-tests> --root=mocker-epel-7-x86_64.el7 --resultdir=<http://jenkins.ovirt.org/job/ovirt_4.0_he- system-tests/ws/
--orphanskill WARNING: Could not find required logging config file: < http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ ovirt-system-tests/logging.ini.> Using default... INFO: mock.py version 1.2.21 starting (python version = 3.4.3)... Start: init plugins INFO: selinux enabled Finish: init plugins Start: run Finish: run ++ grep -Po '(?<=config_opts\['\''root'\''\] = '\'')[^'\'']*' < http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ ovirt-system-tests/mocker-epel-7-x86_64.el7.cfg> + mock_root=epel-7-x86_64-6f628e6dc1a827c86d5e1bd9d3b3d38b + [[ -n epel-7-x86_64-6f628e6dc1a827c86d5e1bd9d3b3d38b ]] + mounts=($(mount | awk '{print $3}' | grep "$mock_root")) ++ mount ++ awk '{print $3}' ++ grep epel-7-x86_64-6f628e6dc1a827c86d5e1bd9d3b3d38b + : + [[ -n '' ]] + find /var/cache/mock/ -mindepth 1 -maxdepth 1 -type d -mtime +2 -print0 + xargs -0 -tr sudo rm -rf ++ virsh list --all --uuid + false POST BUILD TASK : SUCCESS END OF POST BUILD TASK : 1 Recording test results ERROR: Step ‘Publish JUnit test result report’ failed: No test report files were found. Configuration error? Archiving artifacts

Hi Yaniv, Sent a pull request to fix the issue, at least for our tests, by returning the SandyBridge CPU family: https://github.com/lago-project/lago/pull/424/ Thanks in advance, Lev Veyde. ----- Original Message ----- From: "Yaniv Kaul" <ykaul@redhat.com> To: "Evgheni Dereveanchin" <ederevea@redhat.com> Cc: "Milan Zamazal" <mzamazal@redhat.com>, "Lev Veyde" <lveyde@redhat.com>, "Eyal Edri" <eedri@redhat.com>, "Sandro Bonazzola" <sbonazzo@redhat.com>, "infra" <infra@ovirt.org>, "Gal Ben Haim" <gbenhaim@redhat.com>, "Martin Polednik" <mpoledni@redhat.com> Sent: Tuesday, January 10, 2017 3:22:07 PM Subject: Re: Build failed in Jenkins: ovirt_4.0_he-system-tests #627 On Tue, Jan 10, 2017 at 3:14 PM, Evgheni Dereveanchin <ederevea@redhat.com> wrote:
Not sure what the initial problem was, but on my laptop (Haswell-MB) I always use the lowest possible CPU family to ensure it's using as few features as possible in nested VMs:
;-) I'm doing the exact opposite, for two reasons: 1. I want the best possible performance. Specifically, I'd like the tests to run as fast as possible. 2. I'd like to expose as many of the latest features up to the hosts (and VMs).
<cpu mode='custom' match='exact'> <model fallback='allow'>core2duo</model> <feature policy='require' name='vmx'/> </cpu>
Respectively, I use model_Conroe on oVirt side and didn't have problems with it. Do we really need to use newer CPU families in our tests?
We probably don't - we used to have Conroe hard-coded in the tests (until I changed it to use something different). It does mean it'll be a bit challenging to run on AMD if we decide to go back to hard-code Conroe. Y.
Regards, Evgheni Dereveanchin
----- Original Message ----- From: "Milan Zamazal" <mzamazal@redhat.com> To: "Yaniv Kaul" <ykaul@redhat.com> Cc: "Lev Veyde" <lveyde@redhat.com>, "Eyal Edri" <eedri@redhat.com>, "Sandro Bonazzola" <sbonazzo@redhat.com>, "infra" <infra@ovirt.org>, "Gal Ben Haim" <gbenhaim@redhat.com>, "Martin Polednik" <mpoledni@redhat.com>, "Evgheni Dereveanchin" <ederevea@redhat.com> Sent: Tuesday, 10 January, 2017 1:16:09 PM Subject: Re: Build failed in Jenkins: ovirt_4.0_he-system-tests #627
Yaniv Kaul <ykaul@redhat.com> writes:
On Tue, Jan 10, 2017 at 12:45 PM, Milan Zamazal <mzamazal@redhat.com> wrote:
Yaniv Kaul <ykaul@redhat.com> writes:
On Tue, Jan 10, 2017 at 12:08 PM, Lev Veyde <lveyde@redhat.com> wrote:
This patch is one that caused it probably: https://github.com/lago-project/lago/commit/ 05ccf7240976f91b0c14d6a1f88016 376d5e87f0
+Milan.
+Martin
I must confess that I did not like the patch to begin with... I did not understand what real problem it solved, but Michal assured me there was a real issue.
Yes, there was a real issue with nested virtualization. Some CPU flags are missing with Haswell and Lago doesn't run properly.
Is this a libvirt bug btw?
I'm not sure. When the sets of CPU flags on the host and in the VM with a copied host CPU are different, it's not clear what's the right thing to do.
Perhaps we need a switch to turn this feature on and off?
I think it would be useful to have a possibility to specify a particular CPU type in the Lago configuration.
I know have Engine with a Java@ 100% CPU - I hope it's unrelated to this as well.
I suggest we do survey to see who doesn't have SandyBridge and above and perhaps move higher than Westmere.
We've got Westmere servers in the Brno lab.
Do we know the scope of the problem? Does it happen only on Westmere, for example?
The problem was with Haswell-noTSX (on my Lenovo, but I think Martin has observed the same problem too). We don't know the scope of the problem, but if we want to be able to run Lago on Brno servers then we must be Westmere compatible.
Y.
What do we have in CI? Y.
Thanks in advance, Lev Veyde.
----- Original Message ----- From: "Lev Veyde" <lveyde@redhat.com> To: "Eyal Edri" <eedri@redhat.com>, sbonazzo@redhat.com Cc: infra@ovirt.org, "Gal Ben Haim" <gbenhaim@redhat.com> Sent: Tuesday, January 10, 2017 11:50:05 AM Subject: Re: Build failed in Jenkins: ovirt_4.0_he-system-tests #627
Hi,
Checked the logs and see the following:
02:42:05 [WARNING] OVF does not contain a valid image description,
using
default. 02:42:05 The following CPU types are supported by this host: 02:42:05 - model_Westmere: Intel Westmere Family 02:42:05 - model_Nehalem: Intel Nehalem Family 02:42:05 - model_Penryn: Intel Penryn Family 02:42:05 - model_Conroe: Intel Conroe Family 02:42:05 [ ERROR ] Failed to execute stage 'Environment customization': Invalid CPU type specified: model_SandyBridge
Barak thinks that it may be related to the recent update in the Lago code.
Gal, any idea ?
Thanks in advance, Lev Veyde.
----- Original Message ----- From: jenkins@jenkins.phx.ovirt.org To: sbonazzo@redhat.com, infra@ovirt.org, lveyde@redhat.com Sent: Tuesday, January 10, 2017 4:42:14 AM Subject: Build failed in Jenkins: ovirt_4.0_he-system-tests #627
See <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/627/ changes
Changes:
[Lev Veyde] Mask NetworkManager service
[Eyal Edri] fix imgbased job names in jjb
[Daniel Belenky] fixing jjb version for cockpit-ovirt
[Gil Shinar] Add some more 4.1 to experimental
[Juan Hernandez] Don't build RPMs for the JBoss modules Maven plugin
[pkliczewski] jsonrpc 4.1 branch
------------------------------------------ [...truncated 749 lines...] Finish: shell @@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@ @@ Tue Jan 10 02:42:07 UTC 2017 automation/he_basic_suite_4.0.sh
chroot
finished @@ took 360 seconds @@ rc = 1 @@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@ ========== Scrubbing chroot mock \ --configdir="<http://jenkins.ovirt.org/job/ovirt_4.0_he- system-tests/ws/ovirt-system-tests"> \ --root="mocker-epel-7-x86_64.el7" \ --resultdir="./mock_logs.xGGwEk6V/mocker-epel-7-x86_64. el7.scrub" \ --scrub=chroot WARNING: Could not find required logging config file: < http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ ovirt-system-tests/logging.ini.> Using default... INFO: mock.py version 1.2.21 starting (python version = 3.4.3)... Start: init plugins INFO: selinux enabled Finish: init plugins Start: run Start: scrub ['chroot'] INFO: scrubbing chroot for mocker-epel-7-x86_64.el7 Finish: scrub ['chroot'] Finish: run Scrub chroot took 6 seconds ============================ ########################################################## ## Tue Jan 10 02:42:13 UTC 2017 Finished env: el7:epel-7-x86_64 ## took 366 seconds ## rc = 1 ########################################################## find: ‘logs’: No such file or directory No log files found, check command output ##!######################################################## Collecting mock logs ‘./mock_logs.xGGwEk6V/mocker-epel-7-x86_64.el7.clean_rpmdb’ -> ‘exported-artifacts/mock_logs/mocker-epel-7-x86_64.el7.clean_rpmdb’ ‘./mock_logs.xGGwEk6V/mocker-epel-7-x86_64.el7.he_basic_suite_4.0.sh ’ -> ‘exported-artifacts/mock_logs/mocker-epel-7-x86_64.el7.he_ basic_suite_4.0.sh’ ‘./mock_logs.xGGwEk6V/mocker-epel-7-x86_64.el7.init’ -> ‘exported-artifacts/mock_logs/mocker-epel-7-x86_64.el7.init’ ########################################################## Build step 'Execute shell' marked build as failure Performing Post build task... Match found for :.* : True Logical operation result is TRUE Running script : #!/bin/bash -xe echo 'shell_scripts/system_tests.collect_logs.sh'
# # Required jjb vars: # version # VERSION=4.0 SUITE_TYPE=
WORKSPACE="$PWD" OVIRT_SUITE="$SUITE_TYPE_suite_$VERSION" TESTS_LOGS="$WORKSPACE/ovirt-system-tests/exported-artifacts"
rm -rf "$WORKSPACE/exported-artifacts" mkdir -p "$WORKSPACE/exported-artifacts"
if [[ -d "$TESTS_LOGS" ]]; then mv "$TESTS_LOGS/"* "$WORKSPACE/exported-artifacts/" fi
[ovirt_4.0_he-system-tests] $ /bin/bash -xe /tmp/hudson302101162661598371. sh + echo shell_scripts/system_tests.collect_logs.sh shell_scripts/system_tests.collect_logs.sh + VERSION=4.0 + SUITE_TYPE= + WORKSPACE=<http://jenkins.ovirt.org/job/ovirt_4.0_he- system-tests/ws/
+ OVIRT_SUITE=4.0 + TESTS_LOGS=<http://jenkins.ovirt.org/job/ovirt_4.0_he- system-tests/ws/ovirt-system-tests/exported-artifacts> + rm -rf <http://jenkins.ovirt.org/job/ ovirt_4.0_he-system-tests/627/ artifact/exported-artifacts> + mkdir -p <http://jenkins.ovirt.org/job/ ovirt_4.0_he-system-tests/627/ artifact/exported-artifacts> + [[ -d <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ ovirt-system-tests/exported-artifacts> ]] + mv <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ ovirt-system-tests/exported-artifacts/failure_msg.txt> < http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ ovirt-system-tests/exported-artifacts/lago_logs> < http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ ovirt-system-tests/exported-artifacts/mock_logs> < http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/627/ artifact/exported-artifacts/> POST BUILD TASK : SUCCESS END OF POST BUILD TASK : 0 Match found for :.* : True Logical operation result is TRUE Running script : #!/bin/bash -x echo "shell-scripts/mock_cleanup.sh" # Make clear this is the cleanup, helps reading the jenkins logs cat <<EOC ____________________________________________________________
############################################################ ########### # # # CLEANUP # # # ############################################################ ########### EOC
shopt -s nullglob
WORKSPACE="${WORKSPACE:-$PWD}" UMOUNT_RETRIES="${UMOUNT_RETRIES:-3}" UMOUNT_RETRY_DELAY="${UMOUNT_RETRY_DELAY:-1s}"
safe_umount() { local mount="${1:?}" local attempt for ((attempt=0 ; attempt < $UMOUNT_RETRIES ; attempt++)); do # If this is not the 1st time through the loop, Sleep a while to let # the problem "solve itself" [[ attempt > 0 ]] && sleep "$UMOUNT_RETRY_DELAY" # Try to umount sudo umount --lazy "$mount" && return 0 # See if the mount is already not there despite failing findmnt --kernel --first "$mount" > /dev/null && return 0 done echo "ERROR: Failed to umount $mount." return 1 }
# restore the permissions in the working dir, as sometimes it leaves files # owned by root and then the 'cleanup workspace' from jenkins job fails to # clean and breaks the jobs sudo chown -R "$USER" "$WORKSPACE"
# stop any processes running inside the chroot failed=false mock_confs=("$WORKSPACE"/*/mocker*) # Clean current jobs mockroot if any for mock_conf_file in "${mock_confs[@]}"; do [[ "$mock_conf_file" ]] || continue echo "Cleaning up mock $mock_conf" mock_root="${mock_conf_file##*/}" mock_root="${mock_root%.*}" my_mock="/usr/bin/mock" my_mock+=" --configdir=${mock_conf_file%/*}" my_mock+=" --root=${mock_root}" my_mock+=" --resultdir=$WORKSPACE"
#TODO: investigate why mock --clean fails to umount certain dirs sometimes, #so we can use it instead of manually doing all this. echo "Killing all mock orphan processes, if any." $my_mock \ --orphanskill \ || { echo "ERROR: Failed to kill orphans on $chroot." failed=true }
mock_root="$(\ grep \ -Po "(?<=config_opts\['root'\] = ')[^']*" \ "$mock_conf_file" \ )" || : [[ "$mock_root" ]] || continue mounts=($(mount | awk '{print $3}' | grep "$mock_root")) || : if [[ "$mounts" ]]; then echo "Found mounted dirs inside the chroot $chroot. Trying to umount." fi for mount in "${mounts[@]}"; do safe_umount "$mount" || failed=true done done
# Clean any leftover chroot from other jobs for mock_root in /var/lib/mock/*; do this_chroot_failed=false mounts=($(cut -d\ -f2 /proc/mounts | grep "$mock_root" | sort -r)) || : if [[ "$mounts" ]]; then echo "Found mounted dirs inside the chroot $mock_root." \ "Trying to umount." fi for mount in "${mounts[@]}"; do safe_umount "$mount" && continue # If we got here, we failed $UMOUNT_RETRIES attempts so we should make # noise failed=true this_chroot_failed=true done if ! $this_chroot_failed; then sudo rm -rf "$mock_root" fi done
# remove mock caches that are older then 2 days: find /var/cache/mock/ -mindepth 1 -maxdepth 1 -type d -mtime +2 -print0 | \ xargs -0 -tr sudo rm -rf # We make no effort to leave around caches that may still be in use because # packages installed in them may go out of date, so may as well recreate them
# Drop all left over libvirt domains for UUID in $(virsh list --all --uuid); do virsh destroy $UUID || : sleep 2 virsh undefine --remove-all-storage --storage vda --snapshots-metadata $UUID || : done
if $failed; then echo "Cleanup script failed, propegating failure to job" exit 1 fi
[ovirt_4.0_he-system-tests] $ /bin/bash -x /tmp/ hudson1888216492513466503.sh + echo shell-scripts/mock_cleanup.sh shell-scripts/mock_cleanup.sh + cat ____________________________________________________________
############################################################ ########### # # # CLEANUP # # # ############################################################ ########### + shopt -s nullglob + WORKSPACE=<http://jenkins.ovirt.org/job/ovirt_4.0_he- system-tests/ws/
+ UMOUNT_RETRIES=3 + UMOUNT_RETRY_DELAY=1s + sudo chown -R jenkins <http://jenkins.ovirt.org/job/ ovirt_4.0_he-system-tests/ws/> + failed=false + mock_confs=("$WORKSPACE"/*/mocker*) + for mock_conf_file in '"${mock_confs[@]}"' + [[ -n <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ ovirt-system-tests/mocker-epel-7-x86_64.el7.cfg> ]] + echo 'Cleaning up mock ' Cleaning up mock + mock_root=mocker-epel-7-x86_64.el7.cfg + mock_root=mocker-epel-7-x86_64.el7 + my_mock=/usr/bin/mock + my_mock+=' --configdir=<http://jenkins.ovirt.org/job/ovirt_4.0_he- system-tests/ws/ovirt-system-tests'> + my_mock+=' --root=mocker-epel-7-x86_64.el7' + my_mock+=' --resultdir=<http://jenkins.ovirt.org/job/ovirt_4.0_he- system-tests/ws/'> + echo 'Killing all mock orphan processes, if any.' Killing all mock orphan processes, if any. + /usr/bin/mock --configdir=<http://jenkins. ovirt.org/job/ovirt_4.0_he- system-tests/ws/ovirt-system-tests> --root=mocker-epel-7-x86_64.el7 --resultdir=<http://jenkins.ovirt.org/job/ovirt_4.0_he- system-tests/ws/
--orphanskill WARNING: Could not find required logging config file: < http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ ovirt-system-tests/logging.ini.> Using default... INFO: mock.py version 1.2.21 starting (python version = 3.4.3)... Start: init plugins INFO: selinux enabled Finish: init plugins Start: run Finish: run ++ grep -Po '(?<=config_opts\['\''root'\''\] = '\'')[^'\'']*' < http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ ovirt-system-tests/mocker-epel-7-x86_64.el7.cfg> + mock_root=epel-7-x86_64-6f628e6dc1a827c86d5e1bd9d3b3d38b + [[ -n epel-7-x86_64-6f628e6dc1a827c86d5e1bd9d3b3d38b ]] + mounts=($(mount | awk '{print $3}' | grep "$mock_root")) ++ mount ++ awk '{print $3}' ++ grep epel-7-x86_64-6f628e6dc1a827c86d5e1bd9d3b3d38b + : + [[ -n '' ]] + find /var/cache/mock/ -mindepth 1 -maxdepth 1 -type d -mtime +2 -print0 + xargs -0 -tr sudo rm -rf ++ virsh list --all --uuid + false POST BUILD TASK : SUCCESS END OF POST BUILD TASK : 1 Recording test results ERROR: Step ‘Publish JUnit test result report’ failed: No test report files were found. Configuration error? Archiving artifacts

On Jan 10, 2017 6:18 PM, "Lev Veyde" <lveyde@redhat.com> wrote: Hi Yaniv, Sent a pull request to fix the issue, at least for our tests, by returning the SandyBridge CPU family: https://github.com/lago-project/lago/pull/424/ Yep, saw it. Milan - can you ensure it still OK with your servers? Y. Thanks in advance, Lev Veyde. ----- Original Message ----- From: "Yaniv Kaul" <ykaul@redhat.com> To: "Evgheni Dereveanchin" <ederevea@redhat.com> Cc: "Milan Zamazal" <mzamazal@redhat.com>, "Lev Veyde" <lveyde@redhat.com>, "Eyal Edri" <eedri@redhat.com>, "Sandro Bonazzola" <sbonazzo@redhat.com>, "infra" <infra@ovirt.org>, "Gal Ben Haim" <gbenhaim@redhat.com>, "Martin Polednik" <mpoledni@redhat.com> Sent: Tuesday, January 10, 2017 3:22:07 PM Subject: Re: Build failed in Jenkins: ovirt_4.0_he-system-tests #627 On Tue, Jan 10, 2017 at 3:14 PM, Evgheni Dereveanchin <ederevea@redhat.com> wrote:
Not sure what the initial problem was, but on my laptop (Haswell-MB) I always use the lowest possible CPU family to ensure it's using as few features as possible in nested VMs:
;-) I'm doing the exact opposite, for two reasons: 1. I want the best possible performance. Specifically, I'd like the tests to run as fast as possible. 2. I'd like to expose as many of the latest features up to the hosts (and VMs).
<cpu mode='custom' match='exact'> <model fallback='allow'>core2duo</model> <feature policy='require' name='vmx'/> </cpu>
Respectively, I use model_Conroe on oVirt side and didn't have problems with it. Do we really need to use newer CPU families in our tests?
We probably don't - we used to have Conroe hard-coded in the tests (until I changed it to use something different). It does mean it'll be a bit challenging to run on AMD if we decide to go back to hard-code Conroe. Y.
Regards, Evgheni Dereveanchin
----- Original Message ----- From: "Milan Zamazal" <mzamazal@redhat.com> To: "Yaniv Kaul" <ykaul@redhat.com> Cc: "Lev Veyde" <lveyde@redhat.com>, "Eyal Edri" <eedri@redhat.com>, "Sandro Bonazzola" <sbonazzo@redhat.com>, "infra" <infra@ovirt.org>, "Gal Ben Haim" <gbenhaim@redhat.com>, "Martin Polednik" <mpoledni@redhat.com>, "Evgheni Dereveanchin" <ederevea@redhat.com> Sent: Tuesday, 10 January, 2017 1:16:09 PM Subject: Re: Build failed in Jenkins: ovirt_4.0_he-system-tests #627
Yaniv Kaul <ykaul@redhat.com> writes:
On Tue, Jan 10, 2017 at 12:45 PM, Milan Zamazal <mzamazal@redhat.com> wrote:
Yaniv Kaul <ykaul@redhat.com> writes:
On Tue, Jan 10, 2017 at 12:08 PM, Lev Veyde <lveyde@redhat.com> wrote:
This patch is one that caused it probably: https://github.com/lago-project/lago/commit/ 05ccf7240976f91b0c14d6a1f88016 376d5e87f0
+Milan.
+Martin
I must confess that I did not like the patch to begin with... I did not understand what real problem it solved, but Michal assured me there was a real issue.
Yes, there was a real issue with nested virtualization. Some CPU flags are missing with Haswell and Lago doesn't run properly.
Is this a libvirt bug btw?
I'm not sure. When the sets of CPU flags on the host and in the VM with a copied host CPU are different, it's not clear what's the right thing to do.
Perhaps we need a switch to turn this feature on and off?
I think it would be useful to have a possibility to specify a particular CPU type in the Lago configuration.
I know have Engine with a Java@ 100% CPU - I hope it's unrelated to this as well.
I suggest we do survey to see who doesn't have SandyBridge and above and perhaps move higher than Westmere.
We've got Westmere servers in the Brno lab.
Do we know the scope of the problem? Does it happen only on Westmere,
for
example?
The problem was with Haswell-noTSX (on my Lenovo, but I think Martin has observed the same problem too). We don't know the scope of the problem, but if we want to be able to run Lago on Brno servers then we must be Westmere compatible.
Y.
What do we have in CI? Y.
Thanks in advance, Lev Veyde.
----- Original Message ----- From: "Lev Veyde" <lveyde@redhat.com> To: "Eyal Edri" <eedri@redhat.com>, sbonazzo@redhat.com Cc: infra@ovirt.org, "Gal Ben Haim" <gbenhaim@redhat.com> Sent: Tuesday, January 10, 2017 11:50:05 AM Subject: Re: Build failed in Jenkins: ovirt_4.0_he-system-tests #627
Hi,
Checked the logs and see the following:
02:42:05 [WARNING] OVF does not contain a valid image description,
using
default. 02:42:05 The following CPU types are supported by this host: 02:42:05 - model_Westmere: Intel Westmere Family 02:42:05 - model_Nehalem: Intel Nehalem Family 02:42:05 - model_Penryn: Intel Penryn Family 02:42:05 - model_Conroe: Intel Conroe Family 02:42:05 [ ERROR ] Failed to execute stage 'Environment customization': Invalid CPU type specified: model_SandyBridge
Barak thinks that it may be related to the recent update in the Lago code.
Gal, any idea ?
Thanks in advance, Lev Veyde.
----- Original Message ----- From: jenkins@jenkins.phx.ovirt.org To: sbonazzo@redhat.com, infra@ovirt.org, lveyde@redhat.com Sent: Tuesday, January 10, 2017 4:42:14 AM Subject: Build failed in Jenkins: ovirt_4.0_he-system-tests #627
See <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/627/ changes
Changes:
[Lev Veyde] Mask NetworkManager service
[Eyal Edri] fix imgbased job names in jjb
[Daniel Belenky] fixing jjb version for cockpit-ovirt
[Gil Shinar] Add some more 4.1 to experimental
[Juan Hernandez] Don't build RPMs for the JBoss modules Maven plugin
[pkliczewski] jsonrpc 4.1 branch
------------------------------------------ [...truncated 749 lines...] Finish: shell @@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@ @@ Tue Jan 10 02:42:07 UTC 2017 automation/he_basic_suite_4.0.sh
chroot
finished @@ took 360 seconds @@ rc = 1 @@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@ ========== Scrubbing chroot mock \ --configdir="<http://jenkins.ovirt.org/job/ovirt_4.0_he- system-tests/ws/ovirt-system-tests"> \ --root="mocker-epel-7-x86_64.el7" \ --resultdir="./mock_logs.xGGwEk6V/mocker-epel-7-x86_64. el7.scrub" \ --scrub=chroot WARNING: Could not find required logging config file: < http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ ovirt-system-tests/logging.ini.> Using default... INFO: mock.py version 1.2.21 starting (python version = 3.4.3)... Start: init plugins INFO: selinux enabled Finish: init plugins Start: run Start: scrub ['chroot'] INFO: scrubbing chroot for mocker-epel-7-x86_64.el7 Finish: scrub ['chroot'] Finish: run Scrub chroot took 6 seconds ============================ ########################################################## ## Tue Jan 10 02:42:13 UTC 2017 Finished env: el7:epel-7-x86_64 ## took 366 seconds ## rc = 1 ########################################################## find: ‘logs’: No such file or directory No log files found, check command output ##!######################################################## Collecting mock logs ‘./mock_logs.xGGwEk6V/mocker-epel-7-x86_64.el7.clean_rpmdb’ -> ‘exported-artifacts/mock_logs/mocker-epel-7-x86_64.el7.clean_rpmdb’ ‘./mock_logs.xGGwEk6V/mocker-epel-7-x86_64.el7.he_basic_suite_4.0.sh ’ -> ‘exported-artifacts/mock_logs/mocker-epel-7-x86_64.el7.he_ basic_suite_4.0.sh’ ‘./mock_logs.xGGwEk6V/mocker-epel-7-x86_64.el7.init’ -> ‘exported-artifacts/mock_logs/mocker-epel-7-x86_64.el7.init’ ########################################################## Build step 'Execute shell' marked build as failure Performing Post build task... Match found for :.* : True Logical operation result is TRUE Running script : #!/bin/bash -xe echo 'shell_scripts/system_tests.collect_logs.sh'
# # Required jjb vars: # version # VERSION=4.0 SUITE_TYPE=
WORKSPACE="$PWD" OVIRT_SUITE="$SUITE_TYPE_suite_$VERSION" TESTS_LOGS="$WORKSPACE/ovirt-system-tests/exported-artifacts"
rm -rf "$WORKSPACE/exported-artifacts" mkdir -p "$WORKSPACE/exported-artifacts"
if [[ -d "$TESTS_LOGS" ]]; then mv "$TESTS_LOGS/"* "$WORKSPACE/exported-artifacts/" fi
[ovirt_4.0_he-system-tests] $ /bin/bash -xe /tmp/hudson302101162661598371. sh + echo shell_scripts/system_tests.collect_logs.sh shell_scripts/system_tests.collect_logs.sh + VERSION=4.0 + SUITE_TYPE= + WORKSPACE=<http://jenkins.ovirt.org/job/ovirt_4.0_he- system-tests/ws/
+ OVIRT_SUITE=4.0 + TESTS_LOGS=<http://jenkins.ovirt.org/job/ovirt_4.0_he- system-tests/ws/ovirt-system-tests/exported-artifacts> + rm -rf <http://jenkins.ovirt.org/job/ ovirt_4.0_he-system-tests/627/ artifact/exported-artifacts> + mkdir -p <http://jenkins.ovirt.org/job/ ovirt_4.0_he-system-tests/627/ artifact/exported-artifacts> + [[ -d <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ ovirt-system-tests/exported-artifacts> ]] + mv <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ ovirt-system-tests/exported-artifacts/failure_msg.txt> < http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ ovirt-system-tests/exported-artifacts/lago_logs> < http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ ovirt-system-tests/exported-artifacts/mock_logs> < http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/627/ artifact/exported-artifacts/> POST BUILD TASK : SUCCESS END OF POST BUILD TASK : 0 Match found for :.* : True Logical operation result is TRUE Running script : #!/bin/bash -x echo "shell-scripts/mock_cleanup.sh" # Make clear this is the cleanup, helps reading the jenkins logs cat <<EOC ____________________________________________________________
############################################################ ########### # # # CLEANUP # # # ############################################################ ########### EOC
shopt -s nullglob
WORKSPACE="${WORKSPACE:-$PWD}" UMOUNT_RETRIES="${UMOUNT_RETRIES:-3}" UMOUNT_RETRY_DELAY="${UMOUNT_RETRY_DELAY:-1s}"
safe_umount() { local mount="${1:?}" local attempt for ((attempt=0 ; attempt < $UMOUNT_RETRIES ; attempt++)); do # If this is not the 1st time through the loop, Sleep a while to let # the problem "solve itself" [[ attempt > 0 ]] && sleep "$UMOUNT_RETRY_DELAY" # Try to umount sudo umount --lazy "$mount" && return 0 # See if the mount is already not there despite failing findmnt --kernel --first "$mount" > /dev/null && return 0 done echo "ERROR: Failed to umount $mount." return 1 }
# restore the permissions in the working dir, as sometimes it leaves files # owned by root and then the 'cleanup workspace' from jenkins job fails to # clean and breaks the jobs sudo chown -R "$USER" "$WORKSPACE"
# stop any processes running inside the chroot failed=false mock_confs=("$WORKSPACE"/*/mocker*) # Clean current jobs mockroot if any for mock_conf_file in "${mock_confs[@]}"; do [[ "$mock_conf_file" ]] || continue echo "Cleaning up mock $mock_conf" mock_root="${mock_conf_file##*/}" mock_root="${mock_root%.*}" my_mock="/usr/bin/mock" my_mock+=" --configdir=${mock_conf_file%/*}" my_mock+=" --root=${mock_root}" my_mock+=" --resultdir=$WORKSPACE"
#TODO: investigate why mock --clean fails to umount certain dirs sometimes, #so we can use it instead of manually doing all this. echo "Killing all mock orphan processes, if any." $my_mock \ --orphanskill \ || { echo "ERROR: Failed to kill orphans on $chroot." failed=true }
mock_root="$(\ grep \ -Po "(?<=config_opts\['root'\] = ')[^']*" \ "$mock_conf_file" \ )" || : [[ "$mock_root" ]] || continue mounts=($(mount | awk '{print $3}' | grep "$mock_root")) || : if [[ "$mounts" ]]; then echo "Found mounted dirs inside the chroot $chroot. Trying to umount." fi for mount in "${mounts[@]}"; do safe_umount "$mount" || failed=true done done
# Clean any leftover chroot from other jobs for mock_root in /var/lib/mock/*; do this_chroot_failed=false mounts=($(cut -d\ -f2 /proc/mounts | grep "$mock_root" | sort -r)) || : if [[ "$mounts" ]]; then echo "Found mounted dirs inside the chroot $mock_root." \ "Trying to umount." fi for mount in "${mounts[@]}"; do safe_umount "$mount" && continue # If we got here, we failed $UMOUNT_RETRIES attempts so we should make # noise failed=true this_chroot_failed=true done if ! $this_chroot_failed; then sudo rm -rf "$mock_root" fi done
# remove mock caches that are older then 2 days: find /var/cache/mock/ -mindepth 1 -maxdepth 1 -type d -mtime +2 -print0 | \ xargs -0 -tr sudo rm -rf # We make no effort to leave around caches that may still be in use because # packages installed in them may go out of date, so may as well recreate them
# Drop all left over libvirt domains for UUID in $(virsh list --all --uuid); do virsh destroy $UUID || : sleep 2 virsh undefine --remove-all-storage --storage vda --snapshots-metadata $UUID || : done
if $failed; then echo "Cleanup script failed, propegating failure to job" exit 1 fi
[ovirt_4.0_he-system-tests] $ /bin/bash -x /tmp/ hudson1888216492513466503.sh + echo shell-scripts/mock_cleanup.sh shell-scripts/mock_cleanup.sh + cat ____________________________________________________________
############################################################ ########### # # # CLEANUP # # # ############################################################ ########### + shopt -s nullglob + WORKSPACE=<http://jenkins.ovirt.org/job/ovirt_4.0_he- system-tests/ws/
+ UMOUNT_RETRIES=3 + UMOUNT_RETRY_DELAY=1s + sudo chown -R jenkins <http://jenkins.ovirt.org/job/ ovirt_4.0_he-system-tests/ws/> + failed=false + mock_confs=("$WORKSPACE"/*/mocker*) + for mock_conf_file in '"${mock_confs[@]}"' + [[ -n <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ ovirt-system-tests/mocker-epel-7-x86_64.el7.cfg> ]] + echo 'Cleaning up mock ' Cleaning up mock + mock_root=mocker-epel-7-x86_64.el7.cfg + mock_root=mocker-epel-7-x86_64.el7 + my_mock=/usr/bin/mock + my_mock+=' --configdir=<http://jenkins.ovirt.org/job/ovirt_4.0_he- system-tests/ws/ovirt-system-tests'> + my_mock+=' --root=mocker-epel-7-x86_64.el7' + my_mock+=' --resultdir=<http://jenkins.ovirt.org/job/ovirt_4.0_he- system-tests/ws/'> + echo 'Killing all mock orphan processes, if any.' Killing all mock orphan processes, if any. + /usr/bin/mock --configdir=<http://jenkins. ovirt.org/job/ovirt_4.0_he- system-tests/ws/ovirt-system-tests> --root=mocker-epel-7-x86_64.el7 --resultdir=<http://jenkins.ovirt.org/job/ovirt_4.0_he- system-tests/ws/
--orphanskill WARNING: Could not find required logging config file: < http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ ovirt-system-tests/logging.ini.> Using default... INFO: mock.py version 1.2.21 starting (python version = 3.4.3)... Start: init plugins INFO: selinux enabled Finish: init plugins Start: run Finish: run ++ grep -Po '(?<=config_opts\['\''root'\''\] = '\'')[^'\'']*' < http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ ovirt-system-tests/mocker-epel-7-x86_64.el7.cfg> + mock_root=epel-7-x86_64-6f628e6dc1a827c86d5e1bd9d3b3d38b + [[ -n epel-7-x86_64-6f628e6dc1a827c86d5e1bd9d3b3d38b ]] + mounts=($(mount | awk '{print $3}' | grep "$mock_root")) ++ mount ++ awk '{print $3}' ++ grep epel-7-x86_64-6f628e6dc1a827c86d5e1bd9d3b3d38b + : + [[ -n '' ]] + find /var/cache/mock/ -mindepth 1 -maxdepth 1 -type d -mtime +2 -print0 + xargs -0 -tr sudo rm -rf ++ virsh list --all --uuid + false POST BUILD TASK : SUCCESS END OF POST BUILD TASK : 1 Recording test results ERROR: Step ‘Publish JUnit test result report’ failed: No test report files were found. Configuration error? Archiving artifacts

On 10 January 2017 at 12:23, Yaniv Kaul <ykaul@redhat.com> wrote:
On Tue, Jan 10, 2017 at 12:08 PM, Lev Veyde <lveyde@redhat.com> wrote:
This patch is one that caused it probably:
https://github.com/lago-project/lago/commit/05ccf7240976f91b0c14d6a1f8801637...
+Milan.
I must confess that I did not like the patch to begin with... I did not understand what real problem it solved, but Michal assured me there was a real issue. I know have Engine with a Java@ 100% CPU - I hope it's unrelated to this as well.
I suggest we do survey to see who doesn't have SandyBridge and above and perhaps move higher than Westmere. What do we have in CI?
+Evgheny -- Barak Korren bkorren@redhat.com RHCE, RHCi, RHV-DevOps Team https://ifireball.wordpress.com/

See <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/628/changes> Changes: [ngoldin] Enable debug mode in 'setup-ds.pl' [Gil Shinar] Changed ovirt version to 4.0 from 0.16 [Daniel Belenky] added 4.1 branch to the yaml conf ------------------------------------------ [...truncated 748 lines...] Finish: shell @@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@ @@ Wed Jan 11 02:40:54 UTC 2017 automation/he_basic_suite_4.0.sh chroot finished @@ took 345 seconds @@ rc = 1 @@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@ ========== Scrubbing chroot mock \ --configdir="<http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ovirt-system-tests"> \ --root="mocker-epel-7-x86_64.el7" \ --resultdir="./mock_logs.WYQNadO2/mocker-epel-7-x86_64.el7.scrub" \ --scrub=chroot WARNING: Could not find required logging config file: <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ovirt-system-tests/logging.ini.> Using default... INFO: mock.py version 1.2.21 starting (python version = 3.4.3)... Start: init plugins INFO: selinux enabled Finish: init plugins Start: run Start: scrub ['chroot'] INFO: scrubbing chroot for mocker-epel-7-x86_64.el7 Finish: scrub ['chroot'] Finish: run Scrub chroot took 6 seconds ============================ ########################################################## ## Wed Jan 11 02:41:00 UTC 2017 Finished env: el7:epel-7-x86_64 ## took 351 seconds ## rc = 1 ########################################################## find: ‘logs’: No such file or directory No log files found, check command output ##!######################################################## Collecting mock logs ‘./mock_logs.WYQNadO2/mocker-epel-7-x86_64.el7.clean_rpmdb’ -> ‘exported-artifacts/mock_logs/mocker-epel-7-x86_64.el7.clean_rpmdb’ ‘./mock_logs.WYQNadO2/mocker-epel-7-x86_64.el7.he_basic_suite_4.0.sh’ -> ‘exported-artifacts/mock_logs/mocker-epel-7-x86_64.el7.he_basic_suite_4.0.sh’ ‘./mock_logs.WYQNadO2/mocker-epel-7-x86_64.el7.init’ -> ‘exported-artifacts/mock_logs/mocker-epel-7-x86_64.el7.init’ ########################################################## Build step 'Execute shell' marked build as failure Performing Post build task... Match found for :.* : True Logical operation result is TRUE Running script : #!/bin/bash -xe echo 'shell_scripts/system_tests.collect_logs.sh' # # Required jjb vars: # version # VERSION=4.0 SUITE_TYPE= WORKSPACE="$PWD" OVIRT_SUITE="$SUITE_TYPE_suite_$VERSION" TESTS_LOGS="$WORKSPACE/ovirt-system-tests/exported-artifacts" rm -rf "$WORKSPACE/exported-artifacts" mkdir -p "$WORKSPACE/exported-artifacts" if [[ -d "$TESTS_LOGS" ]]; then mv "$TESTS_LOGS/"* "$WORKSPACE/exported-artifacts/" fi [ovirt_4.0_he-system-tests] $ /bin/bash -xe /tmp/hudson1263191637238158671.sh + echo shell_scripts/system_tests.collect_logs.sh shell_scripts/system_tests.collect_logs.sh + VERSION=4.0 + SUITE_TYPE= + WORKSPACE=<http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/> + OVIRT_SUITE=4.0 + TESTS_LOGS=<http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ovirt-system-tests/exported-artifacts> + rm -rf <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/628/artifact/exported-artifacts> + mkdir -p <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/628/artifact/exported-artifacts> + [[ -d <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ovirt-system-tests/exported-artifacts> ]] + mv <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ovirt-system-tests/exported-artifacts/failure_msg.txt> <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ovirt-system-tests/exported-artifacts/lago_logs> <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ovirt-system-tests/exported-artifacts/mock_logs> <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/628/artifact/exported-artifacts/> POST BUILD TASK : SUCCESS END OF POST BUILD TASK : 0 Match found for :.* : True Logical operation result is TRUE Running script : #!/bin/bash -x echo "shell-scripts/mock_cleanup.sh" # Make clear this is the cleanup, helps reading the jenkins logs cat <<EOC _______________________________________________________________________ ####################################################################### # # # CLEANUP # # # ####################################################################### EOC shopt -s nullglob WORKSPACE="${WORKSPACE:-$PWD}" UMOUNT_RETRIES="${UMOUNT_RETRIES:-3}" UMOUNT_RETRY_DELAY="${UMOUNT_RETRY_DELAY:-1s}" safe_umount() { local mount="${1:?}" local attempt for ((attempt=0 ; attempt < $UMOUNT_RETRIES ; attempt++)); do # If this is not the 1st time through the loop, Sleep a while to let # the problem "solve itself" [[ attempt > 0 ]] && sleep "$UMOUNT_RETRY_DELAY" # Try to umount sudo umount --lazy "$mount" && return 0 # See if the mount is already not there despite failing findmnt --kernel --first "$mount" > /dev/null && return 0 done echo "ERROR: Failed to umount $mount." return 1 } # restore the permissions in the working dir, as sometimes it leaves files # owned by root and then the 'cleanup workspace' from jenkins job fails to # clean and breaks the jobs sudo chown -R "$USER" "$WORKSPACE" # stop any processes running inside the chroot failed=false mock_confs=("$WORKSPACE"/*/mocker*) # Clean current jobs mockroot if any for mock_conf_file in "${mock_confs[@]}"; do [[ "$mock_conf_file" ]] || continue echo "Cleaning up mock $mock_conf" mock_root="${mock_conf_file##*/}" mock_root="${mock_root%.*}" my_mock="/usr/bin/mock" my_mock+=" --configdir=${mock_conf_file%/*}" my_mock+=" --root=${mock_root}" my_mock+=" --resultdir=$WORKSPACE" #TODO: investigate why mock --clean fails to umount certain dirs sometimes, #so we can use it instead of manually doing all this. echo "Killing all mock orphan processes, if any." $my_mock \ --orphanskill \ || { echo "ERROR: Failed to kill orphans on $chroot." failed=true } mock_root="$(\ grep \ -Po "(?<=config_opts\['root'\] = ')[^']*" \ "$mock_conf_file" \ )" || : [[ "$mock_root" ]] || continue mounts=($(mount | awk '{print $3}' | grep "$mock_root")) || : if [[ "$mounts" ]]; then echo "Found mounted dirs inside the chroot $chroot. Trying to umount." fi for mount in "${mounts[@]}"; do safe_umount "$mount" || failed=true done done # Clean any leftover chroot from other jobs for mock_root in /var/lib/mock/*; do this_chroot_failed=false mounts=($(cut -d\ -f2 /proc/mounts | grep "$mock_root" | sort -r)) || : if [[ "$mounts" ]]; then echo "Found mounted dirs inside the chroot $mock_root." \ "Trying to umount." fi for mount in "${mounts[@]}"; do safe_umount "$mount" && continue # If we got here, we failed $UMOUNT_RETRIES attempts so we should make # noise failed=true this_chroot_failed=true done if ! $this_chroot_failed; then sudo rm -rf "$mock_root" fi done # remove mock caches that are older then 2 days: find /var/cache/mock/ -mindepth 1 -maxdepth 1 -type d -mtime +2 -print0 | \ xargs -0 -tr sudo rm -rf # We make no effort to leave around caches that may still be in use because # packages installed in them may go out of date, so may as well recreate them # Drop all left over libvirt domains for UUID in $(virsh list --all --uuid); do virsh destroy $UUID || : sleep 2 virsh undefine --remove-all-storage --storage vda --snapshots-metadata $UUID || : done if $failed; then echo "Cleanup script failed, propegating failure to job" exit 1 fi [ovirt_4.0_he-system-tests] $ /bin/bash -x /tmp/hudson3034026418749022728.sh + echo shell-scripts/mock_cleanup.sh shell-scripts/mock_cleanup.sh + cat _______________________________________________________________________ ####################################################################### # # # CLEANUP # # # ####################################################################### + shopt -s nullglob + WORKSPACE=<http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/> + UMOUNT_RETRIES=3 + UMOUNT_RETRY_DELAY=1s + sudo chown -R jenkins <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/> + failed=false + mock_confs=("$WORKSPACE"/*/mocker*) + for mock_conf_file in '"${mock_confs[@]}"' + [[ -n <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ovirt-system-tests/mocker-epel-7-x86_64.el7.cfg> ]] + echo 'Cleaning up mock ' Cleaning up mock + mock_root=mocker-epel-7-x86_64.el7.cfg + mock_root=mocker-epel-7-x86_64.el7 + my_mock=/usr/bin/mock + my_mock+=' --configdir=<http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ovirt-system-tests'> + my_mock+=' --root=mocker-epel-7-x86_64.el7' + my_mock+=' --resultdir=<http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/'> + echo 'Killing all mock orphan processes, if any.' Killing all mock orphan processes, if any. + /usr/bin/mock --configdir=<http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ovirt-system-tests> --root=mocker-epel-7-x86_64.el7 --resultdir=<http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/> --orphanskill WARNING: Could not find required logging config file: <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ovirt-system-tests/logging.ini.> Using default... INFO: mock.py version 1.2.21 starting (python version = 3.4.3)... Start: init plugins INFO: selinux enabled Finish: init plugins Start: run Finish: run ++ grep -Po '(?<=config_opts\['\''root'\''\] = '\'')[^'\'']*' <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ovirt-system-tests/mocker-epel-7-x86_64.el7.cfg> + mock_root=epel-7-x86_64-6f628e6dc1a827c86d5e1bd9d3b3d38b + [[ -n epel-7-x86_64-6f628e6dc1a827c86d5e1bd9d3b3d38b ]] + mounts=($(mount | awk '{print $3}' | grep "$mock_root")) ++ mount ++ awk '{print $3}' ++ grep epel-7-x86_64-6f628e6dc1a827c86d5e1bd9d3b3d38b + : + [[ -n '' ]] + find /var/cache/mock/ -mindepth 1 -maxdepth 1 -type d -mtime +2 -print0 + xargs -0 -tr sudo rm -rf ++ virsh list --all --uuid + false POST BUILD TASK : SUCCESS END OF POST BUILD TASK : 1 Recording test results ERROR: Step ‘Publish JUnit test result report’ failed: No test report files were found. Configuration error? Archiving artifacts

See <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/629/changes> Changes: [Roy Golan] accept defaults of aaa-ldap plugin installation [Sandro Bonazzola] 4.1 nightly: publish 4.1 dashboard in 4.1 repo [Eyal Edri] fixing appliance to take from 4.1 [Daniel Belenky] Added 4.1 branch for ovirt-scheduler-proxy [Daniel Belenky] Added build for fc24+ovirt4.0 [Daniel Belenky] Added build for fc24+ovirt4.0 [Daniel Belenky] install python-requests as a dependency [Pavel Zhukov] Moved publishers section to job definition instead of projects [Barak Korren] Revert "install python-requests as a dependency" ------------------------------------------ [...truncated 112.77 KB...] Finish: shell @@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@ @@ Thu Jan 12 02:41:12 UTC 2017 automation/he_basic_suite_4.0.sh chroot finished @@ took 363 seconds @@ rc = 1 @@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@ ========== Scrubbing chroot mock \ --configdir="<http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ovirt-system-tests"> \ --root="mocker-epel-7-x86_64.el7" \ --resultdir="./mock_logs.qy8Ieq7t/mocker-epel-7-x86_64.el7.scrub" \ --scrub=chroot WARNING: Could not find required logging config file: <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ovirt-system-tests/logging.ini.> Using default... INFO: mock.py version 1.2.21 starting (python version = 3.4.3)... Start: init plugins INFO: selinux enabled Finish: init plugins Start: run Start: scrub ['chroot'] INFO: scrubbing chroot for mocker-epel-7-x86_64.el7 Finish: scrub ['chroot'] Finish: run Scrub chroot took 5 seconds ============================ ########################################################## ## Thu Jan 12 02:41:17 UTC 2017 Finished env: el7:epel-7-x86_64 ## took 368 seconds ## rc = 1 ########################################################## find: ‘logs’: No such file or directory No log files found, check command output ##!######################################################## Collecting mock logs ‘./mock_logs.qy8Ieq7t/mocker-epel-7-x86_64.el7.clean_rpmdb’ -> ‘exported-artifacts/mock_logs/mocker-epel-7-x86_64.el7.clean_rpmdb’ ‘./mock_logs.qy8Ieq7t/mocker-epel-7-x86_64.el7.he_basic_suite_4.0.sh’ -> ‘exported-artifacts/mock_logs/mocker-epel-7-x86_64.el7.he_basic_suite_4.0.sh’ ‘./mock_logs.qy8Ieq7t/mocker-epel-7-x86_64.el7.init’ -> ‘exported-artifacts/mock_logs/mocker-epel-7-x86_64.el7.init’ ########################################################## Build step 'Execute shell' marked build as failure Performing Post build task... Match found for :.* : True Logical operation result is TRUE Running script : #!/bin/bash -xe echo 'shell_scripts/system_tests.collect_logs.sh' # # Required jjb vars: # version # VERSION=4.0 SUITE_TYPE= WORKSPACE="$PWD" OVIRT_SUITE="$SUITE_TYPE_suite_$VERSION" TESTS_LOGS="$WORKSPACE/ovirt-system-tests/exported-artifacts" rm -rf "$WORKSPACE/exported-artifacts" mkdir -p "$WORKSPACE/exported-artifacts" if [[ -d "$TESTS_LOGS" ]]; then mv "$TESTS_LOGS/"* "$WORKSPACE/exported-artifacts/" fi [ovirt_4.0_he-system-tests] $ /bin/bash -xe /tmp/hudson6072159654124524965.sh + echo shell_scripts/system_tests.collect_logs.sh shell_scripts/system_tests.collect_logs.sh + VERSION=4.0 + SUITE_TYPE= + WORKSPACE=<http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/> + OVIRT_SUITE=4.0 + TESTS_LOGS=<http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ovirt-system-tests/exported-artifacts> + rm -rf <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/629/artifact/exported-artifacts> + mkdir -p <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/629/artifact/exported-artifacts> + [[ -d <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ovirt-system-tests/exported-artifacts> ]] + mv <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ovirt-system-tests/exported-artifacts/failure_msg.txt> <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ovirt-system-tests/exported-artifacts/lago_logs> <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ovirt-system-tests/exported-artifacts/mock_logs> <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/629/artifact/exported-artifacts/> POST BUILD TASK : SUCCESS END OF POST BUILD TASK : 0 Match found for :.* : True Logical operation result is TRUE Running script : #!/bin/bash -x echo "shell-scripts/mock_cleanup.sh" # Make clear this is the cleanup, helps reading the jenkins logs cat <<EOC _______________________________________________________________________ ####################################################################### # # # CLEANUP # # # ####################################################################### EOC shopt -s nullglob WORKSPACE="${WORKSPACE:-$PWD}" UMOUNT_RETRIES="${UMOUNT_RETRIES:-3}" UMOUNT_RETRY_DELAY="${UMOUNT_RETRY_DELAY:-1s}" safe_umount() { local mount="${1:?}" local attempt for ((attempt=0 ; attempt < $UMOUNT_RETRIES ; attempt++)); do # If this is not the 1st time through the loop, Sleep a while to let # the problem "solve itself" [[ attempt > 0 ]] && sleep "$UMOUNT_RETRY_DELAY" # Try to umount sudo umount --lazy "$mount" && return 0 # See if the mount is already not there despite failing findmnt --kernel --first "$mount" > /dev/null && return 0 done echo "ERROR: Failed to umount $mount." return 1 } # restore the permissions in the working dir, as sometimes it leaves files # owned by root and then the 'cleanup workspace' from jenkins job fails to # clean and breaks the jobs sudo chown -R "$USER" "$WORKSPACE" # stop any processes running inside the chroot failed=false mock_confs=("$WORKSPACE"/*/mocker*) # Clean current jobs mockroot if any for mock_conf_file in "${mock_confs[@]}"; do [[ "$mock_conf_file" ]] || continue echo "Cleaning up mock $mock_conf" mock_root="${mock_conf_file##*/}" mock_root="${mock_root%.*}" my_mock="/usr/bin/mock" my_mock+=" --configdir=${mock_conf_file%/*}" my_mock+=" --root=${mock_root}" my_mock+=" --resultdir=$WORKSPACE" #TODO: investigate why mock --clean fails to umount certain dirs sometimes, #so we can use it instead of manually doing all this. echo "Killing all mock orphan processes, if any." $my_mock \ --orphanskill \ || { echo "ERROR: Failed to kill orphans on $chroot." failed=true } mock_root="$(\ grep \ -Po "(?<=config_opts\['root'\] = ')[^']*" \ "$mock_conf_file" \ )" || : [[ "$mock_root" ]] || continue mounts=($(mount | awk '{print $3}' | grep "$mock_root")) || : if [[ "$mounts" ]]; then echo "Found mounted dirs inside the chroot $chroot. Trying to umount." fi for mount in "${mounts[@]}"; do safe_umount "$mount" || failed=true done done # Clean any leftover chroot from other jobs for mock_root in /var/lib/mock/*; do this_chroot_failed=false mounts=($(cut -d\ -f2 /proc/mounts | grep "$mock_root" | sort -r)) || : if [[ "$mounts" ]]; then echo "Found mounted dirs inside the chroot $mock_root." \ "Trying to umount." fi for mount in "${mounts[@]}"; do safe_umount "$mount" && continue # If we got here, we failed $UMOUNT_RETRIES attempts so we should make # noise failed=true this_chroot_failed=true done if ! $this_chroot_failed; then sudo rm -rf "$mock_root" fi done # remove mock caches that are older then 2 days: find /var/cache/mock/ -mindepth 1 -maxdepth 1 -type d -mtime +2 -print0 | \ xargs -0 -tr sudo rm -rf # We make no effort to leave around caches that may still be in use because # packages installed in them may go out of date, so may as well recreate them # Drop all left over libvirt domains for UUID in $(virsh list --all --uuid); do virsh destroy $UUID || : sleep 2 virsh undefine --remove-all-storage --storage vda --snapshots-metadata $UUID || : done if $failed; then echo "Cleanup script failed, propegating failure to job" exit 1 fi [ovirt_4.0_he-system-tests] $ /bin/bash -x /tmp/hudson3369216317262378470.sh + echo shell-scripts/mock_cleanup.sh shell-scripts/mock_cleanup.sh + cat _______________________________________________________________________ ####################################################################### # # # CLEANUP # # # ####################################################################### + shopt -s nullglob + WORKSPACE=<http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/> + UMOUNT_RETRIES=3 + UMOUNT_RETRY_DELAY=1s + sudo chown -R jenkins <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/> + failed=false + mock_confs=("$WORKSPACE"/*/mocker*) + for mock_conf_file in '"${mock_confs[@]}"' + [[ -n <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ovirt-system-tests/mocker-epel-7-x86_64.el7.cfg> ]] + echo 'Cleaning up mock ' Cleaning up mock + mock_root=mocker-epel-7-x86_64.el7.cfg + mock_root=mocker-epel-7-x86_64.el7 + my_mock=/usr/bin/mock + my_mock+=' --configdir=<http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ovirt-system-tests'> + my_mock+=' --root=mocker-epel-7-x86_64.el7' + my_mock+=' --resultdir=<http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/'> + echo 'Killing all mock orphan processes, if any.' Killing all mock orphan processes, if any. + /usr/bin/mock --configdir=<http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ovirt-system-tests> --root=mocker-epel-7-x86_64.el7 --resultdir=<http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/> --orphanskill WARNING: Could not find required logging config file: <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ovirt-system-tests/logging.ini.> Using default... INFO: mock.py version 1.2.21 starting (python version = 3.4.3)... Start: init plugins INFO: selinux enabled Finish: init plugins Start: run Finish: run ++ grep -Po '(?<=config_opts\['\''root'\''\] = '\'')[^'\'']*' <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ovirt-system-tests/mocker-epel-7-x86_64.el7.cfg> + mock_root=epel-7-x86_64-6f628e6dc1a827c86d5e1bd9d3b3d38b + [[ -n epel-7-x86_64-6f628e6dc1a827c86d5e1bd9d3b3d38b ]] + mounts=($(mount | awk '{print $3}' | grep "$mock_root")) ++ mount ++ awk '{print $3}' ++ grep epel-7-x86_64-6f628e6dc1a827c86d5e1bd9d3b3d38b + : + [[ -n '' ]] + find /var/cache/mock/ -mindepth 1 -maxdepth 1 -type d -mtime +2 -print0 + xargs -0 -tr sudo rm -rf ++ virsh list --all --uuid + false POST BUILD TASK : SUCCESS END OF POST BUILD TASK : 1 Recording test results ERROR: Step ‘Publish JUnit test result report’ failed: No test report files were found. Configuration error? Archiving artifacts

See <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/630/changes> Changes: [Your Name] network utils: rename attach/detach vlans [Gil Shinar] Make all files in workspace writable by user [Yedidyah Bar David] otopi: Move to 4.1 branch otopi-1.6 [Sandro Bonazzola] ovirt-node-ng: add 4.1-snapshot job [Sandro Bonazzola] imgbased: drop unnecessary jobs [Barak Korren] Added mirrors client script [Barak Korren] Adapt mock_runner.sh to use mirrors [Barak Korren] Normalize repo names in centos-6 mock configs ------------------------------------------ [...truncated 114.85 KB...] ## Fri Jan 13 02:42:53 UTC 2017 Finished env: el7:epel-7-x86_64 ## took 463 seconds ## rc = 1 ########################################################## ##! ERROR vvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvv ##! Last 20 log entries: ./mock_logs.gDK4EDCl/mocker-epel-7-x86_64.el7.he_basic_suite_4.0.sh/he_basic_suite_4.0.sh.log ##! @ Cleanup prefix: # Stop prefix: # Stop prefix: * Stop vms: * Stop vms: Success (in 0:00:00) * Stop nets: * Stop nets: Success (in 0:00:00) # Stop prefix: Success (in 0:00:01) # Tag prefix as uninitialized: # Tag prefix as uninitialized: Success (in 0:00:00) @ Cleanup prefix: Success (in 0:00:01) + echo '----------- Cleaning with lago done' ----------- Cleaning with lago done + [[ 0 != \0 ]] + echo '======== Cleanup done' ======== Cleanup done + exit 0 + exit Took 349 seconds =================================== ##! ##! ERROR ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ ##!######################################################## Collecting mock logs ‘./mock_logs.gDK4EDCl/mocker-epel-7-x86_64.el7.clean_rpmdb’ -> ‘exported-artifacts/mock_logs/mocker-epel-7-x86_64.el7.clean_rpmdb’ ‘./mock_logs.gDK4EDCl/mocker-epel-7-x86_64.el7.he_basic_suite_4.0.sh’ -> ‘exported-artifacts/mock_logs/mocker-epel-7-x86_64.el7.he_basic_suite_4.0.sh’ ‘./mock_logs.gDK4EDCl/mocker-epel-7-x86_64.el7.init’ -> ‘exported-artifacts/mock_logs/mocker-epel-7-x86_64.el7.init’ ########################################################## Build step 'Execute shell' marked build as failure Performing Post build task... Match found for :.* : True Logical operation result is TRUE Running script : #!/bin/bash -xe echo 'shell_scripts/system_tests.collect_logs.sh' # # Required jjb vars: # version # VERSION=4.0 SUITE_TYPE= WORKSPACE="$PWD" OVIRT_SUITE="$SUITE_TYPE_suite_$VERSION" TESTS_LOGS="$WORKSPACE/ovirt-system-tests/exported-artifacts" rm -rf "$WORKSPACE/exported-artifacts" mkdir -p "$WORKSPACE/exported-artifacts" if [[ -d "$TESTS_LOGS" ]]; then mv "$TESTS_LOGS/"* "$WORKSPACE/exported-artifacts/" fi [ovirt_4.0_he-system-tests] $ /bin/bash -xe /tmp/hudson8740931555021581095.sh + echo shell_scripts/system_tests.collect_logs.sh shell_scripts/system_tests.collect_logs.sh + VERSION=4.0 + SUITE_TYPE= + WORKSPACE=<http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/> + OVIRT_SUITE=4.0 + TESTS_LOGS=<http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ovirt-system-tests/exported-artifacts> + rm -rf <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/630/artifact/exported-artifacts> + mkdir -p <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/630/artifact/exported-artifacts> + [[ -d <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ovirt-system-tests/exported-artifacts> ]] + mv <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ovirt-system-tests/exported-artifacts/failure_msg.txt> <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ovirt-system-tests/exported-artifacts/lago_logs> <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ovirt-system-tests/exported-artifacts/mock_logs> <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/630/artifact/exported-artifacts/> POST BUILD TASK : SUCCESS END OF POST BUILD TASK : 0 Match found for :.* : True Logical operation result is TRUE Running script : #!/bin/bash -x echo "shell-scripts/mock_cleanup.sh" # Make clear this is the cleanup, helps reading the jenkins logs cat <<EOC _______________________________________________________________________ ####################################################################### # # # CLEANUP # # # ####################################################################### EOC shopt -s nullglob WORKSPACE="${WORKSPACE:-$PWD}" UMOUNT_RETRIES="${UMOUNT_RETRIES:-3}" UMOUNT_RETRY_DELAY="${UMOUNT_RETRY_DELAY:-1s}" safe_umount() { local mount="${1:?}" local attempt for ((attempt=0 ; attempt < $UMOUNT_RETRIES ; attempt++)); do # If this is not the 1st time through the loop, Sleep a while to let # the problem "solve itself" [[ attempt > 0 ]] && sleep "$UMOUNT_RETRY_DELAY" # Try to umount sudo umount --lazy "$mount" && return 0 # See if the mount is already not there despite failing findmnt --kernel --first "$mount" > /dev/null && return 0 done echo "ERROR: Failed to umount $mount." return 1 } # restore the permissions in the working dir, as sometimes it leaves files # owned by root and then the 'cleanup workspace' from jenkins job fails to # clean and breaks the jobs sudo chown -R "$USER" "$WORKSPACE" sudo chmod -R u+w "$WORKSPACE" # stop any processes running inside the chroot failed=false mock_confs=("$WORKSPACE"/*/mocker*) # Clean current jobs mockroot if any for mock_conf_file in "${mock_confs[@]}"; do [[ "$mock_conf_file" ]] || continue echo "Cleaning up mock $mock_conf" mock_root="${mock_conf_file##*/}" mock_root="${mock_root%.*}" my_mock="/usr/bin/mock" my_mock+=" --configdir=${mock_conf_file%/*}" my_mock+=" --root=${mock_root}" my_mock+=" --resultdir=$WORKSPACE" #TODO: investigate why mock --clean fails to umount certain dirs sometimes, #so we can use it instead of manually doing all this. echo "Killing all mock orphan processes, if any." $my_mock \ --orphanskill \ || { echo "ERROR: Failed to kill orphans on $chroot." failed=true } mock_root="$(\ grep \ -Po "(?<=config_opts\['root'\] = ')[^']*" \ "$mock_conf_file" \ )" || : [[ "$mock_root" ]] || continue mounts=($(mount | awk '{print $3}' | grep "$mock_root")) || : if [[ "$mounts" ]]; then echo "Found mounted dirs inside the chroot $chroot. Trying to umount." fi for mount in "${mounts[@]}"; do safe_umount "$mount" || failed=true done done # Clean any leftover chroot from other jobs for mock_root in /var/lib/mock/*; do this_chroot_failed=false mounts=($(cut -d\ -f2 /proc/mounts | grep "$mock_root" | sort -r)) || : if [[ "$mounts" ]]; then echo "Found mounted dirs inside the chroot $mock_root." \ "Trying to umount." fi for mount in "${mounts[@]}"; do safe_umount "$mount" && continue # If we got here, we failed $UMOUNT_RETRIES attempts so we should make # noise failed=true this_chroot_failed=true done if ! $this_chroot_failed; then sudo rm -rf "$mock_root" fi done # remove mock caches that are older then 2 days: find /var/cache/mock/ -mindepth 1 -maxdepth 1 -type d -mtime +2 -print0 | \ xargs -0 -tr sudo rm -rf # We make no effort to leave around caches that may still be in use because # packages installed in them may go out of date, so may as well recreate them # Drop all left over libvirt domains for UUID in $(virsh list --all --uuid); do virsh destroy $UUID || : sleep 2 virsh undefine --remove-all-storage --storage vda --snapshots-metadata $UUID || : done if $failed; then echo "Cleanup script failed, propegating failure to job" exit 1 fi [ovirt_4.0_he-system-tests] $ /bin/bash -x /tmp/hudson6658254415610815715.sh + echo shell-scripts/mock_cleanup.sh shell-scripts/mock_cleanup.sh + cat _______________________________________________________________________ ####################################################################### # # # CLEANUP # # # ####################################################################### + shopt -s nullglob + WORKSPACE=<http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/> + UMOUNT_RETRIES=3 + UMOUNT_RETRY_DELAY=1s + sudo chown -R jenkins <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/> + sudo chmod -R u+w <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/> + failed=false + mock_confs=("$WORKSPACE"/*/mocker*) + for mock_conf_file in '"${mock_confs[@]}"' + [[ -n <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ovirt-system-tests/mocker-epel-7-x86_64.el7.cfg> ]] + echo 'Cleaning up mock ' Cleaning up mock + mock_root=mocker-epel-7-x86_64.el7.cfg + mock_root=mocker-epel-7-x86_64.el7 + my_mock=/usr/bin/mock + my_mock+=' --configdir=<http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ovirt-system-tests'> + my_mock+=' --root=mocker-epel-7-x86_64.el7' + my_mock+=' --resultdir=<http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/'> + echo 'Killing all mock orphan processes, if any.' Killing all mock orphan processes, if any. + /usr/bin/mock --configdir=<http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ovirt-system-tests> --root=mocker-epel-7-x86_64.el7 --resultdir=<http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/> --orphanskill WARNING: Could not find required logging config file: <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ovirt-system-tests/logging.ini.> Using default... INFO: mock.py version 1.2.21 starting (python version = 3.4.3)... Start: init plugins INFO: selinux enabled Finish: init plugins Start: run Finish: run ++ grep -Po '(?<=config_opts\['\''root'\''\] = '\'')[^'\'']*' <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ovirt-system-tests/mocker-epel-7-x86_64.el7.cfg> + mock_root=epel-7-x86_64-6f628e6dc1a827c86d5e1bd9d3b3d38b + [[ -n epel-7-x86_64-6f628e6dc1a827c86d5e1bd9d3b3d38b ]] + mounts=($(mount | awk '{print $3}' | grep "$mock_root")) ++ mount ++ awk '{print $3}' ++ grep epel-7-x86_64-6f628e6dc1a827c86d5e1bd9d3b3d38b + : + [[ -n '' ]] + find /var/cache/mock/ -mindepth 1 -maxdepth 1 -type d -mtime +2 -print0 + xargs -0 -tr sudo rm -rf ++ virsh list --all --uuid + false POST BUILD TASK : SUCCESS END OF POST BUILD TASK : 1 Recording test results ERROR: Step ‘Publish JUnit test result report’ failed: No test report files were found. Configuration error? Archiving artifacts

See <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/631/changes> Changes: [Your Name] network utils: rename attach/detach vlans [Sandro Bonazzola] cockpit-ovirt: add missing check-patch jobs [Sandro Bonazzola] publisher: 4.0: drop cockpit-ovirt for fc23 ------------------------------------------ [...truncated 85.33 KB...] ## Sat Jan 14 02:36:32 UTC 2017 Finished env: el7:epel-7-x86_64 ## took 83 seconds ## rc = 1 ########################################################## ##! ERROR vvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvv ##! Last 20 log entries: ./mock_logs.ZR8D1Ciz/mocker-epel-7-x86_64.el7.he_basic_suite_4.0.sh/he_basic_suite_4.0.sh.log ##! @ Cleanup prefix: # Stop prefix: # Stop prefix: * Stop vms: * Stop vms: Success (in 0:00:00) * Stop nets: * Stop nets: Success (in 0:00:00) # Stop prefix: Success (in 0:00:00) # Tag prefix as uninitialized: # Tag prefix as uninitialized: Success (in 0:00:00) @ Cleanup prefix: Success (in 0:00:00) + echo '----------- Cleaning with lago done' ----------- Cleaning with lago done + [[ 0 != \0 ]] + echo '======== Cleanup done' ======== Cleanup done + exit 0 + exit Took 66 seconds =================================== ##! ##! ERROR ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ ##!######################################################## Collecting mock logs ‘./mock_logs.ZR8D1Ciz/mocker-epel-7-x86_64.el7.clean_rpmdb’ -> ‘exported-artifacts/mock_logs/mocker-epel-7-x86_64.el7.clean_rpmdb’ ‘./mock_logs.ZR8D1Ciz/mocker-epel-7-x86_64.el7.he_basic_suite_4.0.sh’ -> ‘exported-artifacts/mock_logs/mocker-epel-7-x86_64.el7.he_basic_suite_4.0.sh’ ‘./mock_logs.ZR8D1Ciz/mocker-epel-7-x86_64.el7.init’ -> ‘exported-artifacts/mock_logs/mocker-epel-7-x86_64.el7.init’ ########################################################## Build step 'Execute shell' marked build as failure Performing Post build task... Match found for :.* : True Logical operation result is TRUE Running script : #!/bin/bash -xe echo 'shell_scripts/system_tests.collect_logs.sh' # # Required jjb vars: # version # VERSION=4.0 SUITE_TYPE= WORKSPACE="$PWD" OVIRT_SUITE="$SUITE_TYPE_suite_$VERSION" TESTS_LOGS="$WORKSPACE/ovirt-system-tests/exported-artifacts" rm -rf "$WORKSPACE/exported-artifacts" mkdir -p "$WORKSPACE/exported-artifacts" if [[ -d "$TESTS_LOGS" ]]; then mv "$TESTS_LOGS/"* "$WORKSPACE/exported-artifacts/" fi [ovirt_4.0_he-system-tests] $ /bin/bash -xe /tmp/hudson8978469018453012344.sh + echo shell_scripts/system_tests.collect_logs.sh shell_scripts/system_tests.collect_logs.sh + VERSION=4.0 + SUITE_TYPE= + WORKSPACE=<http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/> + OVIRT_SUITE=4.0 + TESTS_LOGS=<http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ovirt-system-tests/exported-artifacts> + rm -rf <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/631/artifact/exported-artifacts> + mkdir -p <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/631/artifact/exported-artifacts> + [[ -d <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ovirt-system-tests/exported-artifacts> ]] + mv <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ovirt-system-tests/exported-artifacts/failure_msg.txt> <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ovirt-system-tests/exported-artifacts/lago_logs> <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ovirt-system-tests/exported-artifacts/mock_logs> <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/631/artifact/exported-artifacts/> POST BUILD TASK : SUCCESS END OF POST BUILD TASK : 0 Match found for :.* : True Logical operation result is TRUE Running script : #!/bin/bash -x echo "shell-scripts/mock_cleanup.sh" # Make clear this is the cleanup, helps reading the jenkins logs cat <<EOC _______________________________________________________________________ ####################################################################### # # # CLEANUP # # # ####################################################################### EOC shopt -s nullglob WORKSPACE="${WORKSPACE:-$PWD}" UMOUNT_RETRIES="${UMOUNT_RETRIES:-3}" UMOUNT_RETRY_DELAY="${UMOUNT_RETRY_DELAY:-1s}" safe_umount() { local mount="${1:?}" local attempt for ((attempt=0 ; attempt < $UMOUNT_RETRIES ; attempt++)); do # If this is not the 1st time through the loop, Sleep a while to let # the problem "solve itself" [[ attempt > 0 ]] && sleep "$UMOUNT_RETRY_DELAY" # Try to umount sudo umount --lazy "$mount" && return 0 # See if the mount is already not there despite failing findmnt --kernel --first "$mount" > /dev/null && return 0 done echo "ERROR: Failed to umount $mount." return 1 } # restore the permissions in the working dir, as sometimes it leaves files # owned by root and then the 'cleanup workspace' from jenkins job fails to # clean and breaks the jobs sudo chown -R "$USER" "$WORKSPACE" sudo chmod -R u+w "$WORKSPACE" # stop any processes running inside the chroot failed=false mock_confs=("$WORKSPACE"/*/mocker*) # Clean current jobs mockroot if any for mock_conf_file in "${mock_confs[@]}"; do [[ "$mock_conf_file" ]] || continue echo "Cleaning up mock $mock_conf" mock_root="${mock_conf_file##*/}" mock_root="${mock_root%.*}" my_mock="/usr/bin/mock" my_mock+=" --configdir=${mock_conf_file%/*}" my_mock+=" --root=${mock_root}" my_mock+=" --resultdir=$WORKSPACE" #TODO: investigate why mock --clean fails to umount certain dirs sometimes, #so we can use it instead of manually doing all this. echo "Killing all mock orphan processes, if any." $my_mock \ --orphanskill \ || { echo "ERROR: Failed to kill orphans on $chroot." failed=true } mock_root="$(\ grep \ -Po "(?<=config_opts\['root'\] = ')[^']*" \ "$mock_conf_file" \ )" || : [[ "$mock_root" ]] || continue mounts=($(mount | awk '{print $3}' | grep "$mock_root")) || : if [[ "$mounts" ]]; then echo "Found mounted dirs inside the chroot $chroot. Trying to umount." fi for mount in "${mounts[@]}"; do safe_umount "$mount" || failed=true done done # Clean any leftover chroot from other jobs for mock_root in /var/lib/mock/*; do this_chroot_failed=false mounts=($(cut -d\ -f2 /proc/mounts | grep "$mock_root" | sort -r)) || : if [[ "$mounts" ]]; then echo "Found mounted dirs inside the chroot $mock_root." \ "Trying to umount." fi for mount in "${mounts[@]}"; do safe_umount "$mount" && continue # If we got here, we failed $UMOUNT_RETRIES attempts so we should make # noise failed=true this_chroot_failed=true done if ! $this_chroot_failed; then sudo rm -rf "$mock_root" fi done # remove mock caches that are older then 2 days: find /var/cache/mock/ -mindepth 1 -maxdepth 1 -type d -mtime +2 -print0 | \ xargs -0 -tr sudo rm -rf # We make no effort to leave around caches that may still be in use because # packages installed in them may go out of date, so may as well recreate them # Drop all left over libvirt domains for UUID in $(virsh list --all --uuid); do virsh destroy $UUID || : sleep 2 virsh undefine --remove-all-storage --storage vda --snapshots-metadata $UUID || : done if $failed; then echo "Cleanup script failed, propegating failure to job" exit 1 fi [ovirt_4.0_he-system-tests] $ /bin/bash -x /tmp/hudson6767820409864253504.sh + echo shell-scripts/mock_cleanup.sh shell-scripts/mock_cleanup.sh + cat _______________________________________________________________________ ####################################################################### # # # CLEANUP # # # ####################################################################### + shopt -s nullglob + WORKSPACE=<http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/> + UMOUNT_RETRIES=3 + UMOUNT_RETRY_DELAY=1s + sudo chown -R jenkins <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/> + sudo chmod -R u+w <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/> + failed=false + mock_confs=("$WORKSPACE"/*/mocker*) + for mock_conf_file in '"${mock_confs[@]}"' + [[ -n <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ovirt-system-tests/mocker-epel-7-x86_64.el7.cfg> ]] + echo 'Cleaning up mock ' Cleaning up mock + mock_root=mocker-epel-7-x86_64.el7.cfg + mock_root=mocker-epel-7-x86_64.el7 + my_mock=/usr/bin/mock + my_mock+=' --configdir=<http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ovirt-system-tests'> + my_mock+=' --root=mocker-epel-7-x86_64.el7' + my_mock+=' --resultdir=<http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/'> + echo 'Killing all mock orphan processes, if any.' Killing all mock orphan processes, if any. + /usr/bin/mock --configdir=<http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ovirt-system-tests> --root=mocker-epel-7-x86_64.el7 --resultdir=<http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/> --orphanskill WARNING: Could not find required logging config file: <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ovirt-system-tests/logging.ini.> Using default... INFO: mock.py version 1.2.21 starting (python version = 3.4.3)... Start: init plugins INFO: selinux enabled Finish: init plugins Start: run Finish: run ++ grep -Po '(?<=config_opts\['\''root'\''\] = '\'')[^'\'']*' <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ovirt-system-tests/mocker-epel-7-x86_64.el7.cfg> + mock_root=epel-7-x86_64-6f628e6dc1a827c86d5e1bd9d3b3d38b + [[ -n epel-7-x86_64-6f628e6dc1a827c86d5e1bd9d3b3d38b ]] + mounts=($(mount | awk '{print $3}' | grep "$mock_root")) ++ mount ++ awk '{print $3}' ++ grep epel-7-x86_64-6f628e6dc1a827c86d5e1bd9d3b3d38b + : + [[ -n '' ]] + find /var/cache/mock/ -mindepth 1 -maxdepth 1 -type d -mtime +2 -print0 + xargs -0 -tr sudo rm -rf ++ virsh list --all --uuid + false POST BUILD TASK : SUCCESS END OF POST BUILD TASK : 1 Recording test results ERROR: Step ‘Publish JUnit test result report’ failed: No test report files were found. Configuration error? Archiving artifacts

See <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/632/changes> Changes: [Your Name] network utils: rename attach/detach vlans ------------------------------------------ [...truncated 116.21 KB...] ## Sun Jan 15 02:41:26 UTC 2017 Finished env: el7:epel-7-x86_64 ## took 377 seconds ## rc = 1 ########################################################## ##! ERROR vvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvv ##! Last 20 log entries: ./mock_logs.kp7y8v9q/mocker-epel-7-x86_64.el7.he_basic_suite_4.0.sh/he_basic_suite_4.0.sh.log ##! @ Cleanup prefix: # Stop prefix: # Stop prefix: * Stop vms: * Stop vms: Success (in 0:00:00) * Stop nets: * Stop nets: Success (in 0:00:00) # Stop prefix: Success (in 0:00:00) # Tag prefix as uninitialized: # Tag prefix as uninitialized: Success (in 0:00:00) @ Cleanup prefix: Success (in 0:00:00) + echo '----------- Cleaning with lago done' ----------- Cleaning with lago done + [[ 0 != \0 ]] + echo '======== Cleanup done' ======== Cleanup done + exit 0 + exit Took 361 seconds =================================== ##! ##! ERROR ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ ##!######################################################## Collecting mock logs ‘./mock_logs.kp7y8v9q/mocker-epel-7-x86_64.el7.clean_rpmdb’ -> ‘exported-artifacts/mock_logs/mocker-epel-7-x86_64.el7.clean_rpmdb’ ‘./mock_logs.kp7y8v9q/mocker-epel-7-x86_64.el7.he_basic_suite_4.0.sh’ -> ‘exported-artifacts/mock_logs/mocker-epel-7-x86_64.el7.he_basic_suite_4.0.sh’ ‘./mock_logs.kp7y8v9q/mocker-epel-7-x86_64.el7.init’ -> ‘exported-artifacts/mock_logs/mocker-epel-7-x86_64.el7.init’ ########################################################## Build step 'Execute shell' marked build as failure Performing Post build task... Match found for :.* : True Logical operation result is TRUE Running script : #!/bin/bash -xe echo 'shell_scripts/system_tests.collect_logs.sh' # # Required jjb vars: # version # VERSION=4.0 SUITE_TYPE= WORKSPACE="$PWD" OVIRT_SUITE="$SUITE_TYPE_suite_$VERSION" TESTS_LOGS="$WORKSPACE/ovirt-system-tests/exported-artifacts" rm -rf "$WORKSPACE/exported-artifacts" mkdir -p "$WORKSPACE/exported-artifacts" if [[ -d "$TESTS_LOGS" ]]; then mv "$TESTS_LOGS/"* "$WORKSPACE/exported-artifacts/" fi [ovirt_4.0_he-system-tests] $ /bin/bash -xe /tmp/hudson6306838827365603194.sh + echo shell_scripts/system_tests.collect_logs.sh shell_scripts/system_tests.collect_logs.sh + VERSION=4.0 + SUITE_TYPE= + WORKSPACE=<http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/> + OVIRT_SUITE=4.0 + TESTS_LOGS=<http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ovirt-system-tests/exported-artifacts> + rm -rf <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/632/artifact/exported-artifacts> + mkdir -p <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/632/artifact/exported-artifacts> + [[ -d <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ovirt-system-tests/exported-artifacts> ]] + mv <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ovirt-system-tests/exported-artifacts/failure_msg.txt> <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ovirt-system-tests/exported-artifacts/lago_logs> <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ovirt-system-tests/exported-artifacts/mock_logs> <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/632/artifact/exported-artifacts/> POST BUILD TASK : SUCCESS END OF POST BUILD TASK : 0 Match found for :.* : True Logical operation result is TRUE Running script : #!/bin/bash -x echo "shell-scripts/mock_cleanup.sh" # Make clear this is the cleanup, helps reading the jenkins logs cat <<EOC _______________________________________________________________________ ####################################################################### # # # CLEANUP # # # ####################################################################### EOC shopt -s nullglob WORKSPACE="${WORKSPACE:-$PWD}" UMOUNT_RETRIES="${UMOUNT_RETRIES:-3}" UMOUNT_RETRY_DELAY="${UMOUNT_RETRY_DELAY:-1s}" safe_umount() { local mount="${1:?}" local attempt for ((attempt=0 ; attempt < $UMOUNT_RETRIES ; attempt++)); do # If this is not the 1st time through the loop, Sleep a while to let # the problem "solve itself" [[ attempt > 0 ]] && sleep "$UMOUNT_RETRY_DELAY" # Try to umount sudo umount --lazy "$mount" && return 0 # See if the mount is already not there despite failing findmnt --kernel --first "$mount" > /dev/null && return 0 done echo "ERROR: Failed to umount $mount." return 1 } # restore the permissions in the working dir, as sometimes it leaves files # owned by root and then the 'cleanup workspace' from jenkins job fails to # clean and breaks the jobs sudo chown -R "$USER" "$WORKSPACE" sudo chmod -R u+w "$WORKSPACE" # stop any processes running inside the chroot failed=false mock_confs=("$WORKSPACE"/*/mocker*) # Clean current jobs mockroot if any for mock_conf_file in "${mock_confs[@]}"; do [[ "$mock_conf_file" ]] || continue echo "Cleaning up mock $mock_conf" mock_root="${mock_conf_file##*/}" mock_root="${mock_root%.*}" my_mock="/usr/bin/mock" my_mock+=" --configdir=${mock_conf_file%/*}" my_mock+=" --root=${mock_root}" my_mock+=" --resultdir=$WORKSPACE" #TODO: investigate why mock --clean fails to umount certain dirs sometimes, #so we can use it instead of manually doing all this. echo "Killing all mock orphan processes, if any." $my_mock \ --orphanskill \ || { echo "ERROR: Failed to kill orphans on $chroot." failed=true } mock_root="$(\ grep \ -Po "(?<=config_opts\['root'\] = ')[^']*" \ "$mock_conf_file" \ )" || : [[ "$mock_root" ]] || continue mounts=($(mount | awk '{print $3}' | grep "$mock_root")) || : if [[ "$mounts" ]]; then echo "Found mounted dirs inside the chroot $chroot. Trying to umount." fi for mount in "${mounts[@]}"; do safe_umount "$mount" || failed=true done done # Clean any leftover chroot from other jobs for mock_root in /var/lib/mock/*; do this_chroot_failed=false mounts=($(cut -d\ -f2 /proc/mounts | grep "$mock_root" | sort -r)) || : if [[ "$mounts" ]]; then echo "Found mounted dirs inside the chroot $mock_root." \ "Trying to umount." fi for mount in "${mounts[@]}"; do safe_umount "$mount" && continue # If we got here, we failed $UMOUNT_RETRIES attempts so we should make # noise failed=true this_chroot_failed=true done if ! $this_chroot_failed; then sudo rm -rf "$mock_root" fi done # remove mock caches that are older then 2 days: find /var/cache/mock/ -mindepth 1 -maxdepth 1 -type d -mtime +2 -print0 | \ xargs -0 -tr sudo rm -rf # We make no effort to leave around caches that may still be in use because # packages installed in them may go out of date, so may as well recreate them # Drop all left over libvirt domains for UUID in $(virsh list --all --uuid); do virsh destroy $UUID || : sleep 2 virsh undefine --remove-all-storage --storage vda --snapshots-metadata $UUID || : done if $failed; then echo "Cleanup script failed, propegating failure to job" exit 1 fi [ovirt_4.0_he-system-tests] $ /bin/bash -x /tmp/hudson3378956271510490108.sh + echo shell-scripts/mock_cleanup.sh shell-scripts/mock_cleanup.sh + cat _______________________________________________________________________ ####################################################################### # # # CLEANUP # # # ####################################################################### + shopt -s nullglob + WORKSPACE=<http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/> + UMOUNT_RETRIES=3 + UMOUNT_RETRY_DELAY=1s + sudo chown -R jenkins <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/> + sudo chmod -R u+w <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/> + failed=false + mock_confs=("$WORKSPACE"/*/mocker*) + for mock_conf_file in '"${mock_confs[@]}"' + [[ -n <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ovirt-system-tests/mocker-epel-7-x86_64.el7.cfg> ]] + echo 'Cleaning up mock ' Cleaning up mock + mock_root=mocker-epel-7-x86_64.el7.cfg + mock_root=mocker-epel-7-x86_64.el7 + my_mock=/usr/bin/mock + my_mock+=' --configdir=<http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ovirt-system-tests'> + my_mock+=' --root=mocker-epel-7-x86_64.el7' + my_mock+=' --resultdir=<http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/'> + echo 'Killing all mock orphan processes, if any.' Killing all mock orphan processes, if any. + /usr/bin/mock --configdir=<http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ovirt-system-tests> --root=mocker-epel-7-x86_64.el7 --resultdir=<http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/> --orphanskill WARNING: Could not find required logging config file: <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ovirt-system-tests/logging.ini.> Using default... INFO: mock.py version 1.2.21 starting (python version = 3.4.3)... Start: init plugins INFO: selinux enabled Finish: init plugins Start: run Finish: run ++ grep -Po '(?<=config_opts\['\''root'\''\] = '\'')[^'\'']*' <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ovirt-system-tests/mocker-epel-7-x86_64.el7.cfg> + mock_root=epel-7-x86_64-6f628e6dc1a827c86d5e1bd9d3b3d38b + [[ -n epel-7-x86_64-6f628e6dc1a827c86d5e1bd9d3b3d38b ]] + mounts=($(mount | awk '{print $3}' | grep "$mock_root")) ++ mount ++ awk '{print $3}' ++ grep epel-7-x86_64-6f628e6dc1a827c86d5e1bd9d3b3d38b + : + [[ -n '' ]] + find /var/cache/mock/ -mindepth 1 -maxdepth 1 -type d -mtime +2 -print0 + xargs -0 -tr sudo rm -rf ++ virsh list --all --uuid + false POST BUILD TASK : SUCCESS END OF POST BUILD TASK : 1 Recording test results ERROR: Step ‘Publish JUnit test result report’ failed: No test report files were found. Configuration error? Archiving artifacts

See <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/633/changes> Changes: [Your Name] network utils: rename attach/detach vlans ------------------------------------------ [...truncated 115.27 KB...] ## Sun Jan 15 10:02:59 UTC 2017 Finished env: el7:epel-7-x86_64 ## took 365 seconds ## rc = 1 ########################################################## ##! ERROR vvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvv ##! Last 20 log entries: ./mock_logs.8X20vNgw/mocker-epel-7-x86_64.el7.he_basic_suite_4.0.sh/he_basic_suite_4.0.sh.log ##! @ Cleanup prefix: # Stop prefix: # Stop prefix: * Stop vms: * Stop vms: Success (in 0:00:00) * Stop nets: * Stop nets: Success (in 0:00:00) # Stop prefix: Success (in 0:00:01) # Tag prefix as uninitialized: # Tag prefix as uninitialized: Success (in 0:00:00) @ Cleanup prefix: Success (in 0:00:01) + echo '----------- Cleaning with lago done' ----------- Cleaning with lago done + [[ 0 != \0 ]] + echo '======== Cleanup done' ======== Cleanup done + exit 0 + exit Took 349 seconds =================================== ##! ##! ERROR ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ ##!######################################################## Collecting mock logs ‘./mock_logs.8X20vNgw/mocker-epel-7-x86_64.el7.clean_rpmdb’ -> ‘exported-artifacts/mock_logs/mocker-epel-7-x86_64.el7.clean_rpmdb’ ‘./mock_logs.8X20vNgw/mocker-epel-7-x86_64.el7.he_basic_suite_4.0.sh’ -> ‘exported-artifacts/mock_logs/mocker-epel-7-x86_64.el7.he_basic_suite_4.0.sh’ ‘./mock_logs.8X20vNgw/mocker-epel-7-x86_64.el7.init’ -> ‘exported-artifacts/mock_logs/mocker-epel-7-x86_64.el7.init’ ########################################################## Build step 'Execute shell' marked build as failure Performing Post build task... Match found for :.* : True Logical operation result is TRUE Running script : #!/bin/bash -xe echo 'shell_scripts/system_tests.collect_logs.sh' # # Required jjb vars: # version # VERSION=4.0 SUITE_TYPE= WORKSPACE="$PWD" OVIRT_SUITE="$SUITE_TYPE_suite_$VERSION" TESTS_LOGS="$WORKSPACE/ovirt-system-tests/exported-artifacts" rm -rf "$WORKSPACE/exported-artifacts" mkdir -p "$WORKSPACE/exported-artifacts" if [[ -d "$TESTS_LOGS" ]]; then mv "$TESTS_LOGS/"* "$WORKSPACE/exported-artifacts/" fi [ovirt_4.0_he-system-tests] $ /bin/bash -xe /tmp/hudson8568949804824784930.sh + echo shell_scripts/system_tests.collect_logs.sh shell_scripts/system_tests.collect_logs.sh + VERSION=4.0 + SUITE_TYPE= + WORKSPACE=<http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/> + OVIRT_SUITE=4.0 + TESTS_LOGS=<http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ovirt-system-tests/exported-artifacts> + rm -rf <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/633/artifact/exported-artifacts> + mkdir -p <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/633/artifact/exported-artifacts> + [[ -d <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ovirt-system-tests/exported-artifacts> ]] + mv <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ovirt-system-tests/exported-artifacts/failure_msg.txt> <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ovirt-system-tests/exported-artifacts/lago_logs> <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ovirt-system-tests/exported-artifacts/mock_logs> <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/633/artifact/exported-artifacts/> POST BUILD TASK : SUCCESS END OF POST BUILD TASK : 0 Match found for :.* : True Logical operation result is TRUE Running script : #!/bin/bash -x echo "shell-scripts/mock_cleanup.sh" # Make clear this is the cleanup, helps reading the jenkins logs cat <<EOC _______________________________________________________________________ ####################################################################### # # # CLEANUP # # # ####################################################################### EOC shopt -s nullglob WORKSPACE="${WORKSPACE:-$PWD}" UMOUNT_RETRIES="${UMOUNT_RETRIES:-3}" UMOUNT_RETRY_DELAY="${UMOUNT_RETRY_DELAY:-1s}" safe_umount() { local mount="${1:?}" local attempt for ((attempt=0 ; attempt < $UMOUNT_RETRIES ; attempt++)); do # If this is not the 1st time through the loop, Sleep a while to let # the problem "solve itself" [[ attempt > 0 ]] && sleep "$UMOUNT_RETRY_DELAY" # Try to umount sudo umount --lazy "$mount" && return 0 # See if the mount is already not there despite failing findmnt --kernel --first "$mount" > /dev/null && return 0 done echo "ERROR: Failed to umount $mount." return 1 } # restore the permissions in the working dir, as sometimes it leaves files # owned by root and then the 'cleanup workspace' from jenkins job fails to # clean and breaks the jobs sudo chown -R "$USER" "$WORKSPACE" sudo chmod -R u+w "$WORKSPACE" # stop any processes running inside the chroot failed=false mock_confs=("$WORKSPACE"/*/mocker*) # Clean current jobs mockroot if any for mock_conf_file in "${mock_confs[@]}"; do [[ "$mock_conf_file" ]] || continue echo "Cleaning up mock $mock_conf" mock_root="${mock_conf_file##*/}" mock_root="${mock_root%.*}" my_mock="/usr/bin/mock" my_mock+=" --configdir=${mock_conf_file%/*}" my_mock+=" --root=${mock_root}" my_mock+=" --resultdir=$WORKSPACE" #TODO: investigate why mock --clean fails to umount certain dirs sometimes, #so we can use it instead of manually doing all this. echo "Killing all mock orphan processes, if any." $my_mock \ --orphanskill \ || { echo "ERROR: Failed to kill orphans on $chroot." failed=true } mock_root="$(\ grep \ -Po "(?<=config_opts\['root'\] = ')[^']*" \ "$mock_conf_file" \ )" || : [[ "$mock_root" ]] || continue mounts=($(mount | awk '{print $3}' | grep "$mock_root")) || : if [[ "$mounts" ]]; then echo "Found mounted dirs inside the chroot $chroot. Trying to umount." fi for mount in "${mounts[@]}"; do safe_umount "$mount" || failed=true done done # Clean any leftover chroot from other jobs for mock_root in /var/lib/mock/*; do this_chroot_failed=false mounts=($(cut -d\ -f2 /proc/mounts | grep "$mock_root" | sort -r)) || : if [[ "$mounts" ]]; then echo "Found mounted dirs inside the chroot $mock_root." \ "Trying to umount." fi for mount in "${mounts[@]}"; do safe_umount "$mount" && continue # If we got here, we failed $UMOUNT_RETRIES attempts so we should make # noise failed=true this_chroot_failed=true done if ! $this_chroot_failed; then sudo rm -rf "$mock_root" fi done # remove mock caches that are older then 2 days: find /var/cache/mock/ -mindepth 1 -maxdepth 1 -type d -mtime +2 -print0 | \ xargs -0 -tr sudo rm -rf # We make no effort to leave around caches that may still be in use because # packages installed in them may go out of date, so may as well recreate them # Drop all left over libvirt domains for UUID in $(virsh list --all --uuid); do virsh destroy $UUID || : sleep 2 virsh undefine --remove-all-storage --storage vda --snapshots-metadata $UUID || : done if $failed; then echo "Cleanup script failed, propegating failure to job" exit 1 fi [ovirt_4.0_he-system-tests] $ /bin/bash -x /tmp/hudson3909438894782204867.sh + echo shell-scripts/mock_cleanup.sh shell-scripts/mock_cleanup.sh + cat _______________________________________________________________________ ####################################################################### # # # CLEANUP # # # ####################################################################### + shopt -s nullglob + WORKSPACE=<http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/> + UMOUNT_RETRIES=3 + UMOUNT_RETRY_DELAY=1s + sudo chown -R jenkins <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/> + sudo chmod -R u+w <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/> + failed=false + mock_confs=("$WORKSPACE"/*/mocker*) + for mock_conf_file in '"${mock_confs[@]}"' + [[ -n <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ovirt-system-tests/mocker-epel-7-x86_64.el7.cfg> ]] + echo 'Cleaning up mock ' Cleaning up mock + mock_root=mocker-epel-7-x86_64.el7.cfg + mock_root=mocker-epel-7-x86_64.el7 + my_mock=/usr/bin/mock + my_mock+=' --configdir=<http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ovirt-system-tests'> + my_mock+=' --root=mocker-epel-7-x86_64.el7' + my_mock+=' --resultdir=<http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/'> + echo 'Killing all mock orphan processes, if any.' Killing all mock orphan processes, if any. + /usr/bin/mock --configdir=<http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ovirt-system-tests> --root=mocker-epel-7-x86_64.el7 --resultdir=<http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/> --orphanskill WARNING: Could not find required logging config file: <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ovirt-system-tests/logging.ini.> Using default... INFO: mock.py version 1.2.21 starting (python version = 3.4.3)... Start: init plugins INFO: selinux enabled Finish: init plugins Start: run Finish: run ++ grep -Po '(?<=config_opts\['\''root'\''\] = '\'')[^'\'']*' <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ovirt-system-tests/mocker-epel-7-x86_64.el7.cfg> + mock_root=epel-7-x86_64-6f628e6dc1a827c86d5e1bd9d3b3d38b + [[ -n epel-7-x86_64-6f628e6dc1a827c86d5e1bd9d3b3d38b ]] + mounts=($(mount | awk '{print $3}' | grep "$mock_root")) ++ mount ++ awk '{print $3}' ++ grep epel-7-x86_64-6f628e6dc1a827c86d5e1bd9d3b3d38b + : + [[ -n '' ]] + find /var/cache/mock/ -mindepth 1 -maxdepth 1 -type d -mtime +2 -print0 + xargs -0 -tr sudo rm -rf ++ virsh list --all --uuid + false POST BUILD TASK : SUCCESS END OF POST BUILD TASK : 1 Recording test results ERROR: Step ‘Publish JUnit test result report’ failed: No test report files were found. Configuration error? Archiving artifacts

See <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/634/changes> Changes: [ngoldin] Use common settings in centos7 init file ------------------------------------------ [...truncated 114.84 KB...] ## took 355 seconds ## rc = 1 ########################################################## ##! ERROR vvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvv ##! Last 20 log entries: ./mock_logs.vAtsih3y/mocker-epel-7-x86_64.el7.he_basic_suite_4.0.sh/he_basic_suite_4.0.sh.log ##! @ Cleanup prefix: # Stop prefix: # Stop prefix: * Stop vms: * Stop vms: Success (in 0:00:00) * Stop nets: * Stop nets: Success (in 0:00:00) # Stop prefix: Success (in 0:00:00) # Tag prefix as uninitialized: # Tag prefix as uninitialized: Success (in 0:00:00) @ Cleanup prefix: Success (in 0:00:00) + echo '----------- Cleaning with lago done' ----------- Cleaning with lago done + [[ 0 != \0 ]] + echo '======== Cleanup done' ======== Cleanup done + exit 0 + exit Took 339 seconds =================================== ##! ##! ERROR ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ ##!######################################################## Collecting mock logs ‘./mock_logs.vAtsih3y/mocker-epel-7-x86_64.el7.clean_rpmdb’ -> ‘exported-artifacts/mock_logs/mocker-epel-7-x86_64.el7.clean_rpmdb’ ‘./mock_logs.vAtsih3y/mocker-epel-7-x86_64.el7.he_basic_suite_4.0.sh’ -> ‘exported-artifacts/mock_logs/mocker-epel-7-x86_64.el7.he_basic_suite_4.0.sh’ ‘./mock_logs.vAtsih3y/mocker-epel-7-x86_64.el7.init’ -> ‘exported-artifacts/mock_logs/mocker-epel-7-x86_64.el7.init’ ########################################################## Build step 'Execute shell' marked build as failure Performing Post build task... Match found for :.* : True Logical operation result is TRUE Running script : #!/bin/bash -xe echo 'shell_scripts/system_tests.collect_logs.sh' # # Required jjb vars: # version # VERSION=4.0 SUITE_TYPE= WORKSPACE="$PWD" OVIRT_SUITE="$SUITE_TYPE_suite_$VERSION" TESTS_LOGS="$WORKSPACE/ovirt-system-tests/exported-artifacts" rm -rf "$WORKSPACE/exported-artifacts" mkdir -p "$WORKSPACE/exported-artifacts" if [[ -d "$TESTS_LOGS" ]]; then mv "$TESTS_LOGS/"* "$WORKSPACE/exported-artifacts/" fi [ovirt_4.0_he-system-tests] $ /bin/bash -xe /tmp/hudson275740686906754919.sh + echo shell_scripts/system_tests.collect_logs.sh shell_scripts/system_tests.collect_logs.sh + VERSION=4.0 + SUITE_TYPE= + WORKSPACE=<http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/> + OVIRT_SUITE=4.0 + TESTS_LOGS=<http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ovirt-system-tests/exported-artifacts> + rm -rf <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/634/artifact/exported-artifacts> + mkdir -p <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/634/artifact/exported-artifacts> + [[ -d <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ovirt-system-tests/exported-artifacts> ]] + mv <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ovirt-system-tests/exported-artifacts/failure_msg.txt> <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ovirt-system-tests/exported-artifacts/lago_logs> <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ovirt-system-tests/exported-artifacts/mock_logs> <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/634/artifact/exported-artifacts/> POST BUILD TASK : SUCCESS END OF POST BUILD TASK : 0 Match found for :.* : True Logical operation result is TRUE Running script : #!/bin/bash -x echo "shell-scripts/mock_cleanup.sh" # Make clear this is the cleanup, helps reading the jenkins logs cat <<EOC _______________________________________________________________________ ####################################################################### # # # CLEANUP # # # ####################################################################### EOC shopt -s nullglob WORKSPACE="${WORKSPACE:-$PWD}" UMOUNT_RETRIES="${UMOUNT_RETRIES:-3}" UMOUNT_RETRY_DELAY="${UMOUNT_RETRY_DELAY:-1s}" safe_umount() { local mount="${1:?}" local attempt for ((attempt=0 ; attempt < $UMOUNT_RETRIES ; attempt++)); do # If this is not the 1st time through the loop, Sleep a while to let # the problem "solve itself" [[ attempt > 0 ]] && sleep "$UMOUNT_RETRY_DELAY" # Try to umount sudo umount --lazy "$mount" && return 0 # See if the mount is already not there despite failing findmnt --kernel --first "$mount" > /dev/null && return 0 done echo "ERROR: Failed to umount $mount." return 1 } # restore the permissions in the working dir, as sometimes it leaves files # owned by root and then the 'cleanup workspace' from jenkins job fails to # clean and breaks the jobs sudo chown -R "$USER" "$WORKSPACE" sudo chmod -R u+w "$WORKSPACE" # stop any processes running inside the chroot failed=false mock_confs=("$WORKSPACE"/*/mocker*) # Clean current jobs mockroot if any for mock_conf_file in "${mock_confs[@]}"; do [[ "$mock_conf_file" ]] || continue echo "Cleaning up mock $mock_conf" mock_root="${mock_conf_file##*/}" mock_root="${mock_root%.*}" my_mock="/usr/bin/mock" my_mock+=" --configdir=${mock_conf_file%/*}" my_mock+=" --root=${mock_root}" my_mock+=" --resultdir=$WORKSPACE" #TODO: investigate why mock --clean fails to umount certain dirs sometimes, #so we can use it instead of manually doing all this. echo "Killing all mock orphan processes, if any." $my_mock \ --orphanskill \ || { echo "ERROR: Failed to kill orphans on $chroot." failed=true } mock_root="$(\ grep \ -Po "(?<=config_opts\['root'\] = ')[^']*" \ "$mock_conf_file" \ )" || : [[ "$mock_root" ]] || continue mounts=($(mount | awk '{print $3}' | grep "$mock_root")) || : if [[ "$mounts" ]]; then echo "Found mounted dirs inside the chroot $chroot. Trying to umount." fi for mount in "${mounts[@]}"; do safe_umount "$mount" || failed=true done done # Clean any leftover chroot from other jobs for mock_root in /var/lib/mock/*; do this_chroot_failed=false mounts=($(cut -d\ -f2 /proc/mounts | grep "$mock_root" | sort -r)) || : if [[ "$mounts" ]]; then echo "Found mounted dirs inside the chroot $mock_root." \ "Trying to umount." fi for mount in "${mounts[@]}"; do safe_umount "$mount" && continue # If we got here, we failed $UMOUNT_RETRIES attempts so we should make # noise failed=true this_chroot_failed=true done if ! $this_chroot_failed; then sudo rm -rf "$mock_root" fi done # remove mock caches that are older then 2 days: find /var/cache/mock/ -mindepth 1 -maxdepth 1 -type d -mtime +2 -print0 | \ xargs -0 -tr sudo rm -rf # We make no effort to leave around caches that may still be in use because # packages installed in them may go out of date, so may as well recreate them # Drop all left over libvirt domains for UUID in $(virsh list --all --uuid); do virsh destroy $UUID || : sleep 2 virsh undefine --remove-all-storage --storage vda --snapshots-metadata $UUID || : done if $failed; then echo "Cleanup script failed, propegating failure to job" exit 1 fi [ovirt_4.0_he-system-tests] $ /bin/bash -x /tmp/hudson6987743808340107103.sh + echo shell-scripts/mock_cleanup.sh shell-scripts/mock_cleanup.sh + cat _______________________________________________________________________ ####################################################################### # # # CLEANUP # # # ####################################################################### + shopt -s nullglob + WORKSPACE=<http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/> + UMOUNT_RETRIES=3 + UMOUNT_RETRY_DELAY=1s + sudo chown -R jenkins <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/> + sudo chmod -R u+w <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/> + failed=false + mock_confs=("$WORKSPACE"/*/mocker*) + for mock_conf_file in '"${mock_confs[@]}"' + [[ -n <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ovirt-system-tests/mocker-epel-7-x86_64.el7.cfg> ]] + echo 'Cleaning up mock ' Cleaning up mock + mock_root=mocker-epel-7-x86_64.el7.cfg + mock_root=mocker-epel-7-x86_64.el7 + my_mock=/usr/bin/mock + my_mock+=' --configdir=<http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ovirt-system-tests'> + my_mock+=' --root=mocker-epel-7-x86_64.el7' + my_mock+=' --resultdir=<http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/'> + echo 'Killing all mock orphan processes, if any.' Killing all mock orphan processes, if any. + /usr/bin/mock --configdir=<http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ovirt-system-tests> --root=mocker-epel-7-x86_64.el7 --resultdir=<http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/> --orphanskill WARNING: Could not find required logging config file: <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ovirt-system-tests/logging.ini.> Using default... INFO: mock.py version 1.2.21 starting (python version = 3.4.3)... Start: init plugins INFO: selinux enabled Finish: init plugins Start: run Finish: run ++ grep -Po '(?<=config_opts\['\''root'\''\] = '\'')[^'\'']*' <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ovirt-system-tests/mocker-epel-7-x86_64.el7.cfg> + mock_root=epel-7-x86_64-6f628e6dc1a827c86d5e1bd9d3b3d38b + [[ -n epel-7-x86_64-6f628e6dc1a827c86d5e1bd9d3b3d38b ]] + mounts=($(mount | awk '{print $3}' | grep "$mock_root")) ++ mount ++ awk '{print $3}' ++ grep epel-7-x86_64-6f628e6dc1a827c86d5e1bd9d3b3d38b + : + [[ -n '' ]] + find /var/cache/mock/ -mindepth 1 -maxdepth 1 -type d -mtime +2 -print0 + xargs -0 -tr sudo rm -rf sudo rm -rf /var/cache/mock/epel-7-x86_64-6f628e6dc1a827c86d5e1bd9d3b3d38b ++ virsh list --all --uuid + false POST BUILD TASK : SUCCESS END OF POST BUILD TASK : 1 Recording test results ERROR: Step ‘Publish JUnit test result report’ failed: No test report files were found. Configuration error? Archiving artifacts

See <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/635/changes> Changes: [ngoldin] Collect logs after deploy stage [Juan Hernandez] Take Java SDK version 3 from the 3.6 branch [Juan Hernandez] Take Java SDK version 4 from the 4.1 branch [Juan Hernandez] Take Python SDK version 4 from the 4.1 branch [pkliczewski] Use 4.1 jsonrpc for 4.1 branch [Daniel Belenky] ovirt-optimizer: added build job for ovirt-4.1 [Eyal Edri] renaming system tests to reflect real repos ------------------------------------------ [...truncated 114.57 KB...] ## Tue Jan 17 02:43:27 UTC 2017 Finished env: el7:epel-7-x86_64 ## took 495 seconds ## rc = 1 ########################################################## ##! ERROR vvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvv ##! Last 20 log entries: ./mock_logs.GlwwVtX8/mocker-epel-7-x86_64.el7.he_basic_suite_4.0.sh/he_basic_suite_4.0.sh.log ##! @ Cleanup prefix: # Stop prefix: # Stop prefix: * Stop vms: * Stop vms: Success (in 0:00:00) * Stop nets: * Stop nets: Success (in 0:00:00) # Stop prefix: Success (in 0:00:01) # Tag prefix as uninitialized: # Tag prefix as uninitialized: Success (in 0:00:00) @ Cleanup prefix: Success (in 0:00:01) + echo '----------- Cleaning with lago done' ----------- Cleaning with lago done + [[ 0 != \0 ]] + echo '======== Cleanup done' ======== Cleanup done + exit 0 + exit Took 359 seconds =================================== ##! ##! ERROR ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ ##!######################################################## Collecting mock logs './mock_logs.GlwwVtX8/mocker-epel-7-x86_64.el7.clean_rpmdb' -> 'exported-artifacts/mock_logs/mocker-epel-7-x86_64.el7.clean_rpmdb' './mock_logs.GlwwVtX8/mocker-epel-7-x86_64.el7.he_basic_suite_4.0.sh' -> 'exported-artifacts/mock_logs/mocker-epel-7-x86_64.el7.he_basic_suite_4.0.sh' './mock_logs.GlwwVtX8/mocker-epel-7-x86_64.el7.init' -> 'exported-artifacts/mock_logs/mocker-epel-7-x86_64.el7.init' ########################################################## Build step 'Execute shell' marked build as failure Performing Post build task... Match found for :.* : True Logical operation result is TRUE Running script : #!/bin/bash -xe echo 'shell_scripts/system_tests.collect_logs.sh' # # Required jjb vars: # version # VERSION=4.0 SUITE_TYPE= WORKSPACE="$PWD" OVIRT_SUITE="$SUITE_TYPE_suite_$VERSION" TESTS_LOGS="$WORKSPACE/ovirt-system-tests/exported-artifacts" rm -rf "$WORKSPACE/exported-artifacts" mkdir -p "$WORKSPACE/exported-artifacts" if [[ -d "$TESTS_LOGS" ]]; then mv "$TESTS_LOGS/"* "$WORKSPACE/exported-artifacts/" fi [ovirt_4.0_he-system-tests] $ /bin/bash -xe /tmp/hudson322307095565626474.sh + echo shell_scripts/system_tests.collect_logs.sh shell_scripts/system_tests.collect_logs.sh + VERSION=4.0 + SUITE_TYPE= + WORKSPACE=<http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/> + OVIRT_SUITE=4.0 + TESTS_LOGS=<http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ovirt-system-tests/exported-artifacts> + rm -rf <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/635/artifact/exported-artifacts> + mkdir -p <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/635/artifact/exported-artifacts> + [[ -d <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ovirt-system-tests/exported-artifacts> ]] + mv <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ovirt-system-tests/exported-artifacts/failure_msg.txt> <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ovirt-system-tests/exported-artifacts/lago_logs> <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ovirt-system-tests/exported-artifacts/mock_logs> <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/635/artifact/exported-artifacts/> POST BUILD TASK : SUCCESS END OF POST BUILD TASK : 0 Match found for :.* : True Logical operation result is TRUE Running script : #!/bin/bash -x echo "shell-scripts/mock_cleanup.sh" # Make clear this is the cleanup, helps reading the jenkins logs cat <<EOC _______________________________________________________________________ ####################################################################### # # # CLEANUP # # # ####################################################################### EOC shopt -s nullglob WORKSPACE="${WORKSPACE:-$PWD}" UMOUNT_RETRIES="${UMOUNT_RETRIES:-3}" UMOUNT_RETRY_DELAY="${UMOUNT_RETRY_DELAY:-1s}" safe_umount() { local mount="${1:?}" local attempt for ((attempt=0 ; attempt < $UMOUNT_RETRIES ; attempt++)); do # If this is not the 1st time through the loop, Sleep a while to let # the problem "solve itself" [[ attempt > 0 ]] && sleep "$UMOUNT_RETRY_DELAY" # Try to umount sudo umount --lazy "$mount" && return 0 # See if the mount is already not there despite failing findmnt --kernel --first "$mount" > /dev/null && return 0 done echo "ERROR: Failed to umount $mount." return 1 } # restore the permissions in the working dir, as sometimes it leaves files # owned by root and then the 'cleanup workspace' from jenkins job fails to # clean and breaks the jobs sudo chown -R "$USER" "$WORKSPACE" sudo chmod -R u+w "$WORKSPACE" # stop any processes running inside the chroot failed=false mock_confs=("$WORKSPACE"/*/mocker*) # Clean current jobs mockroot if any for mock_conf_file in "${mock_confs[@]}"; do [[ "$mock_conf_file" ]] || continue echo "Cleaning up mock $mock_conf" mock_root="${mock_conf_file##*/}" mock_root="${mock_root%.*}" my_mock="/usr/bin/mock" my_mock+=" --configdir=${mock_conf_file%/*}" my_mock+=" --root=${mock_root}" my_mock+=" --resultdir=$WORKSPACE" #TODO: investigate why mock --clean fails to umount certain dirs sometimes, #so we can use it instead of manually doing all this. echo "Killing all mock orphan processes, if any." $my_mock \ --orphanskill \ || { echo "ERROR: Failed to kill orphans on $chroot." failed=true } mock_root="$(\ grep \ -Po "(?<=config_opts\['root'\] = ')[^']*" \ "$mock_conf_file" \ )" || : [[ "$mock_root" ]] || continue mounts=($(mount | awk '{print $3}' | grep "$mock_root")) || : if [[ "$mounts" ]]; then echo "Found mounted dirs inside the chroot $chroot. Trying to umount." fi for mount in "${mounts[@]}"; do safe_umount "$mount" || failed=true done done # Clean any leftover chroot from other jobs for mock_root in /var/lib/mock/*; do this_chroot_failed=false mounts=($(cut -d\ -f2 /proc/mounts | grep "$mock_root" | sort -r)) || : if [[ "$mounts" ]]; then echo "Found mounted dirs inside the chroot $mock_root." \ "Trying to umount." fi for mount in "${mounts[@]}"; do safe_umount "$mount" && continue # If we got here, we failed $UMOUNT_RETRIES attempts so we should make # noise failed=true this_chroot_failed=true done if ! $this_chroot_failed; then sudo rm -rf "$mock_root" fi done # remove mock caches that are older then 2 days: find /var/cache/mock/ -mindepth 1 -maxdepth 1 -type d -mtime +2 -print0 | \ xargs -0 -tr sudo rm -rf # We make no effort to leave around caches that may still be in use because # packages installed in them may go out of date, so may as well recreate them # Drop all left over libvirt domains for UUID in $(virsh list --all --uuid); do virsh destroy $UUID || : sleep 2 virsh undefine --remove-all-storage --storage vda --snapshots-metadata $UUID || : done if $failed; then echo "Cleanup script failed, propegating failure to job" exit 1 fi [ovirt_4.0_he-system-tests] $ /bin/bash -x /tmp/hudson2272021942568608892.sh + echo shell-scripts/mock_cleanup.sh shell-scripts/mock_cleanup.sh + cat _______________________________________________________________________ ####################################################################### # # # CLEANUP # # # ####################################################################### + shopt -s nullglob + WORKSPACE=<http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/> + UMOUNT_RETRIES=3 + UMOUNT_RETRY_DELAY=1s + sudo chown -R jenkins <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/> + sudo chmod -R u+w <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/> + failed=false + mock_confs=("$WORKSPACE"/*/mocker*) + for mock_conf_file in '"${mock_confs[@]}"' + [[ -n <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ovirt-system-tests/mocker-epel-7-x86_64.el7.cfg> ]] + echo 'Cleaning up mock ' Cleaning up mock + mock_root=mocker-epel-7-x86_64.el7.cfg + mock_root=mocker-epel-7-x86_64.el7 + my_mock=/usr/bin/mock + my_mock+=' --configdir=<http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ovirt-system-tests'> + my_mock+=' --root=mocker-epel-7-x86_64.el7' + my_mock+=' --resultdir=<http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/'> + echo 'Killing all mock orphan processes, if any.' Killing all mock orphan processes, if any. + /usr/bin/mock --configdir=<http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ovirt-system-tests> --root=mocker-epel-7-x86_64.el7 --resultdir=<http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/> --orphanskill WARNING: Could not find required logging config file: <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ovirt-system-tests/logging.ini.> Using default... INFO: mock.py version 1.2.21 starting (python version = 3.5.1)... Start: init plugins INFO: selinux enabled Finish: init plugins Start: run Finish: run ++ grep -Po '(?<=config_opts\['\''root'\''\] = '\'')[^'\'']*' <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ovirt-system-tests/mocker-epel-7-x86_64.el7.cfg> + mock_root=epel-7-x86_64-6f628e6dc1a827c86d5e1bd9d3b3d38b + [[ -n epel-7-x86_64-6f628e6dc1a827c86d5e1bd9d3b3d38b ]] + mounts=($(mount | awk '{print $3}' | grep "$mock_root")) ++ mount ++ awk '{print $3}' ++ grep epel-7-x86_64-6f628e6dc1a827c86d5e1bd9d3b3d38b + : + [[ -n '' ]] + find /var/cache/mock/ -mindepth 1 -maxdepth 1 -type d -mtime +2 -print0 + xargs -0 -tr sudo rm -rf ++ virsh list --all --uuid + false POST BUILD TASK : SUCCESS END OF POST BUILD TASK : 1 Recording test results ERROR: Step ?Publish JUnit test result report? failed: No test report files were found. Configuration error? Archiving artifacts

See <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/636/changes> Changes: [Gal Ben Haim] Output ldap logs to /var/log [Yedidyah Bar David] Build ovirt-wgt master on fc24 [Sandro Bonazzola] publisher: 4.1: use otopi from 4.1 branch [Yedidyah Bar David] dwh: Move to 4.1 branch [Daniel Belenky] ioprocess: add builds for 4.1 branch ------------------------------------------ [...truncated 114.92 KB...] ## Tue Jan 17 17:22:55 UTC 2017 Finished env: el7:epel-7-x86_64 ## took 359 seconds ## rc = 1 ########################################################## ##! ERROR vvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvv ##! Last 20 log entries: ./mock_logs.axABoum0/mocker-epel-7-x86_64.el7.he_basic_suite_4.0.sh/he_basic_suite_4.0.sh.log ##! @ Cleanup prefix: # Stop prefix: # Stop prefix: * Stop vms: * Stop vms: Success (in 0:00:00) * Stop nets: * Stop nets: Success (in 0:00:00) # Stop prefix: Success (in 0:00:01) # Tag prefix as uninitialized: # Tag prefix as uninitialized: Success (in 0:00:00) @ Cleanup prefix: Success (in 0:00:01) + echo '----------- Cleaning with lago done' ----------- Cleaning with lago done + [[ 0 != \0 ]] + echo '======== Cleanup done' ======== Cleanup done + exit 0 + exit Took 342 seconds =================================== ##! ##! ERROR ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ ##!######################################################## Collecting mock logs './mock_logs.axABoum0/mocker-epel-7-x86_64.el7.clean_rpmdb' -> 'exported-artifacts/mock_logs/mocker-epel-7-x86_64.el7.clean_rpmdb' './mock_logs.axABoum0/mocker-epel-7-x86_64.el7.he_basic_suite_4.0.sh' -> 'exported-artifacts/mock_logs/mocker-epel-7-x86_64.el7.he_basic_suite_4.0.sh' './mock_logs.axABoum0/mocker-epel-7-x86_64.el7.init' -> 'exported-artifacts/mock_logs/mocker-epel-7-x86_64.el7.init' ########################################################## Build step 'Execute shell' marked build as failure Performing Post build task... Match found for :.* : True Logical operation result is TRUE Running script : #!/bin/bash -xe echo 'shell_scripts/system_tests.collect_logs.sh' # # Required jjb vars: # version # VERSION=4.0 SUITE_TYPE= WORKSPACE="$PWD" OVIRT_SUITE="$SUITE_TYPE_suite_$VERSION" TESTS_LOGS="$WORKSPACE/ovirt-system-tests/exported-artifacts" rm -rf "$WORKSPACE/exported-artifacts" mkdir -p "$WORKSPACE/exported-artifacts" if [[ -d "$TESTS_LOGS" ]]; then mv "$TESTS_LOGS/"* "$WORKSPACE/exported-artifacts/" fi [ovirt_4.0_he-system-tests] $ /bin/bash -xe /tmp/hudson6044129963342556999.sh + echo shell_scripts/system_tests.collect_logs.sh shell_scripts/system_tests.collect_logs.sh + VERSION=4.0 + SUITE_TYPE= + WORKSPACE=<http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/> + OVIRT_SUITE=4.0 + TESTS_LOGS=<http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ovirt-system-tests/exported-artifacts> + rm -rf <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/636/artifact/exported-artifacts> + mkdir -p <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/636/artifact/exported-artifacts> + [[ -d <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ovirt-system-tests/exported-artifacts> ]] + mv <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ovirt-system-tests/exported-artifacts/failure_msg.txt> <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ovirt-system-tests/exported-artifacts/lago_logs> <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ovirt-system-tests/exported-artifacts/mock_logs> <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/636/artifact/exported-artifacts/> POST BUILD TASK : SUCCESS END OF POST BUILD TASK : 0 Match found for :.* : True Logical operation result is TRUE Running script : #!/bin/bash -x echo "shell-scripts/mock_cleanup.sh" # Make clear this is the cleanup, helps reading the jenkins logs cat <<EOC _______________________________________________________________________ ####################################################################### # # # CLEANUP # # # ####################################################################### EOC shopt -s nullglob WORKSPACE="${WORKSPACE:-$PWD}" UMOUNT_RETRIES="${UMOUNT_RETRIES:-3}" UMOUNT_RETRY_DELAY="${UMOUNT_RETRY_DELAY:-1s}" safe_umount() { local mount="${1:?}" local attempt for ((attempt=0 ; attempt < $UMOUNT_RETRIES ; attempt++)); do # If this is not the 1st time through the loop, Sleep a while to let # the problem "solve itself" [[ attempt > 0 ]] && sleep "$UMOUNT_RETRY_DELAY" # Try to umount sudo umount --lazy "$mount" && return 0 # See if the mount is already not there despite failing findmnt --kernel --first "$mount" > /dev/null && return 0 done echo "ERROR: Failed to umount $mount." return 1 } # restore the permissions in the working dir, as sometimes it leaves files # owned by root and then the 'cleanup workspace' from jenkins job fails to # clean and breaks the jobs sudo chown -R "$USER" "$WORKSPACE" sudo chmod -R u+w "$WORKSPACE" # stop any processes running inside the chroot failed=false mock_confs=("$WORKSPACE"/*/mocker*) # Clean current jobs mockroot if any for mock_conf_file in "${mock_confs[@]}"; do [[ "$mock_conf_file" ]] || continue echo "Cleaning up mock $mock_conf" mock_root="${mock_conf_file##*/}" mock_root="${mock_root%.*}" my_mock="/usr/bin/mock" my_mock+=" --configdir=${mock_conf_file%/*}" my_mock+=" --root=${mock_root}" my_mock+=" --resultdir=$WORKSPACE" #TODO: investigate why mock --clean fails to umount certain dirs sometimes, #so we can use it instead of manually doing all this. echo "Killing all mock orphan processes, if any." $my_mock \ --orphanskill \ || { echo "ERROR: Failed to kill orphans on $chroot." failed=true } mock_root="$(\ grep \ -Po "(?<=config_opts\['root'\] = ')[^']*" \ "$mock_conf_file" \ )" || : [[ "$mock_root" ]] || continue mounts=($(mount | awk '{print $3}' | grep "$mock_root")) || : if [[ "$mounts" ]]; then echo "Found mounted dirs inside the chroot $chroot. Trying to umount." fi for mount in "${mounts[@]}"; do safe_umount "$mount" || failed=true done done # Clean any leftover chroot from other jobs for mock_root in /var/lib/mock/*; do this_chroot_failed=false mounts=($(cut -d\ -f2 /proc/mounts | grep "$mock_root" | sort -r)) || : if [[ "$mounts" ]]; then echo "Found mounted dirs inside the chroot $mock_root." \ "Trying to umount." fi for mount in "${mounts[@]}"; do safe_umount "$mount" && continue # If we got here, we failed $UMOUNT_RETRIES attempts so we should make # noise failed=true this_chroot_failed=true done if ! $this_chroot_failed; then sudo rm -rf "$mock_root" fi done # remove mock caches that are older then 2 days: find /var/cache/mock/ -mindepth 1 -maxdepth 1 -type d -mtime +2 -print0 | \ xargs -0 -tr sudo rm -rf # We make no effort to leave around caches that may still be in use because # packages installed in them may go out of date, so may as well recreate them # Drop all left over libvirt domains for UUID in $(virsh list --all --uuid); do virsh destroy $UUID || : sleep 2 virsh undefine --remove-all-storage --storage vda --snapshots-metadata $UUID || : done if $failed; then echo "Cleanup script failed, propegating failure to job" exit 1 fi [ovirt_4.0_he-system-tests] $ /bin/bash -x /tmp/hudson6820956725205806647.sh + echo shell-scripts/mock_cleanup.sh shell-scripts/mock_cleanup.sh + cat _______________________________________________________________________ ####################################################################### # # # CLEANUP # # # ####################################################################### + shopt -s nullglob + WORKSPACE=<http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/> + UMOUNT_RETRIES=3 + UMOUNT_RETRY_DELAY=1s + sudo chown -R jenkins <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/> + sudo chmod -R u+w <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/> + failed=false + mock_confs=("$WORKSPACE"/*/mocker*) + for mock_conf_file in '"${mock_confs[@]}"' + [[ -n <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ovirt-system-tests/mocker-epel-7-x86_64.el7.cfg> ]] + echo 'Cleaning up mock ' Cleaning up mock + mock_root=mocker-epel-7-x86_64.el7.cfg + mock_root=mocker-epel-7-x86_64.el7 + my_mock=/usr/bin/mock + my_mock+=' --configdir=<http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ovirt-system-tests'> + my_mock+=' --root=mocker-epel-7-x86_64.el7' + my_mock+=' --resultdir=<http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/'> + echo 'Killing all mock orphan processes, if any.' Killing all mock orphan processes, if any. + /usr/bin/mock --configdir=<http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ovirt-system-tests> --root=mocker-epel-7-x86_64.el7 --resultdir=<http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/> --orphanskill WARNING: Could not find required logging config file: <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ovirt-system-tests/logging.ini.> Using default... INFO: mock.py version 1.2.21 starting (python version = 3.5.1)... Start: init plugins INFO: selinux enabled Finish: init plugins Start: run Finish: run ++ grep -Po '(?<=config_opts\['\''root'\''\] = '\'')[^'\'']*' <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ovirt-system-tests/mocker-epel-7-x86_64.el7.cfg> + mock_root=epel-7-x86_64-6f628e6dc1a827c86d5e1bd9d3b3d38b + [[ -n epel-7-x86_64-6f628e6dc1a827c86d5e1bd9d3b3d38b ]] + mounts=($(mount | awk '{print $3}' | grep "$mock_root")) ++ mount ++ awk '{print $3}' ++ grep epel-7-x86_64-6f628e6dc1a827c86d5e1bd9d3b3d38b + : + [[ -n '' ]] + find /var/cache/mock/ -mindepth 1 -maxdepth 1 -type d -mtime +2 -print0 + xargs -0 -tr sudo rm -rf ++ virsh list --all --uuid + false POST BUILD TASK : SUCCESS END OF POST BUILD TASK : 1 Recording test results ERROR: Step ?Publish JUnit test result report? failed: No test report files were found. Configuration error? Archiving artifacts

See <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/637/changes> Changes: [Gal Ben Haim] Output ldap logs to /var/log ------------------------------------------ [...truncated 114.00 KB...] ## Tue Jan 17 17:34:40 UTC 2017 Finished env: el7:epel-7-x86_64 ## took 342 seconds ## rc = 1 ########################################################## ##! ERROR vvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvv ##! Last 20 log entries: ./mock_logs.5j07N6zr/mocker-epel-7-x86_64.el7.he_basic_suite_4.0.sh/he_basic_suite_4.0.sh.log ##! @ Cleanup prefix: # Stop prefix: # Stop prefix: * Stop vms: * Stop vms: Success (in 0:00:00) * Stop nets: * Stop nets: Success (in 0:00:00) # Stop prefix: Success (in 0:00:01) # Tag prefix as uninitialized: # Tag prefix as uninitialized: Success (in 0:00:00) @ Cleanup prefix: Success (in 0:00:01) + echo '----------- Cleaning with lago done' ----------- Cleaning with lago done + [[ 0 != \0 ]] + echo '======== Cleanup done' ======== Cleanup done + exit 0 + exit Took 327 seconds =================================== ##! ##! ERROR ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ ##!######################################################## Collecting mock logs './mock_logs.5j07N6zr/mocker-epel-7-x86_64.el7.clean_rpmdb' -> 'exported-artifacts/mock_logs/mocker-epel-7-x86_64.el7.clean_rpmdb' './mock_logs.5j07N6zr/mocker-epel-7-x86_64.el7.he_basic_suite_4.0.sh' -> 'exported-artifacts/mock_logs/mocker-epel-7-x86_64.el7.he_basic_suite_4.0.sh' './mock_logs.5j07N6zr/mocker-epel-7-x86_64.el7.init' -> 'exported-artifacts/mock_logs/mocker-epel-7-x86_64.el7.init' ########################################################## Build step 'Execute shell' marked build as failure Performing Post build task... Match found for :.* : True Logical operation result is TRUE Running script : #!/bin/bash -xe echo 'shell_scripts/system_tests.collect_logs.sh' # # Required jjb vars: # version # VERSION=4.0 SUITE_TYPE= WORKSPACE="$PWD" OVIRT_SUITE="$SUITE_TYPE_suite_$VERSION" TESTS_LOGS="$WORKSPACE/ovirt-system-tests/exported-artifacts" rm -rf "$WORKSPACE/exported-artifacts" mkdir -p "$WORKSPACE/exported-artifacts" if [[ -d "$TESTS_LOGS" ]]; then mv "$TESTS_LOGS/"* "$WORKSPACE/exported-artifacts/" fi [ovirt_4.0_he-system-tests] $ /bin/bash -xe /tmp/hudson2412073899698622315.sh + echo shell_scripts/system_tests.collect_logs.sh shell_scripts/system_tests.collect_logs.sh + VERSION=4.0 + SUITE_TYPE= + WORKSPACE=<http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/> + OVIRT_SUITE=4.0 + TESTS_LOGS=<http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ovirt-system-tests/exported-artifacts> + rm -rf <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/637/artifact/exported-artifacts> + mkdir -p <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/637/artifact/exported-artifacts> + [[ -d <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ovirt-system-tests/exported-artifacts> ]] + mv <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ovirt-system-tests/exported-artifacts/failure_msg.txt> <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ovirt-system-tests/exported-artifacts/lago_logs> <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ovirt-system-tests/exported-artifacts/mock_logs> <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/637/artifact/exported-artifacts/> POST BUILD TASK : SUCCESS END OF POST BUILD TASK : 0 Match found for :.* : True Logical operation result is TRUE Running script : #!/bin/bash -x echo "shell-scripts/mock_cleanup.sh" # Make clear this is the cleanup, helps reading the jenkins logs cat <<EOC _______________________________________________________________________ ####################################################################### # # # CLEANUP # # # ####################################################################### EOC shopt -s nullglob WORKSPACE="${WORKSPACE:-$PWD}" UMOUNT_RETRIES="${UMOUNT_RETRIES:-3}" UMOUNT_RETRY_DELAY="${UMOUNT_RETRY_DELAY:-1s}" safe_umount() { local mount="${1:?}" local attempt for ((attempt=0 ; attempt < $UMOUNT_RETRIES ; attempt++)); do # If this is not the 1st time through the loop, Sleep a while to let # the problem "solve itself" [[ attempt > 0 ]] && sleep "$UMOUNT_RETRY_DELAY" # Try to umount sudo umount --lazy "$mount" && return 0 # See if the mount is already not there despite failing findmnt --kernel --first "$mount" > /dev/null && return 0 done echo "ERROR: Failed to umount $mount." return 1 } # restore the permissions in the working dir, as sometimes it leaves files # owned by root and then the 'cleanup workspace' from jenkins job fails to # clean and breaks the jobs sudo chown -R "$USER" "$WORKSPACE" sudo chmod -R u+w "$WORKSPACE" # stop any processes running inside the chroot failed=false mock_confs=("$WORKSPACE"/*/mocker*) # Clean current jobs mockroot if any for mock_conf_file in "${mock_confs[@]}"; do [[ "$mock_conf_file" ]] || continue echo "Cleaning up mock $mock_conf" mock_root="${mock_conf_file##*/}" mock_root="${mock_root%.*}" my_mock="/usr/bin/mock" my_mock+=" --configdir=${mock_conf_file%/*}" my_mock+=" --root=${mock_root}" my_mock+=" --resultdir=$WORKSPACE" #TODO: investigate why mock --clean fails to umount certain dirs sometimes, #so we can use it instead of manually doing all this. echo "Killing all mock orphan processes, if any." $my_mock \ --orphanskill \ || { echo "ERROR: Failed to kill orphans on $chroot." failed=true } mock_root="$(\ grep \ -Po "(?<=config_opts\['root'\] = ')[^']*" \ "$mock_conf_file" \ )" || : [[ "$mock_root" ]] || continue mounts=($(mount | awk '{print $3}' | grep "$mock_root")) || : if [[ "$mounts" ]]; then echo "Found mounted dirs inside the chroot $chroot. Trying to umount." fi for mount in "${mounts[@]}"; do safe_umount "$mount" || failed=true done done # Clean any leftover chroot from other jobs for mock_root in /var/lib/mock/*; do this_chroot_failed=false mounts=($(cut -d\ -f2 /proc/mounts | grep "$mock_root" | sort -r)) || : if [[ "$mounts" ]]; then echo "Found mounted dirs inside the chroot $mock_root." \ "Trying to umount." fi for mount in "${mounts[@]}"; do safe_umount "$mount" && continue # If we got here, we failed $UMOUNT_RETRIES attempts so we should make # noise failed=true this_chroot_failed=true done if ! $this_chroot_failed; then sudo rm -rf "$mock_root" fi done # remove mock caches that are older then 2 days: find /var/cache/mock/ -mindepth 1 -maxdepth 1 -type d -mtime +2 -print0 | \ xargs -0 -tr sudo rm -rf # We make no effort to leave around caches that may still be in use because # packages installed in them may go out of date, so may as well recreate them # Drop all left over libvirt domains for UUID in $(virsh list --all --uuid); do virsh destroy $UUID || : sleep 2 virsh undefine --remove-all-storage --storage vda --snapshots-metadata $UUID || : done if $failed; then echo "Cleanup script failed, propegating failure to job" exit 1 fi [ovirt_4.0_he-system-tests] $ /bin/bash -x /tmp/hudson1600853397459855029.sh + echo shell-scripts/mock_cleanup.sh shell-scripts/mock_cleanup.sh + cat _______________________________________________________________________ ####################################################################### # # # CLEANUP # # # ####################################################################### + shopt -s nullglob + WORKSPACE=<http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/> + UMOUNT_RETRIES=3 + UMOUNT_RETRY_DELAY=1s + sudo chown -R jenkins <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/> + sudo chmod -R u+w <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/> + failed=false + mock_confs=("$WORKSPACE"/*/mocker*) + for mock_conf_file in '"${mock_confs[@]}"' + [[ -n <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ovirt-system-tests/mocker-epel-7-x86_64.el7.cfg> ]] + echo 'Cleaning up mock ' Cleaning up mock + mock_root=mocker-epel-7-x86_64.el7.cfg + mock_root=mocker-epel-7-x86_64.el7 + my_mock=/usr/bin/mock + my_mock+=' --configdir=<http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ovirt-system-tests'> + my_mock+=' --root=mocker-epel-7-x86_64.el7' + my_mock+=' --resultdir=<http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/'> + echo 'Killing all mock orphan processes, if any.' Killing all mock orphan processes, if any. + /usr/bin/mock --configdir=<http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ovirt-system-tests> --root=mocker-epel-7-x86_64.el7 --resultdir=<http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/> --orphanskill WARNING: Could not find required logging config file: <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ovirt-system-tests/logging.ini.> Using default... INFO: mock.py version 1.2.21 starting (python version = 3.5.1)... Start: init plugins INFO: selinux enabled Finish: init plugins Start: run Finish: run ++ grep -Po '(?<=config_opts\['\''root'\''\] = '\'')[^'\'']*' <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ovirt-system-tests/mocker-epel-7-x86_64.el7.cfg> + mock_root=epel-7-x86_64-6f628e6dc1a827c86d5e1bd9d3b3d38b + [[ -n epel-7-x86_64-6f628e6dc1a827c86d5e1bd9d3b3d38b ]] + mounts=($(mount | awk '{print $3}' | grep "$mock_root")) ++ mount ++ awk '{print $3}' ++ grep epel-7-x86_64-6f628e6dc1a827c86d5e1bd9d3b3d38b + : + [[ -n '' ]] + find /var/cache/mock/ -mindepth 1 -maxdepth 1 -type d -mtime +2 -print0 + xargs -0 -tr sudo rm -rf ++ virsh list --all --uuid + false POST BUILD TASK : SUCCESS END OF POST BUILD TASK : 1 Recording test results ERROR: Step ?Publish JUnit test result report? failed: No test report files were found. Configuration error? Archiving artifacts

See <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/638/changes> Changes: [Gal Ben Haim] Output ldap logs to /var/log ------------------------------------------ [...truncated 99.48 KB...] ++ awk '{split($4,a,"."); print a[1] "." a[2] "." a[3] ".1"}' ++ awk -F/ '{print $1}' + HEGW=192.168.200.1 ++ /sbin/ip -4 -o addr show dev eth0 ++ awk '{split($4,a,"."); print a[1] "." a[2] "." a[3] ".99"}' ++ awk -F/ '{print $1}' + HEADDR=192.168.200.99 + echo '192.168.200.99 lago-he-basic-suite-4-0-engine lago-he-basic-suite-4-0-engine.lago.local' + VMPASS=123456 + ENGINEPASS=123 ++ tail -11 ++ ls /usr/share/ovirt-engine-appliance/ovirt-engine-appliance-4.0-20170116.1.el7.centos.ova + OVAIMAGE=/usr/share/ovirt-engine-appliance/ovirt-engine-appliance-4.0-20170116.1.el7.centos.ova + sed -e s,@GW@,192.168.200.1,g -e s,@ADDR@,192.168.200.99,g -e s,@OVAIMAGE@,/usr/share/ovirt-engine-appliance/ovirt-engine-appliance-4.0-20170116.1.el7.centos.ova,g -e s,@VMPASS@,123456,g -e s,@ENGINEPASS@,123,g -e s,@DOMAIN@,lago.local,g -e s,@MYHOSTNAME@,lago-he-basic-suite-4-0-host0,g -e s,@HOSTEDENGINE@,lago-he-basic-suite-4-0-engine,g + fstrim -va /var/tmp: 60 GiB (64392904704 bytes) trimmed + rm -rf /dev/shm/yum + hosted-engine --deploy --config-append=/root/hosted-engine-deploy-answers-file.conf [ INFO ] Stage: Initializing [ INFO ] Generating a temporary VNC password. [ INFO ] Stage: Environment setup During customization use CTRL-D to abort. [ INFO ] Hardware supports virtualization Configuration files: ['/root/hosted-engine-deploy-answers-file.conf'] Log file: /var/log/ovirt-hosted-engine-setup/ovirt-hosted-engine-setup-20170117124637-923bru.log Version: otopi-1.5.3_master (otopi-1.5.3-0.0.master.20160904080106.git4d1e74e.el7.centos) [ INFO ] Stage: Environment packages setup [ INFO ] Stage: Programs detection [ INFO ] Stage: Environment setup [ INFO ] Generating libvirt-spice certificates [WARNING] Cannot locate gluster packages, Hyper Converged setup support will be disabled. [ INFO ] Please abort the setup and install vdsm-gluster, gluster-server >= 3.7.2 and restart vdsmd service in order to gain Hyper Converged setup support. [ INFO ] Stage: Environment customization --== STORAGE CONFIGURATION ==-- [ INFO ] Installing on first host --== SYSTEM CONFIGURATION ==-- --== NETWORK CONFIGURATION ==-- --== VM CONFIGURATION ==-- [ INFO ] Checking OVF archive content (could take a few minutes depending on archive size) [ INFO ] Checking OVF XML content (could take a few minutes depending on archive size) [WARNING] OVF does not contain a valid image description, using default. The following CPU types are supported by this host: - model_Westmere: Intel Westmere Family - model_Nehalem: Intel Nehalem Family - model_Penryn: Intel Penryn Family - model_Conroe: Intel Conroe Family [ ERROR ] Failed to execute stage 'Environment customization': Invalid CPU type specified: model_SandyBridge [ INFO ] Stage: Clean up [ INFO ] Generating answer file '/var/lib/ovirt-hosted-engine-setup/answers/answers-20170117124727.conf' [ INFO ] Stage: Pre-termination [ INFO ] Stage: Termination [ ERROR ] Hosted Engine deployment failed Log file is located at /var/log/ovirt-hosted-engine-setup/ovirt-hosted-engine-setup-20170117124637-923bru.log hosted-engine deploy on lago-he-basic-suite-4-0-host0 failed with status 1. + RET_CODE=1 + '[' 1 -ne 0 ']' + echo 'hosted-engine deploy on lago-he-basic-suite-4-0-host0 failed with status 1.' + exit 1 + RET_CODE=1 + '[' 1 -ne 0 ']' + echo 'hosted-engine setup on lago-he-basic-suite-4-0-host0 failed with status 1.' hosted-engine setup on lago-he-basic-suite-4-0-host0 failed with status 1. + exit 1 + res=1 + exit 1 + cleanup /dev/shm/ost/deployment-he-basic-suite-4.0 + local run_path=/dev/shm/ost/deployment-he-basic-suite-4.0 + echo 'suite.sh: moving artifacts' suite.sh: moving artifacts + rm -rf exported-artifacts + mkdir -p exported-artifacts + [[ -d /dev/shm/ost/deployment-he-basic-suite-4.0/current/logs ]] + mv /dev/shm/ost/deployment-he-basic-suite-4.0/current/logs exported-artifacts/lago_logs + find /dev/shm/ost/deployment-he-basic-suite-4.0 -iname 'nose*.xml' -exec mv '{}' exported-artifacts/ ';' + [[ -d test_logs ]] + [[ -e failure_msg.txt ]] + mv failure_msg.txt exported-artifacts/ + ./run_suite.sh -o /dev/shm/ost/deployment-he-basic-suite-4.0 --cleanup he-basic-suite-4.0 + CLI=lago + DO_CLEANUP=false + RECOMMENDED_RAM_IN_MB=8196 + EXTRA_SOURCES=() ++ getopt -o ho:e:n:b:cs:r: --long help,output:,engine:,node:,boot-iso:,cleanup --long extra-rpm-source,reposync-config: -n run_suite.sh -- -o /dev/shm/ost/deployment-he-basic-suite-4.0 --cleanup he-basic-suite-4.0 + options=' -o '\''/dev/shm/ost/deployment-he-basic-suite-4.0'\'' --cleanup -- '\''he-basic-suite-4.0'\''' + [[ 0 != \0 ]] + eval set -- ' -o '\''/dev/shm/ost/deployment-he-basic-suite-4.0'\'' --cleanup -- '\''he-basic-suite-4.0'\''' ++ set -- -o /dev/shm/ost/deployment-he-basic-suite-4.0 --cleanup -- he-basic-suite-4.0 + true + case $1 in ++ realpath /dev/shm/ost/deployment-he-basic-suite-4.0 + PREFIX=/dev/shm/ost/deployment-he-basic-suite-4.0 + shift 2 + true + case $1 in + DO_CLEANUP=true + shift + true + case $1 in + shift + break + [[ -z he-basic-suite-4.0 ]] ++ realpath he-basic-suite-4.0 + export SUITE=<http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ovirt-system-tests/he-basic-suite-4.0> + SUITE=<http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ovirt-system-tests/he-basic-suite-4.0> + '[' -z /dev/shm/ost/deployment-he-basic-suite-4.0 ']' + true + env_cleanup + echo '#########################' ######################### + local res=0 + local uuid + echo '======== Cleaning up' ======== Cleaning up + [[ -e /dev/shm/ost/deployment-he-basic-suite-4.0 ]] + echo '----------- Cleaning with lago' ----------- Cleaning with lago + lago --workdir /dev/shm/ost/deployment-he-basic-suite-4.0 destroy --yes --all-prefixes current session does not belong to lago group. @ Cleanup prefix: # Stop prefix: # Stop prefix: * Stop vms: * Stop vms: Success (in 0:00:00) * Stop nets: * Stop nets: Success (in 0:00:00) # Stop prefix: Success (in 0:00:00) # Tag prefix as uninitialized: # Tag prefix as uninitialized: Success (in 0:00:00) @ Cleanup prefix: Success (in 0:00:00) + echo '----------- Cleaning with lago done' ----------- Cleaning with lago done + [[ 0 != \0 ]] + echo '======== Cleanup done' ======== Cleanup done + exit 0 + exit Took 318 seconds =================================== logout Finish: shell @@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@ @@ Tue Jan 17 17:47:29 UTC 2017 automation/he_basic_suite_4.0.sh chroot finished @@ took 414 seconds @@ rc = 1 @@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@ ========== Scrubbing chroot mock \ --configdir="<http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ovirt-system-tests"> \ --root="mocker-epel-7-x86_64.el7" \ --resultdir="./mock_logs.LEwg3iWs/mocker-epel-7-x86_64.el7.scrub" \ --scrub=chroot WARNING: Could not find required logging config file: <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ovirt-system-tests/logging.ini.> Using default... INFO: mock.py version 1.2.21 starting (python version = 3.4.3)... Start: init plugins INFO: selinux enabled Finish: init plugins Start: run Start: scrub ['chroot'] INFO: scrubbing chroot for mocker-epel-7-x86_64.el7 Finish: scrub ['chroot'] Finish: run Scrub chroot took 6 seconds ============================ ########################################################## ## Tue Jan 17 17:47:35 UTC 2017 Finished env: el7:epel-7-x86_64 ## took 420 seconds ## rc = 1 ########################################################## ##! ERROR vvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvv ##! Last 20 log entries: ./mock_logs.LEwg3iWs/mocker-epel-7-x86_64.el7.he_basic_suite_4.0.sh/he_basic_suite_4.0.sh.log ##! @ Cleanup prefix: # Stop prefix: # Stop prefix: * Stop vms: * Stop vms: Success (in 0:00:00) * Stop nets: * Stop nets: Success (in 0:00:00) # Stop prefix: Success (in 0:00:00) # Tag prefix as uninitialized: # Tag prefix as uninitialized: Success (in 0:00:00) @ Cleanup prefix: Success (in 0:00:00) + echo '----------- Cleaning with lago done' ----------- Cleaning with lago done + [[ 0 != \0 ]] + echo '======== Cleanup done' ======== Cleanup done + exit 0 + exit Took 318 seconds =================================== ##! ##! ERROR ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ ##!######################################################## Collecting mock logs ‘./mock_logs.LEwg3iWs/mocker-epel-7-x86_64.el7.clean_rpmdb’ -> ‘exported-artifacts/mock_logs/mocker-epel-7-x86_64.el7.clean_rpmdb’ ‘./mock_logs.LEwg3iWs/mocker-epel-7-x86_64.el7.he_basic_suite_4.0.sh’ -> ‘exported-artifacts/mock_logs/mocker-epel-7-x86_64.el7.he_basic_suite_4.0.sh’ ‘./mock_logs.LEwg3iWs/mocker-epel-7-x86_64.el7.init’ -> ‘exported-artifacts/mock_logs/mocker-epel-7-x86_64.el7.init’ ########################################################## Build step 'Execute shell' marked build as failure Performing Post build task... Match found for :.* : True Logical operation result is TRUE Running script : #!/bin/bash -xe echo 'shell_scripts/system_tests.collect_logs.sh' # # Required jjb vars: # version # VERSION=4.0 SUITE_TYPE= WORKSPACE="$PWD" OVIRT_SUITE="$SUITE_TYPE_suite_$VERSION" TESTS_LOGS="$WORKSPACE/ovirt-system-tests/exported-artifacts" rm -rf "$WORKSPACE/exported-artifacts" mkdir -p "$WORKSPACE/exported-artifacts" if [[ -d "$TESTS_LOGS" ]]; then mv "$TESTS_LOGS/"* "$WORKSPACE/exported-artifacts/" fi [ovirt_4.0_he-system-tests] $ /bin/bash -xe /tmp/hudson8948629729969729385.sh + echo shell_scripts/system_tests.collect_logs.sh shell_scripts/system_tests.collect_logs.sh + VERSION=4.0 + SUITE_TYPE= + WORKSPACE=<http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/> + OVIRT_SUITE=4.0 + TESTS_LOGS=<http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ovirt-system-tests/exported-artifacts> + rm -rf <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/638/artifact/exported-artifacts> + mkdir -p <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/638/artifact/exported-artifacts> + [[ -d <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ovirt-system-tests/exported-artifacts> ]] + mv <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ovirt-system-tests/exported-artifacts/failure_msg.txt> <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ovirt-system-tests/exported-artifacts/lago_logs> <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ovirt-system-tests/exported-artifacts/mock_logs> <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/638/artifact/exported-artifacts/> POST BUILD TASK : SUCCESS END OF POST BUILD TASK : 0 Recording test results ERROR: Step ‘Publish JUnit test result report’ failed: No test report files were found. Configuration error? Archiving artifacts

participants (6)
-
Barak Korren
-
Evgheni Dereveanchin
-
jenkins@jenkins.phx.ovirt.org
-
Lev Veyde
-
Milan Zamazal
-
Yaniv Kaul