Build failed in Jenkins: ovirt_master_system-tests #243

See <http://jenkins.ovirt.org/job/ovirt_master_system-tests/243/changes> Changes: [Eyal Edri] add hystrix deps to yum repos include list [Eyal Edri] refresh fedora versions and release versions for ovirt-engine [Sandro Bonazzola] ovirt-engine_upgrade-db: drop 3.6.7 jobs [Shirly Radco] Replacing jpackage repo for 3.6 dwh ------------------------------------------ [...truncated 485 lines...] ## rc = 1 ########################################################## ##! ERROR vvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvv ##! Last 20 log enties: logs/mocker-fedora-23-x86_64.fc23.basic_suite_master.sh/basic_suite_master.sh.log ##! File "/usr/lib/python2.7/site-packages/lago/cmd.py", line 255, in do_start prefix.start(vm_names=vm_names) File "/usr/lib/python2.7/site-packages/lago/prefix.py", line 958, in start self.virt_env.start(vm_names=vm_names) File "/usr/lib/python2.7/site-packages/lago/virt.py", line 182, in start vm.start() File "/usr/lib/python2.7/site-packages/lago/plugins/vm.py", line 247, in start return self.provider.start(*args, **kwargs) File "/usr/lib/python2.7/site-packages/lago/vm.py", line 93, in start self.libvirt_con.createXML(self._libvirt_xml()) File "/usr/lib64/python2.7/site-packages/libvirt.py", line 3611, in createXML if ret is None:raise libvirtError('virDomainCreateXML() failed', conn=self) libvirtError: Failed to connect socket to '/var/run/libvirt/virtlogd-sock': No such file or directory ######################### ======== Cleaning up ----------- Cleaning with lago ----------- Cleaning with lago done ======== Cleanup done Took 197 seconds =================================== ##! ##! ERROR ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ ##!######################################################## ########################################################## Build step 'Execute shell' marked build as failure Performing Post build task... Match found for :.* : True Logical operation result is TRUE Running script : #!/bin/bash -xe echo 'shell_scripts/system_tests.collect_logs.sh' # # Required jjb vars: # version # VERSION=master SUITE_TYPE= WORKSPACE="$PWD" OVIRT_SUITE="$SUITE_TYPE_suite_$VERSION" TESTS_LOGS="$WORKSPACE/ovirt-system-tests/exported-artifacts" rm -rf "$WORKSPACE/exported-artifacts" mkdir -p "$WORKSPACE/exported-artifacts" if [[ -d "$TESTS_LOGS" ]]; then mv "$TESTS_LOGS/"* "$WORKSPACE/exported-artifacts/" fi [ovirt_master_system-tests] $ /bin/bash -xe /tmp/hudson703448189995999079.sh + echo shell_scripts/system_tests.collect_logs.sh shell_scripts/system_tests.collect_logs.sh + VERSION=master + SUITE_TYPE= + WORKSPACE=<http://jenkins.ovirt.org/job/ovirt_master_system-tests/ws/> + OVIRT_SUITE=master + TESTS_LOGS=<http://jenkins.ovirt.org/job/ovirt_master_system-tests/ws/ovirt-system-tests/exported-artifacts> + rm -rf <http://jenkins.ovirt.org/job/ovirt_master_system-tests/243/artifact/exported-artifacts> + mkdir -p <http://jenkins.ovirt.org/job/ovirt_master_system-tests/243/artifact/exported-artifacts> + [[ -d <http://jenkins.ovirt.org/job/ovirt_master_system-tests/ws/ovirt-system-tests/exported-artifacts> ]] + mv <http://jenkins.ovirt.org/job/ovirt_master_system-tests/ws/ovirt-system-tests/exported-artifacts/lago_logs> <http://jenkins.ovirt.org/job/ovirt_master_system-tests/243/artifact/exported-artifacts/> POST BUILD TASK : SUCCESS END OF POST BUILD TASK : 0 Match found for :.* : True Logical operation result is TRUE Running script : #!/bin/bash -xe echo "shell-scripts/mock_cleanup.sh" shopt -s nullglob WORKSPACE="$PWD" # Make clear this is the cleanup, helps reading the jenkins logs cat <<EOC _______________________________________________________________________ ####################################################################### # # # CLEANUP # # # ####################################################################### EOC # Archive the logs, we want them anyway logs=( ./*log ./*/logs ) if [[ "$logs" ]]; then tar cvzf exported-artifacts/logs.tgz "${logs[@]}" rm -rf "${logs[@]}" fi # stop any processes running inside the chroot failed=false mock_confs=("$WORKSPACE"/*/mocker*) # Clean current jobs mockroot if any for mock_conf_file in "${mock_confs[@]}"; do [[ "$mock_conf_file" ]] || continue echo "Cleaning up mock $mock_conf" mock_root="${mock_conf_file##*/}" mock_root="${mock_root%.*}" my_mock="/usr/bin/mock" my_mock+=" --configdir=${mock_conf_file%/*}" my_mock+=" --root=${mock_root}" my_mock+=" --resultdir=$WORKSPACE" #TODO: investigate why mock --clean fails to umount certain dirs sometimes, #so we can use it instead of manually doing all this. echo "Killing all mock orphan processes, if any." $my_mock \ --orphanskill \ || { echo "ERROR: Failed to kill orphans on $chroot." failed=true } mock_root="$(\ grep \ -Po "(?<=config_opts\['root'\] = ')[^']*" \ "$mock_conf_file" \ )" || : [[ "$mock_root" ]] || continue mounts=($(mount | awk '{print $3}' | grep "$mock_root")) || : if [[ "$mounts" ]]; then echo "Found mounted dirs inside the chroot $chroot. Trying to umount." fi for mount in "${mounts[@]}"; do sudo umount --lazy "$mount" \ || { echo "ERROR: Failed to umount $mount." failed=true } done done # Clean any leftover chroot from other jobs for mock_root in /var/lib/mock/*; do this_chroot_failed=false mounts=($(mount | awk '{print $3}' | grep "$mock_root")) || : if [[ "$mounts" ]]; then echo "Found mounted dirs inside the chroot $mock_root." \ "Trying to umount." fi for mount in "${mounts[@]}"; do sudo umount --lazy "$mount" \ || { echo "ERROR: Failed to umount $mount." failed=true this_chroot_failed=true } done if ! $this_chroot_failed; then sudo rm -rf "$mock_root" fi done if $failed; then echo "Aborting." exit 1 fi # remove mock system cache, we will setup proxies to do the caching and this # takes lots of space between runs shopt -u nullglob sudo rm -Rf /var/cache/mock/* # restore the permissions in the working dir, as sometimes it leaves files # owned by root and then the 'cleanup workspace' from jenkins job fails to # clean and breaks the jobs sudo chown -R "$USER" "$WORKSPACE" [ovirt_master_system-tests] $ /bin/bash -xe /tmp/hudson505872539550784673.sh + echo shell-scripts/mock_cleanup.sh shell-scripts/mock_cleanup.sh + shopt -s nullglob + WORKSPACE=<http://jenkins.ovirt.org/job/ovirt_master_system-tests/ws/> + cat _______________________________________________________________________ ####################################################################### # # # CLEANUP # # # ####################################################################### + logs=(./*log ./*/logs) + [[ -n ./ovirt-system-tests/logs ]] + tar cvzf exported-artifacts/logs.tgz ./ovirt-system-tests/logs ./ovirt-system-tests/logs/ ./ovirt-system-tests/logs/mocker-fedora-23-x86_64.fc23.init/ ./ovirt-system-tests/logs/mocker-fedora-23-x86_64.fc23.init/stdout_stderr.log ./ovirt-system-tests/logs/mocker-fedora-23-x86_64.fc23.init/state.log ./ovirt-system-tests/logs/mocker-fedora-23-x86_64.fc23.init/build.log ./ovirt-system-tests/logs/mocker-fedora-23-x86_64.fc23.init/root.log ./ovirt-system-tests/logs/mocker-fedora-23-x86_64.fc23.install_packages/ ./ovirt-system-tests/logs/mocker-fedora-23-x86_64.fc23.install_packages/stdout_stderr.log ./ovirt-system-tests/logs/mocker-fedora-23-x86_64.fc23.install_packages/state.log ./ovirt-system-tests/logs/mocker-fedora-23-x86_64.fc23.install_packages/build.log ./ovirt-system-tests/logs/mocker-fedora-23-x86_64.fc23.install_packages/root.log ./ovirt-system-tests/logs/mocker-fedora-23-x86_64.fc23.clean_rpmdb/ ./ovirt-system-tests/logs/mocker-fedora-23-x86_64.fc23.clean_rpmdb/stdout_stderr.log ./ovirt-system-tests/logs/mocker-fedora-23-x86_64.fc23.basic_suite_master.sh/ ./ovirt-system-tests/logs/mocker-fedora-23-x86_64.fc23.basic_suite_master.sh/basic_suite_master.sh.log + rm -rf ./ovirt-system-tests/logs + failed=false + mock_confs=("$WORKSPACE"/*/mocker*) + for mock_conf_file in '"${mock_confs[@]}"' + [[ -n <http://jenkins.ovirt.org/job/ovirt_master_system-tests/ws/ovirt-system-tests/mocker-fedora-23-x86_64.fc23.cfg> ]] + echo 'Cleaning up mock ' Cleaning up mock + mock_root=mocker-fedora-23-x86_64.fc23.cfg + mock_root=mocker-fedora-23-x86_64.fc23 + my_mock=/usr/bin/mock + my_mock+=' --configdir=<http://jenkins.ovirt.org/job/ovirt_master_system-tests/ws/ovirt-system-tests'> + my_mock+=' --root=mocker-fedora-23-x86_64.fc23' + my_mock+=' --resultdir=<http://jenkins.ovirt.org/job/ovirt_master_system-tests/ws/'> + echo 'Killing all mock orphan processes, if any.' Killing all mock orphan processes, if any. + /usr/bin/mock --configdir=<http://jenkins.ovirt.org/job/ovirt_master_system-tests/ws/ovirt-system-tests> --root=mocker-fedora-23-x86_64.fc23 --resultdir=<http://jenkins.ovirt.org/job/ovirt_master_system-tests/ws/> --orphanskill WARNING: Could not find required logging config file: <http://jenkins.ovirt.org/job/ovirt_master_system-tests/ws/ovirt-system-tests/logging.ini.> Using default... INFO: mock.py version 1.2.18 starting (python version = 3.5.1)... Start: init plugins INFO: selinux enabled Finish: init plugins Start: run Finish: run ++ grep -Po '(?<=config_opts\['\''root'\''\] = '\'')[^'\'']*' <http://jenkins.ovirt.org/job/ovirt_master_system-tests/ws/ovirt-system-tests/mocker-fedora-23-x86_64.fc23.cfg> + mock_root=fedora-23-x86_64-4ddbe48c8f8b7d8c2c3635b52313f04a + [[ -n fedora-23-x86_64-4ddbe48c8f8b7d8c2c3635b52313f04a ]] + mounts=($(mount | awk '{print $3}' | grep "$mock_root")) ++ mount ++ awk '{print $3}' ++ grep fedora-23-x86_64-4ddbe48c8f8b7d8c2c3635b52313f04a + : + [[ -n '' ]] + false + shopt -u nullglob + sudo rm -Rf /var/cache/mock/fedora-23-x86_64-4ddbe48c8f8b7d8c2c3635b52313f04a + sudo chown -R jenkins <http://jenkins.ovirt.org/job/ovirt_master_system-tests/ws/> POST BUILD TASK : SUCCESS END OF POST BUILD TASK : 1 Recording test results ERROR: Step ?Publish JUnit test result report? failed: No test report files were found. Configuration error? Archiving artifacts

This looks like a bug in libvirt? Tolik mentioned something in a socket name which is too long, anyone seen it before? *15:37:11* libvirt: XML-RPC error : Failed to connect socket to '/var/run/libvirt/virtlogd-sock': No such file or directory*15:37:11* * Starting VM lago_basic_suite_master_storage: ERROR (in 0:00:00)*15:37:11* # Start vms: ERROR (in 0:00:00)*15:37:11* # Destroy network lago_basic_suite_master_lago: *15:37:11* # Destroy network lago_basic_suite_master_lago: ERROR (in 0:00:00)*15:37:11* @ Start Prefix: ERROR (in 0:00:00)*15:37:11* Error occured, aborting*15:37:11* Traceback (most recent call last):*15:37:11* File "/usr/lib/python2.7/site-packages/lago/cmd.py", line 691, in main*15:37:11* cli_plugins[args.verb].do_run(args)*15:37:11* File "/usr/lib/python2.7/site-packages/lago/plugins/cli.py", line 180, in do_run*15:37:11* self._do_run(**vars(args))*15:37:11* File "/usr/lib/python2.7/site-packages/lago/utils.py", line 488, in wrapper*15:37:11* return func(*args, **kwargs)*15:37:11* File "/usr/lib/python2.7/site-packages/lago/utils.py", line 499, in wrapper*15:37:11* return func(*args, prefix=prefix, **kwargs)*15:37:11* File "/usr/lib/python2.7/site-packages/lago/cmd.py", line 255, in do_start*15:37:11* prefix.start(vm_names=vm_names)*15:37:11* File "/usr/lib/python2.7/site-packages/lago/prefix.py", line 958, in start*15:37:11* self.virt_env.start(vm_names=vm_names)*15:37:11* File "/usr/lib/python2.7/site-packages/lago/virt.py", line 182, in start*15:37:11* vm.start()*15:37:11* File "/usr/lib/python2.7/site-packages/lago/plugins/vm.py", line 247, in start*15:37:11* return self.provider.start(*args, **kwargs)*15:37:11* File "/usr/lib/python2.7/site-packages/lago/vm.py", line 93, in start*15:37:11* self.libvirt_con.createXML(self._libvirt_xml())*15:37:11* File "/usr/lib64/python2.7/site-packages/libvirt.py", line 3611, in createXML*15:37:11* if ret is None:raise libvirtError('virDomainCreateXML() failed', conn=self)*15:37:11* libvirtError: Failed to connect socket to '/var/run/libvirt/virtlogd-sock': No such file or directory*15:37:11* ######################### On Thu, Jul 7, 2016 at 6:37 PM, <jenkins@jenkins.phx.ovirt.org> wrote:
See <http://jenkins.ovirt.org/job/ovirt_master_system-tests/243/changes>
Changes:
[Eyal Edri] add hystrix deps to yum repos include list
[Eyal Edri] refresh fedora versions and release versions for ovirt-engine
[Sandro Bonazzola] ovirt-engine_upgrade-db: drop 3.6.7 jobs
[Shirly Radco] Replacing jpackage repo for 3.6 dwh
------------------------------------------ [...truncated 485 lines...] ## rc = 1 ########################################################## ##! ERROR vvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvv ##! Last 20 log enties: logs/ mocker-fedora-23-x86_64.fc23.basic_suite_master.sh/basic_suite_master.sh.log ##! File "/usr/lib/python2.7/site-packages/lago/cmd.py", line 255, in do_start prefix.start(vm_names=vm_names) File "/usr/lib/python2.7/site-packages/lago/prefix.py", line 958, in start self.virt_env.start(vm_names=vm_names) File "/usr/lib/python2.7/site-packages/lago/virt.py", line 182, in start vm.start() File "/usr/lib/python2.7/site-packages/lago/plugins/vm.py", line 247, in start return self.provider.start(*args, **kwargs) File "/usr/lib/python2.7/site-packages/lago/vm.py", line 93, in start self.libvirt_con.createXML(self._libvirt_xml()) File "/usr/lib64/python2.7/site-packages/libvirt.py", line 3611, in createXML if ret is None:raise libvirtError('virDomainCreateXML() failed', conn=self) libvirtError: Failed to connect socket to '/var/run/libvirt/virtlogd-sock': No such file or directory ######################### ======== Cleaning up ----------- Cleaning with lago ----------- Cleaning with lago done ======== Cleanup done Took 197 seconds =================================== ##! ##! ERROR ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ ##!######################################################## ########################################################## Build step 'Execute shell' marked build as failure Performing Post build task... Match found for :.* : True Logical operation result is TRUE Running script : #!/bin/bash -xe echo 'shell_scripts/system_tests.collect_logs.sh'
# # Required jjb vars: # version # VERSION=master SUITE_TYPE=
WORKSPACE="$PWD" OVIRT_SUITE="$SUITE_TYPE_suite_$VERSION" TESTS_LOGS="$WORKSPACE/ovirt-system-tests/exported-artifacts"
rm -rf "$WORKSPACE/exported-artifacts" mkdir -p "$WORKSPACE/exported-artifacts"
if [[ -d "$TESTS_LOGS" ]]; then mv "$TESTS_LOGS/"* "$WORKSPACE/exported-artifacts/" fi
[ovirt_master_system-tests] $ /bin/bash -xe /tmp/hudson703448189995999079.sh + echo shell_scripts/system_tests.collect_logs.sh shell_scripts/system_tests.collect_logs.sh + VERSION=master + SUITE_TYPE= + WORKSPACE=<http://jenkins.ovirt.org/job/ovirt_master_system-tests/ws/> + OVIRT_SUITE=master + TESTS_LOGS=< http://jenkins.ovirt.org/job/ovirt_master_system-tests/ws/ovirt-system-tests...
+ rm -rf < http://jenkins.ovirt.org/job/ovirt_master_system-tests/243/artifact/exported...
+ mkdir -p < http://jenkins.ovirt.org/job/ovirt_master_system-tests/243/artifact/exported...
+ [[ -d < http://jenkins.ovirt.org/job/ovirt_master_system-tests/ws/ovirt-system-tests/exported-artifacts> ]] + mv < http://jenkins.ovirt.org/job/ovirt_master_system-tests/ws/ovirt-system-tests/exported-artifacts/lago_logs> < http://jenkins.ovirt.org/job/ovirt_master_system-tests/243/artifact/exported...
POST BUILD TASK : SUCCESS END OF POST BUILD TASK : 0 Match found for :.* : True Logical operation result is TRUE Running script : #!/bin/bash -xe echo "shell-scripts/mock_cleanup.sh"
shopt -s nullglob
WORKSPACE="$PWD"
# Make clear this is the cleanup, helps reading the jenkins logs cat <<EOC _______________________________________________________________________ ####################################################################### # # # CLEANUP # # # ####################################################################### EOC
# Archive the logs, we want them anyway logs=( ./*log ./*/logs ) if [[ "$logs" ]]; then tar cvzf exported-artifacts/logs.tgz "${logs[@]}" rm -rf "${logs[@]}" fi
# stop any processes running inside the chroot failed=false mock_confs=("$WORKSPACE"/*/mocker*) # Clean current jobs mockroot if any for mock_conf_file in "${mock_confs[@]}"; do [[ "$mock_conf_file" ]] || continue echo "Cleaning up mock $mock_conf" mock_root="${mock_conf_file##*/}" mock_root="${mock_root%.*}" my_mock="/usr/bin/mock" my_mock+=" --configdir=${mock_conf_file%/*}" my_mock+=" --root=${mock_root}" my_mock+=" --resultdir=$WORKSPACE"
#TODO: investigate why mock --clean fails to umount certain dirs sometimes, #so we can use it instead of manually doing all this. echo "Killing all mock orphan processes, if any." $my_mock \ --orphanskill \ || { echo "ERROR: Failed to kill orphans on $chroot." failed=true }
mock_root="$(\ grep \ -Po "(?<=config_opts\['root'\] = ')[^']*" \ "$mock_conf_file" \ )" || : [[ "$mock_root" ]] || continue mounts=($(mount | awk '{print $3}' | grep "$mock_root")) || : if [[ "$mounts" ]]; then echo "Found mounted dirs inside the chroot $chroot. Trying to umount." fi for mount in "${mounts[@]}"; do sudo umount --lazy "$mount" \ || { echo "ERROR: Failed to umount $mount." failed=true } done done
# Clean any leftover chroot from other jobs for mock_root in /var/lib/mock/*; do this_chroot_failed=false mounts=($(mount | awk '{print $3}' | grep "$mock_root")) || : if [[ "$mounts" ]]; then echo "Found mounted dirs inside the chroot $mock_root." \ "Trying to umount." fi for mount in "${mounts[@]}"; do sudo umount --lazy "$mount" \ || { echo "ERROR: Failed to umount $mount." failed=true this_chroot_failed=true } done if ! $this_chroot_failed; then sudo rm -rf "$mock_root" fi done
if $failed; then echo "Aborting." exit 1 fi
# remove mock system cache, we will setup proxies to do the caching and this # takes lots of space between runs shopt -u nullglob sudo rm -Rf /var/cache/mock/*
# restore the permissions in the working dir, as sometimes it leaves files # owned by root and then the 'cleanup workspace' from jenkins job fails to # clean and breaks the jobs sudo chown -R "$USER" "$WORKSPACE"
[ovirt_master_system-tests] $ /bin/bash -xe /tmp/hudson505872539550784673.sh + echo shell-scripts/mock_cleanup.sh shell-scripts/mock_cleanup.sh + shopt -s nullglob + WORKSPACE=<http://jenkins.ovirt.org/job/ovirt_master_system-tests/ws/> + cat _______________________________________________________________________ ####################################################################### # # # CLEANUP # # # ####################################################################### + logs=(./*log ./*/logs) + [[ -n ./ovirt-system-tests/logs ]] + tar cvzf exported-artifacts/logs.tgz ./ovirt-system-tests/logs ./ovirt-system-tests/logs/ ./ovirt-system-tests/logs/mocker-fedora-23-x86_64.fc23.init/
./ovirt-system-tests/logs/mocker-fedora-23-x86_64.fc23.init/stdout_stderr.log ./ovirt-system-tests/logs/mocker-fedora-23-x86_64.fc23.init/state.log ./ovirt-system-tests/logs/mocker-fedora-23-x86_64.fc23.init/build.log ./ovirt-system-tests/logs/mocker-fedora-23-x86_64.fc23.init/root.log ./ovirt-system-tests/logs/mocker-fedora-23-x86_64.fc23.install_packages/
./ovirt-system-tests/logs/mocker-fedora-23-x86_64.fc23.install_packages/stdout_stderr.log
./ovirt-system-tests/logs/mocker-fedora-23-x86_64.fc23.install_packages/state.log
./ovirt-system-tests/logs/mocker-fedora-23-x86_64.fc23.install_packages/build.log
./ovirt-system-tests/logs/mocker-fedora-23-x86_64.fc23.install_packages/root.log ./ovirt-system-tests/logs/mocker-fedora-23-x86_64.fc23.clean_rpmdb/
./ovirt-system-tests/logs/mocker-fedora-23-x86_64.fc23.clean_rpmdb/stdout_stderr.log ./ovirt-system-tests/logs/ mocker-fedora-23-x86_64.fc23.basic_suite_master.sh/
./ovirt-system-tests/logs/mocker-fedora-23-x86_64.fc23.basic_suite_master.sh/basic_suite_master.sh.log + rm -rf ./ovirt-system-tests/logs + failed=false + mock_confs=("$WORKSPACE"/*/mocker*) + for mock_conf_file in '"${mock_confs[@]}"' + [[ -n < http://jenkins.ovirt.org/job/ovirt_master_system-tests/ws/ovirt-system-tests/mocker-fedora-23-x86_64.fc23.cfg> ]] + echo 'Cleaning up mock ' Cleaning up mock + mock_root=mocker-fedora-23-x86_64.fc23.cfg + mock_root=mocker-fedora-23-x86_64.fc23 + my_mock=/usr/bin/mock + my_mock+=' --configdir=< http://jenkins.ovirt.org/job/ovirt_master_system-tests/ws/ovirt-system-tests '> + my_mock+=' --root=mocker-fedora-23-x86_64.fc23' + my_mock+=' --resultdir=< http://jenkins.ovirt.org/job/ovirt_master_system-tests/ws/'> + echo 'Killing all mock orphan processes, if any.' Killing all mock orphan processes, if any. + /usr/bin/mock --configdir=< http://jenkins.ovirt.org/job/ovirt_master_system-tests/ws/ovirt-system-tests> --root=mocker-fedora-23-x86_64.fc23 --resultdir=< http://jenkins.ovirt.org/job/ovirt_master_system-tests/ws/> --orphanskill WARNING: Could not find required logging config file: < http://jenkins.ovirt.org/job/ovirt_master_system-tests/ws/ovirt-system-tests/logging.ini.> Using default... INFO: mock.py version 1.2.18 starting (python version = 3.5.1)... Start: init plugins INFO: selinux enabled Finish: init plugins Start: run Finish: run ++ grep -Po '(?<=config_opts\['\''root'\''\] = '\'')[^'\'']*' < http://jenkins.ovirt.org/job/ovirt_master_system-tests/ws/ovirt-system-tests...
+ mock_root=fedora-23-x86_64-4ddbe48c8f8b7d8c2c3635b52313f04a + [[ -n fedora-23-x86_64-4ddbe48c8f8b7d8c2c3635b52313f04a ]] + mounts=($(mount | awk '{print $3}' | grep "$mock_root")) ++ mount ++ awk '{print $3}' ++ grep fedora-23-x86_64-4ddbe48c8f8b7d8c2c3635b52313f04a + : + [[ -n '' ]] + false + shopt -u nullglob + sudo rm -Rf /var/cache/mock/fedora-23-x86_64-4ddbe48c8f8b7d8c2c3635b52313f04a + sudo chown -R jenkins < http://jenkins.ovirt.org/job/ovirt_master_system-tests/ws/> POST BUILD TASK : SUCCESS END OF POST BUILD TASK : 1 Recording test results ERROR: Step ?Publish JUnit test result report? failed: No test report files were found. Configuration error? Archiving artifacts _______________________________________________ Infra mailing list Infra@ovirt.org http://lists.ovirt.org/mailman/listinfo/infra
-- Eyal Edri Associate Manager RHEV DevOps EMEA ENG Virtualization R&D Red Hat Israel phone: +972-9-7692018 irc: eedri (on #tlv #rhev-dev #rhev-integ)

Seems like [1], as ovirt-srv19 has fresh new FC24 installation, virtlogd is not enabled by default: ● virtlogd.service - Virtual machine log manager Loaded: loaded (/usr/lib/systemd/system/virtlogd.service; indirect; vendor preset: disabled) Active: inactive (dead) Docs: man:virtlogd(8) http://libvirt.org we can add it to puppet for now. [1] https://bugzilla.redhat.com/show_bug.cgi?id=1290357 On Thu, Jul 7, 2016 at 6:49 PM, Eyal Edri <eedri@redhat.com> wrote:
This looks like a bug in libvirt? Tolik mentioned something in a socket name which is too long, anyone seen it before?
15:37:11 libvirt: XML-RPC error : Failed to connect socket to '/var/run/libvirt/virtlogd-sock': No such file or directory 15:37:11 * Starting VM lago_basic_suite_master_storage: ERROR (in 0:00:00) 15:37:11 # Start vms: ERROR (in 0:00:00) 15:37:11 # Destroy network lago_basic_suite_master_lago: 15:37:11 # Destroy network lago_basic_suite_master_lago: ERROR (in 0:00:00) 15:37:11 @ Start Prefix: ERROR (in 0:00:00) 15:37:11 Error occured, aborting 15:37:11 Traceback (most recent call last): 15:37:11 File "/usr/lib/python2.7/site-packages/lago/cmd.py", line 691, in main 15:37:11 cli_plugins[args.verb].do_run(args) 15:37:11 File "/usr/lib/python2.7/site-packages/lago/plugins/cli.py", line 180, in do_run 15:37:11 self._do_run(**vars(args)) 15:37:11 File "/usr/lib/python2.7/site-packages/lago/utils.py", line 488, in wrapper 15:37:11 return func(*args, **kwargs) 15:37:11 File "/usr/lib/python2.7/site-packages/lago/utils.py", line 499, in wrapper 15:37:11 return func(*args, prefix=prefix, **kwargs) 15:37:11 File "/usr/lib/python2.7/site-packages/lago/cmd.py", line 255, in do_start 15:37:11 prefix.start(vm_names=vm_names) 15:37:11 File "/usr/lib/python2.7/site-packages/lago/prefix.py", line 958, in start 15:37:11 self.virt_env.start(vm_names=vm_names) 15:37:11 File "/usr/lib/python2.7/site-packages/lago/virt.py", line 182, in start 15:37:11 vm.start() 15:37:11 File "/usr/lib/python2.7/site-packages/lago/plugins/vm.py", line 247, in start 15:37:11 return self.provider.start(*args, **kwargs) 15:37:11 File "/usr/lib/python2.7/site-packages/lago/vm.py", line 93, in start 15:37:11 self.libvirt_con.createXML(self._libvirt_xml()) 15:37:11 File "/usr/lib64/python2.7/site-packages/libvirt.py", line 3611, in createXML 15:37:11 if ret is None:raise libvirtError('virDomainCreateXML() failed', conn=self) 15:37:11 libvirtError: Failed to connect socket to '/var/run/libvirt/virtlogd-sock': No such file or directory 15:37:11 #########################
On Thu, Jul 7, 2016 at 6:37 PM, <jenkins@jenkins.phx.ovirt.org> wrote:
See <http://jenkins.ovirt.org/job/ovirt_master_system-tests/243/changes>
Changes:
[Eyal Edri] add hystrix deps to yum repos include list
[Eyal Edri] refresh fedora versions and release versions for ovirt-engine
[Sandro Bonazzola] ovirt-engine_upgrade-db: drop 3.6.7 jobs
[Shirly Radco] Replacing jpackage repo for 3.6 dwh
------------------------------------------ [...truncated 485 lines...] ## rc = 1 ########################################################## ##! ERROR vvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvv ##! Last 20 log enties: logs/mocker-fedora-23-x86_64.fc23.basic_suite_master.sh/basic_suite_master.sh.log ##! File "/usr/lib/python2.7/site-packages/lago/cmd.py", line 255, in do_start prefix.start(vm_names=vm_names) File "/usr/lib/python2.7/site-packages/lago/prefix.py", line 958, in start self.virt_env.start(vm_names=vm_names) File "/usr/lib/python2.7/site-packages/lago/virt.py", line 182, in start vm.start() File "/usr/lib/python2.7/site-packages/lago/plugins/vm.py", line 247, in start return self.provider.start(*args, **kwargs) File "/usr/lib/python2.7/site-packages/lago/vm.py", line 93, in start self.libvirt_con.createXML(self._libvirt_xml()) File "/usr/lib64/python2.7/site-packages/libvirt.py", line 3611, in createXML if ret is None:raise libvirtError('virDomainCreateXML() failed', conn=self) libvirtError: Failed to connect socket to '/var/run/libvirt/virtlogd-sock': No such file or directory ######################### ======== Cleaning up ----------- Cleaning with lago ----------- Cleaning with lago done ======== Cleanup done Took 197 seconds =================================== ##! ##! ERROR ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ ##!######################################################## ########################################################## Build step 'Execute shell' marked build as failure Performing Post build task... Match found for :.* : True Logical operation result is TRUE Running script : #!/bin/bash -xe echo 'shell_scripts/system_tests.collect_logs.sh'
# # Required jjb vars: # version # VERSION=master SUITE_TYPE=
WORKSPACE="$PWD" OVIRT_SUITE="$SUITE_TYPE_suite_$VERSION" TESTS_LOGS="$WORKSPACE/ovirt-system-tests/exported-artifacts"
rm -rf "$WORKSPACE/exported-artifacts" mkdir -p "$WORKSPACE/exported-artifacts"
if [[ -d "$TESTS_LOGS" ]]; then mv "$TESTS_LOGS/"* "$WORKSPACE/exported-artifacts/" fi
[ovirt_master_system-tests] $ /bin/bash -xe /tmp/hudson703448189995999079.sh + echo shell_scripts/system_tests.collect_logs.sh shell_scripts/system_tests.collect_logs.sh + VERSION=master + SUITE_TYPE= + WORKSPACE=<http://jenkins.ovirt.org/job/ovirt_master_system-tests/ws/> + OVIRT_SUITE=master + TESTS_LOGS=<http://jenkins.ovirt.org/job/ovirt_master_system-tests/ws/ovirt-system-tests/exported-artifacts> + rm -rf <http://jenkins.ovirt.org/job/ovirt_master_system-tests/243/artifact/exported-artifacts> + mkdir -p <http://jenkins.ovirt.org/job/ovirt_master_system-tests/243/artifact/exported-artifacts> + [[ -d <http://jenkins.ovirt.org/job/ovirt_master_system-tests/ws/ovirt-system-tests/exported-artifacts> ]] + mv <http://jenkins.ovirt.org/job/ovirt_master_system-tests/ws/ovirt-system-tests/exported-artifacts/lago_logs> <http://jenkins.ovirt.org/job/ovirt_master_system-tests/243/artifact/exported-artifacts/> POST BUILD TASK : SUCCESS END OF POST BUILD TASK : 0 Match found for :.* : True Logical operation result is TRUE Running script : #!/bin/bash -xe echo "shell-scripts/mock_cleanup.sh"
shopt -s nullglob
WORKSPACE="$PWD"
# Make clear this is the cleanup, helps reading the jenkins logs cat <<EOC _______________________________________________________________________ ####################################################################### # # # CLEANUP # # # ####################################################################### EOC
# Archive the logs, we want them anyway logs=( ./*log ./*/logs ) if [[ "$logs" ]]; then tar cvzf exported-artifacts/logs.tgz "${logs[@]}" rm -rf "${logs[@]}" fi
# stop any processes running inside the chroot failed=false mock_confs=("$WORKSPACE"/*/mocker*) # Clean current jobs mockroot if any for mock_conf_file in "${mock_confs[@]}"; do [[ "$mock_conf_file" ]] || continue echo "Cleaning up mock $mock_conf" mock_root="${mock_conf_file##*/}" mock_root="${mock_root%.*}" my_mock="/usr/bin/mock" my_mock+=" --configdir=${mock_conf_file%/*}" my_mock+=" --root=${mock_root}" my_mock+=" --resultdir=$WORKSPACE"
#TODO: investigate why mock --clean fails to umount certain dirs sometimes, #so we can use it instead of manually doing all this. echo "Killing all mock orphan processes, if any." $my_mock \ --orphanskill \ || { echo "ERROR: Failed to kill orphans on $chroot." failed=true }
mock_root="$(\ grep \ -Po "(?<=config_opts\['root'\] = ')[^']*" \ "$mock_conf_file" \ )" || : [[ "$mock_root" ]] || continue mounts=($(mount | awk '{print $3}' | grep "$mock_root")) || : if [[ "$mounts" ]]; then echo "Found mounted dirs inside the chroot $chroot. Trying to umount." fi for mount in "${mounts[@]}"; do sudo umount --lazy "$mount" \ || { echo "ERROR: Failed to umount $mount." failed=true } done done
# Clean any leftover chroot from other jobs for mock_root in /var/lib/mock/*; do this_chroot_failed=false mounts=($(mount | awk '{print $3}' | grep "$mock_root")) || : if [[ "$mounts" ]]; then echo "Found mounted dirs inside the chroot $mock_root." \ "Trying to umount." fi for mount in "${mounts[@]}"; do sudo umount --lazy "$mount" \ || { echo "ERROR: Failed to umount $mount." failed=true this_chroot_failed=true } done if ! $this_chroot_failed; then sudo rm -rf "$mock_root" fi done
if $failed; then echo "Aborting." exit 1 fi
# remove mock system cache, we will setup proxies to do the caching and this # takes lots of space between runs shopt -u nullglob sudo rm -Rf /var/cache/mock/*
# restore the permissions in the working dir, as sometimes it leaves files # owned by root and then the 'cleanup workspace' from jenkins job fails to # clean and breaks the jobs sudo chown -R "$USER" "$WORKSPACE"
[ovirt_master_system-tests] $ /bin/bash -xe /tmp/hudson505872539550784673.sh + echo shell-scripts/mock_cleanup.sh shell-scripts/mock_cleanup.sh + shopt -s nullglob + WORKSPACE=<http://jenkins.ovirt.org/job/ovirt_master_system-tests/ws/> + cat _______________________________________________________________________ ####################################################################### # # # CLEANUP # # # ####################################################################### + logs=(./*log ./*/logs) + [[ -n ./ovirt-system-tests/logs ]] + tar cvzf exported-artifacts/logs.tgz ./ovirt-system-tests/logs ./ovirt-system-tests/logs/ ./ovirt-system-tests/logs/mocker-fedora-23-x86_64.fc23.init/
./ovirt-system-tests/logs/mocker-fedora-23-x86_64.fc23.init/stdout_stderr.log ./ovirt-system-tests/logs/mocker-fedora-23-x86_64.fc23.init/state.log ./ovirt-system-tests/logs/mocker-fedora-23-x86_64.fc23.init/build.log ./ovirt-system-tests/logs/mocker-fedora-23-x86_64.fc23.init/root.log ./ovirt-system-tests/logs/mocker-fedora-23-x86_64.fc23.install_packages/
./ovirt-system-tests/logs/mocker-fedora-23-x86_64.fc23.install_packages/stdout_stderr.log
./ovirt-system-tests/logs/mocker-fedora-23-x86_64.fc23.install_packages/state.log
./ovirt-system-tests/logs/mocker-fedora-23-x86_64.fc23.install_packages/build.log
./ovirt-system-tests/logs/mocker-fedora-23-x86_64.fc23.install_packages/root.log ./ovirt-system-tests/logs/mocker-fedora-23-x86_64.fc23.clean_rpmdb/
./ovirt-system-tests/logs/mocker-fedora-23-x86_64.fc23.clean_rpmdb/stdout_stderr.log
./ovirt-system-tests/logs/mocker-fedora-23-x86_64.fc23.basic_suite_master.sh/
./ovirt-system-tests/logs/mocker-fedora-23-x86_64.fc23.basic_suite_master.sh/basic_suite_master.sh.log + rm -rf ./ovirt-system-tests/logs + failed=false + mock_confs=("$WORKSPACE"/*/mocker*) + for mock_conf_file in '"${mock_confs[@]}"' + [[ -n <http://jenkins.ovirt.org/job/ovirt_master_system-tests/ws/ovirt-system-tests/mocker-fedora-23-x86_64.fc23.cfg> ]] + echo 'Cleaning up mock ' Cleaning up mock + mock_root=mocker-fedora-23-x86_64.fc23.cfg + mock_root=mocker-fedora-23-x86_64.fc23 + my_mock=/usr/bin/mock + my_mock+=' --configdir=<http://jenkins.ovirt.org/job/ovirt_master_system-tests/ws/ovirt-system-tests'> + my_mock+=' --root=mocker-fedora-23-x86_64.fc23' + my_mock+=' --resultdir=<http://jenkins.ovirt.org/job/ovirt_master_system-tests/ws/'> + echo 'Killing all mock orphan processes, if any.' Killing all mock orphan processes, if any. + /usr/bin/mock --configdir=<http://jenkins.ovirt.org/job/ovirt_master_system-tests/ws/ovirt-system-tests> --root=mocker-fedora-23-x86_64.fc23 --resultdir=<http://jenkins.ovirt.org/job/ovirt_master_system-tests/ws/> --orphanskill WARNING: Could not find required logging config file: <http://jenkins.ovirt.org/job/ovirt_master_system-tests/ws/ovirt-system-tests/logging.ini.> Using default... INFO: mock.py version 1.2.18 starting (python version = 3.5.1)... Start: init plugins INFO: selinux enabled Finish: init plugins Start: run Finish: run ++ grep -Po '(?<=config_opts\['\''root'\''\] = '\'')[^'\'']*' <http://jenkins.ovirt.org/job/ovirt_master_system-tests/ws/ovirt-system-tests/mocker-fedora-23-x86_64.fc23.cfg> + mock_root=fedora-23-x86_64-4ddbe48c8f8b7d8c2c3635b52313f04a + [[ -n fedora-23-x86_64-4ddbe48c8f8b7d8c2c3635b52313f04a ]] + mounts=($(mount | awk '{print $3}' | grep "$mock_root")) ++ mount ++ awk '{print $3}' ++ grep fedora-23-x86_64-4ddbe48c8f8b7d8c2c3635b52313f04a + : + [[ -n '' ]] + false + shopt -u nullglob + sudo rm -Rf /var/cache/mock/fedora-23-x86_64-4ddbe48c8f8b7d8c2c3635b52313f04a + sudo chown -R jenkins <http://jenkins.ovirt.org/job/ovirt_master_system-tests/ws/> POST BUILD TASK : SUCCESS END OF POST BUILD TASK : 1 Recording test results ERROR: Step ?Publish JUnit test result report? failed: No test report files were found. Configuration error? Archiving artifacts _______________________________________________ Infra mailing list Infra@ovirt.org http://lists.ovirt.org/mailman/listinfo/infra
-- Eyal Edri Associate Manager RHEV DevOps EMEA ENG Virtualization R&D Red Hat Israel
phone: +972-9-7692018 irc: eedri (on #tlv #rhev-dev #rhev-integ)
_______________________________________________ Infra mailing list Infra@ovirt.org http://lists.ovirt.org/mailman/listinfo/infra

Thanks for the analysis Nadav! I rebuilt srv19/20/21/22 to FC24 today. Just enabled virtlogd on these boxes manually to avoid further job failures. Regards, Evgheni Dereveanchin ----- Original Message ----- From: "Nadav Goldin" <ngoldin@redhat.com> To: "Eyal Edri" <eedri@redhat.com> Cc: "infra" <infra@ovirt.org>, "Martin Perina" <mperina@redhat.com>, "Yaniv Kaul" <ykaul@redhat.com>, "Tolik Litovsky" <tlitovsk@redhat.com> Sent: Thursday, 7 July, 2016 6:00:35 PM Subject: Re: Build failed in Jenkins: ovirt_master_system-tests #243 Seems like [1], as ovirt-srv19 has fresh new FC24 installation, virtlogd is not enabled by default: ● virtlogd.service - Virtual machine log manager Loaded: loaded (/usr/lib/systemd/system/virtlogd.service; indirect; vendor preset: disabled) Active: inactive (dead) Docs: man:virtlogd(8) http://libvirt.org we can add it to puppet for now. [1] https://bugzilla.redhat.com/show_bug.cgi?id=1290357 On Thu, Jul 7, 2016 at 6:49 PM, Eyal Edri <eedri@redhat.com> wrote:
This looks like a bug in libvirt? Tolik mentioned something in a socket name which is too long, anyone seen it before?
15:37:11 libvirt: XML-RPC error : Failed to connect socket to '/var/run/libvirt/virtlogd-sock': No such file or directory 15:37:11 * Starting VM lago_basic_suite_master_storage: ERROR (in 0:00:00) 15:37:11 # Start vms: ERROR (in 0:00:00) 15:37:11 # Destroy network lago_basic_suite_master_lago: 15:37:11 # Destroy network lago_basic_suite_master_lago: ERROR (in 0:00:00) 15:37:11 @ Start Prefix: ERROR (in 0:00:00) 15:37:11 Error occured, aborting 15:37:11 Traceback (most recent call last): 15:37:11 File "/usr/lib/python2.7/site-packages/lago/cmd.py", line 691, in main 15:37:11 cli_plugins[args.verb].do_run(args) 15:37:11 File "/usr/lib/python2.7/site-packages/lago/plugins/cli.py", line 180, in do_run 15:37:11 self._do_run(**vars(args)) 15:37:11 File "/usr/lib/python2.7/site-packages/lago/utils.py", line 488, in wrapper 15:37:11 return func(*args, **kwargs) 15:37:11 File "/usr/lib/python2.7/site-packages/lago/utils.py", line 499, in wrapper 15:37:11 return func(*args, prefix=prefix, **kwargs) 15:37:11 File "/usr/lib/python2.7/site-packages/lago/cmd.py", line 255, in do_start 15:37:11 prefix.start(vm_names=vm_names) 15:37:11 File "/usr/lib/python2.7/site-packages/lago/prefix.py", line 958, in start 15:37:11 self.virt_env.start(vm_names=vm_names) 15:37:11 File "/usr/lib/python2.7/site-packages/lago/virt.py", line 182, in start 15:37:11 vm.start() 15:37:11 File "/usr/lib/python2.7/site-packages/lago/plugins/vm.py", line 247, in start 15:37:11 return self.provider.start(*args, **kwargs) 15:37:11 File "/usr/lib/python2.7/site-packages/lago/vm.py", line 93, in start 15:37:11 self.libvirt_con.createXML(self._libvirt_xml()) 15:37:11 File "/usr/lib64/python2.7/site-packages/libvirt.py", line 3611, in createXML 15:37:11 if ret is None:raise libvirtError('virDomainCreateXML() failed', conn=self) 15:37:11 libvirtError: Failed to connect socket to '/var/run/libvirt/virtlogd-sock': No such file or directory 15:37:11 #########################
On Thu, Jul 7, 2016 at 6:37 PM, <jenkins@jenkins.phx.ovirt.org> wrote:
See <http://jenkins.ovirt.org/job/ovirt_master_system-tests/243/changes>
Changes:
[Eyal Edri] add hystrix deps to yum repos include list
[Eyal Edri] refresh fedora versions and release versions for ovirt-engine
[Sandro Bonazzola] ovirt-engine_upgrade-db: drop 3.6.7 jobs
[Shirly Radco] Replacing jpackage repo for 3.6 dwh
------------------------------------------ [...truncated 485 lines...] ## rc = 1 ########################################################## ##! ERROR vvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvv ##! Last 20 log enties: logs/mocker-fedora-23-x86_64.fc23.basic_suite_master.sh/basic_suite_master.sh.log ##! File "/usr/lib/python2.7/site-packages/lago/cmd.py", line 255, in do_start prefix.start(vm_names=vm_names) File "/usr/lib/python2.7/site-packages/lago/prefix.py", line 958, in start self.virt_env.start(vm_names=vm_names) File "/usr/lib/python2.7/site-packages/lago/virt.py", line 182, in start vm.start() File "/usr/lib/python2.7/site-packages/lago/plugins/vm.py", line 247, in start return self.provider.start(*args, **kwargs) File "/usr/lib/python2.7/site-packages/lago/vm.py", line 93, in start self.libvirt_con.createXML(self._libvirt_xml()) File "/usr/lib64/python2.7/site-packages/libvirt.py", line 3611, in createXML if ret is None:raise libvirtError('virDomainCreateXML() failed', conn=self) libvirtError: Failed to connect socket to '/var/run/libvirt/virtlogd-sock': No such file or directory ######################### ======== Cleaning up ----------- Cleaning with lago ----------- Cleaning with lago done ======== Cleanup done Took 197 seconds =================================== ##! ##! ERROR ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ ##!######################################################## ########################################################## Build step 'Execute shell' marked build as failure Performing Post build task... Match found for :.* : True Logical operation result is TRUE Running script : #!/bin/bash -xe echo 'shell_scripts/system_tests.collect_logs.sh'
# # Required jjb vars: # version # VERSION=master SUITE_TYPE=
WORKSPACE="$PWD" OVIRT_SUITE="$SUITE_TYPE_suite_$VERSION" TESTS_LOGS="$WORKSPACE/ovirt-system-tests/exported-artifacts"
rm -rf "$WORKSPACE/exported-artifacts" mkdir -p "$WORKSPACE/exported-artifacts"
if [[ -d "$TESTS_LOGS" ]]; then mv "$TESTS_LOGS/"* "$WORKSPACE/exported-artifacts/" fi
[ovirt_master_system-tests] $ /bin/bash -xe /tmp/hudson703448189995999079.sh + echo shell_scripts/system_tests.collect_logs.sh shell_scripts/system_tests.collect_logs.sh + VERSION=master + SUITE_TYPE= + WORKSPACE=<http://jenkins.ovirt.org/job/ovirt_master_system-tests/ws/> + OVIRT_SUITE=master + TESTS_LOGS=<http://jenkins.ovirt.org/job/ovirt_master_system-tests/ws/ovirt-system-tests/exported-artifacts> + rm -rf <http://jenkins.ovirt.org/job/ovirt_master_system-tests/243/artifact/exported-artifacts> + mkdir -p <http://jenkins.ovirt.org/job/ovirt_master_system-tests/243/artifact/exported-artifacts> + [[ -d <http://jenkins.ovirt.org/job/ovirt_master_system-tests/ws/ovirt-system-tests/exported-artifacts> ]] + mv <http://jenkins.ovirt.org/job/ovirt_master_system-tests/ws/ovirt-system-tests/exported-artifacts/lago_logs> <http://jenkins.ovirt.org/job/ovirt_master_system-tests/243/artifact/exported-artifacts/> POST BUILD TASK : SUCCESS END OF POST BUILD TASK : 0 Match found for :.* : True Logical operation result is TRUE Running script : #!/bin/bash -xe echo "shell-scripts/mock_cleanup.sh"
shopt -s nullglob
WORKSPACE="$PWD"
# Make clear this is the cleanup, helps reading the jenkins logs cat <<EOC _______________________________________________________________________ ####################################################################### # # # CLEANUP # # # ####################################################################### EOC
# Archive the logs, we want them anyway logs=( ./*log ./*/logs ) if [[ "$logs" ]]; then tar cvzf exported-artifacts/logs.tgz "${logs[@]}" rm -rf "${logs[@]}" fi
# stop any processes running inside the chroot failed=false mock_confs=("$WORKSPACE"/*/mocker*) # Clean current jobs mockroot if any for mock_conf_file in "${mock_confs[@]}"; do [[ "$mock_conf_file" ]] || continue echo "Cleaning up mock $mock_conf" mock_root="${mock_conf_file##*/}" mock_root="${mock_root%.*}" my_mock="/usr/bin/mock" my_mock+=" --configdir=${mock_conf_file%/*}" my_mock+=" --root=${mock_root}" my_mock+=" --resultdir=$WORKSPACE"
#TODO: investigate why mock --clean fails to umount certain dirs sometimes, #so we can use it instead of manually doing all this. echo "Killing all mock orphan processes, if any." $my_mock \ --orphanskill \ || { echo "ERROR: Failed to kill orphans on $chroot." failed=true }
mock_root="$(\ grep \ -Po "(?<=config_opts\['root'\] = ')[^']*" \ "$mock_conf_file" \ )" || : [[ "$mock_root" ]] || continue mounts=($(mount | awk '{print $3}' | grep "$mock_root")) || : if [[ "$mounts" ]]; then echo "Found mounted dirs inside the chroot $chroot. Trying to umount." fi for mount in "${mounts[@]}"; do sudo umount --lazy "$mount" \ || { echo "ERROR: Failed to umount $mount." failed=true } done done
# Clean any leftover chroot from other jobs for mock_root in /var/lib/mock/*; do this_chroot_failed=false mounts=($(mount | awk '{print $3}' | grep "$mock_root")) || : if [[ "$mounts" ]]; then echo "Found mounted dirs inside the chroot $mock_root." \ "Trying to umount." fi for mount in "${mounts[@]}"; do sudo umount --lazy "$mount" \ || { echo "ERROR: Failed to umount $mount." failed=true this_chroot_failed=true } done if ! $this_chroot_failed; then sudo rm -rf "$mock_root" fi done
if $failed; then echo "Aborting." exit 1 fi
# remove mock system cache, we will setup proxies to do the caching and this # takes lots of space between runs shopt -u nullglob sudo rm -Rf /var/cache/mock/*
# restore the permissions in the working dir, as sometimes it leaves files # owned by root and then the 'cleanup workspace' from jenkins job fails to # clean and breaks the jobs sudo chown -R "$USER" "$WORKSPACE"
[ovirt_master_system-tests] $ /bin/bash -xe /tmp/hudson505872539550784673.sh + echo shell-scripts/mock_cleanup.sh shell-scripts/mock_cleanup.sh + shopt -s nullglob + WORKSPACE=<http://jenkins.ovirt.org/job/ovirt_master_system-tests/ws/> + cat _______________________________________________________________________ ####################################################################### # # # CLEANUP # # # ####################################################################### + logs=(./*log ./*/logs) + [[ -n ./ovirt-system-tests/logs ]] + tar cvzf exported-artifacts/logs.tgz ./ovirt-system-tests/logs ./ovirt-system-tests/logs/ ./ovirt-system-tests/logs/mocker-fedora-23-x86_64.fc23.init/
./ovirt-system-tests/logs/mocker-fedora-23-x86_64.fc23.init/stdout_stderr.log ./ovirt-system-tests/logs/mocker-fedora-23-x86_64.fc23.init/state.log ./ovirt-system-tests/logs/mocker-fedora-23-x86_64.fc23.init/build.log ./ovirt-system-tests/logs/mocker-fedora-23-x86_64.fc23.init/root.log ./ovirt-system-tests/logs/mocker-fedora-23-x86_64.fc23.install_packages/
./ovirt-system-tests/logs/mocker-fedora-23-x86_64.fc23.install_packages/stdout_stderr.log
./ovirt-system-tests/logs/mocker-fedora-23-x86_64.fc23.install_packages/state.log
./ovirt-system-tests/logs/mocker-fedora-23-x86_64.fc23.install_packages/build.log
./ovirt-system-tests/logs/mocker-fedora-23-x86_64.fc23.install_packages/root.log ./ovirt-system-tests/logs/mocker-fedora-23-x86_64.fc23.clean_rpmdb/
./ovirt-system-tests/logs/mocker-fedora-23-x86_64.fc23.clean_rpmdb/stdout_stderr.log
./ovirt-system-tests/logs/mocker-fedora-23-x86_64.fc23.basic_suite_master.sh/
./ovirt-system-tests/logs/mocker-fedora-23-x86_64.fc23.basic_suite_master.sh/basic_suite_master.sh.log + rm -rf ./ovirt-system-tests/logs + failed=false + mock_confs=("$WORKSPACE"/*/mocker*) + for mock_conf_file in '"${mock_confs[@]}"' + [[ -n <http://jenkins.ovirt.org/job/ovirt_master_system-tests/ws/ovirt-system-tests/mocker-fedora-23-x86_64.fc23.cfg> ]] + echo 'Cleaning up mock ' Cleaning up mock + mock_root=mocker-fedora-23-x86_64.fc23.cfg + mock_root=mocker-fedora-23-x86_64.fc23 + my_mock=/usr/bin/mock + my_mock+=' --configdir=<http://jenkins.ovirt.org/job/ovirt_master_system-tests/ws/ovirt-system-tests'> + my_mock+=' --root=mocker-fedora-23-x86_64.fc23' + my_mock+=' --resultdir=<http://jenkins.ovirt.org/job/ovirt_master_system-tests/ws/'> + echo 'Killing all mock orphan processes, if any.' Killing all mock orphan processes, if any. + /usr/bin/mock --configdir=<http://jenkins.ovirt.org/job/ovirt_master_system-tests/ws/ovirt-system-tests> --root=mocker-fedora-23-x86_64.fc23 --resultdir=<http://jenkins.ovirt.org/job/ovirt_master_system-tests/ws/> --orphanskill WARNING: Could not find required logging config file: <http://jenkins.ovirt.org/job/ovirt_master_system-tests/ws/ovirt-system-tests/logging.ini.> Using default... INFO: mock.py version 1.2.18 starting (python version = 3.5.1)... Start: init plugins INFO: selinux enabled Finish: init plugins Start: run Finish: run ++ grep -Po '(?<=config_opts\['\''root'\''\] = '\'')[^'\'']*' <http://jenkins.ovirt.org/job/ovirt_master_system-tests/ws/ovirt-system-tests/mocker-fedora-23-x86_64.fc23.cfg> + mock_root=fedora-23-x86_64-4ddbe48c8f8b7d8c2c3635b52313f04a + [[ -n fedora-23-x86_64-4ddbe48c8f8b7d8c2c3635b52313f04a ]] + mounts=($(mount | awk '{print $3}' | grep "$mock_root")) ++ mount ++ awk '{print $3}' ++ grep fedora-23-x86_64-4ddbe48c8f8b7d8c2c3635b52313f04a + : + [[ -n '' ]] + false + shopt -u nullglob + sudo rm -Rf /var/cache/mock/fedora-23-x86_64-4ddbe48c8f8b7d8c2c3635b52313f04a + sudo chown -R jenkins <http://jenkins.ovirt.org/job/ovirt_master_system-tests/ws/> POST BUILD TASK : SUCCESS END OF POST BUILD TASK : 1 Recording test results ERROR: Step ?Publish JUnit test result report? failed: No test report files were found. Configuration error? Archiving artifacts _______________________________________________ Infra mailing list Infra@ovirt.org http://lists.ovirt.org/mailman/listinfo/infra
-- Eyal Edri Associate Manager RHEV DevOps EMEA ENG Virtualization R&D Red Hat Israel
phone: +972-9-7692018 irc: eedri (on #tlv #rhev-dev #rhev-integ)
_______________________________________________ Infra mailing list Infra@ovirt.org http://lists.ovirt.org/mailman/listinfo/infra
_______________________________________________ Infra mailing list Infra@ovirt.org http://lists.ovirt.org/mailman/listinfo/infra

On Thu, Jul 07, 2016 at 07:00:35PM +0300, Nadav Goldin wrote:
Seems like [1], as ovirt-srv19 has fresh new FC24 installation, virtlogd is not enabled by default: ● virtlogd.service - Virtual machine log manager Loaded: loaded (/usr/lib/systemd/system/virtlogd.service; indirect; vendor preset: disabled) Active: inactive (dead) Docs: man:virtlogd(8) http://libvirt.org we can add it to puppet for now.
Francesco, shouldn't vdsm require virtlogd explicitly?
On Thu, Jul 7, 2016 at 6:49 PM, Eyal Edri <eedri@redhat.com> wrote:
This looks like a bug in libvirt? Tolik mentioned something in a socket name which is too long, anyone seen it before?
15:37:11 libvirt: XML-RPC error : Failed to connect socket to '/var/run/libvirt/virtlogd-sock': No such file or directory 15:37:11 * Starting VM lago_basic_suite_master_storage: ERROR (in 0:00:00) 15:37:11 # Start vms: ERROR (in 0:00:00) 15:37:11 # Destroy network lago_basic_suite_master_lago: 15:37:11 # Destroy network lago_basic_suite_master_lago: ERROR (in 0:00:00) 15:37:11 @ Start Prefix: ERROR (in 0:00:00) 15:37:11 Error occured, aborting 15:37:11 Traceback (most recent call last): 15:37:11 File "/usr/lib/python2.7/site-packages/lago/cmd.py", line 691, in main 15:37:11 cli_plugins[args.verb].do_run(args) 15:37:11 File "/usr/lib/python2.7/site-packages/lago/plugins/cli.py", line 180, in do_run 15:37:11 self._do_run(**vars(args)) 15:37:11 File "/usr/lib/python2.7/site-packages/lago/utils.py", line 488, in wrapper 15:37:11 return func(*args, **kwargs) 15:37:11 File "/usr/lib/python2.7/site-packages/lago/utils.py", line 499, in wrapper 15:37:11 return func(*args, prefix=prefix, **kwargs) 15:37:11 File "/usr/lib/python2.7/site-packages/lago/cmd.py", line 255, in do_start 15:37:11 prefix.start(vm_names=vm_names) 15:37:11 File "/usr/lib/python2.7/site-packages/lago/prefix.py", line 958, in start 15:37:11 self.virt_env.start(vm_names=vm_names) 15:37:11 File "/usr/lib/python2.7/site-packages/lago/virt.py", line 182, in start 15:37:11 vm.start() 15:37:11 File "/usr/lib/python2.7/site-packages/lago/plugins/vm.py", line 247, in start 15:37:11 return self.provider.start(*args, **kwargs) 15:37:11 File "/usr/lib/python2.7/site-packages/lago/vm.py", line 93, in start 15:37:11 self.libvirt_con.createXML(self._libvirt_xml()) 15:37:11 File "/usr/lib64/python2.7/site-packages/libvirt.py", line 3611, in createXML 15:37:11 if ret is None:raise libvirtError('virDomainCreateXML() failed', conn=self) 15:37:11 libvirtError: Failed to connect socket to '/var/run/libvirt/virtlogd-sock': No such file or directory 15:37:11 #########################
On Thu, Jul 7, 2016 at 6:37 PM, <jenkins@jenkins.phx.ovirt.org> wrote:
See <http://jenkins.ovirt.org/job/ovirt_master_system-tests/243/changes>
Changes:
[Eyal Edri] add hystrix deps to yum repos include list
[Eyal Edri] refresh fedora versions and release versions for ovirt-engine
[Sandro Bonazzola] ovirt-engine_upgrade-db: drop 3.6.7 jobs
[Shirly Radco] Replacing jpackage repo for 3.6 dwh
------------------------------------------ [...truncated 485 lines...] ## rc = 1 ########################################################## ##! ERROR vvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvv ##! Last 20 log enties: logs/mocker-fedora-23-x86_64.fc23.basic_suite_master.sh/basic_suite_master.sh.log ##! File "/usr/lib/python2.7/site-packages/lago/cmd.py", line 255, in do_start prefix.start(vm_names=vm_names) File "/usr/lib/python2.7/site-packages/lago/prefix.py", line 958, in start self.virt_env.start(vm_names=vm_names) File "/usr/lib/python2.7/site-packages/lago/virt.py", line 182, in start vm.start() File "/usr/lib/python2.7/site-packages/lago/plugins/vm.py", line 247, in start return self.provider.start(*args, **kwargs) File "/usr/lib/python2.7/site-packages/lago/vm.py", line 93, in start self.libvirt_con.createXML(self._libvirt_xml()) File "/usr/lib64/python2.7/site-packages/libvirt.py", line 3611, in createXML if ret is None:raise libvirtError('virDomainCreateXML() failed', conn=self) libvirtError: Failed to connect socket to '/var/run/libvirt/virtlogd-sock': No such file or directory ######################### ======== Cleaning up ----------- Cleaning with lago ----------- Cleaning with lago done ======== Cleanup done Took 197 seconds =================================== ##! ##! ERROR ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ ##!######################################################## ########################################################## Build step 'Execute shell' marked build as failure Performing Post build task... Match found for :.* : True Logical operation result is TRUE Running script : #!/bin/bash -xe echo 'shell_scripts/system_tests.collect_logs.sh'
# # Required jjb vars: # version # VERSION=master SUITE_TYPE=
WORKSPACE="$PWD" OVIRT_SUITE="$SUITE_TYPE_suite_$VERSION" TESTS_LOGS="$WORKSPACE/ovirt-system-tests/exported-artifacts"
rm -rf "$WORKSPACE/exported-artifacts" mkdir -p "$WORKSPACE/exported-artifacts"
if [[ -d "$TESTS_LOGS" ]]; then mv "$TESTS_LOGS/"* "$WORKSPACE/exported-artifacts/" fi
[ovirt_master_system-tests] $ /bin/bash -xe /tmp/hudson703448189995999079.sh + echo shell_scripts/system_tests.collect_logs.sh shell_scripts/system_tests.collect_logs.sh + VERSION=master + SUITE_TYPE= + WORKSPACE=<http://jenkins.ovirt.org/job/ovirt_master_system-tests/ws/> + OVIRT_SUITE=master + TESTS_LOGS=<http://jenkins.ovirt.org/job/ovirt_master_system-tests/ws/ovirt-system-tests/exported-artifacts> + rm -rf <http://jenkins.ovirt.org/job/ovirt_master_system-tests/243/artifact/exported-artifacts> + mkdir -p <http://jenkins.ovirt.org/job/ovirt_master_system-tests/243/artifact/exported-artifacts> + [[ -d <http://jenkins.ovirt.org/job/ovirt_master_system-tests/ws/ovirt-system-tests/exported-artifacts> ]] + mv <http://jenkins.ovirt.org/job/ovirt_master_system-tests/ws/ovirt-system-tests/exported-artifacts/lago_logs> <http://jenkins.ovirt.org/job/ovirt_master_system-tests/243/artifact/exported-artifacts/> POST BUILD TASK : SUCCESS END OF POST BUILD TASK : 0 Match found for :.* : True Logical operation result is TRUE Running script : #!/bin/bash -xe echo "shell-scripts/mock_cleanup.sh"
shopt -s nullglob
WORKSPACE="$PWD"
# Make clear this is the cleanup, helps reading the jenkins logs cat <<EOC _______________________________________________________________________ ####################################################################### # # # CLEANUP # # # ####################################################################### EOC
# Archive the logs, we want them anyway logs=( ./*log ./*/logs ) if [[ "$logs" ]]; then tar cvzf exported-artifacts/logs.tgz "${logs[@]}" rm -rf "${logs[@]}" fi
# stop any processes running inside the chroot failed=false mock_confs=("$WORKSPACE"/*/mocker*) # Clean current jobs mockroot if any for mock_conf_file in "${mock_confs[@]}"; do [[ "$mock_conf_file" ]] || continue echo "Cleaning up mock $mock_conf" mock_root="${mock_conf_file##*/}" mock_root="${mock_root%.*}" my_mock="/usr/bin/mock" my_mock+=" --configdir=${mock_conf_file%/*}" my_mock+=" --root=${mock_root}" my_mock+=" --resultdir=$WORKSPACE"
#TODO: investigate why mock --clean fails to umount certain dirs sometimes, #so we can use it instead of manually doing all this. echo "Killing all mock orphan processes, if any." $my_mock \ --orphanskill \ || { echo "ERROR: Failed to kill orphans on $chroot." failed=true }
mock_root="$(\ grep \ -Po "(?<=config_opts\['root'\] = ')[^']*" \ "$mock_conf_file" \ )" || : [[ "$mock_root" ]] || continue mounts=($(mount | awk '{print $3}' | grep "$mock_root")) || : if [[ "$mounts" ]]; then echo "Found mounted dirs inside the chroot $chroot. Trying to umount." fi for mount in "${mounts[@]}"; do sudo umount --lazy "$mount" \ || { echo "ERROR: Failed to umount $mount." failed=true } done done
# Clean any leftover chroot from other jobs for mock_root in /var/lib/mock/*; do this_chroot_failed=false mounts=($(mount | awk '{print $3}' | grep "$mock_root")) || : if [[ "$mounts" ]]; then echo "Found mounted dirs inside the chroot $mock_root." \ "Trying to umount." fi for mount in "${mounts[@]}"; do sudo umount --lazy "$mount" \ || { echo "ERROR: Failed to umount $mount." failed=true this_chroot_failed=true } done if ! $this_chroot_failed; then sudo rm -rf "$mock_root" fi done
if $failed; then echo "Aborting." exit 1 fi
# remove mock system cache, we will setup proxies to do the caching and this # takes lots of space between runs shopt -u nullglob sudo rm -Rf /var/cache/mock/*
# restore the permissions in the working dir, as sometimes it leaves files # owned by root and then the 'cleanup workspace' from jenkins job fails to # clean and breaks the jobs sudo chown -R "$USER" "$WORKSPACE"
[ovirt_master_system-tests] $ /bin/bash -xe /tmp/hudson505872539550784673.sh + echo shell-scripts/mock_cleanup.sh shell-scripts/mock_cleanup.sh + shopt -s nullglob + WORKSPACE=<http://jenkins.ovirt.org/job/ovirt_master_system-tests/ws/> + cat _______________________________________________________________________ ####################################################################### # # # CLEANUP # # # ####################################################################### + logs=(./*log ./*/logs) + [[ -n ./ovirt-system-tests/logs ]] + tar cvzf exported-artifacts/logs.tgz ./ovirt-system-tests/logs ./ovirt-system-tests/logs/ ./ovirt-system-tests/logs/mocker-fedora-23-x86_64.fc23.init/
./ovirt-system-tests/logs/mocker-fedora-23-x86_64.fc23.init/stdout_stderr.log ./ovirt-system-tests/logs/mocker-fedora-23-x86_64.fc23.init/state.log ./ovirt-system-tests/logs/mocker-fedora-23-x86_64.fc23.init/build.log ./ovirt-system-tests/logs/mocker-fedora-23-x86_64.fc23.init/root.log ./ovirt-system-tests/logs/mocker-fedora-23-x86_64.fc23.install_packages/
./ovirt-system-tests/logs/mocker-fedora-23-x86_64.fc23.install_packages/stdout_stderr.log
./ovirt-system-tests/logs/mocker-fedora-23-x86_64.fc23.install_packages/state.log
./ovirt-system-tests/logs/mocker-fedora-23-x86_64.fc23.install_packages/build.log
./ovirt-system-tests/logs/mocker-fedora-23-x86_64.fc23.install_packages/root.log ./ovirt-system-tests/logs/mocker-fedora-23-x86_64.fc23.clean_rpmdb/
./ovirt-system-tests/logs/mocker-fedora-23-x86_64.fc23.clean_rpmdb/stdout_stderr.log
./ovirt-system-tests/logs/mocker-fedora-23-x86_64.fc23.basic_suite_master.sh/
./ovirt-system-tests/logs/mocker-fedora-23-x86_64.fc23.basic_suite_master.sh/basic_suite_master.sh.log + rm -rf ./ovirt-system-tests/logs + failed=false + mock_confs=("$WORKSPACE"/*/mocker*) + for mock_conf_file in '"${mock_confs[@]}"' + [[ -n <http://jenkins.ovirt.org/job/ovirt_master_system-tests/ws/ovirt-system-tests/mocker-fedora-23-x86_64.fc23.cfg> ]] + echo 'Cleaning up mock ' Cleaning up mock + mock_root=mocker-fedora-23-x86_64.fc23.cfg + mock_root=mocker-fedora-23-x86_64.fc23 + my_mock=/usr/bin/mock + my_mock+=' --configdir=<http://jenkins.ovirt.org/job/ovirt_master_system-tests/ws/ovirt-system-tests'> + my_mock+=' --root=mocker-fedora-23-x86_64.fc23' + my_mock+=' --resultdir=<http://jenkins.ovirt.org/job/ovirt_master_system-tests/ws/'> + echo 'Killing all mock orphan processes, if any.' Killing all mock orphan processes, if any. + /usr/bin/mock --configdir=<http://jenkins.ovirt.org/job/ovirt_master_system-tests/ws/ovirt-system-tests> --root=mocker-fedora-23-x86_64.fc23 --resultdir=<http://jenkins.ovirt.org/job/ovirt_master_system-tests/ws/> --orphanskill WARNING: Could not find required logging config file: <http://jenkins.ovirt.org/job/ovirt_master_system-tests/ws/ovirt-system-tests/logging.ini.> Using default... INFO: mock.py version 1.2.18 starting (python version = 3.5.1)... Start: init plugins INFO: selinux enabled Finish: init plugins Start: run Finish: run ++ grep -Po '(?<=config_opts\['\''root'\''\] = '\'')[^'\'']*' <http://jenkins.ovirt.org/job/ovirt_master_system-tests/ws/ovirt-system-tests/mocker-fedora-23-x86_64.fc23.cfg> + mock_root=fedora-23-x86_64-4ddbe48c8f8b7d8c2c3635b52313f04a + [[ -n fedora-23-x86_64-4ddbe48c8f8b7d8c2c3635b52313f04a ]] + mounts=($(mount | awk '{print $3}' | grep "$mock_root")) ++ mount ++ awk '{print $3}' ++ grep fedora-23-x86_64-4ddbe48c8f8b7d8c2c3635b52313f04a + : + [[ -n '' ]] + false + shopt -u nullglob + sudo rm -Rf /var/cache/mock/fedora-23-x86_64-4ddbe48c8f8b7d8c2c3635b52313f04a + sudo chown -R jenkins <http://jenkins.ovirt.org/job/ovirt_master_system-tests/ws/> POST BUILD TASK : SUCCESS END OF POST BUILD TASK : 1 Recording test results ERROR: Step ?Publish JUnit test result report? failed: No test report files were found. Configuration error? Archiving artifacts _______________________________________________ Infra mailing list Infra@ovirt.org http://lists.ovirt.org/mailman/listinfo/infra
-- Eyal Edri Associate Manager RHEV DevOps EMEA ENG Virtualization R&D Red Hat Israel
phone: +972-9-7692018 irc: eedri (on #tlv #rhev-dev #rhev-integ)
_______________________________________________ Infra mailing list Infra@ovirt.org http://lists.ovirt.org/mailman/listinfo/infra

----- Original Message -----
From: "Dan Kenigsberg" <danken@redhat.com> To: "Nadav Goldin" <ngoldin@redhat.com>, fromani@redhat.com Cc: "Eyal Edri" <eedri@redhat.com>, "Nir Soffer" <nsoffer@redhat.com>, "Yaniv Kaul" <ykaul@redhat.com>, "Martin Perina" <mperina@redhat.com>, "Tolik Litovsky" <tlitovsk@redhat.com>, "infra" <infra@ovirt.org> Sent: Monday, July 11, 2016 10:12:48 PM Subject: Re: Build failed in Jenkins: ovirt_master_system-tests #243
On Thu, Jul 07, 2016 at 07:00:35PM +0300, Nadav Goldin wrote:
Seems like [1], as ovirt-srv19 has fresh new FC24 installation, virtlogd is not enabled by default: ● virtlogd.service - Virtual machine log manager Loaded: loaded (/usr/lib/systemd/system/virtlogd.service; indirect; vendor preset: disabled) Active: inactive (dead) Docs: man:virtlogd(8) http://libvirt.org we can add it to puppet for now.
Francesco, shouldn't vdsm require virtlogd explicitly?
Disclaimer: I didn't read yet the rest of the thread. We choose not to, trying to prevent issues: https://gerrit.ovirt.org/#/c/55189/1 Related-To: https://bugzilla.redhat.com/show_bug.cgi?id=1318902 We want to reenable and require it in the near future: https://bugzilla.redhat.com/show_bug.cgi?id=1321010 HTH, -- Francesco Romani RedHat Engineering Virtualization R & D Phone: 8261328 IRC: fromani

On Tue, Jul 12, 2016 at 10:12 AM, Francesco Romani <fromani@redhat.com> wrote:
----- Original Message -----
From: "Dan Kenigsberg" <danken@redhat.com> To: "Nadav Goldin" <ngoldin@redhat.com>, fromani@redhat.com Cc: "Eyal Edri" <eedri@redhat.com>, "Nir Soffer" <nsoffer@redhat.com>, "Yaniv Kaul" <ykaul@redhat.com>, "Martin Perina" <mperina@redhat.com>, "Tolik Litovsky" <tlitovsk@redhat.com>, "infra" <infra@ovirt.org> Sent: Monday, July 11, 2016 10:12:48 PM Subject: Re: Build failed in Jenkins: ovirt_master_system-tests #243
On Thu, Jul 07, 2016 at 07:00:35PM +0300, Nadav Goldin wrote:
Seems like [1], as ovirt-srv19 has fresh new FC24 installation, virtlogd is not enabled by default: ● virtlogd.service - Virtual machine log manager Loaded: loaded (/usr/lib/systemd/system/virtlogd.service; indirect; vendor preset: disabled) Active: inactive (dead) Docs: man:virtlogd(8) http://libvirt.org we can add it to puppet for now.
Francesco, shouldn't vdsm require virtlogd explicitly?
Disclaimer: I didn't read yet the rest of the thread.
We choose not to, trying to prevent issues: https://gerrit.ovirt.org/#/c/55189/1 Related-To: https://bugzilla.redhat.com/show_bug.cgi?id=1318902
I thought and still do it's quite a mistake - if we think a feature of our friends from libvirt is immature - we should tell them that, hopefully accompanied by bugs. I'd like to believe they don't enable features by default unless they believe the features are mature and ready for general consumption. If that's not the case, we should talk with them. Y.
We want to reenable and require it in the near future: https://bugzilla.redhat.com/show_bug.cgi?id=1321010
HTH,
-- Francesco Romani RedHat Engineering Virtualization R & D Phone: 8261328 IRC: fromani

--Apple-Mail=_ABF30BAE-59CC-4D45-9FBD-AA99278442C4 Content-Transfer-Encoding: quoted-printable Content-Type: text/plain; charset=utf-8
On 12 Jul 2016, at 09:19, Yaniv Kaul <ykaul@redhat.com> wrote: =20 =20 =20 On Tue, Jul 12, 2016 at 10:12 AM, Francesco Romani <fromani@redhat.com = <mailto:fromani@redhat.com>> wrote: =20 =20 ----- Original Message -----
From: "Dan Kenigsberg" <danken@redhat.com = <mailto:danken@redhat.com>> To: "Nadav Goldin" <ngoldin@redhat.com <mailto:ngoldin@redhat.com>>, = fromani@redhat.com <mailto:fromani@redhat.com> Cc: "Eyal Edri" <eedri@redhat.com <mailto:eedri@redhat.com>>, "Nir = Soffer" <nsoffer@redhat.com <mailto:nsoffer@redhat.com>>, "Yaniv Kaul" = <ykaul@redhat.com <mailto:ykaul@redhat.com>>, "Martin Perina" <mperina@redhat.com <mailto:mperina@redhat.com>>, "Tolik = Litovsky" <tlitovsk@redhat.com <mailto:tlitovsk@redhat.com>>, "infra" = <infra@ovirt.org <mailto:infra@ovirt.org>> Sent: Monday, July 11, 2016 10:12:48 PM Subject: Re: Build failed in Jenkins: ovirt_master_system-tests #243
On Thu, Jul 07, 2016 at 07:00:35PM +0300, Nadav Goldin wrote:
Seems like [1], as ovirt-srv19 has fresh new FC24 installation, virtlogd is not enabled by default: =E2=97=8F virtlogd.service - Virtual machine log manager Loaded: loaded (/usr/lib/systemd/system/virtlogd.service; = indirect; vendor preset: disabled) Active: inactive (dead) Docs: man:virtlogd(8) http://libvirt.org <http://libvirt.org/> we can add it to puppet for now.
[1] https://bugzilla.redhat.com/show_bug.cgi?id=3D1290357 = <https://bugzilla.redhat.com/show_bug.cgi?id=3D1290357>
Francesco, shouldn't vdsm require virtlogd explicitly? =20 Disclaimer: I didn't read yet the rest of the thread. =20 We choose not to, trying to prevent issues: https://gerrit.ovirt.org/#/c/55189/1 = <https://gerrit.ovirt.org/#/c/55189/1> Related-To: https://bugzilla.redhat.com/show_bug.cgi?id=3D1318902 = <https://bugzilla.redhat.com/show_bug.cgi?id=3D1318902> =20 I thought and still do it's quite a mistake - if we think a feature of = our friends from libvirt is immature - we should tell them that, = hopefully accompanied by bugs.
I'd like to believe they don't enable features by default unless they = believe the features are mature and ready for general consumption. If =
It=E2=80=99s not so much feature itself, rather it=E2=80=99s integration = into OSes. I think there was enough feedback, a random google search = shows a load of issues reported at the time this dependency was = introduced, so I hope they learned from it that's not the case, we should talk with them. IMO it was a bit hasty on their part, and we just didn=E2=80=99t have = time to resolve all the virtlogd daemon management in RHEL and RHEVH in = time Thanks, michal
Y. =20 =20 We want to reenable and require it in the near future: https://bugzilla.redhat.com/show_bug.cgi?id=3D1321010 = <https://bugzilla.redhat.com/show_bug.cgi?id=3D1321010> =20 HTH, =20 -- Francesco Romani RedHat Engineering Virtualization R & D Phone: 8261328 IRC: fromani =20
--Apple-Mail=_ABF30BAE-59CC-4D45-9FBD-AA99278442C4 Content-Transfer-Encoding: quoted-printable Content-Type: text/html; charset=utf-8 <html><head><meta http-equiv=3D"Content-Type" content=3D"text/html = charset=3Dutf-8"></head><body style=3D"word-wrap: break-word; = -webkit-nbsp-mode: space; -webkit-line-break: after-white-space;" = class=3D""><br class=3D""><div><blockquote type=3D"cite" class=3D""><div = class=3D"">On 12 Jul 2016, at 09:19, Yaniv Kaul <<a = href=3D"mailto:ykaul@redhat.com" class=3D"">ykaul@redhat.com</a>> = wrote:</div><br class=3D"Apple-interchange-newline"><div class=3D""><div = dir=3D"ltr" class=3D""><br class=3D""><div class=3D"gmail_extra"><br = class=3D""><div class=3D"gmail_quote">On Tue, Jul 12, 2016 at 10:12 AM, = Francesco Romani <span dir=3D"ltr" class=3D""><<a = href=3D"mailto:fromani@redhat.com" target=3D"_blank" = class=3D"">fromani@redhat.com</a>></span> wrote:<br = class=3D""><blockquote class=3D"gmail_quote" style=3D"margin:0 0 0 = .8ex;border-left:1px #ccc solid;padding-left:1ex"><span class=3D""><br = class=3D""> <br class=3D""> ----- Original Message -----<br class=3D""> > From: "Dan Kenigsberg" <<a href=3D"mailto:danken@redhat.com" = class=3D"">danken@redhat.com</a>><br class=3D""> > To: "Nadav Goldin" <<a href=3D"mailto:ngoldin@redhat.com" = class=3D"">ngoldin@redhat.com</a>>, <a = href=3D"mailto:fromani@redhat.com" class=3D"">fromani@redhat.com</a><br = class=3D""> > Cc: "Eyal Edri" <<a href=3D"mailto:eedri@redhat.com" = class=3D"">eedri@redhat.com</a>>, "Nir Soffer" <<a = href=3D"mailto:nsoffer@redhat.com" class=3D"">nsoffer@redhat.com</a>>, = "Yaniv Kaul" <<a href=3D"mailto:ykaul@redhat.com" = class=3D"">ykaul@redhat.com</a>>, "Martin<br class=3D""> > Perina" <<a href=3D"mailto:mperina@redhat.com" = class=3D"">mperina@redhat.com</a>>, "Tolik Litovsky" <<a = href=3D"mailto:tlitovsk@redhat.com" = class=3D"">tlitovsk@redhat.com</a>>, "infra" <<a = href=3D"mailto:infra@ovirt.org" class=3D"">infra@ovirt.org</a>><br = class=3D""> > Sent: Monday, July 11, 2016 10:12:48 PM<br class=3D""> > Subject: Re: Build failed in Jenkins: ovirt_master_system-tests = #243<br class=3D""> ><br class=3D""> </span><span class=3D"">> On Thu, Jul 07, 2016 at 07:00:35PM +0300, = Nadav Goldin wrote:<br class=3D""> > > Seems like [1], as ovirt-srv19 has fresh new FC24 = installation,<br class=3D""> > > virtlogd is not enabled by default:<br class=3D""> > > =E2=97=8F virtlogd.service - Virtual machine log manager<br = class=3D""> > > Loaded: loaded = (/usr/lib/systemd/system/virtlogd.service; indirect;<br class=3D""> > > vendor preset: disabled)<br class=3D""> > > Active: inactive (dead)<br class=3D""> > > Docs: man:virtlogd(8)<br class=3D""> > > <a = href=3D"http://libvirt.org/" rel=3D"noreferrer" target=3D"_blank" = class=3D"">http://libvirt.org</a><br class=3D""> > > we can add it to puppet for now.<br class=3D""> > ><br class=3D""> > > [1] <a = href=3D"https://bugzilla.redhat.com/show_bug.cgi?id=3D1290357" = rel=3D"noreferrer" target=3D"_blank" = class=3D"">https://bugzilla.redhat.com/show_bug.cgi?id=3D1290357</a><br = class=3D""> ><br class=3D""> > Francesco, shouldn't vdsm require virtlogd explicitly?<br class=3D"">= <br class=3D""> </span>Disclaimer: I didn't read yet the rest of the thread.<br = class=3D""> <br class=3D""> We choose not to, trying to prevent issues:<br class=3D""> <a href=3D"https://gerrit.ovirt.org/#/c/55189/1" rel=3D"noreferrer" = target=3D"_blank" class=3D"">https://gerrit.ovirt.org/#/c/55189/1</a><br = class=3D""> Related-To: <a = href=3D"https://bugzilla.redhat.com/show_bug.cgi?id=3D1318902" = rel=3D"noreferrer" target=3D"_blank" = class=3D"">https://bugzilla.redhat.com/show_bug.cgi?id=3D1318902</a></bloc= kquote><div class=3D""><br class=3D""></div><div class=3D"">I thought = and still do it's quite a mistake - if we think a feature of our friends = from libvirt is immature - we should tell them that, hopefully = accompanied by bugs.</div></div></div></div></div></blockquote><div><br = class=3D""></div>It=E2=80=99s not so much feature itself, rather it=E2=80=99= s integration into OSes. I think there was enough feedback, a random = google search shows a load of issues reported at the time this = dependency was introduced, so I hope they learned from it</div><div><br = class=3D""><blockquote type=3D"cite" class=3D""><div class=3D""><div = dir=3D"ltr" class=3D""><div class=3D"gmail_extra"><div = class=3D"gmail_quote"><div class=3D"">I'd like to believe they don't = enable features by default unless they believe the features are mature = and ready for general consumption. If that's not the case, we should = talk with them.</div></div></div></div></div></blockquote><div><br = class=3D""></div>IMO it was a bit hasty on their part, and we just = didn=E2=80=99t have time to resolve all the virtlogd daemon management = in RHEL and RHEVH in time</div><div><br = class=3D""></div><div>Thanks,</div><div>michal</div><div><br = class=3D""></div><div><br class=3D""><blockquote type=3D"cite" = class=3D""><div class=3D""><div dir=3D"ltr" class=3D""><div = class=3D"gmail_extra"><div class=3D"gmail_quote"><div = class=3D"">Y.</div><blockquote class=3D"gmail_quote" style=3D"margin:0 0 = 0 .8ex;border-left:1px #ccc solid;padding-left:1ex"><br class=3D""> <br class=3D""> We want to reenable and require it in the near future:<br class=3D""> <a href=3D"https://bugzilla.redhat.com/show_bug.cgi?id=3D1321010" = rel=3D"noreferrer" target=3D"_blank" = class=3D"">https://bugzilla.redhat.com/show_bug.cgi?id=3D1321010</a><br = class=3D""> <br class=3D""> HTH,<br class=3D""> <span class=3D"HOEnZb"><font color=3D"#888888" class=3D""><br class=3D""> --<br class=3D""> Francesco Romani<br class=3D""> RedHat Engineering Virtualization R & D<br class=3D""> Phone: 8261328<br class=3D""> IRC: fromani<br class=3D""> </font></span></blockquote></div><br class=3D""></div></div> </div></blockquote></div><br class=3D""></body></html>= --Apple-Mail=_ABF30BAE-59CC-4D45-9FBD-AA99278442C4--

On 11 July 2016 at 23:12, Dan Kenigsberg <danken@redhat.com> wrote:
Francesco, shouldn't vdsm require virtlogd explicitly?
This is unrelated to this thread. The discussion here is about configuration on Lago hosts, they are not running vdsm... -- Barak Korren bkorren@redhat.com RHEV-CI Team

See <http://jenkins.ovirt.org/job/ovirt_master_system-tests/244/changes> Changes: [Eyal Edri] add hystrix deps to yum repos include list ------------------------------------------ [...truncated 555 lines...] ## took 2222 seconds ## rc = 1 ########################################################## ##! ERROR vvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvv ##! Last 20 log enties: logs/mocker-fedora-23-x86_64.fc23.basic_suite_master.sh/basic_suite_master.sh.log ##! current session does not belong to lago group. @ Collect artifacts: # [Thread-1] lago_basic_suite_master_storage: # [Thread-2] lago_basic_suite_master_host0: # [Thread-3] lago_basic_suite_master_engine: # [Thread-4] lago_basic_suite_master_host1: # [Thread-2] lago_basic_suite_master_host0: Success (in 0:00:15) # [Thread-3] lago_basic_suite_master_engine: Success (in 0:00:15) # [Thread-1] lago_basic_suite_master_storage: Success (in 0:00:15) # [Thread-4] lago_basic_suite_master_host1: Success (in 0:00:15) @ Collect artifacts: Success (in 0:00:15) <http://jenkins.ovirt.org/job/ovirt_master_system-tests/ws/ovirt-system-tests> @@@@ ERROR: Failed running <http://jenkins.ovirt.org/job/ovirt_master_system-tests/ws/ovirt-system-tests/basic_suite_master/test-scenarios/002_bootstrap.py> ######################### ======== Cleaning up ----------- Cleaning with lago ----------- Cleaning with lago done ======== Cleanup done Took 2045 seconds =================================== ##! ##! ERROR ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ ##!######################################################## ########################################################## Build step 'Execute shell' marked build as failure Performing Post build task... Match found for :.* : True Logical operation result is TRUE Running script : #!/bin/bash -xe echo 'shell_scripts/system_tests.collect_logs.sh' # # Required jjb vars: # version # VERSION=master SUITE_TYPE= WORKSPACE="$PWD" OVIRT_SUITE="$SUITE_TYPE_suite_$VERSION" TESTS_LOGS="$WORKSPACE/ovirt-system-tests/exported-artifacts" rm -rf "$WORKSPACE/exported-artifacts" mkdir -p "$WORKSPACE/exported-artifacts" if [[ -d "$TESTS_LOGS" ]]; then mv "$TESTS_LOGS/"* "$WORKSPACE/exported-artifacts/" fi [ovirt_master_system-tests] $ /bin/bash -xe /tmp/hudson7396354016037072661.sh + echo shell_scripts/system_tests.collect_logs.sh shell_scripts/system_tests.collect_logs.sh + VERSION=master + SUITE_TYPE= + WORKSPACE=<http://jenkins.ovirt.org/job/ovirt_master_system-tests/ws/> + OVIRT_SUITE=master + TESTS_LOGS=<http://jenkins.ovirt.org/job/ovirt_master_system-tests/ws/ovirt-system-tests/exported-artifacts> + rm -rf <http://jenkins.ovirt.org/job/ovirt_master_system-tests/244/artifact/exported-artifacts> + mkdir -p <http://jenkins.ovirt.org/job/ovirt_master_system-tests/244/artifact/exported-artifacts> + [[ -d <http://jenkins.ovirt.org/job/ovirt_master_system-tests/ws/ovirt-system-tests/exported-artifacts> ]] + mv <http://jenkins.ovirt.org/job/ovirt_master_system-tests/ws/ovirt-system-tests/exported-artifacts/lago_logs> <http://jenkins.ovirt.org/job/ovirt_master_system-tests/ws/ovirt-system-tests/exported-artifacts/nosetests-001_initialize_engine.py.xml> <http://jenkins.ovirt.org/job/ovirt_master_system-tests/ws/ovirt-system-tests/exported-artifacts/nosetests-002_bootstrap.py.xml> <http://jenkins.ovirt.org/job/ovirt_master_system-tests/ws/ovirt-system-tests/exported-artifacts/test_logs> <http://jenkins.ovirt.org/job/ovirt_master_system-tests/244/artifact/exported-artifacts/> POST BUILD TASK : SUCCESS END OF POST BUILD TASK : 0 Match found for :.* : True Logical operation result is TRUE Running script : #!/bin/bash -xe echo "shell-scripts/mock_cleanup.sh" shopt -s nullglob WORKSPACE="$PWD" # Make clear this is the cleanup, helps reading the jenkins logs cat <<EOC _______________________________________________________________________ ####################################################################### # # # CLEANUP # # # ####################################################################### EOC # Archive the logs, we want them anyway logs=( ./*log ./*/logs ) if [[ "$logs" ]]; then tar cvzf exported-artifacts/logs.tgz "${logs[@]}" rm -rf "${logs[@]}" fi # stop any processes running inside the chroot failed=false mock_confs=("$WORKSPACE"/*/mocker*) # Clean current jobs mockroot if any for mock_conf_file in "${mock_confs[@]}"; do [[ "$mock_conf_file" ]] || continue echo "Cleaning up mock $mock_conf" mock_root="${mock_conf_file##*/}" mock_root="${mock_root%.*}" my_mock="/usr/bin/mock" my_mock+=" --configdir=${mock_conf_file%/*}" my_mock+=" --root=${mock_root}" my_mock+=" --resultdir=$WORKSPACE" #TODO: investigate why mock --clean fails to umount certain dirs sometimes, #so we can use it instead of manually doing all this. echo "Killing all mock orphan processes, if any." $my_mock \ --orphanskill \ || { echo "ERROR: Failed to kill orphans on $chroot." failed=true } mock_root="$(\ grep \ -Po "(?<=config_opts\['root'\] = ')[^']*" \ "$mock_conf_file" \ )" || : [[ "$mock_root" ]] || continue mounts=($(mount | awk '{print $3}' | grep "$mock_root")) || : if [[ "$mounts" ]]; then echo "Found mounted dirs inside the chroot $chroot. Trying to umount." fi for mount in "${mounts[@]}"; do sudo umount --lazy "$mount" \ || { echo "ERROR: Failed to umount $mount." failed=true } done done # Clean any leftover chroot from other jobs for mock_root in /var/lib/mock/*; do this_chroot_failed=false mounts=($(mount | awk '{print $3}' | grep "$mock_root")) || : if [[ "$mounts" ]]; then echo "Found mounted dirs inside the chroot $mock_root." \ "Trying to umount." fi for mount in "${mounts[@]}"; do sudo umount --lazy "$mount" \ || { echo "ERROR: Failed to umount $mount." failed=true this_chroot_failed=true } done if ! $this_chroot_failed; then sudo rm -rf "$mock_root" fi done if $failed; then echo "Aborting." exit 1 fi # remove mock system cache, we will setup proxies to do the caching and this # takes lots of space between runs shopt -u nullglob sudo rm -Rf /var/cache/mock/* # restore the permissions in the working dir, as sometimes it leaves files # owned by root and then the 'cleanup workspace' from jenkins job fails to # clean and breaks the jobs sudo chown -R "$USER" "$WORKSPACE" [ovirt_master_system-tests] $ /bin/bash -xe /tmp/hudson285553616057172153.sh + echo shell-scripts/mock_cleanup.sh shell-scripts/mock_cleanup.sh + shopt -s nullglob + WORKSPACE=<http://jenkins.ovirt.org/job/ovirt_master_system-tests/ws/> + cat _______________________________________________________________________ ####################################################################### # # # CLEANUP # # # ####################################################################### + logs=(./*log ./*/logs) + [[ -n ./ovirt-system-tests/logs ]] + tar cvzf exported-artifacts/logs.tgz ./ovirt-system-tests/logs ./ovirt-system-tests/logs/ ./ovirt-system-tests/logs/mocker-fedora-23-x86_64.fc23.init/ ./ovirt-system-tests/logs/mocker-fedora-23-x86_64.fc23.init/stdout_stderr.log ./ovirt-system-tests/logs/mocker-fedora-23-x86_64.fc23.init/state.log ./ovirt-system-tests/logs/mocker-fedora-23-x86_64.fc23.init/build.log ./ovirt-system-tests/logs/mocker-fedora-23-x86_64.fc23.init/root.log ./ovirt-system-tests/logs/mocker-fedora-23-x86_64.fc23.install_packages/ ./ovirt-system-tests/logs/mocker-fedora-23-x86_64.fc23.install_packages/stdout_stderr.log ./ovirt-system-tests/logs/mocker-fedora-23-x86_64.fc23.install_packages/state.log ./ovirt-system-tests/logs/mocker-fedora-23-x86_64.fc23.install_packages/build.log ./ovirt-system-tests/logs/mocker-fedora-23-x86_64.fc23.install_packages/root.log ./ovirt-system-tests/logs/mocker-fedora-23-x86_64.fc23.clean_rpmdb/ ./ovirt-system-tests/logs/mocker-fedora-23-x86_64.fc23.clean_rpmdb/stdout_stderr.log ./ovirt-system-tests/logs/mocker-fedora-23-x86_64.fc23.basic_suite_master.sh/ ./ovirt-system-tests/logs/mocker-fedora-23-x86_64.fc23.basic_suite_master.sh/basic_suite_master.sh.log + rm -rf ./ovirt-system-tests/logs + failed=false + mock_confs=("$WORKSPACE"/*/mocker*) + for mock_conf_file in '"${mock_confs[@]}"' + [[ -n <http://jenkins.ovirt.org/job/ovirt_master_system-tests/ws/ovirt-system-tests/mocker-fedora-23-x86_64.fc23.cfg> ]] + echo 'Cleaning up mock ' Cleaning up mock + mock_root=mocker-fedora-23-x86_64.fc23.cfg + mock_root=mocker-fedora-23-x86_64.fc23 + my_mock=/usr/bin/mock + my_mock+=' --configdir=<http://jenkins.ovirt.org/job/ovirt_master_system-tests/ws/ovirt-system-tests'> + my_mock+=' --root=mocker-fedora-23-x86_64.fc23' + my_mock+=' --resultdir=<http://jenkins.ovirt.org/job/ovirt_master_system-tests/ws/'> + echo 'Killing all mock orphan processes, if any.' Killing all mock orphan processes, if any. + /usr/bin/mock --configdir=<http://jenkins.ovirt.org/job/ovirt_master_system-tests/ws/ovirt-system-tests> --root=mocker-fedora-23-x86_64.fc23 --resultdir=<http://jenkins.ovirt.org/job/ovirt_master_system-tests/ws/> --orphanskill WARNING: Could not find required logging config file: <http://jenkins.ovirt.org/job/ovirt_master_system-tests/ws/ovirt-system-tests/logging.ini.> Using default... INFO: mock.py version 1.2.18 starting (python version = 3.5.1)... Start: init plugins INFO: selinux enabled Finish: init plugins Start: run Finish: run ++ grep -Po '(?<=config_opts\['\''root'\''\] = '\'')[^'\'']*' <http://jenkins.ovirt.org/job/ovirt_master_system-tests/ws/ovirt-system-tests/mocker-fedora-23-x86_64.fc23.cfg> + mock_root=fedora-23-x86_64-4ddbe48c8f8b7d8c2c3635b52313f04a + [[ -n fedora-23-x86_64-4ddbe48c8f8b7d8c2c3635b52313f04a ]] + mounts=($(mount | awk '{print $3}' | grep "$mock_root")) ++ mount ++ awk '{print $3}' ++ grep fedora-23-x86_64-4ddbe48c8f8b7d8c2c3635b52313f04a + : + [[ -n '' ]] + false + shopt -u nullglob + sudo rm -Rf /var/cache/mock/fedora-23-x86_64-4ddbe48c8f8b7d8c2c3635b52313f04a + sudo chown -R jenkins <http://jenkins.ovirt.org/job/ovirt_master_system-tests/ws/> POST BUILD TASK : SUCCESS END OF POST BUILD TASK : 1 Recording test results Archiving artifacts

See <http://jenkins.ovirt.org/job/ovirt_master_system-tests/245/changes> Changes: [Eyal Edri] add hystrix deps to yum repos include list ------------------------------------------ [...truncated 42810 lines...] ## rc = 1 ########################################################## ##! ERROR vvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvv ##! Last 20 log enties: logs/mocker-fedora-23-x86_64.fc23.basic_suite_master.sh/basic_suite_master.sh.log ##! File "/usr/lib/python2.7/site-packages/lago/cmd.py", line 255, in do_start prefix.start(vm_names=vm_names) File "/usr/lib/python2.7/site-packages/lago/prefix.py", line 958, in start self.virt_env.start(vm_names=vm_names) File "/usr/lib/python2.7/site-packages/lago/virt.py", line 182, in start vm.start() File "/usr/lib/python2.7/site-packages/lago/plugins/vm.py", line 247, in start return self.provider.start(*args, **kwargs) File "/usr/lib/python2.7/site-packages/lago/vm.py", line 93, in start self.libvirt_con.createXML(self._libvirt_xml()) File "/usr/lib64/python2.7/site-packages/libvirt.py", line 3611, in createXML if ret is None:raise libvirtError('virDomainCreateXML() failed', conn=self) libvirtError: Failed to connect socket to '/var/run/libvirt/virtlogd-sock': No such file or directory ######################### ======== Cleaning up ----------- Cleaning with lago ----------- Cleaning with lago done ======== Cleanup done Took 174 seconds =================================== ##! ##! ERROR ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ ##!######################################################## ########################################################## Build step 'Execute shell' marked build as failure Performing Post build task... Match found for :.* : True Logical operation result is TRUE Running script : #!/bin/bash -xe echo 'shell_scripts/system_tests.collect_logs.sh' # # Required jjb vars: # version # VERSION=master SUITE_TYPE= WORKSPACE="$PWD" OVIRT_SUITE="$SUITE_TYPE_suite_$VERSION" TESTS_LOGS="$WORKSPACE/ovirt-system-tests/exported-artifacts" rm -rf "$WORKSPACE/exported-artifacts" mkdir -p "$WORKSPACE/exported-artifacts" if [[ -d "$TESTS_LOGS" ]]; then mv "$TESTS_LOGS/"* "$WORKSPACE/exported-artifacts/" fi [ovirt_master_system-tests] $ /bin/bash -xe /tmp/hudson2352263519573252990.sh + echo shell_scripts/system_tests.collect_logs.sh shell_scripts/system_tests.collect_logs.sh + VERSION=master + SUITE_TYPE= + WORKSPACE=<http://jenkins.ovirt.org/job/ovirt_master_system-tests/ws/> + OVIRT_SUITE=master + TESTS_LOGS=<http://jenkins.ovirt.org/job/ovirt_master_system-tests/ws/ovirt-system-tests/exported-artifacts> + rm -rf <http://jenkins.ovirt.org/job/ovirt_master_system-tests/245/artifact/exported-artifacts> + mkdir -p <http://jenkins.ovirt.org/job/ovirt_master_system-tests/245/artifact/exported-artifacts> + [[ -d <http://jenkins.ovirt.org/job/ovirt_master_system-tests/ws/ovirt-system-tests/exported-artifacts> ]] + mv <http://jenkins.ovirt.org/job/ovirt_master_system-tests/ws/ovirt-system-tests/exported-artifacts/lago_logs> <http://jenkins.ovirt.org/job/ovirt_master_system-tests/245/artifact/exported-artifacts/> POST BUILD TASK : SUCCESS END OF POST BUILD TASK : 0 Match found for :.* : True Logical operation result is TRUE Running script : #!/bin/bash -xe echo "shell-scripts/mock_cleanup.sh" shopt -s nullglob WORKSPACE="$PWD" # Make clear this is the cleanup, helps reading the jenkins logs cat <<EOC _______________________________________________________________________ ####################################################################### # # # CLEANUP # # # ####################################################################### EOC # Archive the logs, we want them anyway logs=( ./*log ./*/logs ) if [[ "$logs" ]]; then tar cvzf exported-artifacts/logs.tgz "${logs[@]}" rm -rf "${logs[@]}" fi # stop any processes running inside the chroot failed=false mock_confs=("$WORKSPACE"/*/mocker*) # Clean current jobs mockroot if any for mock_conf_file in "${mock_confs[@]}"; do [[ "$mock_conf_file" ]] || continue echo "Cleaning up mock $mock_conf" mock_root="${mock_conf_file##*/}" mock_root="${mock_root%.*}" my_mock="/usr/bin/mock" my_mock+=" --configdir=${mock_conf_file%/*}" my_mock+=" --root=${mock_root}" my_mock+=" --resultdir=$WORKSPACE" #TODO: investigate why mock --clean fails to umount certain dirs sometimes, #so we can use it instead of manually doing all this. echo "Killing all mock orphan processes, if any." $my_mock \ --orphanskill \ || { echo "ERROR: Failed to kill orphans on $chroot." failed=true } mock_root="$(\ grep \ -Po "(?<=config_opts\['root'\] = ')[^']*" \ "$mock_conf_file" \ )" || : [[ "$mock_root" ]] || continue mounts=($(mount | awk '{print $3}' | grep "$mock_root")) || : if [[ "$mounts" ]]; then echo "Found mounted dirs inside the chroot $chroot. Trying to umount." fi for mount in "${mounts[@]}"; do sudo umount --lazy "$mount" \ || { echo "ERROR: Failed to umount $mount." failed=true } done done # Clean any leftover chroot from other jobs for mock_root in /var/lib/mock/*; do this_chroot_failed=false mounts=($(mount | awk '{print $3}' | grep "$mock_root")) || : if [[ "$mounts" ]]; then echo "Found mounted dirs inside the chroot $mock_root." \ "Trying to umount." fi for mount in "${mounts[@]}"; do sudo umount --lazy "$mount" \ || { echo "ERROR: Failed to umount $mount." failed=true this_chroot_failed=true } done if ! $this_chroot_failed; then sudo rm -rf "$mock_root" fi done if $failed; then echo "Aborting." exit 1 fi # remove mock system cache, we will setup proxies to do the caching and this # takes lots of space between runs shopt -u nullglob sudo rm -Rf /var/cache/mock/* # restore the permissions in the working dir, as sometimes it leaves files # owned by root and then the 'cleanup workspace' from jenkins job fails to # clean and breaks the jobs sudo chown -R "$USER" "$WORKSPACE" [ovirt_master_system-tests] $ /bin/bash -xe /tmp/hudson7329670288782931639.sh + echo shell-scripts/mock_cleanup.sh shell-scripts/mock_cleanup.sh + shopt -s nullglob + WORKSPACE=<http://jenkins.ovirt.org/job/ovirt_master_system-tests/ws/> + cat _______________________________________________________________________ ####################################################################### # # # CLEANUP # # # ####################################################################### + logs=(./*log ./*/logs) + [[ -n ./ovirt-system-tests/logs ]] + tar cvzf exported-artifacts/logs.tgz ./ovirt-system-tests/logs ./ovirt-system-tests/logs/ ./ovirt-system-tests/logs/mocker-fedora-23-x86_64.fc23.init/ ./ovirt-system-tests/logs/mocker-fedora-23-x86_64.fc23.init/stdout_stderr.log ./ovirt-system-tests/logs/mocker-fedora-23-x86_64.fc23.init/state.log ./ovirt-system-tests/logs/mocker-fedora-23-x86_64.fc23.init/build.log ./ovirt-system-tests/logs/mocker-fedora-23-x86_64.fc23.init/root.log ./ovirt-system-tests/logs/mocker-fedora-23-x86_64.fc23.install_packages/ ./ovirt-system-tests/logs/mocker-fedora-23-x86_64.fc23.install_packages/stdout_stderr.log ./ovirt-system-tests/logs/mocker-fedora-23-x86_64.fc23.install_packages/state.log ./ovirt-system-tests/logs/mocker-fedora-23-x86_64.fc23.install_packages/build.log ./ovirt-system-tests/logs/mocker-fedora-23-x86_64.fc23.install_packages/root.log ./ovirt-system-tests/logs/mocker-fedora-23-x86_64.fc23.clean_rpmdb/ ./ovirt-system-tests/logs/mocker-fedora-23-x86_64.fc23.clean_rpmdb/stdout_stderr.log ./ovirt-system-tests/logs/mocker-fedora-23-x86_64.fc23.basic_suite_master.sh/ ./ovirt-system-tests/logs/mocker-fedora-23-x86_64.fc23.basic_suite_master.sh/basic_suite_master.sh.log + rm -rf ./ovirt-system-tests/logs + failed=false + mock_confs=("$WORKSPACE"/*/mocker*) + for mock_conf_file in '"${mock_confs[@]}"' + [[ -n <http://jenkins.ovirt.org/job/ovirt_master_system-tests/ws/ovirt-system-tests/mocker-fedora-23-x86_64.fc23.cfg> ]] + echo 'Cleaning up mock ' Cleaning up mock + mock_root=mocker-fedora-23-x86_64.fc23.cfg + mock_root=mocker-fedora-23-x86_64.fc23 + my_mock=/usr/bin/mock + my_mock+=' --configdir=<http://jenkins.ovirt.org/job/ovirt_master_system-tests/ws/ovirt-system-tests'> + my_mock+=' --root=mocker-fedora-23-x86_64.fc23' + my_mock+=' --resultdir=<http://jenkins.ovirt.org/job/ovirt_master_system-tests/ws/'> + echo 'Killing all mock orphan processes, if any.' Killing all mock orphan processes, if any. + /usr/bin/mock --configdir=<http://jenkins.ovirt.org/job/ovirt_master_system-tests/ws/ovirt-system-tests> --root=mocker-fedora-23-x86_64.fc23 --resultdir=<http://jenkins.ovirt.org/job/ovirt_master_system-tests/ws/> --orphanskill WARNING: Could not find required logging config file: <http://jenkins.ovirt.org/job/ovirt_master_system-tests/ws/ovirt-system-tests/logging.ini.> Using default... INFO: mock.py version 1.2.18 starting (python version = 3.5.1)... Start: init plugins INFO: selinux enabled Finish: init plugins Start: run Finish: run ++ grep -Po '(?<=config_opts\['\''root'\''\] = '\'')[^'\'']*' <http://jenkins.ovirt.org/job/ovirt_master_system-tests/ws/ovirt-system-tests/mocker-fedora-23-x86_64.fc23.cfg> + mock_root=fedora-23-x86_64-4ddbe48c8f8b7d8c2c3635b52313f04a + [[ -n fedora-23-x86_64-4ddbe48c8f8b7d8c2c3635b52313f04a ]] + mounts=($(mount | awk '{print $3}' | grep "$mock_root")) ++ mount ++ awk '{print $3}' ++ grep fedora-23-x86_64-4ddbe48c8f8b7d8c2c3635b52313f04a + : + [[ -n '' ]] + false + shopt -u nullglob + sudo rm -Rf /var/cache/mock/fedora-23-x86_64-4ddbe48c8f8b7d8c2c3635b52313f04a + sudo chown -R jenkins <http://jenkins.ovirt.org/job/ovirt_master_system-tests/ws/> POST BUILD TASK : SUCCESS END OF POST BUILD TASK : 1 Recording test results ERROR: Step ?Publish JUnit test result report? failed: No test report files were found. Configuration error? Archiving artifacts

participants (9)
-
Barak Korren
-
Dan Kenigsberg
-
Evgheni Dereveanchin
-
Eyal Edri
-
Francesco Romani
-
jenkins@jenkins.phx.ovirt.org
-
Michal Skrivanek
-
Nadav Goldin
-
Yaniv Kaul