oVirt infra daily report - unstable production jobs - 100
by jenkins@jenkins.phx.ovirt.org
------=_Part_419_560869301.1475881206544
Content-Type: text/plain; charset=UTF-8
Content-Transfer-Encoding: 7bit
Good morning!
Attached is the HTML page with the jenkins status report. You can see it also here:
- http://jenkins.ovirt.org/job/system_jenkins-report/100//artifact/exported...
Cheers,
Jenkins
------=_Part_419_560869301.1475881206544
Content-Type: application/octet-stream; name=upstream_report.html
Content-Transfer-Encoding: 7bit
Content-Disposition: attachment; filename=upstream_report.html
Content-ID: <upstream_report.html>
<!DOCTYPE html><head><style type="text/css">
table.gridtable {
border-collapse: collapse;
table-layout:fixed;
width:1600px;
font-family: monospace;
font-size:13px;
}
.head {
font-size:20px;
font-family: arial;
}
.sub {
font-size:18px;
background-color:#e5e5e5;
font-family: arial;
}
pre {
font-family: monospace;
display: inline;
white-space: pre-wrap;
white-space: -moz-pre-wrap !important;
white-space: -pre-wrap;
white-space: -o-pre-wrap;
word-wrap: break-word;
}
</style>
</head>
<body>
<table class="gridtable" border=2>
<tr><th colspan=2 class=head>
RHEVM CI Jenkins Daily Report - 07/10/2016
</th></tr><tr><th colspan=2 class=sub>
<font color="blue"><a href="http://jenkins.ovirt.org/">00 Unstable Jobs (Production)</a></font>
</th></tr>
<tr><td>
<a href="http://jenkins.ovirt.org/job/ovirt-appliance_master_build-artifacts-el7-x...">ovirt-appliance_master_build-artifacts-el7-x86_64</a>
</td><td>
This job is automatically updated by jenkins job builder, any manual
change will be lost in the next update. If you want to make permanent
changes, check out the <a href="http://gerrit.ovirt.org/gitweb?p=jenkins.git;a=tree;h=refs/heads/master;h...">
jenkins</a> repo.
<!-- Managed by Jenkins Job Builder -->
</td></tr>
<tr><td>
<a href="http://jenkins.ovirt.org/job/ovirt-engine_master_upgrade-from-master_el7_...">ovirt-engine_master_upgrade-from-master_el7_merged</a>
</td><td>
This job is automatically updated by jenkins job builder, any manual
change will be lost in the next update. If you want to make permanent
changes, check out the <a href="http://gerrit.ovirt.org/gitweb?p=jenkins.git;a=tree;h=refs/heads/master;h...">
jenkins</a> repo.
<!-- Managed by Jenkins Job Builder -->
</td></tr>
<tr><td>
<a href="http://jenkins.ovirt.org/job/ovirt-node-ng_master_build-artifacts-el7-x86...">ovirt-node-ng_master_build-artifacts-el7-x86_64</a>
</td><td>
This job is automatically updated by jenkins job builder, any manual
change will be lost in the next update. If you want to make permanent
changes, check out the <a href="http://gerrit.ovirt.org/gitweb?p=jenkins.git;a=tree;h=refs/heads/master;h...">
jenkins</a> repo.
<!-- Managed by Jenkins Job Builder -->
</td></tr>
<tr><td>
<a href="http://jenkins.ovirt.org/job/ovirt-node-ng_ovirt-4.0-snapshot_build-artif...">ovirt-node-ng_ovirt-4.0-snapshot_build-artifacts-el7-x86_64</a>
</td><td>
This job is automatically updated by jenkins job builder, any manual
change will be lost in the next update. If you want to make permanent
changes, check out the <a href="http://gerrit.ovirt.org/gitweb?p=jenkins.git;a=tree;h=refs/heads/master;h...">
jenkins</a> repo.
<!-- Managed by Jenkins Job Builder -->
</td></tr>
<tr><td>
<a href="http://jenkins.ovirt.org/job/ovirt-node-ng_ovirt-4.0_build-artifacts-el7-...">ovirt-node-ng_ovirt-4.0_build-artifacts-el7-x86_64</a>
</td><td>
This job is automatically updated by jenkins job builder, any manual
change will be lost in the next update. If you want to make permanent
changes, check out the <a href="http://gerrit.ovirt.org/gitweb?p=jenkins.git;a=tree;h=refs/heads/master;h...">
jenkins</a> repo.
<!-- Managed by Jenkins Job Builder -->
</td></tr>
<tr><td>
<a href="http://jenkins.ovirt.org/job/ovirt-release_master_build-artifacts-all-x86...">ovirt-release_master_build-artifacts-all-x86_64</a>
</td><td>
This job is automatically updated by jenkins job builder, any manual
change will be lost in the next update. If you want to make permanent
changes, check out the <a href="http://gerrit.ovirt.org/gitweb?p=jenkins.git;a=tree;h=refs/heads/master;h...">
jenkins</a> repo.
<!-- Managed by Jenkins Job Builder -->
</td></tr>
<tr><td>
<a href="http://jenkins.ovirt.org/job/ovirt_master_system-tests/">ovirt_master_system-tests</a>
</td><td>
This job is automatically updated by jenkins job builder, any manual
change will be lost in the next update. If you want to make permanent
changes, check out the <a href="http://gerrit.ovirt.org/gitweb?p=jenkins.git;a=tree;h=refs/heads/master;h...">
jenkins</a> repo.
<!-- Managed by Jenkins Job Builder -->
</td></tr>
<tr><td>
<a href="http://jenkins.ovirt.org/job/repos_master_check-closure_fc24_merged/">repos_master_check-closure_fc24_merged</a>
</td><td>
This job is automatically updated by jenkins job builder, any manual
change will be lost in the next update. If you want to make permanent
changes, check out the <a href="http://gerrit.ovirt.org/gitweb?p=jenkins.git;a=tree;h=refs/heads/master;h...">
jenkins</a> repo.
<!-- Managed by Jenkins Job Builder -->
</td></tr>
<tr><td>
<a href="http://jenkins.ovirt.org/job/vdsm_4.0_check-merged-fc23-x86_64/">vdsm_4.0_check-merged-fc23-x86_64</a>
</td><td>
This job is automatically updated by jenkins job builder, any manual
change will be lost in the next update. If you want to make permanent
changes, check out the <a href="http://gerrit.ovirt.org/gitweb?p=jenkins.git;a=tree;h=refs/heads/master;h...">
jenkins</a> repo.
<!-- Managed by Jenkins Job Builder -->
</td></tr>
------=_Part_419_560869301.1475881206544--
8 years, 1 month
[JIRA] (OVIRT-755) primary DNS resolver in PHX not working for some domains
by Evgheni Dereveanchin (oVirt JIRA)
[ https://ovirt-jira.atlassian.net/browse/OVIRT-755?page=com.atlassian.jira... ]
Evgheni Dereveanchin reassigned OVIRT-755:
------------------------------------------
Assignee: Evgheni Dereveanchin (was: infra)
> primary DNS resolver in PHX not working for some domains
> --------------------------------------------------------
>
> Key: OVIRT-755
> URL: https://ovirt-jira.atlassian.net/browse/OVIRT-755
> Project: oVirt - virtualization made easy
> Issue Type: Bug
> Reporter: Evgheni Dereveanchin
> Assignee: Evgheni Dereveanchin
>
> As part of network reorganization I published a zone for new workers, and after a day it is still not resolvable by Jenkins so I can't make use of the new slaves.
> Here are the two DNS servers sent by DHCP in the PHX datacenter:
> nameserver 208.67.222.222
> nameserver 8.8.8.8
> The first one is OpenDNS and the second - Google Public DNS. We do not use the BIND instance we have on the Foreman proxy for some reason and the OpenDNS resolver fails for the new hostnames. We need to fix this.
--
This message was sent by Atlassian JIRA
(v1000.383.2#100014)
8 years, 1 month
[JIRA] (OVIRT-755) primary DNS resolver in PHX not working for some domains
by Evgheni Dereveanchin (oVirt JIRA)
Evgheni Dereveanchin created OVIRT-755:
------------------------------------------
Summary: primary DNS resolver in PHX not working for some domains
Key: OVIRT-755
URL: https://ovirt-jira.atlassian.net/browse/OVIRT-755
Project: oVirt - virtualization made easy
Issue Type: Bug
Reporter: Evgheni Dereveanchin
Assignee: infra
As part of network reorganization I published a zone for new workers, and after a day it is still not resolvable by Jenkins so I can't make use of the new slaves.
Here are the two DNS servers sent by DHCP in the PHX datacenter:
nameserver 208.67.222.222
nameserver 8.8.8.8
The first one is OpenDNS and the second - Google Public DNS. We do not use the BIND instance we have on the Foreman proxy for some reason and the OpenDNS resolver fails for the new hostnames. We need to fix this.
--
This message was sent by Atlassian JIRA
(v1000.383.2#100014)
8 years, 1 month
[JIRA] (OVIRT-754) Guest name 'TestNode-node' is already in use.
by sbonazzo (oVirt JIRA)
[ https://ovirt-jira.atlassian.net/browse/OVIRT-754?page=com.atlassian.jira... ]
sbonazzo commented on OVIRT-754:
--------------------------------
On Fri, Oct 7, 2016 at 9:30 AM, Fabian Deutsch <fdeutsch(a)redhat.com> wrote:
> Feel free to review https://gerrit.ovirt.org/#/c/64511/
>
> Rebased, +1; please verify
> - fabian
>
> On Fri, Oct 7, 2016 at 9:26 AM, Sandro Bonazzola <sbonazzo(a)redhat.com>
> wrote:
>
>>
>> http://jenkins.ovirt.org/job/ovirt-node-ng_master_build-arti
>> facts-el7-x86_64/138/
>>
>> ======================================================================
>> ERROR: test suite for <class 'testSanity.TestNode'>
>> ----------------------------------------------------------------------
>> Traceback (most recent call last):
>> File "/usr/lib/python2.7/site-packages/nose/suite.py", line 208, in run
>> self.setUp()
>> File "/usr/lib/python2.7/site-packages/nose/suite.py", line 291, in
>> setUp
>> self.setupContext(ancestor)
>> File "/usr/lib/python2.7/site-packages/nose/suite.py", line 314, in
>> setupContext
>> try_run(context, names)
>> File "/usr/lib/python2.7/site-packages/nose/util.py", line 469, in
>> try_run
>> return func()
>> File "/home/jenkins/workspace/ovirt-node-ng_master_build-artifact
>> s-el7-x86_64/ovirt-node-ng/tests/testVirt.py", line 150, in setUpClass
>> 77)
>> File "/home/jenkins/workspace/ovirt-node-ng_master_build-artifact
>> s-el7-x86_64/ovirt-node-ng/tests/testVirt.py", line 88, in _start_vm
>> dom = VM.create(name, img, ssh_port=ssh_port, memory_gb=memory_gb)
>> File "/home/jenkins/workspace/ovirt-node-ng_master_build-artifact
>> s-el7-x86_64/ovirt-node-ng/tests/virt.py", line 217, in create
>> dom = sh.virt_install(*args, **kwargs)
>> File "/usr/lib/python2.7/site-packages/sh.py", line 1021, in __call__
>> return RunningCommand(cmd, call_args, stdin, stdout, stderr)
>> File "/usr/lib/python2.7/site-packages/sh.py", line 486, in __init__
>> self.wait()
>> File "/usr/lib/python2.7/site-packages/sh.py", line 500, in wait
>> self.handle_command_exit_code(exit_code)
>> File "/usr/lib/python2.7/site-packages/sh.py", line 516, in
>> handle_command_exit_code
>> raise exc(self.ran, self.process.stdout, self.process.stderr)
>> ErrorReturnCode_1:
>>
>> RAN: '/bin/virt-install --import --print-xml
>> --network=user,model=virtio --noautoconsole --memory=2048 --rng=/dev/random
>> --memballoon=virtio --cpu=host --vcpus=4 --graphics=vnc
>> --watchdog=default,action=poweroff --serial=pty
>> --disk=path=/var/tmp/TestNode-node.qcow2,bus=virtio,format=q
>> cow2,driver_type=qcow2,discard=unmap,cache=unsafe --check=all=off
>> --channel=unix,target_type=virtio,name=local.test.0 --name=TestNode-node'
>>
>> STDOUT:
>>
>>
>> STDERR:
>> ERROR Guest name 'TestNode-node' is already in use.
>>
>> Seems to be a run in a not clean environment. Not sure about what caused
>> this.
>>
>>
>> --
>> Sandro Bonazzola
>> Better technology. Faster innovation. Powered by community collaboration.
>> See how it works at redhat.com
>> <https://www.redhat.com/it/about/events/red-hat-open-source-day-2016>
>>
>
>
>
> --
> Fabian Deutsch <fdeutsch(a)redhat.com>
> RHEV Hypervisor
> Red Hat
>
--
Sandro Bonazzola
Better technology. Faster innovation. Powered by community collaboration.
See how it works at redhat.com
<https://www.redhat.com/it/about/events/red-hat-open-source-day-2016>
> Guest name 'TestNode-node' is already in use.
> ---------------------------------------------
>
> Key: OVIRT-754
> URL: https://ovirt-jira.atlassian.net/browse/OVIRT-754
> Project: oVirt - virtualization made easy
> Issue Type: By-EMAIL
> Reporter: sbonazzo
> Assignee: infra
>
> http://jenkins.ovirt.org/job/ovirt-node-ng_master_build-artifacts-el7-x86...
> ======================================================================
> ERROR: test suite for <class 'testSanity.TestNode'>
> ----------------------------------------------------------------------
> Traceback (most recent call last):
> File "/usr/lib/python2.7/site-packages/nose/suite.py", line 208, in run
> self.setUp()
> File "/usr/lib/python2.7/site-packages/nose/suite.py", line 291, in setUp
> self.setupContext(ancestor)
> File "/usr/lib/python2.7/site-packages/nose/suite.py", line 314, in
> setupContext
> try_run(context, names)
> File "/usr/lib/python2.7/site-packages/nose/util.py", line 469, in try_run
> return func()
> File
> "/home/jenkins/workspace/ovirt-node-ng_master_build-artifacts-el7-x86_64/ovirt-node-ng/tests/testVirt.py",
> line 150, in setUpClass
> 77)
> File
> "/home/jenkins/workspace/ovirt-node-ng_master_build-artifacts-el7-x86_64/ovirt-node-ng/tests/testVirt.py",
> line 88, in _start_vm
> dom = VM.create(name, img, ssh_port=ssh_port, memory_gb=memory_gb)
> File
> "/home/jenkins/workspace/ovirt-node-ng_master_build-artifacts-el7-x86_64/ovirt-node-ng/tests/virt.py",
> line 217, in create
> dom = sh.virt_install(*args, **kwargs)
> File "/usr/lib/python2.7/site-packages/sh.py", line 1021, in __call__
> return RunningCommand(cmd, call_args, stdin, stdout, stderr)
> File "/usr/lib/python2.7/site-packages/sh.py", line 486, in __init__
> self.wait()
> File "/usr/lib/python2.7/site-packages/sh.py", line 500, in wait
> self.handle_command_exit_code(exit_code)
> File "/usr/lib/python2.7/site-packages/sh.py", line 516, in
> handle_command_exit_code
> raise exc(self.ran, self.process.stdout, self.process.stderr)
> ErrorReturnCode_1:
> RAN: '/bin/virt-install --import --print-xml --network=user,model=virtio
> --noautoconsole --memory=2048 --rng=/dev/random --memballoon=virtio
> --cpu=host --vcpus=4 --graphics=vnc --watchdog=default,action=poweroff
> --serial=pty
> --disk=path=/var/tmp/TestNode-node.qcow2,bus=virtio,format=qcow2,driver_type=qcow2,discard=unmap,cache=unsafe
> --check=all=off --channel=unix,target_type=virtio,name=local.test.0
> --name=TestNode-node'
> STDOUT:
> STDERR:
> ERROR Guest name 'TestNode-node' is already in use.
> Seems to be a run in a not clean environment. Not sure about what caused
> this.
> --
> Sandro Bonazzola
> Better technology. Faster innovation. Powered by community collaboration.
> See how it works at redhat.com
> <https://www.redhat.com/it/about/events/red-hat-open-source-day-2016>
--
This message was sent by Atlassian JIRA
(v1000.383.2#100014)
8 years, 1 month
Build failed in Jenkins: ovirt_4.0_he-system-tests #358
by jenkins@jenkins.phx.ovirt.org
See <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/358/>
------------------------------------------
Started by timer
[EnvInject] - Loading node environment variables.
Building remotely on ovirt-srv08.phx.ovirt.org (phx physical integ-tests fc23) in workspace <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/>
Cloning the remote Git repository
Cloning repository git://gerrit.ovirt.org/ovirt-system-tests.git
> git init <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ovirt-system-tests> # timeout=10
Fetching upstream changes from git://gerrit.ovirt.org/ovirt-system-tests.git
> git --version # timeout=10
> git -c core.askpass=true fetch --tags --progress git://gerrit.ovirt.org/ovirt-system-tests.git +refs/heads/*:refs/remotes/origin/*
ERROR: Error cloning remote repo 'origin'
hudson.plugins.git.GitException: Command "git -c core.askpass=true fetch --tags --progress git://gerrit.ovirt.org/ovirt-system-tests.git +refs/heads/*:refs/remotes/origin/*" returned status code 128:
stdout:
stderr: fatal: Unable to look up gerrit.ovirt.org (port 9418) (No address associated with hostname)
at org.jenkinsci.plugins.gitclient.CliGitAPIImpl.launchCommandIn(CliGitAPIImpl.java:1640)
at org.jenkinsci.plugins.gitclient.CliGitAPIImpl.launchCommandWithCredentials(CliGitAPIImpl.java:1388)
at org.jenkinsci.plugins.gitclient.CliGitAPIImpl.access$300(CliGitAPIImpl.java:62)
at org.jenkinsci.plugins.gitclient.CliGitAPIImpl$1.execute(CliGitAPIImpl.java:313)
at org.jenkinsci.plugins.gitclient.CliGitAPIImpl$2.execute(CliGitAPIImpl.java:505)
at org.jenkinsci.plugins.gitclient.RemoteGitImpl$CommandInvocationHandler$1.call(RemoteGitImpl.java:152)
at org.jenkinsci.plugins.gitclient.RemoteGitImpl$CommandInvocationHandler$1.call(RemoteGitImpl.java:145)
at hudson.remoting.UserRequest.perform(UserRequest.java:152)
at hudson.remoting.UserRequest.perform(UserRequest.java:50)
at hudson.remoting.Request$2.run(Request.java:332)
at hudson.remoting.InterceptingExecutorService$1.call(InterceptingExecutorService.java:68)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
at ......remote call to ovirt-srv08.phx.ovirt.org(Native Method)
at hudson.remoting.Channel.attachCallSiteStackTrace(Channel.java:1416)
at hudson.remoting.UserResponse.retrieve(UserRequest.java:252)
at hudson.remoting.Channel.call(Channel.java:781)
at org.jenkinsci.plugins.gitclient.RemoteGitImpl$CommandInvocationHandler.execute(RemoteGitImpl.java:145)
at sun.reflect.GeneratedMethodAccessor512.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.jenkinsci.plugins.gitclient.RemoteGitImpl$CommandInvocationHandler.invoke(RemoteGitImpl.java:131)
at com.sun.proxy.$Proxy59.execute(Unknown Source)
at hudson.plugins.git.GitSCM.retrieveChanges(GitSCM.java:1013)
at hudson.plugins.git.GitSCM.checkout(GitSCM.java:1053)
at org.jenkinsci.plugins.multiplescms.MultiSCM.checkout(MultiSCM.java:129)
at hudson.scm.SCM.checkout(SCM.java:485)
at hudson.model.AbstractProject.checkout(AbstractProject.java:1269)
at hudson.model.AbstractBuild$AbstractBuildExecution.defaultCheckout(AbstractBuild.java:607)
at jenkins.scm.SCMCheckoutStrategy.checkout(SCMCheckoutStrategy.java:86)
at hudson.model.AbstractBuild$AbstractBuildExecution.run(AbstractBuild.java:529)
at hudson.model.Run.execute(Run.java:1738)
at hudson.model.FreeStyleBuild.run(FreeStyleBuild.java:43)
at hudson.model.ResourceController.execute(ResourceController.java:98)
at hudson.model.Executor.run(Executor.java:410)
ERROR: null
Performing Post build task...
Match found for :.* : True
Logical operation result is TRUE
Running script : #!/bin/bash -xe
echo 'shell_scripts/system_tests.collect_logs.sh'
#
# Required jjb vars:
# version
#
VERSION=4.0
SUITE_TYPE=
WORKSPACE="$PWD"
OVIRT_SUITE="$SUITE_TYPE_suite_$VERSION"
TESTS_LOGS="$WORKSPACE/ovirt-system-tests/exported-artifacts"
rm -rf "$WORKSPACE/exported-artifacts"
mkdir -p "$WORKSPACE/exported-artifacts"
if [[ -d "$TESTS_LOGS" ]]; then
mv "$TESTS_LOGS/"* "$WORKSPACE/exported-artifacts/"
fi
[ovirt_4.0_he-system-tests] $ /bin/bash -xe /tmp/hudson2349874729798992178.sh
+ echo shell_scripts/system_tests.collect_logs.sh
shell_scripts/system_tests.collect_logs.sh
+ VERSION=4.0
+ SUITE_TYPE=
+ WORKSPACE=<http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/>
+ OVIRT_SUITE=4.0
+ TESTS_LOGS=<http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ovirt-system-te...>
+ rm -rf <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/358/artifact/expor...>
+ mkdir -p <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/358/artifact/expor...>
+ [[ -d <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ovirt-system-te...> ]]
POST BUILD TASK : SUCCESS
END OF POST BUILD TASK : 0
Match found for :.* : True
Logical operation result is TRUE
Running script : #!/bin/bash -xe
echo "shell-scripts/mock_cleanup.sh"
shopt -s nullglob
WORKSPACE="$PWD"
# Make clear this is the cleanup, helps reading the jenkins logs
cat <<EOC
_______________________________________________________________________
#######################################################################
# #
# CLEANUP #
# #
#######################################################################
EOC
# Archive the logs, we want them anyway
logs=(
./*log
./*/logs
)
if [[ "$logs" ]]; then
tar cvzf exported-artifacts/logs.tgz "${logs[@]}"
rm -rf "${logs[@]}"
fi
# stop any processes running inside the chroot
failed=false
mock_confs=("$WORKSPACE"/*/mocker*)
# Clean current jobs mockroot if any
for mock_conf_file in "${mock_confs[@]}"; do
[[ "$mock_conf_file" ]] || continue
echo "Cleaning up mock $mock_conf"
mock_root="${mock_conf_file##*/}"
mock_root="${mock_root%.*}"
my_mock="/usr/bin/mock"
my_mock+=" --configdir=${mock_conf_file%/*}"
my_mock+=" --root=${mock_root}"
my_mock+=" --resultdir=$WORKSPACE"
#TODO: investigate why mock --clean fails to umount certain dirs sometimes,
#so we can use it instead of manually doing all this.
echo "Killing all mock orphan processes, if any."
$my_mock \
--orphanskill \
|| {
echo "ERROR: Failed to kill orphans on $chroot."
failed=true
}
mock_root="$(\
grep \
-Po "(?<=config_opts\['root'\] = ')[^']*" \
"$mock_conf_file" \
)" || :
[[ "$mock_root" ]] || continue
mounts=($(mount | awk '{print $3}' | grep "$mock_root")) || :
if [[ "$mounts" ]]; then
echo "Found mounted dirs inside the chroot $chroot. Trying to umount."
fi
for mount in "${mounts[@]}"; do
sudo umount --lazy "$mount" \
|| {
echo "ERROR: Failed to umount $mount."
failed=true
}
done
done
# Clean any leftover chroot from other jobs
for mock_root in /var/lib/mock/*; do
this_chroot_failed=false
mounts=($(mount | awk '{print $3}' | grep "$mock_root")) || :
if [[ "$mounts" ]]; then
echo "Found mounted dirs inside the chroot $mock_root." \
"Trying to umount."
fi
for mount in "${mounts[@]}"; do
sudo umount --lazy "$mount" \
|| {
echo "ERROR: Failed to umount $mount."
failed=true
this_chroot_failed=true
}
done
if ! $this_chroot_failed; then
sudo rm -rf "$mock_root"
fi
done
if $failed; then
echo "Aborting."
exit 1
fi
# remove mock system cache, we will setup proxies to do the caching and this
# takes lots of space between runs
shopt -u nullglob
sudo rm -Rf /var/cache/mock/*
# restore the permissions in the working dir, as sometimes it leaves files
# owned by root and then the 'cleanup workspace' from jenkins job fails to
# clean and breaks the jobs
sudo chown -R "$USER" "$WORKSPACE"
[ovirt_4.0_he-system-tests] $ /bin/bash -xe /tmp/hudson1163283835629940871.sh
+ echo shell-scripts/mock_cleanup.sh
shell-scripts/mock_cleanup.sh
+ shopt -s nullglob
+ WORKSPACE=<http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/>
+ cat
_______________________________________________________________________
#######################################################################
# #
# CLEANUP #
# #
#######################################################################
+ logs=(./*log ./*/logs)
+ [[ -n '' ]]
+ failed=false
+ mock_confs=("$WORKSPACE"/*/mocker*)
+ false
+ shopt -u nullglob
+ sudo rm -Rf '/var/cache/mock/*'
+ sudo chown -R jenkins <http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/>
POST BUILD TASK : SUCCESS
END OF POST BUILD TASK : 1
Recording test results
ERROR: Step ‘Publish JUnit test result report’ failed: No test report files were found. Configuration error?
Archiving artifacts
8 years, 1 month
[JIRA] (OVIRT-754) Guest name 'TestNode-node' is already in use.
by Fabian Deutsch (oVirt JIRA)
[ https://ovirt-jira.atlassian.net/browse/OVIRT-754?page=com.atlassian.jira... ]
Fabian Deutsch commented on OVIRT-754:
--------------------------------------
Feel free to review https://gerrit.ovirt.org/#/c/64511/
- fabian
On Fri, Oct 7, 2016 at 9:26 AM, Sandro Bonazzola <sbonazzo(a)redhat.com>
wrote:
>
> http://jenkins.ovirt.org/job/ovirt-node-ng_master_build-
> artifacts-el7-x86_64/138/
>
> ======================================================================
> ERROR: test suite for <class 'testSanity.TestNode'>
> ----------------------------------------------------------------------
> Traceback (most recent call last):
> File "/usr/lib/python2.7/site-packages/nose/suite.py", line 208, in run
> self.setUp()
> File "/usr/lib/python2.7/site-packages/nose/suite.py", line 291, in
> setUp
> self.setupContext(ancestor)
> File "/usr/lib/python2.7/site-packages/nose/suite.py", line 314, in
> setupContext
> try_run(context, names)
> File "/usr/lib/python2.7/site-packages/nose/util.py", line 469, in
> try_run
> return func()
> File "/home/jenkins/workspace/ovirt-node-ng_master_build-
> artifacts-el7-x86_64/ovirt-node-ng/tests/testVirt.py", line 150, in
> setUpClass
> 77)
> File "/home/jenkins/workspace/ovirt-node-ng_master_build-
> artifacts-el7-x86_64/ovirt-node-ng/tests/testVirt.py", line 88, in
> _start_vm
> dom = VM.create(name, img, ssh_port=ssh_port, memory_gb=memory_gb)
> File "/home/jenkins/workspace/ovirt-node-ng_master_build-
> artifacts-el7-x86_64/ovirt-node-ng/tests/virt.py", line 217, in create
> dom = sh.virt_install(*args, **kwargs)
> File "/usr/lib/python2.7/site-packages/sh.py", line 1021, in __call__
> return RunningCommand(cmd, call_args, stdin, stdout, stderr)
> File "/usr/lib/python2.7/site-packages/sh.py", line 486, in __init__
> self.wait()
> File "/usr/lib/python2.7/site-packages/sh.py", line 500, in wait
> self.handle_command_exit_code(exit_code)
> File "/usr/lib/python2.7/site-packages/sh.py", line 516, in
> handle_command_exit_code
> raise exc(self.ran, self.process.stdout, self.process.stderr)
> ErrorReturnCode_1:
>
> RAN: '/bin/virt-install --import --print-xml --network=user,model=virtio
> --noautoconsole --memory=2048 --rng=/dev/random --memballoon=virtio
> --cpu=host --vcpus=4 --graphics=vnc --watchdog=default,action=poweroff
> --serial=pty --disk=path=/var/tmp/TestNode-node.qcow2,bus=virtio,format=
> qcow2,driver_type=qcow2,discard=unmap,cache=unsafe --check=all=off
> --channel=unix,target_type=virtio,name=local.test.0 --name=TestNode-node'
>
> STDOUT:
>
>
> STDERR:
> ERROR Guest name 'TestNode-node' is already in use.
>
> Seems to be a run in a not clean environment. Not sure about what caused
> this.
>
>
> --
> Sandro Bonazzola
> Better technology. Faster innovation. Powered by community collaboration.
> See how it works at redhat.com
> <https://www.redhat.com/it/about/events/red-hat-open-source-day-2016>
>
--
Fabian Deutsch <fdeutsch(a)redhat.com>
RHEV Hypervisor
Red Hat
> Guest name 'TestNode-node' is already in use.
> ---------------------------------------------
>
> Key: OVIRT-754
> URL: https://ovirt-jira.atlassian.net/browse/OVIRT-754
> Project: oVirt - virtualization made easy
> Issue Type: By-EMAIL
> Reporter: sbonazzo
> Assignee: infra
>
> http://jenkins.ovirt.org/job/ovirt-node-ng_master_build-artifacts-el7-x86...
> ======================================================================
> ERROR: test suite for <class 'testSanity.TestNode'>
> ----------------------------------------------------------------------
> Traceback (most recent call last):
> File "/usr/lib/python2.7/site-packages/nose/suite.py", line 208, in run
> self.setUp()
> File "/usr/lib/python2.7/site-packages/nose/suite.py", line 291, in setUp
> self.setupContext(ancestor)
> File "/usr/lib/python2.7/site-packages/nose/suite.py", line 314, in
> setupContext
> try_run(context, names)
> File "/usr/lib/python2.7/site-packages/nose/util.py", line 469, in try_run
> return func()
> File
> "/home/jenkins/workspace/ovirt-node-ng_master_build-artifacts-el7-x86_64/ovirt-node-ng/tests/testVirt.py",
> line 150, in setUpClass
> 77)
> File
> "/home/jenkins/workspace/ovirt-node-ng_master_build-artifacts-el7-x86_64/ovirt-node-ng/tests/testVirt.py",
> line 88, in _start_vm
> dom = VM.create(name, img, ssh_port=ssh_port, memory_gb=memory_gb)
> File
> "/home/jenkins/workspace/ovirt-node-ng_master_build-artifacts-el7-x86_64/ovirt-node-ng/tests/virt.py",
> line 217, in create
> dom = sh.virt_install(*args, **kwargs)
> File "/usr/lib/python2.7/site-packages/sh.py", line 1021, in __call__
> return RunningCommand(cmd, call_args, stdin, stdout, stderr)
> File "/usr/lib/python2.7/site-packages/sh.py", line 486, in __init__
> self.wait()
> File "/usr/lib/python2.7/site-packages/sh.py", line 500, in wait
> self.handle_command_exit_code(exit_code)
> File "/usr/lib/python2.7/site-packages/sh.py", line 516, in
> handle_command_exit_code
> raise exc(self.ran, self.process.stdout, self.process.stderr)
> ErrorReturnCode_1:
> RAN: '/bin/virt-install --import --print-xml --network=user,model=virtio
> --noautoconsole --memory=2048 --rng=/dev/random --memballoon=virtio
> --cpu=host --vcpus=4 --graphics=vnc --watchdog=default,action=poweroff
> --serial=pty
> --disk=path=/var/tmp/TestNode-node.qcow2,bus=virtio,format=qcow2,driver_type=qcow2,discard=unmap,cache=unsafe
> --check=all=off --channel=unix,target_type=virtio,name=local.test.0
> --name=TestNode-node'
> STDOUT:
> STDERR:
> ERROR Guest name 'TestNode-node' is already in use.
> Seems to be a run in a not clean environment. Not sure about what caused
> this.
> --
> Sandro Bonazzola
> Better technology. Faster innovation. Powered by community collaboration.
> See how it works at redhat.com
> <https://www.redhat.com/it/about/events/red-hat-open-source-day-2016>
--
This message was sent by Atlassian JIRA
(v1000.383.2#100014)
8 years, 1 month
[JIRA] (OVIRT-754) Guest name 'TestNode-node' is already in use.
by sbonazzo (oVirt JIRA)
sbonazzo created OVIRT-754:
------------------------------
Summary: Guest name 'TestNode-node' is already in use.
Key: OVIRT-754
URL: https://ovirt-jira.atlassian.net/browse/OVIRT-754
Project: oVirt - virtualization made easy
Issue Type: By-EMAIL
Reporter: sbonazzo
Assignee: infra
http://jenkins.ovirt.org/job/ovirt-node-ng_master_build-artifacts-el7-x86...
======================================================================
ERROR: test suite for <class 'testSanity.TestNode'>
----------------------------------------------------------------------
Traceback (most recent call last):
File "/usr/lib/python2.7/site-packages/nose/suite.py", line 208, in run
self.setUp()
File "/usr/lib/python2.7/site-packages/nose/suite.py", line 291, in setUp
self.setupContext(ancestor)
File "/usr/lib/python2.7/site-packages/nose/suite.py", line 314, in
setupContext
try_run(context, names)
File "/usr/lib/python2.7/site-packages/nose/util.py", line 469, in try_run
return func()
File
"/home/jenkins/workspace/ovirt-node-ng_master_build-artifacts-el7-x86_64/ovirt-node-ng/tests/testVirt.py",
line 150, in setUpClass
77)
File
"/home/jenkins/workspace/ovirt-node-ng_master_build-artifacts-el7-x86_64/ovirt-node-ng/tests/testVirt.py",
line 88, in _start_vm
dom = VM.create(name, img, ssh_port=ssh_port, memory_gb=memory_gb)
File
"/home/jenkins/workspace/ovirt-node-ng_master_build-artifacts-el7-x86_64/ovirt-node-ng/tests/virt.py",
line 217, in create
dom = sh.virt_install(*args, **kwargs)
File "/usr/lib/python2.7/site-packages/sh.py", line 1021, in __call__
return RunningCommand(cmd, call_args, stdin, stdout, stderr)
File "/usr/lib/python2.7/site-packages/sh.py", line 486, in __init__
self.wait()
File "/usr/lib/python2.7/site-packages/sh.py", line 500, in wait
self.handle_command_exit_code(exit_code)
File "/usr/lib/python2.7/site-packages/sh.py", line 516, in
handle_command_exit_code
raise exc(self.ran, self.process.stdout, self.process.stderr)
ErrorReturnCode_1:
RAN: '/bin/virt-install --import --print-xml --network=user,model=virtio
--noautoconsole --memory=2048 --rng=/dev/random --memballoon=virtio
--cpu=host --vcpus=4 --graphics=vnc --watchdog=default,action=poweroff
--serial=pty
--disk=path=/var/tmp/TestNode-node.qcow2,bus=virtio,format=qcow2,driver_type=qcow2,discard=unmap,cache=unsafe
--check=all=off --channel=unix,target_type=virtio,name=local.test.0
--name=TestNode-node'
STDOUT:
STDERR:
ERROR Guest name 'TestNode-node' is already in use.
Seems to be a run in a not clean environment. Not sure about what caused
this.
--
Sandro Bonazzola
Better technology. Faster innovation. Powered by community collaboration.
See how it works at redhat.com
<https://www.redhat.com/it/about/events/red-hat-open-source-day-2016>
--
This message was sent by Atlassian JIRA
(v1000.383.2#100014)
8 years, 1 month