[JIRA] (OVIRT-938) Fix Jenkins slave connection dying on vdsm check_merged jobs
by Barak Korren (oVirt JIRA)
Barak Korren created OVIRT-938:
----------------------------------
Summary: Fix Jenkins slave connection dying on vdsm check_merged jobs
Key: OVIRT-938
URL: https://ovirt-jira.atlassian.net/browse/OVIRT-938
Project: oVirt - virtualization made easy
Issue Type: Bug
Components: Jenkins
Reporter: Barak Korren
Assignee: infra
Something in the vdsm build_artifacs job makes the Jenkins slave disconnect when it is running. This in turn makes the cleanup scripts not run on the slave leaving it dirty enough to make the next job on that slave fail.
Example of this can be seen here:
http://jenkins.ovirt.org/job/vdsm_master_check-merged-el7-x86_64/692/console
Relevant log lines:
{code}
21:49:00 Ran 44 tests in 1231.988s
21:49:00
21:49:00 OK
21:49:00 + return 0
21:49:00 sh: [13086: 1 (255)] tcsetattr: Inappropriate ioctl for device
21:49:00 Took 2464 seconds
21:49:00 ===================================
21:49:00 logout
21:49:01 Slave went offline during the build
21:49:01 ERROR: Connection was broken: java.io.IOException: Unexpected termination of the channel
21:49:01 at hudson.remoting.SynchronousCommandTransport$ReaderThread.run(SynchronousCommandTransport.java:50)
21:49:01 Caused by: java.io.EOFException
21:49:01 at java.io.ObjectInputStream$PeekInputStream.readFully(ObjectInputStream.java:2353)
21:49:01 at java.io.ObjectInputStream$BlockDataInputStream.readShort(ObjectInputStream.java:2822)
21:49:01 at java.io.ObjectInputStream.readStreamHeader(ObjectInputStream.java:804)
21:49:01 at java.io.ObjectInputStream.<init>(ObjectInputStream.java:301)
21:49:01 at hudson.remoting.ObjectInputStreamEx.<init>(ObjectInputStreamEx.java:48)
21:49:01 at hudson.remoting.AbstractSynchronousByteArrayCommandTransport.read(AbstractSynchronousByteArrayCommandTransport.java:34)
21:49:01 at hudson.remoting.SynchronousCommandTransport$ReaderThread.run(SynchronousCommandTransport.java:48)
21:49:01
21:49:01 Build step 'Execute shell' marked build as failure
21:49:01 Performing Post build task...
21:49:01 Match found for :.* : True
21:49:01 Logical operation result is TRUE
21:49:01 Running script : #!/bin/bash -x
21:49:01 echo "shell-scripts/mock_cleanup.sh"
... SNIP ...
21:49:01 Exception when executing the batch command : no workspace from node hudson.slaves.DumbSlave[fc24-vm06.phx.ovirt.org] which is computer hudson.slaves.SlaveComputer@30863c81 and has channel null
21:49:01 Build step 'Post build task' marked build as failure
21:49:02 ERROR: Step ?Archive the artifacts? failed: no workspace for vdsm_master_check-merged-el7-x86_64 #692
21:49:02 ERROR: Failed to evaluate groovy script.
21:49:02 java.lang.NullPointerException: Cannot invoke method child() on null object
21:49:02 at org.codehaus.groovy.runtime.NullObject.invokeMethod(NullObject.java:77)
21:49:02 at org.codehaus.groovy.runtime.callsite.PogoMetaClassSite.call(PogoMetaClassSite.java:45)
21:49:02 at org.codehaus.groovy.runtime.callsite.CallSiteArray.defaultCall(CallSiteArray.java:42)
21:49:02 at org.codehaus.groovy.runtime.callsite.NullCallSite.call(NullCallSite.java:32)
21:49:02 at org.codehaus.groovy.runtime.callsite.CallSiteArray.defaultCall(CallSiteArray.java:42)
21:49:02 at org.codehaus.groovy.runtime.callsite.AbstractCallSite.call(AbstractCallSite.java:108)
21:49:02 at org.codehaus.groovy.runtime.callsite.AbstractCallSite.call(AbstractCallSite.java:116)
21:49:02 at Script1.run(Script1.groovy:2)
21:49:02 at groovy.lang.GroovyShell.evaluate(GroovyShell.java:580)
21:49:02 at groovy.lang.GroovyShell.evaluate(GroovyShell.java:618)
21:49:02 at groovy.lang.GroovyShell.evaluate(GroovyShell.java:589)
21:49:02 at org.jenkinsci.plugins.scriptsecurity.sandbox.groovy.SecureGroovyScript.evaluate(SecureGroovyScript.java:166)
21:49:02 at org.jvnet.hudson.plugins.groovypostbuild.GroovyPostbuildRecorder.perform(GroovyPostbuildRecorder.java:361)
21:49:02 at hudson.tasks.BuildStepMonitor$1.perform(BuildStepMonitor.java:20)
21:49:02 at hudson.model.AbstractBuild$AbstractBuildExecution.perform(AbstractBuild.java:782)
21:49:02 at hudson.model.AbstractBuild$AbstractBuildExecution.performAllBuildSteps(AbstractBuild.java:723)
21:49:02 at hudson.model.Build$BuildExecution.post2(Build.java:185)
21:49:02 at hudson.model.AbstractBuild$AbstractBuildExecution.post(AbstractBuild.java:668)
21:49:02 at hudson.model.Run.execute(Run.java:1763)
21:49:02 at hudson.model.FreeStyleBuild.run(FreeStyleBuild.java:43)
21:49:02 at hudson.model.ResourceController.execute(ResourceController.java:98)
21:49:02 at hudson.model.Executor.run(Executor.java:410)
21:49:02 Build step 'Groovy Postbuild' marked build as failure
21:49:02 Started calculate disk usage of build
21:49:02 Finished Calculation of disk usage of build in 0 seconds
21:49:02 Finished: FAILURE
{code}
--
This message was sent by Atlassian JIRA
(v1000.621.2#100023)
7 years, 11 months
[JIRA] (OVIRT-938) Fix Jenkins slave connection dying on vdsm check_merged jobs
by Barak Korren (oVirt JIRA)
[ https://ovirt-jira.atlassian.net/browse/OVIRT-938?page=com.atlassian.jira... ]
Barak Korren updated OVIRT-938:
-------------------------------
Epic Link: OVIRT-400
> Fix Jenkins slave connection dying on vdsm check_merged jobs
> ------------------------------------------------------------
>
> Key: OVIRT-938
> URL: https://ovirt-jira.atlassian.net/browse/OVIRT-938
> Project: oVirt - virtualization made easy
> Issue Type: Bug
> Components: Jenkins
> Reporter: Barak Korren
> Assignee: infra
>
> Something in the vdsm build_artifacs job makes the Jenkins slave disconnect when it is running. This in turn makes the cleanup scripts not run on the slave leaving it dirty enough to make the next job on that slave fail.
> Example of this can be seen here:
> http://jenkins.ovirt.org/job/vdsm_master_check-merged-el7-x86_64/692/console
> Relevant log lines:
> {code}
> 21:49:00 Ran 44 tests in 1231.988s
> 21:49:00
> 21:49:00 OK
> 21:49:00 + return 0
> 21:49:00 sh: [13086: 1 (255)] tcsetattr: Inappropriate ioctl for device
> 21:49:00 Took 2464 seconds
> 21:49:00 ===================================
> 21:49:00 logout
> 21:49:01 Slave went offline during the build
> 21:49:01 ERROR: Connection was broken: java.io.IOException: Unexpected termination of the channel
> 21:49:01 at hudson.remoting.SynchronousCommandTransport$ReaderThread.run(SynchronousCommandTransport.java:50)
> 21:49:01 Caused by: java.io.EOFException
> 21:49:01 at java.io.ObjectInputStream$PeekInputStream.readFully(ObjectInputStream.java:2353)
> 21:49:01 at java.io.ObjectInputStream$BlockDataInputStream.readShort(ObjectInputStream.java:2822)
> 21:49:01 at java.io.ObjectInputStream.readStreamHeader(ObjectInputStream.java:804)
> 21:49:01 at java.io.ObjectInputStream.<init>(ObjectInputStream.java:301)
> 21:49:01 at hudson.remoting.ObjectInputStreamEx.<init>(ObjectInputStreamEx.java:48)
> 21:49:01 at hudson.remoting.AbstractSynchronousByteArrayCommandTransport.read(AbstractSynchronousByteArrayCommandTransport.java:34)
> 21:49:01 at hudson.remoting.SynchronousCommandTransport$ReaderThread.run(SynchronousCommandTransport.java:48)
> 21:49:01
> 21:49:01 Build step 'Execute shell' marked build as failure
> 21:49:01 Performing Post build task...
> 21:49:01 Match found for :.* : True
> 21:49:01 Logical operation result is TRUE
> 21:49:01 Running script : #!/bin/bash -x
> 21:49:01 echo "shell-scripts/mock_cleanup.sh"
> ... SNIP ...
> 21:49:01 Exception when executing the batch command : no workspace from node hudson.slaves.DumbSlave[fc24-vm06.phx.ovirt.org] which is computer hudson.slaves.SlaveComputer@30863c81 and has channel null
> 21:49:01 Build step 'Post build task' marked build as failure
> 21:49:02 ERROR: Step ?Archive the artifacts? failed: no workspace for vdsm_master_check-merged-el7-x86_64 #692
> 21:49:02 ERROR: Failed to evaluate groovy script.
> 21:49:02 java.lang.NullPointerException: Cannot invoke method child() on null object
> 21:49:02 at org.codehaus.groovy.runtime.NullObject.invokeMethod(NullObject.java:77)
> 21:49:02 at org.codehaus.groovy.runtime.callsite.PogoMetaClassSite.call(PogoMetaClassSite.java:45)
> 21:49:02 at org.codehaus.groovy.runtime.callsite.CallSiteArray.defaultCall(CallSiteArray.java:42)
> 21:49:02 at org.codehaus.groovy.runtime.callsite.NullCallSite.call(NullCallSite.java:32)
> 21:49:02 at org.codehaus.groovy.runtime.callsite.CallSiteArray.defaultCall(CallSiteArray.java:42)
> 21:49:02 at org.codehaus.groovy.runtime.callsite.AbstractCallSite.call(AbstractCallSite.java:108)
> 21:49:02 at org.codehaus.groovy.runtime.callsite.AbstractCallSite.call(AbstractCallSite.java:116)
> 21:49:02 at Script1.run(Script1.groovy:2)
> 21:49:02 at groovy.lang.GroovyShell.evaluate(GroovyShell.java:580)
> 21:49:02 at groovy.lang.GroovyShell.evaluate(GroovyShell.java:618)
> 21:49:02 at groovy.lang.GroovyShell.evaluate(GroovyShell.java:589)
> 21:49:02 at org.jenkinsci.plugins.scriptsecurity.sandbox.groovy.SecureGroovyScript.evaluate(SecureGroovyScript.java:166)
> 21:49:02 at org.jvnet.hudson.plugins.groovypostbuild.GroovyPostbuildRecorder.perform(GroovyPostbuildRecorder.java:361)
> 21:49:02 at hudson.tasks.BuildStepMonitor$1.perform(BuildStepMonitor.java:20)
> 21:49:02 at hudson.model.AbstractBuild$AbstractBuildExecution.perform(AbstractBuild.java:782)
> 21:49:02 at hudson.model.AbstractBuild$AbstractBuildExecution.performAllBuildSteps(AbstractBuild.java:723)
> 21:49:02 at hudson.model.Build$BuildExecution.post2(Build.java:185)
> 21:49:02 at hudson.model.AbstractBuild$AbstractBuildExecution.post(AbstractBuild.java:668)
> 21:49:02 at hudson.model.Run.execute(Run.java:1763)
> 21:49:02 at hudson.model.FreeStyleBuild.run(FreeStyleBuild.java:43)
> 21:49:02 at hudson.model.ResourceController.execute(ResourceController.java:98)
> 21:49:02 at hudson.model.Executor.run(Executor.java:410)
> 21:49:02 Build step 'Groovy Postbuild' marked build as failure
> 21:49:02 Started calculate disk usage of build
> 21:49:02 Finished Calculation of disk usage of build in 0 seconds
> 21:49:02 Finished: FAILURE
> {code}
--
This message was sent by Atlassian JIRA
(v1000.621.2#100023)
7 years, 11 months
[JIRA] (OVIRT-937) Fix mock_cleanup.sh umount failures
by Barak Korren (oVirt JIRA)
[ https://ovirt-jira.atlassian.net/browse/OVIRT-937?page=com.atlassian.jira... ]
Barak Korren reassigned OVIRT-937:
----------------------------------
Assignee: Barak Korren (was: infra)
> Fix mock_cleanup.sh umount failures
> -----------------------------------
>
> Key: OVIRT-937
> URL: https://ovirt-jira.atlassian.net/browse/OVIRT-937
> Project: oVirt - virtualization made easy
> Issue Type: Bug
> Components: Jenkins
> Reporter: Barak Korren
> Assignee: Barak Korren
>
> Cleanup script can log faux failures when trying to umount leftover mock file systems.
> This happened here:
> http://jenkins.ovirt.org/job/vdsm_master_check-patch-fc24-x86_64/6291/con...
> Relevant log lines:
> {code}
> 18:36:55 + echo 'Found mounted dirs inside the chroot /var/lib/mock/epel-7-x86_64-95d9ead9d725499a15a9021ba2fe9831-54661.' 'Trying to umount.'
> 18:36:55 Found mounted dirs inside the chroot /var/lib/mock/epel-7-x86_64-95d9ead9d725499a15a9021ba2fe9831-54661. Trying to umount.
> 18:36:55 + for mount in '"${mounts[@]}"'
> 18:36:55 + sudo umount --lazy /var/lib/mock/epel-7-x86_64-95d9ead9d725499a15a9021ba2fe9831-54661/root/proc
> 18:36:55 + for mount in '"${mounts[@]}"'
> 18:36:55 + sudo umount --lazy /var/lib/mock/epel-7-x86_64-95d9ead9d725499a15a9021ba2fe9831-54661/root/sys
> 18:36:55 + for mount in '"${mounts[@]}"'
> 18:36:55 + sudo umount --lazy /var/lib/mock/epel-7-x86_64-95d9ead9d725499a15a9021ba2fe9831-54661/root/dev/shm
> 18:36:55 + for mount in '"${mounts[@]}"'
> 18:36:55 + sudo umount --lazy /var/lib/mock/epel-7-x86_64-95d9ead9d725499a15a9021ba2fe9831-54661/root/dev/pts
> 18:36:55 + for mount in '"${mounts[@]}"'
> 18:36:55 + sudo umount --lazy /var/lib/mock/epel-7-x86_64-95d9ead9d725499a15a9021ba2fe9831-54661/root/var/cache/yum
> 18:36:55 + for mount in '"${mounts[@]}"'
> 18:36:55 + sudo umount --lazy /var/lib/mock/epel-7-x86_64-95d9ead9d725499a15a9021ba2fe9831-54661/root/home/jenkins/workspace/vdsm_master_check-merged-el7-x86_64/vdsm
> 18:36:55 + for mount in '"${mounts[@]}"'
> 18:36:55 + sudo umount --lazy /var/lib/mock/epel-7-x86_64-95d9ead9d725499a15a9021ba2fe9831-54661/root/run/libvirt
> 18:36:55 + for mount in '"${mounts[@]}"'
> 18:36:55 + sudo umount --lazy /var/lib/mock/epel-7-x86_64-95d9ead9d725499a15a9021ba2fe9831-54661/root/var/lib/lago
> 18:36:55 + for mount in '"${mounts[@]}"'
> 18:36:55 + sudo umount --lazy /var/lib/mock/epel-7-x86_64-95d9ead9d725499a15a9021ba2fe9831-54661/root/proc/filesystems
> 18:36:55 umount: /var/lib/mock/epel-7-x86_64-95d9ead9d725499a15a9021ba2fe9831-54661/root/proc/filesystems: mountpoint not found
> 18:36:55 + echo 'ERROR: Failed to umount /var/lib/mock/epel-7-x86_64-95d9ead9d725499a15a9021ba2fe9831-54661/root/proc/filesystems.'
> 18:36:55 ERROR: Failed to umount /var/lib/mock/epel-7-x86_64-95d9ead9d725499a15a9021ba2fe9831-54661/root/proc/filesystems.
> 18:36:55 + failed=true
> 18:36:55 + this_chroot_failed=true
> {code}
> We should make the umount loop not fail if the FS its trying to un-mount is not mounted anymore.
--
This message was sent by Atlassian JIRA
(v1000.621.2#100023)
7 years, 11 months
[JIRA] (OVIRT-937) Fix mock_cleanup.sh umount failures
by Barak Korren (oVirt JIRA)
Barak Korren created OVIRT-937:
----------------------------------
Summary: Fix mock_cleanup.sh umount failures
Key: OVIRT-937
URL: https://ovirt-jira.atlassian.net/browse/OVIRT-937
Project: oVirt - virtualization made easy
Issue Type: Bug
Components: Jenkins
Reporter: Barak Korren
Assignee: infra
Cleanup script can log faux failures when trying to umount leftover mock file systems.
This happened here:
http://jenkins.ovirt.org/job/vdsm_master_check-patch-fc24-x86_64/6291/con...
Relevant log lines:
{code}
18:36:55 + echo 'Found mounted dirs inside the chroot /var/lib/mock/epel-7-x86_64-95d9ead9d725499a15a9021ba2fe9831-54661.' 'Trying to umount.'
18:36:55 Found mounted dirs inside the chroot /var/lib/mock/epel-7-x86_64-95d9ead9d725499a15a9021ba2fe9831-54661. Trying to umount.
18:36:55 + for mount in '"${mounts[@]}"'
18:36:55 + sudo umount --lazy /var/lib/mock/epel-7-x86_64-95d9ead9d725499a15a9021ba2fe9831-54661/root/proc
18:36:55 + for mount in '"${mounts[@]}"'
18:36:55 + sudo umount --lazy /var/lib/mock/epel-7-x86_64-95d9ead9d725499a15a9021ba2fe9831-54661/root/sys
18:36:55 + for mount in '"${mounts[@]}"'
18:36:55 + sudo umount --lazy /var/lib/mock/epel-7-x86_64-95d9ead9d725499a15a9021ba2fe9831-54661/root/dev/shm
18:36:55 + for mount in '"${mounts[@]}"'
18:36:55 + sudo umount --lazy /var/lib/mock/epel-7-x86_64-95d9ead9d725499a15a9021ba2fe9831-54661/root/dev/pts
18:36:55 + for mount in '"${mounts[@]}"'
18:36:55 + sudo umount --lazy /var/lib/mock/epel-7-x86_64-95d9ead9d725499a15a9021ba2fe9831-54661/root/var/cache/yum
18:36:55 + for mount in '"${mounts[@]}"'
18:36:55 + sudo umount --lazy /var/lib/mock/epel-7-x86_64-95d9ead9d725499a15a9021ba2fe9831-54661/root/home/jenkins/workspace/vdsm_master_check-merged-el7-x86_64/vdsm
18:36:55 + for mount in '"${mounts[@]}"'
18:36:55 + sudo umount --lazy /var/lib/mock/epel-7-x86_64-95d9ead9d725499a15a9021ba2fe9831-54661/root/run/libvirt
18:36:55 + for mount in '"${mounts[@]}"'
18:36:55 + sudo umount --lazy /var/lib/mock/epel-7-x86_64-95d9ead9d725499a15a9021ba2fe9831-54661/root/var/lib/lago
18:36:55 + for mount in '"${mounts[@]}"'
18:36:55 + sudo umount --lazy /var/lib/mock/epel-7-x86_64-95d9ead9d725499a15a9021ba2fe9831-54661/root/proc/filesystems
18:36:55 umount: /var/lib/mock/epel-7-x86_64-95d9ead9d725499a15a9021ba2fe9831-54661/root/proc/filesystems: mountpoint not found
18:36:55 + echo 'ERROR: Failed to umount /var/lib/mock/epel-7-x86_64-95d9ead9d725499a15a9021ba2fe9831-54661/root/proc/filesystems.'
18:36:55 ERROR: Failed to umount /var/lib/mock/epel-7-x86_64-95d9ead9d725499a15a9021ba2fe9831-54661/root/proc/filesystems.
18:36:55 + failed=true
18:36:55 + this_chroot_failed=true
{code}
We should make the umount loop not fail if the FS its trying to un-mount is not mounted anymore.
--
This message was sent by Atlassian JIRA
(v1000.621.2#100023)
7 years, 11 months
[JIRA] (OVIRT-937) Fix mock_cleanup.sh umount failures
by Barak Korren (oVirt JIRA)
[ https://ovirt-jira.atlassian.net/browse/OVIRT-937?page=com.atlassian.jira... ]
Barak Korren updated OVIRT-937:
-------------------------------
Epic Link: OVIRT-400
> Fix mock_cleanup.sh umount failures
> -----------------------------------
>
> Key: OVIRT-937
> URL: https://ovirt-jira.atlassian.net/browse/OVIRT-937
> Project: oVirt - virtualization made easy
> Issue Type: Bug
> Components: Jenkins
> Reporter: Barak Korren
> Assignee: infra
>
> Cleanup script can log faux failures when trying to umount leftover mock file systems.
> This happened here:
> http://jenkins.ovirt.org/job/vdsm_master_check-patch-fc24-x86_64/6291/con...
> Relevant log lines:
> {code}
> 18:36:55 + echo 'Found mounted dirs inside the chroot /var/lib/mock/epel-7-x86_64-95d9ead9d725499a15a9021ba2fe9831-54661.' 'Trying to umount.'
> 18:36:55 Found mounted dirs inside the chroot /var/lib/mock/epel-7-x86_64-95d9ead9d725499a15a9021ba2fe9831-54661. Trying to umount.
> 18:36:55 + for mount in '"${mounts[@]}"'
> 18:36:55 + sudo umount --lazy /var/lib/mock/epel-7-x86_64-95d9ead9d725499a15a9021ba2fe9831-54661/root/proc
> 18:36:55 + for mount in '"${mounts[@]}"'
> 18:36:55 + sudo umount --lazy /var/lib/mock/epel-7-x86_64-95d9ead9d725499a15a9021ba2fe9831-54661/root/sys
> 18:36:55 + for mount in '"${mounts[@]}"'
> 18:36:55 + sudo umount --lazy /var/lib/mock/epel-7-x86_64-95d9ead9d725499a15a9021ba2fe9831-54661/root/dev/shm
> 18:36:55 + for mount in '"${mounts[@]}"'
> 18:36:55 + sudo umount --lazy /var/lib/mock/epel-7-x86_64-95d9ead9d725499a15a9021ba2fe9831-54661/root/dev/pts
> 18:36:55 + for mount in '"${mounts[@]}"'
> 18:36:55 + sudo umount --lazy /var/lib/mock/epel-7-x86_64-95d9ead9d725499a15a9021ba2fe9831-54661/root/var/cache/yum
> 18:36:55 + for mount in '"${mounts[@]}"'
> 18:36:55 + sudo umount --lazy /var/lib/mock/epel-7-x86_64-95d9ead9d725499a15a9021ba2fe9831-54661/root/home/jenkins/workspace/vdsm_master_check-merged-el7-x86_64/vdsm
> 18:36:55 + for mount in '"${mounts[@]}"'
> 18:36:55 + sudo umount --lazy /var/lib/mock/epel-7-x86_64-95d9ead9d725499a15a9021ba2fe9831-54661/root/run/libvirt
> 18:36:55 + for mount in '"${mounts[@]}"'
> 18:36:55 + sudo umount --lazy /var/lib/mock/epel-7-x86_64-95d9ead9d725499a15a9021ba2fe9831-54661/root/var/lib/lago
> 18:36:55 + for mount in '"${mounts[@]}"'
> 18:36:55 + sudo umount --lazy /var/lib/mock/epel-7-x86_64-95d9ead9d725499a15a9021ba2fe9831-54661/root/proc/filesystems
> 18:36:55 umount: /var/lib/mock/epel-7-x86_64-95d9ead9d725499a15a9021ba2fe9831-54661/root/proc/filesystems: mountpoint not found
> 18:36:55 + echo 'ERROR: Failed to umount /var/lib/mock/epel-7-x86_64-95d9ead9d725499a15a9021ba2fe9831-54661/root/proc/filesystems.'
> 18:36:55 ERROR: Failed to umount /var/lib/mock/epel-7-x86_64-95d9ead9d725499a15a9021ba2fe9831-54661/root/proc/filesystems.
> 18:36:55 + failed=true
> 18:36:55 + this_chroot_failed=true
> {code}
> We should make the umount loop not fail if the FS its trying to un-mount is not mounted anymore.
--
This message was sent by Atlassian JIRA
(v1000.621.2#100023)
7 years, 11 months
oVirt infra daily report - unstable production jobs - 167
by jenkins@jenkins.phx.ovirt.org
------=_Part_217_499581145.1481670009439
Content-Type: text/plain; charset=UTF-8
Content-Transfer-Encoding: 7bit
Good morning!
Attached is the HTML page with the jenkins status report. You can see it also here:
- http://jenkins.ovirt.org/job/system_jenkins-report/167//artifact/exported...
Cheers,
Jenkins
------=_Part_217_499581145.1481670009439
Content-Type: text/html; charset=us-ascii; name=upstream_report.html
Content-Transfer-Encoding: 7bit
Content-Disposition: attachment; filename=upstream_report.html
Content-ID: <upstream_report.html>
<!DOCTYPE html><head><style type="text/css">
table.gridtable {
border-collapse: collapse;
table-layout:fixed;
width:1600px;
font-family: monospace;
font-size:13px;
}
.head {
font-size:20px;
font-family: arial;
}
.sub {
font-size:18px;
background-color:#e5e5e5;
font-family: arial;
}
pre {
font-family: monospace;
display: inline;
white-space: pre-wrap;
white-space: -moz-pre-wrap !important;
white-space: -pre-wrap;
white-space: -o-pre-wrap;
word-wrap: break-word;
}
</style>
</head>
<body>
<table class="gridtable" border=2>
<tr><th colspan=2 class=head>
RHEVM CI Jenkins Daily Report - 13/12/2016
</th></tr><tr><th colspan=2 class=sub>
<font color="blue"><a href="http://jenkins.ovirt.org/">00 Unstable Jobs (Production)</a></font>
</th></tr>
<tr><td>
<a href="http://jenkins.ovirt.org/job/cockpit-ovirt_ovirt-4.1_build-artifacts-fc23...">cockpit-ovirt_ovirt-4.1_build-artifacts-fc23-x86_64</a>
</td><td>
This job is automatically updated by jenkins job builder, any manual
change will be lost in the next update. If you want to make permanent
changes, check out the <a href="http://gerrit.ovirt.org/gitweb?p=jenkins.git;a=tree;h=refs/heads/master;h...">
jenkins</a> repo.
<!-- Managed by Jenkins Job Builder -->
</td></tr>
<tr><td>
<a href="http://jenkins.ovirt.org/job/fabric-ovirt_master_check-merged-el7-x86_64/">fabric-ovirt_master_check-merged-el7-x86_64</a>
</td><td>
This job is automatically updated by jenkins job builder, any manual
change will be lost in the next update. If you want to make permanent
changes, check out the <a href="http://gerrit.ovirt.org/gitweb?p=jenkins.git;a=tree;h=refs/heads/master;h...">
jenkins</a> repo.
<!-- Managed by Jenkins Job Builder -->
</td></tr>
<tr><td>
<a href="http://jenkins.ovirt.org/job/ovirt-appliance_master_build-artifacts-el7-x...">ovirt-appliance_master_build-artifacts-el7-x86_64</a>
</td><td>
This job is automatically updated by jenkins job builder, any manual
change will be lost in the next update. If you want to make permanent
changes, check out the <a href="http://gerrit.ovirt.org/gitweb?p=jenkins.git;a=tree;h=refs/heads/master;h...">
jenkins</a> repo.
<!-- Managed by Jenkins Job Builder -->
</td></tr>
<tr><td>
<a href="http://jenkins.ovirt.org/job/ovirt-appliance_ovirt-4.0-pre_build-artifact...">ovirt-appliance_ovirt-4.0-pre_build-artifacts-el7-x86_64</a>
</td><td>
This job is automatically updated by jenkins job builder, any manual
change will be lost in the next update. If you want to make permanent
changes, check out the <a href="http://gerrit.ovirt.org/gitweb?p=jenkins.git;a=tree;h=refs/heads/master;h...">
jenkins</a> repo.
<!-- Managed by Jenkins Job Builder -->
</td></tr>
<tr><td>
<a href="http://jenkins.ovirt.org/job/ovirt-appliance_ovirt-4.0-snapshot_build-art...">ovirt-appliance_ovirt-4.0-snapshot_build-artifacts-el7-x86_64</a>
</td><td>
This job is automatically updated by jenkins job builder, any manual
change will be lost in the next update. If you want to make permanent
changes, check out the <a href="http://gerrit.ovirt.org/gitweb?p=jenkins.git;a=tree;h=refs/heads/master;h...">
jenkins</a> repo.
<!-- Managed by Jenkins Job Builder -->
</td></tr>
<tr><td>
<a href="http://jenkins.ovirt.org/job/ovirt-engine-api-explorer_master_build-artif...">ovirt-engine-api-explorer_master_build-artifacts-fc23-x86_64</a>
</td><td>
This job is automatically updated by jenkins job builder, any manual
change will be lost in the next update. If you want to make permanent
changes, check out the <a href="http://gerrit.ovirt.org/gitweb?p=jenkins.git;a=tree;h=refs/heads/master;h...">
jenkins</a> repo.
<!-- Managed by Jenkins Job Builder -->
</td></tr>
<tr><td>
<a href="http://jenkins.ovirt.org/job/ovirt-engine-extension-aaa-ldap_4.0_create-r...">ovirt-engine-extension-aaa-ldap_4.0_create-rpms-el7-x86_64_merged</a>
</td><td>
This job is automatically updated by jenkins job builder, any manual
change will be lost in the next update. If you want to make permanent
changes, check out the <a href="http://gerrit.ovirt.org/gitweb?p=jenkins.git;a=tree;h=refs/heads/master;h...">
jenkins</a> repo.
<!-- Managed by Jenkins Job Builder -->
</td></tr>
<tr><td>
<a href="http://jenkins.ovirt.org/job/ovirt-engine-extension-aaa-ldap_master_creat...">ovirt-engine-extension-aaa-ldap_master_create-rpms-el7-x86_64_merged</a>
</td><td>
This job is automatically updated by jenkins job builder, any manual
change will be lost in the next update. If you want to make permanent
changes, check out the <a href="http://gerrit.ovirt.org/gitweb?p=jenkins.git;a=tree;h=refs/heads/master;h...">
jenkins</a> repo.
<!-- Managed by Jenkins Job Builder -->
</td></tr>
<tr><td>
<a href="http://jenkins.ovirt.org/job/ovirt-engine-extension-aaa-ldap_master_creat...">ovirt-engine-extension-aaa-ldap_master_create-rpms-fc24-x86_64_merged</a>
</td><td>
This job is automatically updated by jenkins job builder, any manual
change will be lost in the next update. If you want to make permanent
changes, check out the <a href="http://gerrit.ovirt.org/gitweb?p=jenkins.git;a=tree;h=refs/heads/master;h...">
jenkins</a> repo.
<!-- Managed by Jenkins Job Builder -->
</td></tr>
<tr><td>
<a href="http://jenkins.ovirt.org/job/ovirt-engine_master_find-bugs_merged/">ovirt-engine_master_find-bugs_merged</a>
</td><td>
This job is automatically updated by jenkins job builder, any manual
change will be lost in the next update. If you want to make permanent
changes, check out the <a href="http://gerrit.ovirt.org/gitweb?p=jenkins.git;a=tree;h=refs/heads/master;h...">
jenkins</a> repo.
<!-- Managed by Jenkins Job Builder -->
</td></tr>
<tr><td>
<a href="http://jenkins.ovirt.org/job/ovirt-guest-agent_master_build-artifacts-el6...">ovirt-guest-agent_master_build-artifacts-el6-x86_64</a>
</td><td>
This job is automatically updated by jenkins job builder, any manual
change will be lost in the next update. If you want to make permanent
changes, check out the <a href="http://gerrit.ovirt.org/gitweb?p=jenkins.git;a=tree;h=refs/heads/master;h...">
jenkins</a> repo.
<!-- Managed by Jenkins Job Builder -->
</td></tr>
<tr><td>
<a href="http://jenkins.ovirt.org/job/ovirt-hosted-engine-ha_3.6_create-rpms-el7-x...">ovirt-hosted-engine-ha_3.6_create-rpms-el7-x86_64_merged</a>
</td><td>
This job is automatically updated by jenkins job builder, any manual
change will be lost in the next update. If you want to make permanent
changes, check out the <a href="http://gerrit.ovirt.org/gitweb?p=jenkins.git;a=tree;h=refs/heads/master;h...">
jenkins</a> repo.
<!-- Managed by Jenkins Job Builder -->
</td></tr>
<tr><td>
<a href="http://jenkins.ovirt.org/job/ovirt-hosted-engine-ha_master_build-artifact...">ovirt-hosted-engine-ha_master_build-artifacts-fc24-x86_64</a>
</td><td>
This job is automatically updated by jenkins job builder, any manual
change will be lost in the next update. If you want to make permanent
changes, check out the <a href="http://gerrit.ovirt.org/gitweb?p=jenkins.git;a=tree;h=refs/heads/master;h...">
jenkins</a> repo.
<!-- Managed by Jenkins Job Builder -->
</td></tr>
<tr><td>
<a href="http://jenkins.ovirt.org/job/ovirt-live_4.1-create-iso/">ovirt-live_4.1-create-iso</a>
</td><td>
<h3>This job generates a nightly iso of ovirt-live 4.1</h3>
Need to move to std-ci or yamlize
</td></tr>
<tr><td>
<a href="http://jenkins.ovirt.org/job/ovirt-live_master_create-iso-el7-x86_64/">ovirt-live_master_create-iso-el7-x86_64</a>
</td><td>
<h3>This job generates a nightly iso of ovirt-live </h3>
</td></tr>
<tr><td>
<a href="http://jenkins.ovirt.org/job/ovirt-live_master_experimental_create-iso-el...">ovirt-live_master_experimental_create-iso-el7-x86_64</a>
</td><td>
<h3>This job generates a nightly iso of ovirt-live using experimental repos</h3>
</td></tr>
<tr><td>
<a href="http://jenkins.ovirt.org/job/ovirt-node-ng_master_build-artifacts-el7-x86...">ovirt-node-ng_master_build-artifacts-el7-x86_64</a>
</td><td>
This job is automatically updated by jenkins job builder, any manual
change will be lost in the next update. If you want to make permanent
changes, check out the <a href="http://gerrit.ovirt.org/gitweb?p=jenkins.git;a=tree;h=refs/heads/master;h...">
jenkins</a> repo.
<!-- Managed by Jenkins Job Builder -->
</td></tr>
<tr><td>
<a href="http://jenkins.ovirt.org/job/ovirt-node-ng_ovirt-3.6_build-artifacts-el7-...">ovirt-node-ng_ovirt-3.6_build-artifacts-el7-x86_64</a>
</td><td>
This job is automatically updated by jenkins job builder, any manual
change will be lost in the next update. If you want to make permanent
changes, check out the <a href="http://gerrit.ovirt.org/gitweb?p=jenkins.git;a=tree;h=refs/heads/master;h...">
jenkins</a> repo.
<!-- Managed by Jenkins Job Builder -->
</td></tr>
<tr><td>
<a href="http://jenkins.ovirt.org/job/ovirt-node-ng_ovirt-4.0-snapshot_build-artif...">ovirt-node-ng_ovirt-4.0-snapshot_build-artifacts-el7-x86_64</a>
</td><td>
This job is automatically updated by jenkins job builder, any manual
change will be lost in the next update. If you want to make permanent
changes, check out the <a href="http://gerrit.ovirt.org/gitweb?p=jenkins.git;a=tree;h=refs/heads/master;h...">
jenkins</a> repo.
<!-- Managed by Jenkins Job Builder -->
</td></tr>
<tr><td>
<a href="http://jenkins.ovirt.org/job/ovirt-node-ng_ovirt-4.0_build-artifacts-el7-...">ovirt-node-ng_ovirt-4.0_build-artifacts-el7-x86_64</a>
</td><td>
This job is automatically updated by jenkins job builder, any manual
change will be lost in the next update. If you want to make permanent
changes, check out the <a href="http://gerrit.ovirt.org/gitweb?p=jenkins.git;a=tree;h=refs/heads/master;h...">
jenkins</a> repo.
<!-- Managed by Jenkins Job Builder -->
</td></tr>
<tr><td>
<a href="http://jenkins.ovirt.org/job/ovirt-node-ng_ovirt-4.1-pre_build-artifacts-...">ovirt-node-ng_ovirt-4.1-pre_build-artifacts-el7-x86_64</a>
</td><td>
This job is automatically updated by jenkins job builder, any manual
change will be lost in the next update. If you want to make permanent
changes, check out the <a href="http://gerrit.ovirt.org/gitweb?p=jenkins.git;a=tree;h=refs/heads/master;h...">
jenkins</a> repo.
<!-- Managed by Jenkins Job Builder -->
</td></tr>
<tr><td>
<a href="http://jenkins.ovirt.org/job/ovirt-node-ng_ovirt-master-experimental_buil...">ovirt-node-ng_ovirt-master-experimental_build-artifacts-el7-x86_64</a>
</td><td>
This job is automatically updated by jenkins job builder, any manual
change will be lost in the next update. If you want to make permanent
changes, check out the <a href="http://gerrit.ovirt.org/gitweb?p=jenkins.git;a=tree;h=refs/heads/master;h...">
jenkins</a> repo.
<!-- Managed by Jenkins Job Builder -->
</td></tr>
<tr><td>
<a href="http://jenkins.ovirt.org/job/ovirt-optimizer_4.0_build-artifacts-el6-x86_64/">ovirt-optimizer_4.0_build-artifacts-el6-x86_64</a>
</td><td>
This job is automatically updated by jenkins job builder, any manual
change will be lost in the next update. If you want to make permanent
changes, check out the <a href="http://gerrit.ovirt.org/gitweb?p=jenkins.git;a=tree;h=refs/heads/master;h...">
jenkins</a> repo.
<!-- Managed by Jenkins Job Builder -->
</td></tr>
<tr><td>
<a href="http://jenkins.ovirt.org/job/ovirt_3.6_he-system-tests/">ovirt_3.6_he-system-tests</a>
</td><td>
This job is automatically updated by jenkins job builder, any manual
change will be lost in the next update. If you want to make permanent
changes, check out the <a href="http://gerrit.ovirt.org/gitweb?p=jenkins.git;a=tree;h=refs/heads/master;h...">
jenkins</a> repo.
<!-- Managed by Jenkins Job Builder -->
</td></tr>
<tr><td>
<a href="http://jenkins.ovirt.org/job/qemu_3.6_create-rpms-el7-ppc64le_merged/">qemu_3.6_create-rpms-el7-ppc64le_merged</a>
</td><td>
This job is automatically updated by jenkins job builder, any manual
change will be lost in the next update. If you want to make permanent
changes, check out the <a href="http://gerrit.ovirt.org/gitweb?p=jenkins.git;a=tree;h=refs/heads/master;h...">
jenkins</a> repo.
<!-- Managed by Jenkins Job Builder -->
</td></tr>
<tr><td>
<a href="http://jenkins.ovirt.org/job/qemu_3.6_create-rpms-el7-x86_64_merged/">qemu_3.6_create-rpms-el7-x86_64_merged</a>
</td><td>
This job is automatically updated by jenkins job builder, any manual
change will be lost in the next update. If you want to make permanent
changes, check out the <a href="http://gerrit.ovirt.org/gitweb?p=jenkins.git;a=tree;h=refs/heads/master;h...">
jenkins</a> repo.
<!-- Managed by Jenkins Job Builder -->
</td></tr>
<tr><td>
<a href="http://jenkins.ovirt.org/job/qemu_4.0_create-rpms-el7-ppc64le_merged/">qemu_4.0_create-rpms-el7-ppc64le_merged</a>
</td><td>
This job is automatically updated by jenkins job builder, any manual
change will be lost in the next update. If you want to make permanent
changes, check out the <a href="http://gerrit.ovirt.org/gitweb?p=jenkins.git;a=tree;h=refs/heads/master;h...">
jenkins</a> repo.
<!-- Managed by Jenkins Job Builder -->
</td></tr>
<tr><td>
<a href="http://jenkins.ovirt.org/job/qemu_4.0_create-rpms-el7-x86_64_merged/">qemu_4.0_create-rpms-el7-x86_64_merged</a>
</td><td>
This job is automatically updated by jenkins job builder, any manual
change will be lost in the next update. If you want to make permanent
changes, check out the <a href="http://gerrit.ovirt.org/gitweb?p=jenkins.git;a=tree;h=refs/heads/master;h...">
jenkins</a> repo.
<!-- Managed by Jenkins Job Builder -->
</td></tr>
<tr><td>
<a href="http://jenkins.ovirt.org/job/qemu_master_create-rpms-el7-ppc64le_merged/">qemu_master_create-rpms-el7-ppc64le_merged</a>
</td><td>
This job is automatically updated by jenkins job builder, any manual
change will be lost in the next update. If you want to make permanent
changes, check out the <a href="http://gerrit.ovirt.org/gitweb?p=jenkins.git;a=tree;h=refs/heads/master;h...">
jenkins</a> repo.
<!-- Managed by Jenkins Job Builder -->
</td></tr>
<tr><td>
<a href="http://jenkins.ovirt.org/job/qemu_master_create-rpms-el7-x86_64_merged/">qemu_master_create-rpms-el7-x86_64_merged</a>
</td><td>
This job is automatically updated by jenkins job builder, any manual
change will be lost in the next update. If you want to make permanent
changes, check out the <a href="http://gerrit.ovirt.org/gitweb?p=jenkins.git;a=tree;h=refs/heads/master;h...">
jenkins</a> repo.
<!-- Managed by Jenkins Job Builder -->
</td></tr>
<tr><td>
<a href="http://jenkins.ovirt.org/job/repos_master_check-closure_fc24_merged/">repos_master_check-closure_fc24_merged</a>
</td><td>
This job is automatically updated by jenkins job builder, any manual
change will be lost in the next update. If you want to make permanent
changes, check out the <a href="http://gerrit.ovirt.org/gitweb?p=jenkins.git;a=tree;h=refs/heads/master;h...">
jenkins</a> repo.
<!-- Managed by Jenkins Job Builder -->
</td></tr>
<tr><td>
<a href="http://jenkins.ovirt.org/job/test-repo_ovirt_experimental_3.6/">test-repo_ovirt_experimental_3.6</a>
</td><td>
This job is automatically updated by jenkins job builder, any manual
change will be lost in the next update. If you want to make permanent
changes, check out the <a href="http://gerrit.ovirt.org/gitweb?p=jenkins.git;a=tree;h=refs/heads/master;h...">
jenkins</a> repo.
<!-- Managed by Jenkins Job Builder -->
</td></tr>
<tr><td>
<a href="http://jenkins.ovirt.org/job/test-repo_ovirt_experimental_master/">test-repo_ovirt_experimental_master</a>
</td><td>
This job is automatically updated by jenkins job builder, any manual
change will be lost in the next update. If you want to make permanent
changes, check out the <a href="http://gerrit.ovirt.org/gitweb?p=jenkins.git;a=tree;h=refs/heads/master;h...">
jenkins</a> repo.
<!-- Managed by Jenkins Job Builder -->
</td></tr>
<tr><td>
<a href="http://jenkins.ovirt.org/job/vdsm_master_build-artifacts-fc24-x86_64/">vdsm_master_build-artifacts-fc24-x86_64</a>
</td><td>
This job is automatically updated by jenkins job builder, any manual
change will be lost in the next update. If you want to make permanent
changes, check out the <a href="http://gerrit.ovirt.org/gitweb?p=jenkins.git;a=tree;h=refs/heads/master;h...">
jenkins</a> repo.
<!-- Managed by Jenkins Job Builder -->
</td></tr>
<tr><td>
<a href="http://jenkins.ovirt.org/job/vdsm_master_check-merged-el7-x86_64/">vdsm_master_check-merged-el7-x86_64</a>
</td><td>
This job is automatically updated by jenkins job builder, any manual
change will be lost in the next update. If you want to make permanent
changes, check out the <a href="http://gerrit.ovirt.org/gitweb?p=jenkins.git;a=tree;h=refs/heads/master;h...">
jenkins</a> repo.
<!-- Managed by Jenkins Job Builder -->
</td></tr>
------=_Part_217_499581145.1481670009439--
7 years, 11 months
Build failed in Jenkins: ovirt_3.6_system-tests #820
by jenkins@jenkins.phx.ovirt.org
See <http://jenkins.ovirt.org/job/ovirt_3.6_system-tests/820/>
------------------------------------------
[...truncated 8 lines...]
Resetting working tree
> git reset --hard # timeout=10
> git clean -fdx # timeout=10
ERROR: Error fetching remote repo 'origin'
hudson.plugins.git.GitException: Failed to fetch from git://gerrit.ovirt.org/ovirt-system-tests.git
at hudson.plugins.git.GitSCM.fetchFrom(GitSCM.java:766)
at hudson.plugins.git.GitSCM.retrieveChanges(GitSCM.java:1022)
at hudson.plugins.git.GitSCM.checkout(GitSCM.java:1053)
at org.jenkinsci.plugins.multiplescms.MultiSCM.checkout(MultiSCM.java:129)
at hudson.scm.SCM.checkout(SCM.java:485)
at hudson.model.AbstractProject.checkout(AbstractProject.java:1269)
at hudson.model.AbstractBuild$AbstractBuildExecution.defaultCheckout(AbstractBuild.java:607)
at jenkins.scm.SCMCheckoutStrategy.checkout(SCMCheckoutStrategy.java:86)
at hudson.model.AbstractBuild$AbstractBuildExecution.run(AbstractBuild.java:529)
at hudson.model.Run.execute(Run.java:1738)
at hudson.model.FreeStyleBuild.run(FreeStyleBuild.java:43)
at hudson.model.ResourceController.execute(ResourceController.java:98)
at hudson.model.Executor.run(Executor.java:410)
Caused by: hudson.plugins.git.GitException: Command "git clean -fdx" returned status code 1:
stdout: Removing .bash_history
Removing .pki/
Removing basic-suite-3.6/LagoInitFile
Removing deployment-basic-suite-3.6/
Removing exported-artifacts/lago_logs/lago.log
Removing mocker-epel-7-x86_64.el7.cfg
stderr: warning: failed to remove exported-artifacts/lago_logs
warning: failed to remove exported-artifacts/failure_msg.txt
at org.jenkinsci.plugins.gitclient.CliGitAPIImpl.launchCommandIn(CliGitAPIImpl.java:1640)
at org.jenkinsci.plugins.gitclient.CliGitAPIImpl.launchCommandIn(CliGitAPIImpl.java:1616)
at org.jenkinsci.plugins.gitclient.CliGitAPIImpl.launchCommandIn(CliGitAPIImpl.java:1612)
at org.jenkinsci.plugins.gitclient.CliGitAPIImpl.launchCommand(CliGitAPIImpl.java:1254)
at org.jenkinsci.plugins.gitclient.CliGitAPIImpl.launchCommand(CliGitAPIImpl.java:1266)
at org.jenkinsci.plugins.gitclient.CliGitAPIImpl.clean(CliGitAPIImpl.java:621)
at hudson.plugins.git.GitAPI.clean(GitAPI.java:311)
at sun.reflect.GeneratedMethodAccessor51.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at hudson.remoting.RemoteInvocationHandler$RPCRequest.perform(RemoteInvocationHandler.java:884)
at hudson.remoting.RemoteInvocationHandler$RPCRequest.call(RemoteInvocationHandler.java:859)
at hudson.remoting.RemoteInvocationHandler$RPCRequest.call(RemoteInvocationHandler.java:818)
at hudson.remoting.UserRequest.perform(UserRequest.java:152)
at hudson.remoting.UserRequest.perform(UserRequest.java:50)
at hudson.remoting.Request$2.run(Request.java:332)
at hudson.remoting.InterceptingExecutorService$1.call(InterceptingExecutorService.java:68)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
at ......remote call to ovirt-srv18.phx.ovirt.org(Native Method)
at hudson.remoting.Channel.attachCallSiteStackTrace(Channel.java:1416)
at hudson.remoting.UserResponse.retrieve(UserRequest.java:252)
at hudson.remoting.Channel.call(Channel.java:781)
at hudson.remoting.RemoteInvocationHandler.invoke(RemoteInvocationHandler.java:249)
at com.sun.proxy.$Proxy55.clean(Unknown Source)
at org.jenkinsci.plugins.gitclient.RemoteGitImpl.clean(RemoteGitImpl.java:444)
at hudson.plugins.git.extensions.impl.CleanBeforeCheckout.decorateFetchCommand(CleanBeforeCheckout.java:32)
at hudson.plugins.git.GitSCM.fetchFrom(GitSCM.java:762)
... 12 more
ERROR: null
Performing Post build task...
Match found for :.* : True
Logical operation result is TRUE
Running script : #!/bin/bash -xe
echo 'shell_scripts/system_tests.collect_logs.sh'
#
# Required jjb vars:
# version
#
VERSION=3.6
SUITE_TYPE=
WORKSPACE="$PWD"
OVIRT_SUITE="$SUITE_TYPE_suite_$VERSION"
TESTS_LOGS="$WORKSPACE/ovirt-system-tests/exported-artifacts"
rm -rf "$WORKSPACE/exported-artifacts"
mkdir -p "$WORKSPACE/exported-artifacts"
if [[ -d "$TESTS_LOGS" ]]; then
mv "$TESTS_LOGS/"* "$WORKSPACE/exported-artifacts/"
fi
[ovirt_3.6_system-tests] $ /bin/bash -xe /tmp/hudson2601473392204303823.sh
+ echo shell_scripts/system_tests.collect_logs.sh
shell_scripts/system_tests.collect_logs.sh
+ VERSION=3.6
+ SUITE_TYPE=
+ WORKSPACE=<http://jenkins.ovirt.org/job/ovirt_3.6_system-tests/ws/>
+ OVIRT_SUITE=3.6
+ TESTS_LOGS=<http://jenkins.ovirt.org/job/ovirt_3.6_system-tests/ws/ovirt-system-tests...>
+ rm -rf <http://jenkins.ovirt.org/job/ovirt_3.6_system-tests/820/artifact/exported...>
+ mkdir -p <http://jenkins.ovirt.org/job/ovirt_3.6_system-tests/820/artifact/exported...>
+ [[ -d <http://jenkins.ovirt.org/job/ovirt_3.6_system-tests/ws/ovirt-system-tests...> ]]
+ mv <http://jenkins.ovirt.org/job/ovirt_3.6_system-tests/ws/ovirt-system-tests...> <http://jenkins.ovirt.org/job/ovirt_3.6_system-tests/ws/ovirt-system-tests...> <http://jenkins.ovirt.org/job/ovirt_3.6_system-tests/820/artifact/exported...>
mv: cannot move '<http://jenkins.ovirt.org/job/ovirt_3.6_system-tests/ws/ovirt-system-tests...'> to '<http://jenkins.ovirt.org/job/ovirt_3.6_system-tests/820/artifact/exported...'>: Permission denied
mv: cannot move '<http://jenkins.ovirt.org/job/ovirt_3.6_system-tests/ws/ovirt-system-tests...'> to '<http://jenkins.ovirt.org/job/ovirt_3.6_system-tests/820/artifact/exported...'>: Permission denied
POST BUILD TASK : FAILURE
END OF POST BUILD TASK : 0
ESCALATE FAILED POST BUILD TASK TO JOB STATUS
Match found for :.* : True
Logical operation result is TRUE
Running script : #!/bin/bash -x
echo "shell-scripts/mock_cleanup.sh"
shopt -s nullglob
WORKSPACE="$PWD"
# Make clear this is the cleanup, helps reading the jenkins logs
cat <<EOC
_______________________________________________________________________
#######################################################################
# #
# CLEANUP #
# #
#######################################################################
EOC
# restore the permissions in the working dir, as sometimes it leaves files
# owned by root and then the 'cleanup workspace' from jenkins job fails to
# clean and breaks the jobs
sudo chown -R "$USER" "$WORKSPACE"
# Archive the logs, we want them anyway
logs=(
./*log
./*/logs
)
if [[ "$logs" ]]; then
for log in "${logs[@]}"; do
[[ "$log" = ./exported-artifacts/* ]] && continue
echo "Copying ${log} to exported-artifacts"
mv $log exported-artifacts/
done
fi
# stop any processes running inside the chroot
failed=false
mock_confs=("$WORKSPACE"/*/mocker*)
# Clean current jobs mockroot if any
for mock_conf_file in "${mock_confs[@]}"; do
[[ "$mock_conf_file" ]] || continue
echo "Cleaning up mock $mock_conf"
mock_root="${mock_conf_file##*/}"
mock_root="${mock_root%.*}"
my_mock="/usr/bin/mock"
my_mock+=" --configdir=${mock_conf_file%/*}"
my_mock+=" --root=${mock_root}"
my_mock+=" --resultdir=$WORKSPACE"
#TODO: investigate why mock --clean fails to umount certain dirs sometimes,
#so we can use it instead of manually doing all this.
echo "Killing all mock orphan processes, if any."
$my_mock \
--orphanskill \
|| {
echo "ERROR: Failed to kill orphans on $chroot."
failed=true
}
mock_root="$(\
grep \
-Po "(?<=config_opts\['root'\] = ')[^']*" \
"$mock_conf_file" \
)" || :
[[ "$mock_root" ]] || continue
mounts=($(mount | awk '{print $3}' | grep "$mock_root")) || :
if [[ "$mounts" ]]; then
echo "Found mounted dirs inside the chroot $chroot. Trying to umount."
fi
for mount in "${mounts[@]}"; do
sudo umount --lazy "$mount" \
|| {
echo "ERROR: Failed to umount $mount."
failed=true
}
done
done
# Clean any leftover chroot from other jobs
for mock_root in /var/lib/mock/*; do
this_chroot_failed=false
mounts=($(mount | awk '{print $3}' | grep "$mock_root")) || :
if [[ "$mounts" ]]; then
echo "Found mounted dirs inside the chroot $mock_root." \
"Trying to umount."
fi
for mount in "${mounts[@]}"; do
sudo umount --lazy "$mount" \
|| {
echo "ERROR: Failed to umount $mount."
failed=true
this_chroot_failed=true
}
done
if ! $this_chroot_failed; then
sudo rm -rf "$mock_root"
fi
done
# remove mock caches that are older then 2 days:
find /var/cache/mock/ -mindepth 1 -maxdepth 1 -type d -mtime +2 -print0 | \
xargs -0 -tr sudo rm -rf
# We make no effort to leave around caches that may still be in use because
# packages installed in them may go out of date, so may as well recreate them
# Drop all left over libvirt domains
for UUID in $(virsh list --all --uuid); do
virsh destroy $UUID || :
sleep 2
virsh undefine --remove-all-storage --storage vda --snapshots-metadata $UUID || :
done
if $failed; then
echo "Cleanup script failed, propegating failure to job"
exit 1
fi
[ovirt_3.6_system-tests] $ /bin/bash -x /tmp/hudson7317803229108863008.sh
+ echo shell-scripts/mock_cleanup.sh
shell-scripts/mock_cleanup.sh
+ shopt -s nullglob
+ WORKSPACE=<http://jenkins.ovirt.org/job/ovirt_3.6_system-tests/ws/>
+ cat
_______________________________________________________________________
#######################################################################
# #
# CLEANUP #
# #
#######################################################################
+ sudo chown -R jenkins <http://jenkins.ovirt.org/job/ovirt_3.6_system-tests/ws/>
+ logs=(./*log ./*/logs)
+ [[ -n '' ]]
+ failed=false
+ mock_confs=("$WORKSPACE"/*/mocker*)
+ find /var/cache/mock/ -mindepth 1 -maxdepth 1 -type d -mtime +2 -print0
+ xargs -0 -tr sudo rm -rf
++ virsh list --all --uuid
+ false
POST BUILD TASK : SUCCESS
END OF POST BUILD TASK : 1
Recording test results
ERROR: Step ?Publish JUnit test result report? failed: No test report files were found. Configuration error?
Archiving artifacts
7 years, 11 months
lago failure
by Martin Polednik
Hello,
playing with install_lago.sh, I've found few issues that are blocking
my further progress. First problem is that install_lago.sh on el7
doesn't install EPEL.
The second issue is actually solved by [1].
Third issue occurs when I apply the patch[1]:
-- output --
# sh install_lago.sh root basic-suite-master
Virtualization extension is enabled
Nested virtualization is enabled
Configuring repos
Loaded plugins: langpacks, product-id, search-disabled-repos, subscription-manager
ci-tools | 2.9 kB 00:00:00
lago | 2.9 kB 00:00:00
Package epel-release-7-8.noarch already installed and latest version
Nothing to do
Installing lago
Loaded plugins: langpacks, product-id, search-disabled-repos, subscription-manager
Package python-lago-0.28.0-1.el7.centos.noarch already installed and latest version
Package python-lago-ovirt-0.28.0-1.el7.centos.noarch already installed and latest version
Nothing to do
Configuring permissions
Starting libvirt
Running ovirt-system-tests
>From https://gerrit.ovirt.org/ovirt-system-tests
* branch HEAD -> FETCH_HEAD
Already up-to-date.
+ CLI=lago
+ DO_CLEANUP=false
+ RECOMMENDED_RAM_IN_MB=8196
+ EXTRA_SOURCES=()
++ getopt -o ho:e:n:b:cs:r: --long help,output:,engine:,node:,boot-iso:,cleanup --long extra-rpm-source,reposync-config: -n run_suite.sh -- basic-suite-master
+ options=' -- '\''basic-suite-master'\'''
+ [[ 0 != \0 ]]
+ eval set -- ' -- '\''basic-suite-master'\'''
++ set -- -- basic-suite-master
+ true
+ case $1 in
+ shift
+ break
+ [[ -z basic-suite-master ]]
++ realpath basic-suite-master
+ export SUITE=/root/ovirt-system-tests/basic-suite-master
+ SUITE=/root/ovirt-system-tests/basic-suite-master
+ '[' -z '' ']'
+ export PREFIX=/root/ovirt-system-tests/deployment-basic-suite-master
+ PREFIX=/root/ovirt-system-tests/deployment-basic-suite-master
+ false
+ [[ -d /root/ovirt-system-tests/basic-suite-master ]]
+ echo '################# lago version'
################# lago version
+ lago --version
lago 0.28.0
+ echo '#################'
#################
+ check_ram 8196
+ local recommended=8196
++ free -m
++ grep Mem
++ awk '{print $2}'
+ local cur_ram=64154
+ [[ 64154 -lt 8196 ]]
+ echo 'Running suite found in /root/ovirt-system-tests/basic-suite-master'
Running suite found in /root/ovirt-system-tests/basic-suite-master
+ echo 'Environment will be deployed at /root/ovirt-system-tests/deployment-basic-suite-master'
Environment will be deployed at /root/ovirt-system-tests/deployment-basic-suite-master
+ rm -rf /root/ovirt-system-tests/deployment-basic-suite-master
+ source /root/ovirt-system-tests/basic-suite-master/control.sh
+ prep_suite '' '' ''
+ local suite_name=basic-suite-master
+ suite_name=basic-suite-master
+ sed -r -e s,__ENGINE__,lago-basic-suite-master-engine,g -e 's,__HOST([0-9]+)__,lago-basic-suite-master-host\1,g' -e s,__LAGO_NET__,lago-basic-suite-master-lago,g -e s,__STORAGE__,lago-basic-suite-master-storage,g
+ run_suite
+ env_init '' /root/ovirt-system-tests/basic-suite-master/LagoInitFile
+ echo '#########################'
#########################
+ local template_repo=/root/ovirt-system-tests/basic-suite-master/template-repo.json
+ local initfile=/root/ovirt-system-tests/basic-suite-master/LagoInitFile
+ lago init /root/ovirt-system-tests/deployment-basic-suite-master /root/ovirt-system-tests/basic-suite-master/LagoInitFile --template-repo-path /root/ovirt-system-tests/basic-suite-master/template-repo.json
current session does not belong to lago group.
@ Initialize and populate prefix:
# Initialize prefix:
* Create prefix dirs:
* Create prefix dirs: Success (in 0:00:00)
* Generate prefix uuid:
* Generate prefix uuid: Success (in 0:00:00)
* Create ssh keys:
* Create ssh keys: Success (in 0:00:00)
* Tag prefix as initialized:
* Tag prefix as initialized: Success (in 0:00:00)
# Initialize prefix: Success (in 0:00:00)
# Create disks for VM lago-basic-suite-master-host0:
* Create disk root:
* Create disk root: Success (in 0:00:00)
# Create disks for VM lago-basic-suite-master-host0: Success (in 0:00:00)
# Create disks for VM lago-basic-suite-master-engine:
* Create disk root:
* Create disk root: Success (in 0:00:00)
* Create disk nfs:
* Create disk nfs: Success (in 0:00:00)
* Create disk export:
* Create disk export: Success (in 0:00:00)
* Create disk iscsi:
* Create disk iscsi: Success (in 0:00:00)
# Create disks for VM lago-basic-suite-master-engine: Success (in 0:00:00)
# Create disks for VM lago-basic-suite-master-host1:
* Create disk root:
* Create disk root: Success (in 0:00:00)
# Create disks for VM lago-basic-suite-master-host1: Success (in 0:00:00)
# Copying any deploy scripts:
# Copying any deploy scripts: Success (in 0:00:00)
# [Thread-1] Bootstrapping lago-basic-suite-master-host0:
# [Thread-2] Bootstrapping lago-basic-suite-master-engine:
# [Thread-3] Bootstrapping lago-basic-suite-master-host1:
# [Thread-1] Bootstrapping lago-basic-suite-master-host0: Success (in 0:00:12)
# [Thread-3] Bootstrapping lago-basic-suite-master-host1: Success (in 0:00:12)
# [Thread-2] Bootstrapping lago-basic-suite-master-engine: Success (in 0:00:12)
# Save prefix:
* Save nets:
* Save nets: Success (in 0:00:00)
* Save VMs:
* Save VMs: Success (in 0:00:00)
* Save env:
* Save env: Success (in 0:00:00)
# Save prefix: Success (in 0:00:00)
@ Initialize and populate prefix: Success (in 0:00:13)
+ env_repo_setup
+ echo '#########################'
#########################
+ local extrasrc
+ declare -a extrasrcs
+ cd /root/ovirt-system-tests/deployment-basic-suite-master
+ local reposync_conf=/root/ovirt-system-tests/basic-suite-master/reposync-config.repo
+ [[ -e '' ]]
+ echo 'using reposync config file: /root/ovirt-system-tests/basic-suite-master/reposync-config.repo'
using reposync config file: /root/ovirt-system-tests/basic-suite-master/reposync-config.repo
+ lago ovirt reposetup --reposync-yum-config /root/ovirt-system-tests/basic-suite-master/reposync-config.repo
current session does not belong to lago group.
@ Create prefix internal repo:
# Syncing remote repos locally (this might take some time):
* Acquiring lock for /var/lib/lago/reposync/repolock:
* Acquiring lock for /var/lib/lago/reposync/repolock: Success (in 0:00:00)
* Running reposync:
* Running reposync: Success (in 0:00:10)
# Syncing remote repos locally (this might take some time): Success (in 0:00:10)
# Running repoman:
# Running repoman: Success (in 0:00:07)
# Save prefix:
* Save nets:
* Save nets: Success (in 0:00:00)
* Save VMs:
* Save VMs: Success (in 0:00:00)
* Save env:
* Save env: Success (in 0:00:00)
# Save prefix: Success (in 0:00:00)
@ Create prefix internal repo: Success (in 0:00:17)
+ cd -
/root/ovirt-system-tests
+ env_start
+ echo '#########################'
#########################
+ cd /root/ovirt-system-tests/deployment-basic-suite-master
+ lago start
current session does not belong to lago group.
@ Start Prefix:
# Start nets:
* Create network lago-basic-suite-master-lago:
* Create network lago-basic-suite-master-lago: Success (in 0:00:05)
# Start nets: Success (in 0:00:05)
# Start vms:
* Starting VM lago-basic-suite-master-host0:
* Starting VM lago-basic-suite-master-host0: Success (in 0:00:00)
* Starting VM lago-basic-suite-master-engine:
* Starting VM lago-basic-suite-master-engine: Success (in 0:00:00)
* Starting VM lago-basic-suite-master-host1:
* Starting VM lago-basic-suite-master-host1: Success (in 0:00:00)
# Start vms: Success (in 0:00:00)
@ Start Prefix: Success (in 0:00:06)
+ cd -
/root/ovirt-system-tests
+ env_deploy
+ echo '#########################'
#########################
+ cd /root/ovirt-system-tests/deployment-basic-suite-master
+ lago ovirt deploy
current session does not belong to lago group.
@ Deploy oVirt environment:
# Deploy environment:
* [Thread-2] Deploy VM lago-basic-suite-master-host0:
* [Thread-3] Deploy VM lago-basic-suite-master-engine:
* [Thread-4] Deploy VM lago-basic-suite-master-host1:
* [Thread-2] Deploy VM lago-basic-suite-master-host0: Success (in 0:00:38)
- STDERR
+ MAIN_NFS_DEV=disk/by-id/scsi-0QEMU_QEMU_HARDDISK_2
+ EXPORTED_DEV=disk/by-id/scsi-0QEMU_QEMU_HARDDISK_3
+ ISCSI_DEV=disk/by-id/scsi-0QEMU_QEMU_HARDDISK_4
+ NUM_LUNS=5
+ main
+ install_deps
+ systemctl stop kdump.service
+ systemctl disable kdump.service
Removed symlink /etc/systemd/system/multi-user.target.wants/kdump.service.
+ yum install -y --downloaddir=/dev/shm nfs-utils lvm2 targetcli sg3_utils iscsi-initiator-utils
+ setup_services
+ systemctl stop postfix
+ systemctl disable postfix
Removed symlink /etc/systemd/system/multi-user.target.wants/postfix.service.
+ systemctl stop wpa_supplicant
+ systemctl disable wpa_supplicant
+ disable_firewalld
+ rpm -q firewalld
+ systemctl disable firewalld
Removed symlink /etc/systemd/system/dbus-org.fedoraproject.FirewallD1.service.
Removed symlink /etc/systemd/system/basic.target.wants/firewalld.service.
+ systemctl stop firewalld
+ systemctl start rpcbind.service
Failed to start rpcbind.service: Unit rpcbind.service failed to load: No such file or directory.
* [Thread-3] Deploy VM lago-basic-suite-master-engine: ERROR (in 0:00:41)
Error while running thread
Traceback (most recent call last):
File "/usr/lib/python2.7/site-packages/lago/utils.py", line 55, in _ret_via_queue
queue.put({'return': func()})
File "/usr/lib/python2.7/site-packages/lago/prefix.py", line 1242, in _deploy_host
host.name(),
RuntimeError: /root/ovirt-system-tests/deployment-basic-suite-master/default/scripts/_root_ovirt-system-tests_basic-suite-master_.._common_deploy-scripts_setup_storage_unified_el7.sh failed with status 6 on lago-basic-suite-master-engine
* [Thread-4] Deploy VM lago-basic-suite-master-host1: ERROR (in 0:02:09)
# Deploy environment: ERROR (in 0:02:09)
@ Deploy oVirt environment: ERROR (in 0:02:09)
Error occured, aborting
Traceback (most recent call last):
File "/usr/lib/python2.7/site-packages/ovirtlago/cmd.py", line 264, in do_run
self.cli_plugins[args.ovirtverb].do_run(args)
File "/usr/lib/python2.7/site-packages/lago/plugins/cli.py", line 184, in do_run
self._do_run(**vars(args))
File "/usr/lib/python2.7/site-packages/lago/utils.py", line 489, in wrapper
return func(*args, **kwargs)
File "/usr/lib/python2.7/site-packages/lago/utils.py", line 500, in wrapper
return func(*args, prefix=prefix, **kwargs)
File "/usr/lib/python2.7/site-packages/ovirtlago/cmd.py", line 187, in do_deploy
prefix.deploy()
File "/usr/lib/python2.7/site-packages/lago/log_utils.py", line 621, in wrapper
return func(*args, **kwargs)
File "/usr/lib/python2.7/site-packages/ovirtlago/reposetup.py", line 67, in wrapper
return func(*args, **kwargs)
File "/usr/lib/python2.7/site-packages/ovirtlago/__init__.py", line 198, in deploy
return super(OvirtPrefix, self).deploy()
File "/usr/lib/python2.7/site-packages/lago/log_utils.py", line 621, in wrapper
return func(*args, **kwargs)
File "/usr/lib/python2.7/site-packages/lago/prefix.py", line 1249, in deploy
self._deploy_host, self.virt_env.get_vms().values()
File "/usr/lib/python2.7/site-packages/lago/utils.py", line 97, in invoke_in_parallel
vt.join_all()
File "/usr/lib/python2.7/site-packages/lago/utils.py", line 55, in _ret_via_queue
queue.put({'return': func()})
File "/usr/lib/python2.7/site-packages/lago/prefix.py", line 1242, in _deploy_host
host.name(),
RuntimeError: /root/ovirt-system-tests/deployment-basic-suite-master/default/scripts/_root_ovirt-system-tests_basic-suite-master_.._common_deploy-scripts_setup_storage_unifie
-- output --
Could you do something about the problems?
Thanks,
mpolednik
[1] https://gerrit.ovirt.org/#/c/68166/2/common/deploy-scripts/add_local_repo.sh
7 years, 11 months