[ OST Failure Report ] [ oVirt master ] [ 01 Nov 2017 ] [ 098_ovirt_provider_ovn.test_ovn_provider_rest ]
by Dafna Ron
This is a multi-part message in MIME format.
--------------4781A375DC902B82C9DFEC94
Content-Type: text/plain; charset=utf-8
Content-Transfer-Encoding: 7bit
Hi,
098_ovirt_provider_ovn.test_ovn_provider_rest failed on removing the
interface from a running vm.
I have seen this before, do we perhaps have a race in OST where the vm
is still running at times?
**
*Link to suspected patches: Patch reported is below but I am suspecting
its a race and not related*
*
*
**https://gerrit.ovirt.org/#/c/83414/*
*
*Link to Job:*
*
*
**http://jenkins.ovirt.org/job/ovirt-master_change-queue-tester/3558/**
***
*
*Link to all logs:*
*
*http://jenkins.ovirt.org/job/ovirt-master_change-queue-tester/3558/artifact/*
(Relevant) error snippet from the log:
<error>
2017-10-31 10:58:43,516-04 ERROR
[org.ovirt.engine.api.restapi.resource.AbstractBackendResource] (default
task-32) [] Operation Failed: [Cannot remove Interface. The VM Network
Interface is plugged to a running VM.]
</error>
*
--------------4781A375DC902B82C9DFEC94
Content-Type: text/html; charset=utf-8
Content-Transfer-Encoding: 7bit
<html>
<head>
<meta http-equiv="content-type" content="text/html; charset=utf-8">
</head>
<body text="#000000" bgcolor="#FFFFFF">
<p>Hi, <br>
</p>
<p>098_ovirt_provider_ovn.test_ovn_provider_rest failed on removing
the interface from a running vm. <br>
</p>
<p>I have seen this before, do we perhaps have a race in OST where
the vm is still running at times? <br>
</p>
<p><b style="font-weight:normal;"
id="docs-internal-guid-5859b7a1-7704-cf53-3db5-2b1a698c80c1">
<p dir="ltr"
style="line-height:1.38;margin-top:0pt;margin-bottom:0pt;"><span style="font-size:11pt;font-family:Arial;color:#000000;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre-wrap;">Link to suspected patches: Patch reported is below but I am suspecting its a race and not related</span></p>
<br>
</b></p>
<p><b style="font-weight:normal;"
id="docs-internal-guid-5859b7a1-7704-cf53-3db5-2b1a698c80c1"><b><a class="moz-txt-link-freetext" href="https://gerrit.ovirt.org/#/c/83414/">https://gerrit.ovirt.org/#/c/83414/</a></b><br>
<p dir="ltr"
style="line-height:1.38;margin-top:0pt;margin-bottom:0pt;"><span style="font-size:11pt;font-family:Arial;color:#000000;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre-wrap;">Link to Job:</span></p>
<br>
</b></p>
<p><b style="font-weight:normal;"
id="docs-internal-guid-5859b7a1-7704-cf53-3db5-2b1a698c80c1"><b><a class="moz-txt-link-freetext" href="http://jenkins.ovirt.org/job/ovirt-master_change-queue-tester/3558/">http://jenkins.ovirt.org/job/ovirt-master_change-queue-tester/3558/</a></b></b></p>
<p><b style="font-weight:normal;"
id="docs-internal-guid-5859b7a1-7704-cf53-3db5-2b1a698c80c1"><b></b><br>
<p dir="ltr"
style="line-height:1.38;margin-top:0pt;margin-bottom:0pt;"><span style="font-size:11pt;font-family:Arial;color:#000000;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre-wrap;">Link to all logs:</span></p>
<p dir="ltr"
style="line-height:1.38;margin-top:0pt;margin-bottom:0pt;"><span style="font-size:11pt;font-family:Arial;color:#000000;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre-wrap;">
</span></p>
<p dir="ltr"
style="line-height:1.38;margin-top:0pt;margin-bottom:0pt;"><span style="font-size:11pt;font-family:Arial;color:#000000;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre-wrap;"><b><a class="moz-txt-link-freetext" href="http://jenkins.ovirt.org/job/ovirt-master_change-queue-tester/3558/artifact/">http://jenkins.ovirt.org/job/ovirt-master_change-queue-tester/3558/artifact/</a></b>
</span></p>
<br>
<p dir="ltr"
style="line-height:1.38;margin-top:0pt;margin-bottom:0pt;"><span style="font-size:11pt;font-family:Arial;color:#000000;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre-wrap;">(Relevant) error snippet from the log: </span></p>
<p dir="ltr"
style="line-height:1.38;margin-top:0pt;margin-bottom:0pt;"><span style="font-size:11pt;font-family:Arial;color:#000000;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre-wrap;"><error></span></p>
<br>
2017-10-31 10:58:43,516-04 ERROR
[org.ovirt.engine.api.restapi.resource.AbstractBackendResource]
(default task-32) [] Operation Failed: [Cannot remove Interface.
The VM Network Interface is plugged to a running VM.]<br>
<br>
<p dir="ltr"
style="line-height:1.38;margin-top:0pt;margin-bottom:0pt;"><span style="font-size:11pt;font-family:Arial;color:#000000;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre-wrap;"></error></span></p>
</b><br class="Apple-interchange-newline">
</p>
</body>
</html>
--------------4781A375DC902B82C9DFEC94--
7 years, 1 month
[FOSDEM 2018] CfP Announcement: Virtualization and IaaS Devroom
by Doron Fediuck
On behalf of oVirt and the Xen Project, we are excited to announce that the
call for proposals is now open for the Virtualization & IaaS devroom at the
upcoming FOSDEM 2018, to be hosted on February 3 and 4, 2017.
This year will mark FOSDEM’s 18th anniversary as one of the longest-running
free and open source software developer events, attracting thousands of
developers and users from all over the world. FOSDEM will be held once
again in Brussels, Belgium, on February 3 & 4, 2018.
This devroom is a collaborative effort, and is organized by dedicated folks
from projects such as OpenStack, Xen Project,, oVirt, QEMU, and
Foreman. We would like to invite all those who are involved in these fields
to submit your proposals by December 1st, 2017.
About the Devroom
The Virtualization & IaaS devroom will feature session topics such as open
source hypervisors and virtual machine managers such as Xen Project, KVM,
bhyve, and VirtualBox, and Infrastructure-as-a-Service projects such as
Apache CloudStack, OpenStack, oVirt, QEMU, OpenNebula, and Ganeti.
This devroom will host presentations that focus on topics of shared
interest, such as KVM; libvirt; shared storage; virtualized networking;
cloud security; clustering and high availability; interfacing with multiple
hypervisors; hyperconverged deployments; and scaling across hundreds or
thousands of servers.
Presentations in this devroom will be aimed at developers working on these
platforms who are looking to collaborate and improve shared infrastructure
or solve common problems. We seek topics that encourage dialog between
projects and continued work post-FOSDEM.
Important Dates
Submission deadline: 01 December 2017
Acceptance notifications: 14 December 2017
Final schedule announcement: 21 December 2017
Devroom: 03 and 04 February 2018 (two days- different rooms)
Submit Your Proposal
All submissions must be made via the Pentabarf event planning site[1]. If
you have not used Pentabarf before, you will need to create an account. If
you submitted proposals for FOSDEM in previous years, you can use your
existing account.
After creating the account, select Create Event to start the submission
process. Make sure to select Virtualization and IaaS devroom from the Track
list. Please fill out all the required fields, and provide a meaningful
abstract and description of your proposed session.
Submission Guidelines
We expect more proposals than we can possibly accept, so it is vitally
important that you submit your proposal on or before the deadline. Late
submissions are unlikely to be considered.
All presentation slots are 45 minutes, with 35 minutes planned for
presentations, and 10 minutes for Q&A.
All presentations will be recorded and made available under Creative
Commons licenses. In the Submission notes field, please indicate that you
agree that your presentation will be licensed under the CC-By-SA-4.0 or
CC-By-4.0 license and that you agree to have your presentation recorded.
For example:
"If my presentation is accepted for FOSDEM, I hereby agree to license all
recordings, slides, and other associated materials under the Creative
Commons Attribution Share-Alike 4.0 International License. Sincerely,
<NAME>."
In the Submission notes field, please also confirm that if your talk is
accepted, you will be able to attend FOSDEM and deliver your presentation.
We will not consider proposals from prospective speakers who are unsure
whether they will be able to secure funds for travel and lodging to attend
FOSDEM. (Sadly, we are not able to offer travel funding for prospective
speakers.)
Speaker Mentoring Program
As a part of the rising efforts to grow our communities and encourage a
diverse and inclusive conference ecosystem, we're happy to announce that
we'll be offering mentoring for new speakers. Our mentors can help you with
tasks such as reviewing your abstract, reviewing your presentation outline
or slides, or practicing your talk with you.
You may apply to the mentoring program as a newcomer speaker if you:
Never presented before or
Presented only lightning talks or
Presented full-length talks at small meetups (<50 ppl)
Submission Guidelines
Mentored presentations will have 25-minute slots, where 20 minutes will
include the presentation and 5 minutes will be reserved for questions.
The number of newcomer session slots is limited, so we will probably not be
able to accept all applications.
You must submit your talk and abstract to apply for the mentoring program,
our mentors are volunteering their time and will happily provide feedback
but won't write your presentation for you!
If you are experiencing problems with Pentabarf, the proposal submission
interface, or have other questions, you can email our devroom mailing
list[2] and we will try to help you.
How to Apply
In addition to agreeing to video recording and confirming that you can
attend FOSDEM in case your session is accepted, please write "speaker
mentoring program application" in the "Submission notes" field, and list
any prior speaking experience or other relevant information for your
application.
Call for Mentors
Interested in mentoring newcomer speakers? We'd love to have your help!
Please email iaas-virt-devroom at lists.fosdem.org with a short speaker
biography and any specific fields of expertise (for example, KVM,
OpenStack, storage, etc.) so that we can match you with a newcomer speaker
from a similar field. Estimated time investment can be as low as a 5-10
hours in total, usually distributed weekly or bi-weekly.
Never mentored a newcomer speaker but interested to try? As the mentoring
program coordinator, email Brian Proffitt[3] and he will be happy to answer
your questions!
Code of Conduct
Following the release of the updated code of conduct for FOSDEM, we'd like
to remind all speakers and attendees that all of the presentations and
discussions in our devroom are held under the guidelines set in the CoC and
we expect attendees, speakers, and volunteers to follow the CoC at all
times.
If you submit a proposal and it is accepted, you will be required to
confirm that you accept the FOSDEM CoC. If you have any questions about the
CoC or wish to have one of the devroom organizers review your presentation
slides or any other content for CoC compliance, please email us and we will
do our best to assist you.
Call for Volunteers
We are also looking for volunteers to help run the devroom. We need
assistance watching time for the speakers, and helping with video for the
devroom. Please contact me, Brian Proffitt, for more information.
Questions?
If you have any questions about this devroom, please send your questions to
our devroom mailing list. You can also subscribe to the list to receive
updates about important dates, session announcements, and to connect with
other attendees.
See you all at FOSDEM!
[1] https://penta.fosdem.org/submission/FOSDEM18
[2] iaas-virt-devroom at lists.fosdem.org
[3] bkp at redhat.com
7 years, 1 month
Code owners configuration for GitHub projects
by Martin Sivak
Hi,
I recently noticed GitHub enabled a feature that allows specifying
code owners for different pieces of code:
https://github.com/blog/2392-introducing-code-owners
It should supposedly automatically add the proper reviewers to patches.
We have similar feature enabled in Gerrit and it might make sense for
our GitHub specific projects to do the same. (It might even make sense
to follow the same format in Gerrit)
Martin Sivak
7 years, 1 month
Host deploy fails due to missing Ansible playbook (engine dev env)
by Fred Rolland
Hi,
When I try to add a new host in an engine running in a dev environment I
get the following error:
2017-11-05 16:05:28,544+02 WARN
[org.ovirt.engine.core.common.utils.ansible.AnsibleExecutor]
(EE-ManagedThreadFactory-engine-Thread-18)
[350cd700-493b-4259-8d7a-0e63a06f6cd8] Playbook
'/home/frolland/ovirt-engine/share/ovirt-engine/../ovirt-ansible-roles/playbooks/ovirt-host-deploy.yml'
does not exist, please ensure that ovirt-ansible-roles package is properly
installed.
Installing 'ovirt-ansible-roles' does not solve anything as the host-deploy
process tries to run the playbook on a location relative to where the
engine is running.
I manually copied the playbook as a workaround.
Can we have a more robust solution when running engine in dev mode ?
Thanks,
Freddy
7 years, 1 month
[ANN] oVirt 4.2.0 First Beta Release is now available for testing
by Sandro Bonazzola
The oVirt Project is pleased to announce the availability of the First Beta
Release of oVirt 4.2.0, as of October 31st, 2017
This is pre-release software. This pre-release should not to be used in
production.
Please take a look at our community page[1] to learn how to ask questions
and interact with developers and users.
All issues or bugs should be reported via oVirt Bugzilla[2].
This update is the first beta release of the 4.2.0 version. This release
brings more than 230 enhancements and more than one thousand bug fixes,
including more than 380 high or urgent severity fixes, on top of oVirt 4.1
series.
What's new in oVirt 4.2.0?
-
The Administration Portal has been completely redesigned using
Patternfly, a widely adopted standard in web application design. It now
features a cleaner, more intuitive design, for an improved user experience.
-
There is an all-new VM Portal for non-admin users.
-
A new High Performance virtual machine type has been added to the New VM
dialog box in the Administration Portal.
-
Open Virtual Network (OVN) adds support for Open vSwitch software
defined networking (SDN).
-
oVirt now supports Nvidia vGPU.
-
The ovirt-ansible-roles package helps users with common administration
tasks.
-
Virt-v2v now supports Debian/Ubuntu based VMs.
For more information about these and other features, check out the oVirt
4.2.0 blog post <https://ovirt.org/blog/2017/09/introducing-ovirt-4.2.0/>.
This release is available now on x86_64 architecture for:
* Red Hat Enterprise Linux 7.4 or later
* CentOS Linux (or similar) 7.4 or later
This release supports Hypervisor Hosts on x86_64 and ppc64le architectures
for:
* Red Hat Enterprise Linux 7.4 or later
* CentOS Linux (or similar) 7.4 or later
* oVirt Node 4.2 (available for x86_64 only)
See the release notes draft [3] for installation / upgrade instructions and
a list of new features and bugs fixed.
Notes:
- oVirt Appliance is already available.
- An async release of oVirt Node will follow soon.
Additional Resources:
* Read more about the oVirt 4.2.0 release highlights:
http://www.ovirt.org/release/4.2.0/
* Get more oVirt project updates on Twitter: https://twitter.com/ovirt
* Check out the latest project news on the oVirt blog:
http://www.ovirt.org/blog/
[1] https://www.ovirt.org/community/
[2] https://bugzilla.redhat.com/enter_bug.cgi?classification=oVirt
[3] http://www.ovirt.org/release/4.2.0/
[4] http://resources.ovirt.org/pub/ovirt-4.2-pre/iso/
--
SANDRO BONAZZOLA
ASSOCIATE MANAGER, SOFTWARE ENGINEERING, EMEA ENG VIRTUALIZATION R&D
Red Hat EMEA <https://www.redhat.com/>
<https://red.ht/sig>
TRIED. TESTED. TRUSTED. <https://redhat.com/trusted>
<http://www.teraplan.it/redhat-osd-2017/>
7 years, 1 month
Network is already in use by interface virbr2 error
by Dafna Ron
This is a multi-part message in MIME format.
--------------2631BD3B597AE9940D121B2F
Content-Type: text/plain; charset=windows-1252
Content-Transfer-Encoding: 7bit
Hi,
We have an ongoing issue with some builds failing due to failure to
clean network from libvirt.
the error is:
libvirtError: internal error: Network is already in use by interface virbr2
we are working to fix the issue and will update once it's resolved.
Thanks,
Dafna
-------- Forwarded Message --------
Subject: [CQ]: 82219, 14 (ovirt-engine) failed "ovirt-master" system
tests, but isn't the failure root cause
Date: Thu, 2 Nov 2017 14:55:41 +0000 (UTC)
From: oVirt Jenkins <jenkins(a)ovirt.org>
To: infra(a)ovirt.org
A system test invoked by the "ovirt-master" change queue including change
82219,14 (ovirt-engine) failed. However, this change seems not to be the root
cause for this failure. Change 83472,4 (ovirt-engine) that this change depends
on or is based on, was detected as the cause of the testing failures.
This change had been removed from the testing queue. Artifacts built from this
change will not be released until either change 83472,4 (ovirt-engine) is fixed
and this change is updated to refer to or rebased on the fixed version, or this
change is modified to no longer depend on it.
For further details about the change see:
https://gerrit.ovirt.org/#/c/82219/14
For further details about the change that seems to be the root cause behind the
testing failures see:
https://gerrit.ovirt.org/#/c/83472/4
For failed test results see:
http://jenkins.ovirt.org/job/ovirt-master_change-queue-tester/3594/
_______________________________________________
Infra mailing list
Infra(a)ovirt.org
http://lists.ovirt.org/mailman/listinfo/infra
--------------2631BD3B597AE9940D121B2F
Content-Type: text/html; charset=windows-1252
Content-Transfer-Encoding: 7bit
<html>
<head>
<meta http-equiv="content-type" content="text/html; charset=windows-1252">
</head>
<body text="#000000" bgcolor="#FFFFFF">
<p>Hi, <br>
</p>
<p>We have an ongoing issue with some builds failing due to failure
to clean network from libvirt. <br>
</p>
<p>the error is: <br>
</p>
<pre class="console-output" style="box-sizing: border-box; white-space: pre-wrap; word-wrap: break-word; margin: 0px; color: rgb(51, 51, 51); font-size: 14px; font-style: normal; font-variant-ligatures: normal; font-variant-caps: normal; font-weight: normal; letter-spacing: normal; orphans: 2; text-align: start; text-indent: 0px; text-transform: none; widows: 2; word-spacing: 0px; -webkit-text-stroke-width: 0px; text-decoration-style: initial; text-decoration-color: initial;"><span style="box-sizing: border-box; color: rgb(205, 0, 0);">libvirtError: internal error: Network is already in use by interface virbr2
</span></pre>
<pre class="console-output" style="box-sizing: border-box; white-space: pre-wrap; word-wrap: break-word; margin: 0px; color: rgb(51, 51, 51); font-size: 14px; font-style: normal; font-variant-ligatures: normal; font-variant-caps: normal; font-weight: normal; letter-spacing: normal; orphans: 2; text-align: start; text-indent: 0px; text-transform: none; widows: 2; word-spacing: 0px; -webkit-text-stroke-width: 0px; text-decoration-style: initial; text-decoration-color: initial;"><span style="box-sizing: border-box; color: rgb(205, 0, 0);"><font color="#000000">
</font></span>we are working to fix the issue and will update once it's resolved.
Thanks,
Dafna
<span style="box-sizing: border-box; color: rgb(205, 0, 0);"><font color="#000000">
</font></span><span style="box-sizing: border-box; color: rgb(205, 0, 0);"></span></pre>
<span style="box-sizing: border-box; color: rgb(205, 0, 0);"><font
color="#000000"><br>
</font></span>
<div class="moz-forward-container"><br>
<br>
-------- Forwarded Message --------
<table class="moz-email-headers-table" border="0" cellspacing="0"
cellpadding="0">
<tbody>
<tr>
<th valign="BASELINE" align="RIGHT" nowrap="nowrap">Subject:
</th>
<td>[CQ]: 82219, 14 (ovirt-engine) failed "ovirt-master"
system tests, but isn't the failure root cause</td>
</tr>
<tr>
<th valign="BASELINE" align="RIGHT" nowrap="nowrap">Date: </th>
<td>Thu, 2 Nov 2017 14:55:41 +0000 (UTC)</td>
</tr>
<tr>
<th valign="BASELINE" align="RIGHT" nowrap="nowrap">From: </th>
<td>oVirt Jenkins <a class="moz-txt-link-rfc2396E" href="mailto:jenkins@ovirt.org"><jenkins(a)ovirt.org></a></td>
</tr>
<tr>
<th valign="BASELINE" align="RIGHT" nowrap="nowrap">To: </th>
<td><a class="moz-txt-link-abbreviated" href="mailto:infra@ovirt.org">infra(a)ovirt.org</a></td>
</tr>
</tbody>
</table>
<br>
<br>
<pre>A system test invoked by the "ovirt-master" change queue including change
82219,14 (ovirt-engine) failed. However, this change seems not to be the root
cause for this failure. Change 83472,4 (ovirt-engine) that this change depends
on or is based on, was detected as the cause of the testing failures.
This change had been removed from the testing queue. Artifacts built from this
change will not be released until either change 83472,4 (ovirt-engine) is fixed
and this change is updated to refer to or rebased on the fixed version, or this
change is modified to no longer depend on it.
For further details about the change see:
<a class="moz-txt-link-freetext" href="https://gerrit.ovirt.org/#/c/82219/14">https://gerrit.ovirt.org/#/c/82219/14</a>
For further details about the change that seems to be the root cause behind the
testing failures see:
<a class="moz-txt-link-freetext" href="https://gerrit.ovirt.org/#/c/83472/4">https://gerrit.ovirt.org/#/c/83472/4</a>
For failed test results see:
<a class="moz-txt-link-freetext" href="http://jenkins.ovirt.org/job/ovirt-master_change-queue-tester/3594/">http://jenkins.ovirt.org/job/ovirt-master_change-queue-tester/3594/</a>
_______________________________________________
Infra mailing list
<a class="moz-txt-link-abbreviated" href="mailto:Infra@ovirt.org">Infra(a)ovirt.org</a>
<a class="moz-txt-link-freetext" href="http://lists.ovirt.org/mailman/listinfo/infra">http://lists.ovirt.org/mailman/listinfo/infra</a>
</pre>
</div>
</body>
</html>
--------------2631BD3B597AE9940D121B2F--
7 years, 1 month
[ OST Failure Report ] [ oVirt 4.1 ] [ Nov 1st 2017 ] [002_bootstrap.add_dc, 002_bootstrap.add_master_storage_domain ]
by Dafna Ron
This is a multi-part message in MIME format.
--------------2C6E8A07021865030490D407
Content-Type: text/plain; charset=utf-8
Content-Transfer-Encoding: 7bit
Hi,
we have failures in 4.1 imagebased.
I think it's also related to the same package from 4.2 that is
downloaded in the 4.1 repo.
2017-11-01 13:17:53,817::ssh.py::ssh::96::lago.ssh::DEBUG::Command 10946280 on lago-basic-suite-4-1-host-0 errors:
Error: Package: ovirt-host-4.2.0-0.0.master.20171019135233.git1921fc6.el7.centos.noarch (alocalsync)
Requires: vdsm-client >= 4.20.0
Available: vdsm-client-4.19.35-2.gitc1d5a55.el7.centos.noarch (alocalsync)
vdsm-client = 4.19.35-2.gitc1d5a55.el7.centos
Error: Package: ovirt-host-4.2.0-0.0.master.20171019135233.git1921fc6.el7.centos.noarch (alocalsync)
Requires: vdsm >= 4.20.0
Available: vdsm-4.19.35-2.gitc1d5a55.el7.centos.x86_64 (alocalsync)
vdsm = 4.19.35-2.gitc1d5a55.el7.centos
>From what I can see, the package is coming from the centos mirros:
yum -y install ovirt-host
rm -rf /dev/shm/*.rpm /dev/shm/yum
"
2017-11-01 13:17:53,817::ssh.py::ssh::81::lago.ssh::DEBUG::Command 10946280 on lago-basic-suite-4-1-host-0 returned with 0
2017-11-01 13:17:53,817::ssh.py::ssh::89::lago.ssh::DEBUG::Command 10946280 on lago-basic-suite-4-1-host-0 output:
Loaded plugins: fastestmirror
Loading mirror speeds from cached hostfile
* base: mirror.keystealth.org
* extras: mirror.lax.hugeserver.com
* updates: dallas.tx.mirror.xygenhosting.com
Resolving Dependencies
--> Running transaction check
---> Package ovirt-host.noarch 0:4.2.0-0.0.master.20171019135233.git1921fc6.el7.centos will be installed
--> Processing Dependency: vdsm >= 4.20.0 for package: ovirt-host-4.2.0-0.0.master.20171019135233.git1921fc6.el7.centos.noarch
--> Processing Dependency: vdsm-client >= 4.20.0 for package: ovirt-host-4.2.0-0.0.master.20171019135233.git1921fc6.el7.centos.noarch
--> Processing Dependency: collectd for package: ovirt-host-4.2.0-0.0.master.20171019135233.git1921fc6.el7.centos.noarch
--> Processing Dependency: collectd-disk for package: ovirt-host-4.2.0-0.0.master.20171019135233.git1921fc6.el7.centos.noarch
--> Processing Dependency: collectd-netlink for package: ovirt-host-4.2.0-0.0.master.20171019135233.git1921fc6.el7.centos.noarch
--> Processing Dependency: collectd-virt for package: ovirt-host-4.2.0-0.0.master.20171019135233.git1921fc6.el7.centos.noarch
--> Processing Dependency: collectd-write_http for package: ovirt-host-4.2.0-0.0.master.20171019135233.git1921fc6.el7.centos.noarch
--> Processing Dependency: fluentd for package: ovirt-host-4.2.0-0.0.master.20171019135233.git1921fc6.el7.centos.noarch
--> Processing Dependency: ovirt-vmconsole for package: ovirt-host-4.2.0-0.0.master.20171019135233.git1921fc6.el7.centos.noarch
--> Processing Dependency: ovirt-vmconsole-host for package: ovirt-host-4.2.0-0.0.master.20171019135233.git1921fc6.el7.centos.noarch
--> Processing Dependency: rubygem-fluent-plugin-collectd-nest for package: ovirt-host-4.2.0-0.0.master.20171019135233.git1921fc6.el7.centos.noarch
--> Processing Dependency: rubygem-fluent-plugin-rewrite-tag-filter for package: ovirt-host-4.2.0-0.0.master.20171019135233.git1921fc6.el7.centos.noarch
--> Processing Dependency: rubygem-fluent-plugin-secure-forward for package: ovirt-host-4.2.0-0.0.master.20171019135233.git1921fc6.el7.centos.noarch
--> Processing Dependency: rubygem-fluent-plugin-viaq_data_model for package: ovirt-host-4.2.0-0.0.master.20171019135233.git1921fc6.el7.centos.noarch
--> Processing Dependency: socat for package: ovirt-host-4.2.0-0.0.master.20171019135233.git1921fc6.el7.centos.no
Sandro, can you please help to remove this package?
Thanks,
Dafna
--------------2C6E8A07021865030490D407
Content-Type: text/html; charset=utf-8
Content-Transfer-Encoding: 7bit
<html>
<head>
<meta http-equiv="content-type" content="text/html; charset=utf-8">
</head>
<body text="#000000" bgcolor="#FFFFFF">
<p>Hi, <br>
</p>
<p>we have failures in 4.1 imagebased. <br>
</p>
<p>I think it's also related to the same package from 4.2 that is
downloaded in the 4.1 repo. <br>
</p>
<pre style="color: rgb(0, 0, 0); font-style: normal; font-variant-ligatures: normal; font-variant-caps: normal; font-weight: normal; letter-spacing: normal; orphans: 2; text-align: start; text-indent: 0px; text-transform: none; widows: 2; word-spacing: 0px; -webkit-text-stroke-width: 0px; text-decoration-style: initial; text-decoration-color: initial;">2017-11-01 13:17:53,817::ssh.py::<a class="moz-txt-link-freetext" href="ssh::96::lago.ssh::DEBUG::Command">ssh::96::lago.ssh::DEBUG::Command</a> 10946280 on lago-basic-suite-4-1-host-0 errors:
Error: Package: ovirt-host-4.2.0-0.0.master.20171019135233.git1921fc6.el7.centos.noarch (alocalsync)
Requires: vdsm-client >= 4.20.0
Available: vdsm-client-4.19.35-2.gitc1d5a55.el7.centos.noarch (alocalsync)
vdsm-client = 4.19.35-2.gitc1d5a55.el7.centos
Error: Package: ovirt-host-4.2.0-0.0.master.20171019135233.git1921fc6.el7.centos.noarch (alocalsync)
Requires: vdsm >= 4.20.0
Available: vdsm-4.19.35-2.gitc1d5a55.el7.centos.x86_64 (alocalsync)
vdsm = 4.19.35-2.gitc1d5a55.el7.centos
</pre>
From what I can see, the package is coming from the centos mirros: <br>
<br>
<pre style="color: rgb(0, 0, 0); font-style: normal; font-variant-ligatures: normal; font-variant-caps: normal; font-weight: normal; letter-spacing: normal; orphans: 2; text-align: start; text-indent: 0px; text-transform: none; widows: 2; word-spacing: 0px; -webkit-text-stroke-width: 0px; text-decoration-style: initial; text-decoration-color: initial;">yum -y install ovirt-host
rm -rf /dev/shm/*.rpm /dev/shm/yum
"
2017-11-01 13:17:53,817::ssh.py::<a class="moz-txt-link-freetext" href="ssh::81::lago.ssh::DEBUG::Command">ssh::81::lago.ssh::DEBUG::Command</a> 10946280 on lago-basic-suite-4-1-host-0 returned with 0
2017-11-01 13:17:53,817::ssh.py::<a class="moz-txt-link-freetext" href="ssh::89::lago.ssh::DEBUG::Command">ssh::89::lago.ssh::DEBUG::Command</a> 10946280 on lago-basic-suite-4-1-host-0 output:
Loaded plugins: fastestmirror
Loading mirror speeds from cached hostfile
* base: mirror.keystealth.org
* extras: mirror.lax.hugeserver.com
* updates: dallas.tx.mirror.xygenhosting.com
Resolving Dependencies
--> Running transaction check
---> Package ovirt-host.noarch 0:4.2.0-0.0.master.20171019135233.git1921fc6.el7.centos will be installed
--> Processing Dependency: vdsm >= 4.20.0 for package: ovirt-host-4.2.0-0.0.master.20171019135233.git1921fc6.el7.centos.noarch
--> Processing Dependency: vdsm-client >= 4.20.0 for package: ovirt-host-4.2.0-0.0.master.20171019135233.git1921fc6.el7.centos.noarch
--> Processing Dependency: collectd for package: ovirt-host-4.2.0-0.0.master.20171019135233.git1921fc6.el7.centos.noarch
--> Processing Dependency: collectd-disk for package: ovirt-host-4.2.0-0.0.master.20171019135233.git1921fc6.el7.centos.noarch
--> Processing Dependency: collectd-netlink for package: ovirt-host-4.2.0-0.0.master.20171019135233.git1921fc6.el7.centos.noarch
--> Processing Dependency: collectd-virt for package: ovirt-host-4.2.0-0.0.master.20171019135233.git1921fc6.el7.centos.noarch
--> Processing Dependency: collectd-write_http for package: ovirt-host-4.2.0-0.0.master.20171019135233.git1921fc6.el7.centos.noarch
--> Processing Dependency: fluentd for package: ovirt-host-4.2.0-0.0.master.20171019135233.git1921fc6.el7.centos.noarch
--> Processing Dependency: ovirt-vmconsole for package: ovirt-host-4.2.0-0.0.master.20171019135233.git1921fc6.el7.centos.noarch
--> Processing Dependency: ovirt-vmconsole-host for package: ovirt-host-4.2.0-0.0.master.20171019135233.git1921fc6.el7.centos.noarch
--> Processing Dependency: rubygem-fluent-plugin-collectd-nest for package: ovirt-host-4.2.0-0.0.master.20171019135233.git1921fc6.el7.centos.noarch
--> Processing Dependency: rubygem-fluent-plugin-rewrite-tag-filter for package: ovirt-host-4.2.0-0.0.master.20171019135233.git1921fc6.el7.centos.noarch
--> Processing Dependency: rubygem-fluent-plugin-secure-forward for package: ovirt-host-4.2.0-0.0.master.20171019135233.git1921fc6.el7.centos.noarch
--> Processing Dependency: rubygem-fluent-plugin-viaq_data_model for package: ovirt-host-4.2.0-0.0.master.20171019135233.git1921fc6.el7.centos.noarch
--> Processing Dependency: socat for package: ovirt-host-4.2.0-0.0.master.20171019135233.git1921fc6.el7.centos.no
</pre>
Sandro, can you please help to remove this package? <br>
<br>
Thanks, <br>
Dafna<br>
<br>
</body>
</html>
--------------2C6E8A07021865030490D407--
7 years, 1 month
Planned restart of production services
by Evgheni Dereveanchin
Hi everyone,
I'll be restarting several production systems today to perform planned
maintenance.
The following services may be unreachable for some period of time:
- resources.ovirt.org - software repositories
- jenkins.ovirt.org - CI master
No new builds will be started during this period.
I will announce you once the maintenance is complete.
--
Regards,
Evgheni Dereveanchin
7 years, 1 month
[oVirt 4.2 Localization Question #6] Misplaced double quotes?
by Yuko Katabami
Hello oVirt developers.
My next question is as follows;
*File: *UIConstants
*Resource IDs: *highPerformancePopupRecommendationMsgForKsmPart1
highPerformancePopupRecommendationMsgForKsmPart2
*Strings: *
KERNEL SAME PAGE MERGING (KSM):
Please disable KSM by disabling it for the Cluster: "
". This can be done by editing the Cluster and disabling the "Enable KSM"
field.
*Question:* There is something wrong with the usage of double quotes in
those two strings. Could I remove it from the end of the first string, and
the beginning of the second string (as well as the period)?
Thank you,
Yuko
7 years, 1 month
[ OST Failure Report ] [ oVirt 4.1 ] [ Nov 1st 2017 ] [ 002_bootstrap.add_master_storage_domain]
by Dafna Ron
This is a multi-part message in MIME format.
--------------4543E9AA1C72328340227CEF
Content-Type: text/plain; charset=utf-8
Content-Transfer-Encoding: 7bit
Hi,
There are three reported failures on the same test:
002_bootstrap.add_master_storage_domain
This seems to be related to packages issues.
Please note that all 3 suits failed.
Basic suit failed because of ovirt-host-4.2.0-0.0 and i think all
related issues came from there.
upgrade suits fail for different packaging issues (will paste issues at
the end of the mail but please look at lago logs for more info)
**
*Link to suspected patches: This is the patch that was reported*
*
*
**https://gerrit.ovirt.org/#/c/83403/**
*
*
*Link to Job:*
*
*
**http://jenkins.ovirt.org/job/ovirt-4.1_change-queue-tester/1232**
*
*
*Link to all logs:*
*
*
**http://jenkins.ovirt.org/job/ovirt-4.1_change-queue-tester/1232/artifact/exported-artifacts/upgrade-from-prevrelease-suit-4.1-el7/lago_logs/lago.log**
**http://jenkins.ovirt.org/job/ovirt-4.1_change-queue-tester/1232/artifact/exported-artifacts/basic-suit-4.1-el7/lago_logs/lago.log**
**http://jenkins.ovirt.org/job/ovirt-4.1_change-queue-tester/1232/artifact/exported-artifacts/upgrade-from-release-suit-4.1-el7/lago_logs/lago.log
**
**http://jenkins.ovirt.org/job/ovirt-4.1_change-queue-tester/1232/artifact/**
*
*
*(Relevant) error snippet from the log: *
*
<error>
basic suit:
*
*
*
---> Package rubygem-json.x86_64 0:1.7.7-30.el7 will be installed
--> Finished Dependency Resolution
You could try using --skip-broken to work around the problem
You could try running: rpm -Va --nofiles --nodigest
2017-11-01 12:59:07,818::ssh.py::ssh::96::lago.ssh::DEBUG::Command 71738e12 on lago-basic-suite-4-1-host-0 errors:
Error: Package: ovirt-host-4.2.0-0.0.master.20171019135233.git1921fc6.el7.centos.noarch (alocalsync)
Requires: vdsm-client >= 4.20.0
Available: vdsm-client-4.19.35-2.gitc1d5a55.el7.centos.noarch (alocalsync)
vdsm-client = 4.19.35-2.gitc1d5a55.el7.centos
Error: Package: ovirt-host-4.2.0-0.0.master.20171019135233.git1921fc6.el7.centos.noarch (alocalsync)
Requires: vdsm >= 4.20.0
Available: vdsm-4.19.35-2.gitc1d5a55.el7.centos.x86_64 (alocalsync)
vdsm = 4.19.35-2.gitc1d5a55.el7.centos
upgrade-from-prevrelease-suit-4.1-el7upgrade-from-prevrelease-suit-4.1-el7
2017-11-01 12:54:36,678::log_utils.py::__enter__::600::lago.ssh::DEBUG::start task:4a7a2559-9f73-4c95-a592-a6c6144843eb:Get ssh client for lago-upgrade-from-prevrelease-suite-4-1-engine:
2017-11-01 12:54:36,678::ssh.py::get_ssh_client::339::lago.ssh::DEBUG::Still got 1 tries for lago-upgrade-from-prevrelease-suite-4-1-engine
2017-11-01 12:54:39,685::ssh.py::get_ssh_client::354::lago.ssh::DEBUG::Socket error connecting to lago-upgrade-from-prevrelease-suite-4-1-engine: [Errno None] Unable to connect to port 22 on 192.168.203.2
2017-11-01 12:54:40,686::log_utils.py::__exit__::611::lago.ssh::DEBUG::end task:4a7a2559-9f73-4c95-a592-a6c6144843eb:Get ssh client for lago-upgrade-from-prevrelease-suite-4-1-engine:
2017-11-01 12:54:40,686::ssh.py::wait_for_ssh::129::lago.ssh::DEBUG::Got exception while sshing to lago-upgrade-from-prevrelease-suite-4-1-engine: Timed out (in 4 s) trying to ssh to lago-upgrade-from-prevrelease-suite-4-1-engine
2017-11-01 12:54:41,688::log_utils.py::__enter__::600::lago.ssh::DEBUG::start task:f0d5d670-e595-4b3c-af10-5ce91d70c431:Get ssh client for lago-upgrade-from-prevrelease-suite-4-1-engine:
2017-11-01 12:54:41,688::ssh.py::get_ssh_client::339::lago.ssh::DEBUG::Still got 1 tries for lago-upgrade-from-prevrelease-suite-4-1-engine
2017-11-01 12:54:41,689::ssh.py::get_ssh_client::354::lago.ssh::DEBUG::Socket error connecting to lago-upgrade-from-prevrelease-suite-4-1-engine: [Errno None] Unable to connect to port 22 on 192.168.203.2
2017-11-01 12:54:42,690::log_utils.py::__exit__::611::lago.ssh::DEBUG::end task:f0d5d670-e595-4b3c-af10-5ce91d70c431:Get ssh client for lago-upgrade-from-prevrelease-suite-4-1-engine:
2017-11-01 12:54:42,690::ssh.py::wait_for_ssh::129::lago.ssh::DEBUG::Got exception while sshing to lago-upgrade-from-prevrelease-suite-4-1-engine: Timed out (in 1 s) trying to ssh to lago-upgrade-from-prevrelease-suite-4-1-engine
2017-11-01 12:54:43,692::log_utils.py::__enter__::600::lago.ssh::DEBUG::start task:d151f339-0f76-4f42-869f-c08a533a3e54:Get ssh client for lago-upgrade-from-prevrelease-suite-4-1-engine:
2017-11-01 12:54:43,692::ssh.py::get_ssh_client::339::lago.ssh::DEBUG::Still got 1 tries for lago-upgrade-from-prevrelease-suite-4-1-engine
2017-11-01 12:54:43,692::ssh.py::get_ssh_client::354::lago.ssh::DEBUG::Socket error connecting to lago-upgrade-from-prevrelease-suite-4-1-engine: [Errno None] Unable to connect to port 22 on 192.168.203.2
2017-11-01 12:54:44,694::log_utils.py::__exit__::611::lago.ssh::DEBUG::end task:d151f339-0f76-4f42-869f-c08a533a3e54:Get ssh client for lago-upgrade-from-prevrelease-suite-4-1-engine:
2017-11-01 12:54:44,694::ssh.py::wait_for_ssh::129::lago.ssh::DEBUG::Got exception while sshing to lago-upgrade-from-prevrelease-suite-4-1-engine: Timed out (in 1 s) trying to ssh to lago-upgrade-from-prevrelease-suite-4-1-engine
2017-11-01 12:54:45,695::log_utils.py::__enter__::600::lago.ssh::DEBUG::start task:8646939f-b59a-42b1-8ebc-1dec54e5d138:Get ssh client for lago-upgrade-from-prevrelease-suite-4-1-engine:
2017-11-01 12:54:45,695::ssh.py::get_ssh_client::339::lago.ssh::DEBUG::Still got 1 tries for lago-upgrade-from-prevrelease-suite-4-1-engine
Installing : ovirt-vmconsole-1.0.4-1.el7.centos.noarch 154/322
Failed to resolve booleanif statement at /etc/selinux/targeted/tmp/modules/400/ovirt_vmconsole/cil:588
semodule: Failed!
Installing : ovirt-vmconsole-proxy-1.0.4-1.el7.centos.noarch 155/322
libsemanage.semanage_read_policydb: Could not open kernel policy /etc/selinux/targeted/active/policy.kern for reading. (No such file or directory).
OSError: No such file or directory
134/141): vdsm-xmlrpc-4.19.35-2.gitc1d5a55.el7.centos.noa | 25 kB 00:00
(135/141): vdsm-yajsonrpc-4.19.35-2.gitc1d5a55.el7.centos. | 28 kB 00:00
ovirt-engine-yarn-0.19.1-2.el7.centos.x86_64: [Errno 256] No more mirrors to try.
ovirt-scheduler-proxy-0.1.7-1.el7.centos.noarch: [Errno 256] No more mirrors to try.
ovirt-provider-ovn-1.0-8.el7.centos.noarch: [Errno 256] No more mirrors to try.
ovirt-provider-ovn-driver-1.0-8.el7.centos.noarch: [Errno 256] No more mirrors to try.
ovirt-engine-setup-plugin-live-4.1.0-1.el7.centos.noarch: [Errno 256] No more mirrors to try.
unboundid-ldapsdk-3.2.0-1.el7.noarch: [Errno 256] No more mirrors to try.
2017-11-01 12:59:18,385::log_utils.py::__exit__::611::lago.utils::DEBUG::end task:da936503-c329-49b1-acfc-7658379be2fc:Run command: "reposync" "--config" "/home/jenkins/workspace/ovirt-4.1_change-queue-tester/ovirt-system-tests/upgrade-from-prevrelease-suite-4.1/reposync-config.repo" "--download_path" "/var/lib/lago" "--newest-only" "--delete" "--cachedir" "/tmp/reposync_i77mEt/cache" "--repoid" "ovirt-4.1-el7":
2017-11-01 12:59:18,385::reposetup.py::sync_rpm_repository::200::ovirtlago.reposetup::INFO:: - repo: ovirt-4.1-el7: failed, re-running.
2017-11-01 12:59:18,409::reposetup.py::_fix_reposync_issues::146::ovirtlago.reposetup::DEBUG::detected package errors in reposync output in repo_path:/var/lib/lago/ovirt-4.1-el7: ovirt-engine-setup-plugin-live,ovirt-provider-ovn-1.0-8.el7.c,unboundid-ldapsdk-3.2.0-1.el7.,ovirt-engine-setup-plugin-live-4.1.0-1.el7.centos.noarch,ovirt-provider-ovn-driver-1.0-8.el7.centos.noarch,ovirt-engine-yarn-0.19.1-2.el7,ovirt-provider-ovn-1.0-8.el7.centos.noarch,ovirt-scheduler-proxy-0.1.7-1.el7.centos.noarch,ovirt-scheduler-proxy-0.1.7-1.,unboundid-ldapsdk-3.2.0-1.el7.noarch,ovirt-provider-ovn-driver-1.0-,ovirt-engine-yarn-0.19.1-2.el7.centos.x86_64
2017-11-01 12:59:18,411::reposetup.py::_fix_reposync_issues::157::ovirtlago.reposetup::INFO:: - removing conflicting RPM: /var/lib/lago/ovirt-4.1-el7/noarch/ovirt-engine-setup-plugin-live-4.1.0-1.el7.centos.noarch.rpm
2017-11-01 12:59:18,412::reposetup.py::_fix_reposync_issues::157::ovirtlago.reposetup::INFO:: - removing conflicting RPM: /var/lib/lago/ovirt-4.1-el7/noarch/ovirt-provider-ovn-1.0-8.el7.centos.noarch.rpm
2017-11-01 12:59:18,412::reposetup.py::_fix_reposync_issues::157::ovirtlago.reposetup::INFO:: - removing conflicting RPM: /var/lib/lago/ovirt-4.1-el7/noarch/ovirt-provider-ovn-driver-1.0-8.el7.centos.noarch.rpm
2017-11-01 12:59:18,412::reposetup.py::_fix_reposync_issues::157::ovirtlago.reposetup::INFO:: - removing conflicting RPM: /var/lib/lago/ovirt-4.1-el7/noarch/unboundid-ldapsdk-3.2.0-1.el7.noarch.rpm
2017-11-01 12:59:18,413::reposetup.py::_fix_reposync_issues::157::ovirtlago.reposetup::INFO:: - removing conflicting RPM: /var/lib/lago/ovirt-4.1-el7/noarch/ovirt-scheduler-proxy-0.1.7-1.el7.centos.noarch.rpm
2017-11-01 12:59:18,415::reposetup.py::_fix_reposync_issues::157::ovirtlago.reposetup::INFO:: - removing conflicting RPM: /var/lib/lago/ovirt-4.1-el7/x86_64/ovirt-engine-yarn-0.19.1-2.el7.centos.x86_64.rpm
2017-11-01 12:59:18,416::reposetup.py::_fix_reposync_issues::167::ovirtlago.reposetup::DEBUG::removed 6 conflicting packages, see https://bugzilla.redhat.com//show_bug.cgi?id=1399235 for more details.
2017-11-01 12:59:18,416::log_utils.py::__enter__::600::lago.utils::DEBUG::start task:6e42e37e-deca-4754-9009-17b063f931a2:Run command: "reposync" "--config" "/home/jenkins/workspace/ovirt-4.1_change-queue-tester/ovirt-system-tests/upgrade-from-prevrelease-suite-4.1/reposync-config.repo" "--download_path" "/var/lib/lago" "--newest-only" "--delete" "--cachedir" "/tmp/reposync_i77mEt/cache" "--repoid" "ovirt-4.1-el7":
2017-11-01 12:59:20,628::utils.py::_run_command::189::lago.utils::DEBUG::6e42e37e-deca-4754-9009-17b063f931a2: command exit with return code: 0
2017-11-01 12:59:20,628::utils.py::_run_command::192::lago.utils::DEBUG::6e42e37e-deca-4754-9009-17b063f931a2: command stdout: Package ovirt-engine-setup-plugin-live-4.1.0-1.el7.centos.noarch.rpm is not signed
2017-11-01 13:02:29,826::ssh.py::ssh::96::lago.ssh::DEBUG::Command eaba52c4 on lago-upgrade-from-prevrelease-suite-4-1-engine errors:
cat: /root/multipath.txt: No such file or directory
upgrade-from-release-suit-4.1-el7
virt-engine-yarn-0.19.1-2.el7.centos.x86_64: [Errno 256] No more mirrors to try.
ovirt-scheduler-proxy-0.1.7-1.el7.centos.noarch: [Errno 256] No more mirrors to try.
ovirt-provider-ovn-1.0-8.el7.centos.noarch: [Errno 256] No more mirrors to try.
ovirt-provider-ovn-driver-1.0-8.el7.centos.noarch: [Errno 256] No more mirrors to try.
ovirt-engine-setup-plugin-live-4.1.0-1.el7.centos.noarch: [Errno 256] No more mirrors to try.
unboundid-ldapsdk-3.2.0-1.el7.noarch: [Errno 256] No more mirrors to try.
2017-11-01 12:53:12,198::utils.py::_run_command::194::lago.utils::DEBUG::07b8a1a2-7149-49f1-b837-cba126d81521: command stderr: BDB2053 Freeing read locks for locker 0x8: 49578/140137252079680
BDB2053 Freeing read locks for locker 0xa: 49578/140137252079680
BDB2053 Freeing read locks for locker 0xb: 49578/140137252079680
BDB2053 Freeing read locks for locker 0xc: 49578/140137252079680
2017-11-01 12:53:12,198::log_utils.py::__exit__::611::lago.utils::DEBUG::end task:07b8a1a2-7149-49f1-b837-cba126d81521:Run command: "reposync" "--config" "/home/jenkins/workspace/ovirt-4.1_change-queue-tester/ovirt-system-tests/upgrade-from-release-suite-4.1/pre-reposync-config.repo" "--download_path" "/var/lib/lago" "--newest-only" "--delete" "--cachedir" "/tmp/reposync_v99dud/cache" "--repoid" "ovirt-4.1-el7":
2017-11-01 12:53:12,199::reposetup.py::sync_rpm_repository::200::ovirtlago.reposetup::INFO:: - repo: ovirt-4.1-el7: failed, re-running.
2017-11-01 12:53:12,225::reposetup.py::_fix_reposync_issues::146::ovirtlago.reposetup::DEBUG::detected package errors in reposync output in repo_path:/var/lib/lago/ovirt-4.1-el7: ovirt-engine-setup-plugin-live,ovirt-provider-ovn-1.0-8.el7.c,unboundid-ldapsdk-3.2.0-1.el7.,ovirt-engine-setup-plugin-live-4.1.0-1.el7.centos.noarch,ovirt-provider-ovn-driver-1.0-8.el7.centos.noarch,ovirt-engine-yarn-0.19.1-2.el7,ovirt-provider-ovn-1.0-8.el7.centos.noarch,ovirt-scheduler-proxy-0.1.7-1.el7.centos.noarch,ovirt-scheduler-proxy-0.1.7-1.,unboundid-ldapsdk-3.2.0-1.el7.noarch,ovirt-provider-ovn-driver-1.0-,ovirt-engine-yarn-0.19.1-2.el7.centos.x86_64
2017-11-01 12:53:12,227::reposetup.py::_fix_reposync_issues::157::ovirtlago.reposetup::INFO:: - removing conflicting RPM: /var/lib/lago/ovirt-4.1-el7/noarch/ovirt-scheduler-proxy-0.1.7-1.el7.centos.noarch.rpm
2017-11-01 12:53:12,227::reposetup.py::_fix_reposync_issues::157::ovirtlago.reposetup::INFO:: - removing conflicting RPM: /var/lib/lago/ovirt-4.1-el7/noarch/ovirt-provider-ovn-driver-1.0-8.el7.centos.noarch.rpm
2017-11-01 12:53:12,228::reposetup.py::_fix_reposync_issues::157::ovirtlago.reposetup::INFO:: - removing conflicting RPM: /var/lib/lago/ovirt-4.1-el7/noarch/ovirt-engine-setup-plugin-live-4.1.0-1.el7.centos.noarch.rpm
2017-11-01 12:53:12,228::reposetup.py::_fix_reposync_issues::157::ovirtlago.reposetup::INFO:: - removing conflicting RPM: /var/lib/lago/ovirt-4.1-el7/noarch/ovirt-provider-ovn-1.0-8.el7.centos.noarch.rpm
2017-11-01 12:53:12,229::reposetup.py::_fix_reposync_issues::157::ovirtlago.reposetup::INFO:: - removing conflicting RPM: /var/lib/lago/ovirt-4.1-el7/noarch/unboundid-ldapsdk-3.2.0-1.el7.noarch.rpm
2017-11-01 12:53:12,231::reposetup.py::_fix_reposync_issues::157::ovirtlago.reposetup::INFO:: - removing conflicting RPM: /var/lib/lago/ovirt-4.1-el7/x86_64/ovirt-engine-yarn-0.19.1-2.el7.centos.x86_64.rpm
2017-11-01 12:53:12,232::reposetup.py::_fix_reposync_issues::167::ovirtlago.reposetup::DEBUG::removed 6 conflicting packages, see https://bugzilla.redhat.com//show_bug.cgi?id=1399235 for more details.
2017-11-01 12:53:12,232::log_utils.py::__enter__::600::lago.utils::DEBUG::start task:4c36637d-e4d7-488d-a0e5-590e4b1e28e6:Run command: "reposync" "--config" "/home/jenkins/workspace/ovirt-4.1_change-queue-tester/ovirt-system-tests/upgrade-from-release-suite-4.1/pre-reposync-config.repo" "--download_path" "/var/lib/lago" "--newest-only" "--delete" "--cachedir" "/tmp/reposync_v99dud/cache" "--repoid" "ovirt-4.1-el7":
2017-11-01 12:53:14,457::utils.py::_run_command::189::lago.utils::DEBUG::4c36637d-e4d7-488d-a0e5-590e4b1e28e6: command exit with return code: 0
2017-11-01 12:53:14,457::utils.py::_run_command::192::lago.utils::DEBUG::4c36637d-e4d7-488d-a0e5-590e4b1e28e6: command stdout: Package ovirt-engine-setup-plugin-live-4.1.0-1.el7.centos.noarch.rpm is not signed
2017-11-01 12:54:52,287::log_utils.py::__enter__::600::lago.prefix::INFO::[0m[0m
2017-11-01 12:54:52,288::log_utils.py::__enter__::600::lago.ssh::DEBUG::start task:1cfeda92-2104-4c35-8868-37f1b3e175ed:Get ssh client for lago-upgrade-from-release-suite-4-1-engine:
2017-11-01 12:54:52,288::ssh.py::get_ssh_client::339::lago.ssh::DEBUG::Still got 1 tries for lago-upgrade-from-release-suite-4-1-engine
2017-11-01 12:54:55,294::ssh.py::get_ssh_client::354::lago.ssh::DEBUG::Socket error connecting to lago-upgrade-from-release-suite-4-1-engine: [Errno None] Unable to connect to port 22 on 192.168.200.2
2017-11-01 12:54:56,295::log_utils.py::__exit__::611::lago.ssh::DEBUG::end task:1cfeda92-2104-4c35-8868-37f1b3e175ed:Get ssh client for lago-upgrade-from-release-suite-4-1-engine:
2017-11-01 12:54:56,296::ssh.py::wait_for_ssh::129::lago.ssh::DEBUG::Got exception while sshing to lago-upgrade-from-release-suite-4-1-engine: Timed out (in 4 s) trying to ssh to lago-upgrade-from-release-suite-4-1-engine
2017-11-01 12:54:57,297::log_utils.py::__enter__::600::lago.ssh::DEBUG::start task:09223161-d627-4306-abb0-218678b6eb1f:Get ssh client for lago-upgrade-from-release-suite-4-1-engine:
2017-11-01 12:54:57,297::ssh.py::get_ssh_client::339::lago.ssh::DEBUG::Still got 1 tries for lago-upgrade-from-release-suite-4-1-engine
2017-11-01 12:54:57,298::ssh.py::get_ssh_client::354::lago.ssh::DEBUG::Socket error connecting to lago-upgrade-from-release-suite-4-1-engine: [Errno None] Unable to connect to port 22 on 192.168.200.2
2017-11-01 12:54:58,299::log_utils.py::__exit__::611::lago.ssh::DEBUG::end task:09223161-d627-4306-abb0-218678b6eb1f:Get ssh client for lago-upgrade-from-release-suite-4-1-engine:
2017-11-01 12:54:58,299::ssh.py::wait_for_ssh::129::lago.ssh::DEBUG::Got exception while sshing to lago-upgrade-from-release-suite-4-1-engine: Timed out (in 1 s) trying to ssh to lago-upgrade-from-release-suite-4-1-engine
2017-11-01 12:54:59,300::log_utils.py::__enter__::600::lago.ssh::DEBUG::start task:0981c19b-77bf-43fd-8e2f-f5519f74dbf0:Get ssh client for lago-upgrade-from-release-suite-4-1-engine:
2017-11-01 12:54:59,301::ssh.py::get_ssh_client::339::lago.ssh::DEBUG::Still got 1 tries for lago-upgrade-from-release-suite-4-1-engine
2017-11-01 12:54:59,301::ssh.py::get_ssh_client::354::lago.ssh::DEBUG::Socket error connecting to lago-upgrade-from-release-suite-4-1-engine: [Errno None] Unable to connect to port 22 on 192.168.200.2
2017-11-01 12:55:00,302::log_utils.py::__exit__::611::lago.ssh::DEBUG::end task:0981c19b-77bf-43fd-8e2f-f5519f74dbf0:Get ssh client for lago-upgrade-from-release-suite-4-1-engine:
2017-11-01 12:55:00,303::ssh.py::wait_for_ssh::129::lago.ssh::DEBUG::Got exception while sshing to lago-upgrade-from-release-suite-4-1-engine: Timed out (in 1 s) trying to ssh to lago-upgrade-from-release-suite-4-1-engine
2017-11-01 12:55:01,304::log_utils.py::__enter__::600::lago.ssh::DEBUG::start task:80c0243d-c589-496d-a4c0-75e62e37e17e:Get ssh client for lago-upgrade-from-release-suite-4-1-engine:
2017-11-01 12:55:01,304::ssh.py::get_ssh_client::339::lago.ssh::DEBUG::Still got 1 tries for lago-upgrade-from-release-suite-4-1-engine
2017-11-01 12:55:01,471::log_utils.py::__exit__::611::lago.ssh::DEBUG::end task:80c0243d-c589-496d-a4c0-75e62e37e17e:Get ssh client for lago-upgrade-from-release-suite-4-1-engine:
2017-11-01 12:55:01,588::ssh.py::ssh::58::lago.ssh::DEBUG::Running dfa2bbf2 on lago-upgrade-from-release-suite-4-1-engine: true
2017-11-01 12:55:01,611::ssh.py::ssh::81::lago.ssh::DEBUG::Command dfa2bbf2 on lago-upgrade-from-release-suite-4-1-engine returned with 0
2017-11-01 12:55:01,611::ssh.py::wait_for_ssh::153::lago.ssh::DEBUG::Wait succeeded for ssh to lago-upgrade-from-release-suite-4-1-engine
2017-11-01 12:55:01,611::log_utils.py::__exit__::611::lago.prefix::INFO::[32mSuccess[0m (in 0:00:09)
2017-11-01 12:55:01,612::log_utils.py::__enter__::600::lago.prefix::INFO::[0m[0m
2017-11-01 12:55:
*
</error>
*
**
**
*
*
--------------4543E9AA1C72328340227CEF
Content-Type: text/html; charset=utf-8
Content-Transfer-Encoding: 8bit
<html>
<head>
<meta http-equiv="content-type" content="text/html; charset=utf-8">
</head>
<body text="#000000" bgcolor="#FFFFFF">
<p>Hi, <br>
</p>
<p>There are three reported failures on the same test:
002_bootstrap.add_master_storage_domain</p>
<p>This seems to be related to packages issues. <br>
</p>
<p>Please note that all 3 suits failed. </p>
<p>Basic suit failed because of ovirt-host-4.2.0-0.0 and i think all
related issues came from there. <br>
</p>
<p>upgrade suits fail for different packaging issues (will paste
issues at the end of the mail but please look at lago logs for
more info)</p>
<p><b style="font-weight:normal;"
id="docs-internal-guid-5859b7a1-7827-5a99-bb29-1c8f72c98a2f">
<p dir="ltr"
style="line-height:1.38;margin-top:0pt;margin-bottom:0pt;"><span style="font-size:11pt;font-family:Arial;color:#000000;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre-wrap;">Link to suspected patches: This is the patch that was reported</span></p>
<p dir="ltr"
style="line-height:1.38;margin-top:0pt;margin-bottom:0pt;"><span style="font-size:11pt;font-family:Arial;color:#000000;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre-wrap;">
</span></p>
<br>
</b></p>
<p><b style="font-weight:normal;"
id="docs-internal-guid-5859b7a1-7827-5a99-bb29-1c8f72c98a2f"><b><a class="moz-txt-link-freetext" href="https://gerrit.ovirt.org/#/c/83403/">https://gerrit.ovirt.org/#/c/83403/</a></b></b></p>
<p><b style="font-weight:normal;"
id="docs-internal-guid-5859b7a1-7827-5a99-bb29-1c8f72c98a2f"><br>
<p dir="ltr"
style="line-height:1.38;margin-top:0pt;margin-bottom:0pt;"><span style="font-size:11pt;font-family:Arial;color:#000000;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre-wrap;">Link to Job:</span></p>
<br>
</b></p>
<p><b style="font-weight:normal;"
id="docs-internal-guid-5859b7a1-7827-5a99-bb29-1c8f72c98a2f"><b><a class="moz-txt-link-freetext" href="http://jenkins.ovirt.org/job/ovirt-4.1_change-queue-tester/1232">http://jenkins.ovirt.org/job/ovirt-4.1_change-queue-tester/1232</a></b></b></p>
<p><b style="font-weight:normal;"
id="docs-internal-guid-5859b7a1-7827-5a99-bb29-1c8f72c98a2f"><br>
<p dir="ltr"
style="line-height:1.38;margin-top:0pt;margin-bottom:0pt;"><span style="font-size:11pt;font-family:Arial;color:#000000;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre-wrap;">Link to all logs:</span></p>
<br>
</b></p>
<p><b style="font-weight:normal;"
id="docs-internal-guid-5859b7a1-7827-5a99-bb29-1c8f72c98a2f"><b><a class="moz-txt-link-freetext" href="http://jenkins.ovirt.org/job/ovirt-4.1_change-queue-tester/1232/artifact/...">http://jenkins.ovirt.org/job/ovirt-4.1_change-queue-tester/1232/artifact/...</a></b></b></p>
<p><b style="font-weight:normal;"
id="docs-internal-guid-5859b7a1-7827-5a99-bb29-1c8f72c98a2f"><b><a class="moz-txt-link-freetext" href="http://jenkins.ovirt.org/job/ovirt-4.1_change-queue-tester/1232/artifact/...">http://jenkins.ovirt.org/job/ovirt-4.1_change-queue-tester/1232/artifact/...</a></b></b></p>
<p><b style="font-weight:normal;"
id="docs-internal-guid-5859b7a1-7827-5a99-bb29-1c8f72c98a2f"><b><a class="moz-txt-link-freetext" href="http://jenkins.ovirt.org/job/ovirt-4.1_change-queue-tester/1232/artifact/...">http://jenkins.ovirt.org/job/ovirt-4.1_change-queue-tester/1232/artifact/...</a><br>
</b></b></p>
<p><b style="font-weight:normal;"
id="docs-internal-guid-5859b7a1-7827-5a99-bb29-1c8f72c98a2f"><b><a class="moz-txt-link-freetext" href="http://jenkins.ovirt.org/job/ovirt-4.1_change-queue-tester/1232/artifact/">http://jenkins.ovirt.org/job/ovirt-4.1_change-queue-tester/1232/artifact/</a></b></b></p>
<p><b style="font-weight:normal;"
id="docs-internal-guid-5859b7a1-7827-5a99-bb29-1c8f72c98a2f"><br>
<p dir="ltr"
style="line-height:1.38;margin-top:0pt;margin-bottom:0pt;"><span style="font-size:11pt;font-family:Arial;color:#000000;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre-wrap;">(Relevant) error snippet from the log: </span></p>
<p dir="ltr"
style="line-height:1.38;margin-top:0pt;margin-bottom:0pt;"><span style="font-size:11pt;font-family:Arial;color:#000000;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre-wrap;"><error></span></p>
<br>
basic suit: <br>
</b></p>
<p><b style="font-weight:normal;"
id="docs-internal-guid-5859b7a1-7827-5a99-bb29-1c8f72c98a2f"><br>
</b></p>
<p><br>
</p>
<pre style="color: rgb(0, 0, 0); font-style: normal; font-variant-ligatures: normal; font-variant-caps: normal; font-weight: normal; letter-spacing: normal; orphans: 2; text-align: start; text-indent: 0px; text-transform: none; widows: 2; word-spacing: 0px; -webkit-text-stroke-width: 0px; text-decoration-style: initial; text-decoration-color: initial;">---> Package rubygem-json.x86_64 0:1.7.7-30.el7 will be installed
--> Finished Dependency Resolution
You could try using --skip-broken to work around the problem
You could try running: rpm -Va --nofiles --nodigest
2017-11-01 12:59:07,818::ssh.py::<a class="moz-txt-link-freetext" href="ssh::96::lago.ssh::DEBUG::Command">ssh::96::lago.ssh::DEBUG::Command</a> 71738e12 on lago-basic-suite-4-1-host-0 errors:
Error: Package: ovirt-host-4.2.0-0.0.master.20171019135233.git1921fc6.el7.centos.noarch (alocalsync)
Requires: vdsm-client >= 4.20.0
Available: vdsm-client-4.19.35-2.gitc1d5a55.el7.centos.noarch (alocalsync)
vdsm-client = 4.19.35-2.gitc1d5a55.el7.centos
Error: Package: ovirt-host-4.2.0-0.0.master.20171019135233.git1921fc6.el7.centos.noarch (alocalsync)
Requires: vdsm >= 4.20.0
Available: vdsm-4.19.35-2.gitc1d5a55.el7.centos.x86_64 (alocalsync)
vdsm = 4.19.35-2.gitc1d5a55.el7.centos
</pre>
<p style="color: rgb(0, 0, 0); font-style: normal;
font-variant-ligatures: normal; font-variant-caps: normal;
font-weight: normal; letter-spacing: normal; orphans: 2;
text-align: start; text-indent: 0px; text-transform: none; widows:
2; word-spacing: 0px; -webkit-text-stroke-width: 0px;
text-decoration-style: initial; text-decoration-color: initial;">upgrade-from-prevrelease-suit-4.1-el7upgrade-from-prevrelease-suit-4.1-el7<br>
</p>
<br>
<pre style="color: rgb(0, 0, 0); font-style: normal; font-variant-ligatures: normal; font-variant-caps: normal; font-weight: normal; letter-spacing: normal; orphans: 2; text-align: start; text-indent: 0px; text-transform: none; widows: 2; word-spacing: 0px; -webkit-text-stroke-width: 0px; text-decoration-style: initial; text-decoration-color: initial;">2017-11-01 12:54:36,678::log_utils.py::__enter__::600::lago.ssh::DEBUG::start task:4a7a2559-9f73-4c95-a592-a6c6144843eb:Get ssh client for lago-upgrade-from-prevrelease-suite-4-1-engine:
2017-11-01 12:54:36,678::ssh.py::get_ssh_client::339::lago.ssh::DEBUG::Still got 1 tries for lago-upgrade-from-prevrelease-suite-4-1-engine
2017-11-01 12:54:39,685::ssh.py::get_ssh_client::354::lago.ssh::DEBUG::Socket error connecting to lago-upgrade-from-prevrelease-suite-4-1-engine: [Errno None] Unable to connect to port 22 on 192.168.203.2
2017-11-01 12:54:40,686::log_utils.py::__exit__::611::lago.ssh::DEBUG::end task:4a7a2559-9f73-4c95-a592-a6c6144843eb:Get ssh client for lago-upgrade-from-prevrelease-suite-4-1-engine:
2017-11-01 12:54:40,686::ssh.py::wait_for_<a class="moz-txt-link-freetext" href="ssh::129::lago.ssh::DEBUG::Got">ssh::129::lago.ssh::DEBUG::Got</a> exception while sshing to lago-upgrade-from-prevrelease-suite-4-1-engine: Timed out (in 4 s) trying to ssh to lago-upgrade-from-prevrelease-suite-4-1-engine
2017-11-01 12:54:41,688::log_utils.py::__enter__::600::lago.ssh::DEBUG::start task:f0d5d670-e595-4b3c-af10-5ce91d70c431:Get ssh client for lago-upgrade-from-prevrelease-suite-4-1-engine:
2017-11-01 12:54:41,688::ssh.py::get_ssh_client::339::lago.ssh::DEBUG::Still got 1 tries for lago-upgrade-from-prevrelease-suite-4-1-engine
2017-11-01 12:54:41,689::ssh.py::get_ssh_client::354::lago.ssh::DEBUG::Socket error connecting to lago-upgrade-from-prevrelease-suite-4-1-engine: [Errno None] Unable to connect to port 22 on 192.168.203.2
2017-11-01 12:54:42,690::log_utils.py::__exit__::611::lago.ssh::DEBUG::end task:f0d5d670-e595-4b3c-af10-5ce91d70c431:Get ssh client for lago-upgrade-from-prevrelease-suite-4-1-engine:
2017-11-01 12:54:42,690::ssh.py::wait_for_<a class="moz-txt-link-freetext" href="ssh::129::lago.ssh::DEBUG::Got">ssh::129::lago.ssh::DEBUG::Got</a> exception while sshing to lago-upgrade-from-prevrelease-suite-4-1-engine: Timed out (in 1 s) trying to ssh to lago-upgrade-from-prevrelease-suite-4-1-engine
2017-11-01 12:54:43,692::log_utils.py::__enter__::600::lago.ssh::DEBUG::start task:d151f339-0f76-4f42-869f-c08a533a3e54:Get ssh client for lago-upgrade-from-prevrelease-suite-4-1-engine:
2017-11-01 12:54:43,692::ssh.py::get_ssh_client::339::lago.ssh::DEBUG::Still got 1 tries for lago-upgrade-from-prevrelease-suite-4-1-engine
2017-11-01 12:54:43,692::ssh.py::get_ssh_client::354::lago.ssh::DEBUG::Socket error connecting to lago-upgrade-from-prevrelease-suite-4-1-engine: [Errno None] Unable to connect to port 22 on 192.168.203.2
2017-11-01 12:54:44,694::log_utils.py::__exit__::611::lago.ssh::DEBUG::end task:d151f339-0f76-4f42-869f-c08a533a3e54:Get ssh client for lago-upgrade-from-prevrelease-suite-4-1-engine:
2017-11-01 12:54:44,694::ssh.py::wait_for_<a class="moz-txt-link-freetext" href="ssh::129::lago.ssh::DEBUG::Got">ssh::129::lago.ssh::DEBUG::Got</a> exception while sshing to lago-upgrade-from-prevrelease-suite-4-1-engine: Timed out (in 1 s) trying to ssh to lago-upgrade-from-prevrelease-suite-4-1-engine
2017-11-01 12:54:45,695::log_utils.py::__enter__::600::lago.ssh::DEBUG::start task:8646939f-b59a-42b1-8ebc-1dec54e5d138:Get ssh client for lago-upgrade-from-prevrelease-suite-4-1-engine:
2017-11-01 12:54:45,695::ssh.py::get_ssh_client::339::lago.ssh::DEBUG::Still got 1 tries for lago-upgrade-from-prevrelease-suite-4-1-engine
Installing : ovirt-vmconsole-1.0.4-1.el7.centos.noarch 154/322
Failed to resolve booleanif statement at /etc/selinux/targeted/tmp/modules/400/ovirt_vmconsole/cil:588
semodule: Failed!
Installing : ovirt-vmconsole-proxy-1.0.4-1.el7.centos.noarch 155/322
libsemanage.semanage_read_policydb: Could not open kernel policy /etc/selinux/targeted/active/policy.kern for reading. (No such file or directory).
OSError: No such file or directory
134/141): vdsm-xmlrpc-4.19.35-2.gitc1d5a55.el7.centos.noa | 25 kB 00:00
(135/141): vdsm-yajsonrpc-4.19.35-2.gitc1d5a55.el7.centos. | 28 kB 00:00
ovirt-engine-yarn-0.19.1-2.el7.centos.x86_64: [Errno 256] No more mirrors to try.
ovirt-scheduler-proxy-0.1.7-1.el7.centos.noarch: [Errno 256] No more mirrors to try.
ovirt-provider-ovn-1.0-8.el7.centos.noarch: [Errno 256] No more mirrors to try.
ovirt-provider-ovn-driver-1.0-8.el7.centos.noarch: [Errno 256] No more mirrors to try.
ovirt-engine-setup-plugin-live-4.1.0-1.el7.centos.noarch: [Errno 256] No more mirrors to try.
unboundid-ldapsdk-3.2.0-1.el7.noarch: [Errno 256] No more mirrors to try.
2017-11-01 12:59:18,385::log_utils.py::__exit__::611::lago.utils::DEBUG::end task:da936503-c329-49b1-acfc-7658379be2fc:Run command: "reposync" "--config" "/home/jenkins/workspace/ovirt-4.1_change-queue-tester/ovirt-system-tests/upgrade-from-prevrelease-suite-4.1/reposync-config.repo" "--download_path" "/var/lib/lago" "--newest-only" "--delete" "--cachedir" "/tmp/reposync_i77mEt/cache" "--repoid" "ovirt-4.1-el7":
2017-11-01 12:59:18,385::reposetup.py::sync_rpm_repository::200::ovirtlago.reposetup::INFO:: - repo: ovirt-4.1-el7: failed, re-running.
2017-11-01 12:59:18,409::reposetup.py::_fix_reposync_issues::146::ovirtlago.reposetup::DEBUG::detected package errors in reposync output in repo_path:/var/lib/lago/ovirt-4.1-el7: ovirt-engine-setup-plugin-live,ovirt-provider-ovn-1.0-8.el7.c,unboundid-ldapsdk-3.2.0-1.el7.,ovirt-engine-setup-plugin-live-4.1.0-1.el7.centos.noarch,ovirt-provider-ovn-driver-1.0-8.el7.centos.noarch,ovirt-engine-yarn-0.19.1-2.el7,ovirt-provider-ovn-1.0-8.el7.centos.noarch,ovirt-scheduler-proxy-0.1.7-1.el7.centos.noarch,ovirt-scheduler-proxy-0.1.7-1.,unboundid-ldapsdk-3.2.0-1.el7.noarch,ovirt-provider-ovn-driver-1.0-,ovirt-engine-yarn-0.19.1-2.el7.centos.x86_64
2017-11-01 12:59:18,411::reposetup.py::_fix_reposync_issues::157::ovirtlago.reposetup::INFO:: - removing conflicting RPM: /var/lib/lago/ovirt-4.1-el7/noarch/ovirt-engine-setup-plugin-live-4.1.0-1.el7.centos.noarch.rpm
2017-11-01 12:59:18,412::reposetup.py::_fix_reposync_issues::157::ovirtlago.reposetup::INFO:: - removing conflicting RPM: /var/lib/lago/ovirt-4.1-el7/noarch/ovirt-provider-ovn-1.0-8.el7.centos.noarch.rpm
2017-11-01 12:59:18,412::reposetup.py::_fix_reposync_issues::157::ovirtlago.reposetup::INFO:: - removing conflicting RPM: /var/lib/lago/ovirt-4.1-el7/noarch/ovirt-provider-ovn-driver-1.0-8.el7.centos.noarch.rpm
2017-11-01 12:59:18,412::reposetup.py::_fix_reposync_issues::157::ovirtlago.reposetup::INFO:: - removing conflicting RPM: /var/lib/lago/ovirt-4.1-el7/noarch/unboundid-ldapsdk-3.2.0-1.el7.noarch.rpm
2017-11-01 12:59:18,413::reposetup.py::_fix_reposync_issues::157::ovirtlago.reposetup::INFO:: - removing conflicting RPM: /var/lib/lago/ovirt-4.1-el7/noarch/ovirt-scheduler-proxy-0.1.7-1.el7.centos.noarch.rpm
2017-11-01 12:59:18,415::reposetup.py::_fix_reposync_issues::157::ovirtlago.reposetup::INFO:: - removing conflicting RPM: /var/lib/lago/ovirt-4.1-el7/x86_64/ovirt-engine-yarn-0.19.1-2.el7.centos.x86_64.rpm
2017-11-01 12:59:18,416::reposetup.py::_fix_reposync_issues::167::ovirtlago.reposetup::DEBUG::removed 6 conflicting packages, see <a class="moz-txt-link-freetext" href="https://bugzilla.redhat.com//show_bug.cgi?id=1399235">https://bugzilla.redhat.com//show_bug.cgi?id=1399235</a> for more details.
2017-11-01 12:59:18,416::log_utils.py::__enter__::600::lago.utils::DEBUG::start task:6e42e37e-deca-4754-9009-17b063f931a2:Run command: "reposync" "--config" "/home/jenkins/workspace/ovirt-4.1_change-queue-tester/ovirt-system-tests/upgrade-from-prevrelease-suite-4.1/reposync-config.repo" "--download_path" "/var/lib/lago" "--newest-only" "--delete" "--cachedir" "/tmp/reposync_i77mEt/cache" "--repoid" "ovirt-4.1-el7":
2017-11-01 12:59:20,628::utils.py::_run_command::189::lago.utils::DEBUG::6e42e37e-deca-4754-9009-17b063f931a2: command exit with return code: 0
2017-11-01 12:59:20,628::utils.py::_run_command::192::lago.utils::DEBUG::6e42e37e-deca-4754-9009-17b063f931a2: command stdout: Package ovirt-engine-setup-plugin-live-4.1.0-1.el7.centos.noarch.rpm is not signed
2017-11-01 13:02:29,826::ssh.py::<a class="moz-txt-link-freetext" href="ssh::96::lago.ssh::DEBUG::Command">ssh::96::lago.ssh::DEBUG::Command</a> eaba52c4 on lago-upgrade-from-prevrelease-suite-4-1-engine errors:
cat: /root/multipath.txt: No such file or directory
</pre>
<p>upgrade-from-release-suit-4.1-el7</p>
<pre style="color: rgb(0, 0, 0); font-style: normal; font-variant-ligatures: normal; font-variant-caps: normal; font-weight: normal; letter-spacing: normal; orphans: 2; text-align: start; text-indent: 0px; text-transform: none; widows: 2; word-spacing: 0px; -webkit-text-stroke-width: 0px; text-decoration-style: initial; text-decoration-color: initial;">
virt-engine-yarn-0.19.1-2.el7.centos.x86_64: [Errno 256] No more mirrors to try.
ovirt-scheduler-proxy-0.1.7-1.el7.centos.noarch: [Errno 256] No more mirrors to try.
ovirt-provider-ovn-1.0-8.el7.centos.noarch: [Errno 256] No more mirrors to try.
ovirt-provider-ovn-driver-1.0-8.el7.centos.noarch: [Errno 256] No more mirrors to try.
ovirt-engine-setup-plugin-live-4.1.0-1.el7.centos.noarch: [Errno 256] No more mirrors to try.
unboundid-ldapsdk-3.2.0-1.el7.noarch: [Errno 256] No more mirrors to try.
2017-11-01 12:53:12,198::utils.py::_run_command::194::lago.utils::DEBUG::07b8a1a2-7149-49f1-b837-cba126d81521: command stderr: BDB2053 Freeing read locks for locker 0x8: 49578/140137252079680
BDB2053 Freeing read locks for locker 0xa: 49578/140137252079680
BDB2053 Freeing read locks for locker 0xb: 49578/140137252079680
BDB2053 Freeing read locks for locker 0xc: 49578/140137252079680
2017-11-01 12:53:12,198::log_utils.py::__exit__::611::lago.utils::DEBUG::end task:07b8a1a2-7149-49f1-b837-cba126d81521:Run command: "reposync" "--config" "/home/jenkins/workspace/ovirt-4.1_change-queue-tester/ovirt-system-tests/upgrade-from-release-suite-4.1/pre-reposync-config.repo" "--download_path" "/var/lib/lago" "--newest-only" "--delete" "--cachedir" "/tmp/reposync_v99dud/cache" "--repoid" "ovirt-4.1-el7":
2017-11-01 12:53:12,199::reposetup.py::sync_rpm_repository::200::ovirtlago.reposetup::INFO:: - repo: ovirt-4.1-el7: failed, re-running.
2017-11-01 12:53:12,225::reposetup.py::_fix_reposync_issues::146::ovirtlago.reposetup::DEBUG::detected package errors in reposync output in repo_path:/var/lib/lago/ovirt-4.1-el7: ovirt-engine-setup-plugin-live,ovirt-provider-ovn-1.0-8.el7.c,unboundid-ldapsdk-3.2.0-1.el7.,ovirt-engine-setup-plugin-live-4.1.0-1.el7.centos.noarch,ovirt-provider-ovn-driver-1.0-8.el7.centos.noarch,ovirt-engine-yarn-0.19.1-2.el7,ovirt-provider-ovn-1.0-8.el7.centos.noarch,ovirt-scheduler-proxy-0.1.7-1.el7.centos.noarch,ovirt-scheduler-proxy-0.1.7-1.,unboundid-ldapsdk-3.2.0-1.el7.noarch,ovirt-provider-ovn-driver-1.0-,ovirt-engine-yarn-0.19.1-2.el7.centos.x86_64
2017-11-01 12:53:12,227::reposetup.py::_fix_reposync_issues::157::ovirtlago.reposetup::INFO:: - removing conflicting RPM: /var/lib/lago/ovirt-4.1-el7/noarch/ovirt-scheduler-proxy-0.1.7-1.el7.centos.noarch.rpm
2017-11-01 12:53:12,227::reposetup.py::_fix_reposync_issues::157::ovirtlago.reposetup::INFO:: - removing conflicting RPM: /var/lib/lago/ovirt-4.1-el7/noarch/ovirt-provider-ovn-driver-1.0-8.el7.centos.noarch.rpm
2017-11-01 12:53:12,228::reposetup.py::_fix_reposync_issues::157::ovirtlago.reposetup::INFO:: - removing conflicting RPM: /var/lib/lago/ovirt-4.1-el7/noarch/ovirt-engine-setup-plugin-live-4.1.0-1.el7.centos.noarch.rpm
2017-11-01 12:53:12,228::reposetup.py::_fix_reposync_issues::157::ovirtlago.reposetup::INFO:: - removing conflicting RPM: /var/lib/lago/ovirt-4.1-el7/noarch/ovirt-provider-ovn-1.0-8.el7.centos.noarch.rpm
2017-11-01 12:53:12,229::reposetup.py::_fix_reposync_issues::157::ovirtlago.reposetup::INFO:: - removing conflicting RPM: /var/lib/lago/ovirt-4.1-el7/noarch/unboundid-ldapsdk-3.2.0-1.el7.noarch.rpm
2017-11-01 12:53:12,231::reposetup.py::_fix_reposync_issues::157::ovirtlago.reposetup::INFO:: - removing conflicting RPM: /var/lib/lago/ovirt-4.1-el7/x86_64/ovirt-engine-yarn-0.19.1-2.el7.centos.x86_64.rpm
2017-11-01 12:53:12,232::reposetup.py::_fix_reposync_issues::167::ovirtlago.reposetup::DEBUG::removed 6 conflicting packages, see <a class="moz-txt-link-freetext" href="https://bugzilla.redhat.com//show_bug.cgi?id=1399235">https://bugzilla.redhat.com//show_bug.cgi?id=1399235</a> for more details.
2017-11-01 12:53:12,232::log_utils.py::__enter__::600::lago.utils::DEBUG::start task:4c36637d-e4d7-488d-a0e5-590e4b1e28e6:Run command: "reposync" "--config" "/home/jenkins/workspace/ovirt-4.1_change-queue-tester/ovirt-system-tests/upgrade-from-release-suite-4.1/pre-reposync-config.repo" "--download_path" "/var/lib/lago" "--newest-only" "--delete" "--cachedir" "/tmp/reposync_v99dud/cache" "--repoid" "ovirt-4.1-el7":
2017-11-01 12:53:14,457::utils.py::_run_command::189::lago.utils::DEBUG::4c36637d-e4d7-488d-a0e5-590e4b1e28e6: command exit with return code: 0
2017-11-01 12:53:14,457::utils.py::_run_command::192::lago.utils::DEBUG::4c36637d-e4d7-488d-a0e5-590e4b1e28e6: command stdout: Package ovirt-engine-setup-plugin-live-4.1.0-1.el7.centos.noarch.rpm is not signed
2017-11-01 12:54:52,287::log_utils.py::__enter__::600::lago.prefix::<a class="moz-txt-link-freetext" href="INFO::">INFO::</a>[0m[0m
2017-11-01 12:54:52,288::log_utils.py::__enter__::600::lago.ssh::DEBUG::start task:1cfeda92-2104-4c35-8868-37f1b3e175ed:Get ssh client for lago-upgrade-from-release-suite-4-1-engine:
2017-11-01 12:54:52,288::ssh.py::get_ssh_client::339::lago.ssh::DEBUG::Still got 1 tries for lago-upgrade-from-release-suite-4-1-engine
2017-11-01 12:54:55,294::ssh.py::get_ssh_client::354::lago.ssh::DEBUG::Socket error connecting to lago-upgrade-from-release-suite-4-1-engine: [Errno None] Unable to connect to port 22 on 192.168.200.2
2017-11-01 12:54:56,295::log_utils.py::__exit__::611::lago.ssh::DEBUG::end task:1cfeda92-2104-4c35-8868-37f1b3e175ed:Get ssh client for lago-upgrade-from-release-suite-4-1-engine:
2017-11-01 12:54:56,296::ssh.py::wait_for_<a class="moz-txt-link-freetext" href="ssh::129::lago.ssh::DEBUG::Got">ssh::129::lago.ssh::DEBUG::Got</a> exception while sshing to lago-upgrade-from-release-suite-4-1-engine: Timed out (in 4 s) trying to ssh to lago-upgrade-from-release-suite-4-1-engine
2017-11-01 12:54:57,297::log_utils.py::__enter__::600::lago.ssh::DEBUG::start task:09223161-d627-4306-abb0-218678b6eb1f:Get ssh client for lago-upgrade-from-release-suite-4-1-engine:
2017-11-01 12:54:57,297::ssh.py::get_ssh_client::339::lago.ssh::DEBUG::Still got 1 tries for lago-upgrade-from-release-suite-4-1-engine
2017-11-01 12:54:57,298::ssh.py::get_ssh_client::354::lago.ssh::DEBUG::Socket error connecting to lago-upgrade-from-release-suite-4-1-engine: [Errno None] Unable to connect to port 22 on 192.168.200.2
2017-11-01 12:54:58,299::log_utils.py::__exit__::611::lago.ssh::DEBUG::end task:09223161-d627-4306-abb0-218678b6eb1f:Get ssh client for lago-upgrade-from-release-suite-4-1-engine:
2017-11-01 12:54:58,299::ssh.py::wait_for_<a class="moz-txt-link-freetext" href="ssh::129::lago.ssh::DEBUG::Got">ssh::129::lago.ssh::DEBUG::Got</a> exception while sshing to lago-upgrade-from-release-suite-4-1-engine: Timed out (in 1 s) trying to ssh to lago-upgrade-from-release-suite-4-1-engine
2017-11-01 12:54:59,300::log_utils.py::__enter__::600::lago.ssh::DEBUG::start task:0981c19b-77bf-43fd-8e2f-f5519f74dbf0:Get ssh client for lago-upgrade-from-release-suite-4-1-engine:
2017-11-01 12:54:59,301::ssh.py::get_ssh_client::339::lago.ssh::DEBUG::Still got 1 tries for lago-upgrade-from-release-suite-4-1-engine
2017-11-01 12:54:59,301::ssh.py::get_ssh_client::354::lago.ssh::DEBUG::Socket error connecting to lago-upgrade-from-release-suite-4-1-engine: [Errno None] Unable to connect to port 22 on 192.168.200.2
2017-11-01 12:55:00,302::log_utils.py::__exit__::611::lago.ssh::DEBUG::end task:0981c19b-77bf-43fd-8e2f-f5519f74dbf0:Get ssh client for lago-upgrade-from-release-suite-4-1-engine:
2017-11-01 12:55:00,303::ssh.py::wait_for_<a class="moz-txt-link-freetext" href="ssh::129::lago.ssh::DEBUG::Got">ssh::129::lago.ssh::DEBUG::Got</a> exception while sshing to lago-upgrade-from-release-suite-4-1-engine: Timed out (in 1 s) trying to ssh to lago-upgrade-from-release-suite-4-1-engine
2017-11-01 12:55:01,304::log_utils.py::__enter__::600::lago.ssh::DEBUG::start task:80c0243d-c589-496d-a4c0-75e62e37e17e:Get ssh client for lago-upgrade-from-release-suite-4-1-engine:
2017-11-01 12:55:01,304::ssh.py::get_ssh_client::339::lago.ssh::DEBUG::Still got 1 tries for lago-upgrade-from-release-suite-4-1-engine
2017-11-01 12:55:01,471::log_utils.py::__exit__::611::lago.ssh::DEBUG::end task:80c0243d-c589-496d-a4c0-75e62e37e17e:Get ssh client for lago-upgrade-from-release-suite-4-1-engine:
2017-11-01 12:55:01,588::ssh.py::<a class="moz-txt-link-freetext" href="ssh::58::lago.ssh::DEBUG::Running">ssh::58::lago.ssh::DEBUG::Running</a> dfa2bbf2 on lago-upgrade-from-release-suite-4-1-engine: true
2017-11-01 12:55:01,611::ssh.py::<a class="moz-txt-link-freetext" href="ssh::81::lago.ssh::DEBUG::Command">ssh::81::lago.ssh::DEBUG::Command</a> dfa2bbf2 on lago-upgrade-from-release-suite-4-1-engine returned with 0
2017-11-01 12:55:01,611::ssh.py::wait_for_<a class="moz-txt-link-freetext" href="ssh::153::lago.ssh::DEBUG::Wait">ssh::153::lago.ssh::DEBUG::Wait</a> succeeded for ssh to lago-upgrade-from-release-suite-4-1-engine
2017-11-01 12:55:01,611::log_utils.py::__exit__::611::lago.prefix::<a class="moz-txt-link-freetext" href="INFO::">INFO::</a>[32mSuccess[0m (in 0:00:09)
2017-11-01 12:55:01,612::log_utils.py::__enter__::600::lago.prefix::<a class="moz-txt-link-freetext" href="INFO::">INFO::</a>[0m[0m
2017-11-01 12:55:
</pre>
<b style="font-weight:normal;"
id="docs-internal-guid-5859b7a1-7827-5a99-bb29-1c8f72c98a2f">
<p dir="ltr"
style="line-height:1.38;margin-top:0pt;margin-bottom:0pt;"><span style="font-size:11pt;font-family:Arial;color:#000000;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre-wrap;"></error></span></p>
</b>
<p><b style="font-weight:normal;"
id="docs-internal-guid-5859b7a1-7827-5a99-bb29-1c8f72c98a2f">
<p dir="ltr"
style="line-height:1.38;margin-top:0pt;margin-bottom:0pt;"><span style="font-size:11pt;font-family:Arial;color:#000000;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre-wrap;">
</span></p>
<p dir="ltr"
style="line-height:1.38;margin-top:0pt;margin-bottom:0pt;"><span style="font-size:11pt;font-family:Arial;color:#000000;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre-wrap;">
</span></p>
<p dir="ltr"
style="line-height:1.38;margin-top:0pt;margin-bottom:0pt;"><span style="font-size:11pt;font-family:Arial;color:#000000;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre-wrap;">
</span></p>
<br>
<br>
</b></p>
<p><br>
</p>
</body>
</html>
--------------4543E9AA1C72328340227CEF--
7 years, 1 month