[vdsm] stable branch ovirt-4.2 created
by Francesco Romani
Hi all,
With the help of Sandro (many thanks @sbonazzo !), we created minutes
ago the ovirt-4.2 stable branch:
Steps performed:
1. merged https://gerrit.ovirt.org/#/c/87070/
2. branched out ovirt-4.2 from git master
3. merged https://gerrit.ovirt.org/#/c/87181/ to add support for 4.3 level
4. createed and pushed the tag v4.30.0 from master, to make sure the
version number is greater of the stable versions, and to (somehow :))
align with oVirt versioning
5. tested make dist/make rpm on both new branch ovirt-4.2 and master,
both looks good and use the right version
Maintainers, please check it looks right for you before merging any new
patch to master branch.
Please let me know about any issue!
Happy hacking,
--
Francesco Romani
Senior SW Eng., Virtualization R&D
Red Hat
IRC: fromani github: @fromanirh
6 years, 9 months
[ OST Failure Report ] [ oVirt Master (vdsm) ] [ 08-02-2018 ] [ 002_bootstrap.get_host_numa_nodes+ 004_basic_sanity.vm_run]
by Dafna Ron
Hi,
We have a failure on 002_bootstrap.get_host_numa_nodes in basic suite and
004_basic_sanity.vm_run on upgrade from release suite.
Eli, can you please take a look? it seems to be the same reason - host is
down or not suitable
*Link and headline of suspected patches: Link to Job:Link to all
logs:(Relevant) error snippet from the log: <error>*
*basic suite *:
2018-02-08 00:14:23,852-05 DEBUG
[org.ovirt.engine.core.dal.dbbroker.PostgresDbEngineDialect$PostgresSimpleJdbcCall]
(ServerService Thread Pool -- 41) [] SqlCall for procedure
[GetAllFromVdcOption] compiled
2018-02-08 00:14:23,864-05 WARN
[org.ovirt.engine.core.utils.ConfigUtilsBase] (ServerService Thread Pool --
41) [] Could not find enum value for option: 'ConfigDir'
2018-02-08 00:14:23,869-05 WARN
[org.ovirt.engine.core.utils.ConfigUtilsBase] (ServerService Thread Pool --
41) [] Could not find enum value for option: 'DbJustRestored'
2018-02-08 00:14:23,871-05 WARN
[org.ovirt.engine.core.utils.ConfigUtilsBase] (ServerService Thread Pool --
41) [] Could not find enum value for option: 'ConfigDir'
2018-02-08 00:14:23,881-05 WARN
[org.ovirt.engine.core.utils.ConfigUtilsBase] (ServerService Thread Pool --
41) [] Could not find enum value for option: 'DbJustRestored'
2018-02-08 00:14:23,882-05 WARN
[org.ovirt.engine.core.utils.ConfigUtilsBase] (ServerService Thread Pool --
41) [] Could not find enum value for option: 'ConfigDir'
2018-02-08 00:14:23,915-05 WARN
[org.ovirt.engine.core.utils.ConfigUtilsBase] (ServerService Thread Pool --
41) [] Could not find enum value for option: 'DbJustRestored'
2018-02-08 00:14:23,919-05 INFO
[org.ovirt.engine.core.utils.osinfo.OsInfoPreferencesLoader] (ServerService
Thread Pool -- 41) [] Loading file
'/etc/ovirt-engine/osinfo.conf.d/00-defaults.properties'
2018-02-08 00:14:23,971-05 DEBUG
[org.ovirt.engine.core.utils.OsRepositoryImpl] (ServerService Thread Pool
-- 41) [] Osinfo Repository:
backwardCompatibility
OtherLinux=5
Windows2008R2x64=17
Windows2003=3
Windows2003x64=10
RHEL3x64=15
Windows8x64=21
Windows8=20
Windows7=11
Windows7x64=12
Windows2008=4
RHEL4=8
RHEL6x64=19
RHEL5x64=13
Windows2012x64=23
WindowsXP=1
RHEL4x64=14
Unassigned=0
Windows2008x64=16
RHEL6=18
RHEL5=7
Other=0
REHL3=9
emptyNode
os.debian_7.derivedFrom
value=ubuntu_12_04
os.debian_7.id
value=1300
os.debian_7.name
value=Debian 7
os.freebsd.bus
value=32
os.freebsd.derivedFrom
value=other
os.freebsd.id
value=1500
os.freebsd.name
value=FreeBSD 9.2
os.freebsdx64.bus
value=64
os.freebsdx64.derivedFrom
:
upgrade suite:
2018-02-08 00:12:07,451-05 WARN
[org.ovirt.engine.core.utils.ConfigUtilsBase] (ServerService Thread Pool --
51) [] Could not find enum value for option: 'DbJustRestored'
2018-02-08 00:12:07,451-05 WARN
[org.ovirt.engine.core.utils.ConfigUtilsBase] (ServerService Thread Pool --
51) [] Could not find enum value for option: 'ConfigDir'
2018-02-08 00:12:07,452-05 INFO
[org.ovirt.engine.core.utils.osinfo.OsInfoPreferencesLoader] (ServerService
Thread Pool -- 51) [] Loading file
'/etc/ovirt-engine/osinfo.conf.d/00-defaults.properties'
2018-02-08 00:12:07,505-05 DEBUG
[org.ovirt.engine.core.utils.OsRepositoryImpl] (ServerService Thread Pool
-- 51) [] Osinfo Repository:
backwardCompatibility
OtherLinux=5
Windows2008R2x64=17
Windows2003=3
Windows2003x64=10
RHEL3x64=15
Windows8x64=21
Windows8=20
Windows7=11
Windows7x64=12
Windows2008=4
RHEL4=8
RHEL6x64=19
RHEL5x64=13
Windows2012x64=23
WindowsXP=1
RHEL4x64=14
Unassigned=0
Windows2008x64=16
RHEL6=18
RHEL5=7
Other=0
REHL3=9
emptyNode
os.debian_7.derivedFrom
value=ubuntu_12_04
os.debian_7.id
value=1300
os.debian_7.name
value=Debian 7
os.freebsd.bus
value=32
os.freebsd.derivedFrom
value=other
os.freebsd.id
value=1500
os.freebsd.name
value=FreeBSD 9.2
os.freebsdx64.bus
value=64
os.freebsdx64.derivedFrom
value=freebsd
os.freebsdx64.id
value=1501
os.freebsdx64.name
value=FreeBSD 9.2 x64
os.other.bus
value=64
os.other.cpu.hotplugSupport
value=true
os.other.cpu.hotunplugSupport
value=false
os.other.cpu.unsupported
value=
*</error>*
6 years, 9 months
[ OST Failure Report ] [ oVirt Master (ovirt-engine) ] [ 08-02-2018 ] [ 001_initialize_engine.initialize_engine ]
by Dafna Ron
Hi,
We are failing test 001_initialize_engine.initialize_engine
It seems that the issue is related to the ovirt-engine-dwhd.service not
starting.
*Link and headline of suspected patches:
https://gerrit.ovirt.org/#/c/86920/ <https://gerrit.ovirt.org/#/c/86920/> -
*
*core: adding key to bundles/AuditLogMessagesLink to
Job:http://jenkins.ovirt.org/job/ovirt-master_change-queue-tester/5365/
<http://jenkins.ovirt.org/job/ovirt-master_change-queue-tester/5365/>Link
to all
logs:http://jenkins.ovirt.org/job/ovirt-master_change-queue-tester/5365/a...
<http://jenkins.ovirt.org/job/ovirt-master_change-queue-tester/5365/artifact/>(Relevant)
error snippet from the log: <error>Feb 7 22:43:23
lago-basic-suite-master-engine dbus[525]: [system] Reloaded
configurationFeb 7 22:43:24 lago-basic-suite-master-engine firewalld[565]:
WARNING: ICMP type 'beyond-scope' is not supported by the kernel for
ipv6.Feb 7 22:43:24 lago-basic-suite-master-engine firewalld[565]:
WARNING: beyond-scope: INVALID_ICMPTYPE: No supported ICMP type., ignoring
for run-time.Feb 7 22:43:24 lago-basic-suite-master-engine firewalld[565]:
WARNING: ICMP type 'failed-policy' is not supported by the kernel for
ipv6.Feb 7 22:43:24 lago-basic-suite-master-engine firewalld[565]:
WARNING: failed-policy: INVALID_ICMPTYPE: No supported ICMP type., ignoring
for run-time.Feb 7 22:43:24 lago-basic-suite-master-engine firewalld[565]:
WARNING: ICMP type 'reject-route' is not supported by the kernel for
ipv6.Feb 7 22:43:24 lago-basic-suite-master-engine firewalld[565]:
WARNING: reject-route: INVALID_ICMPTYPE: No supported ICMP type., ignoring
for run-time.Feb 7 22:43:28 lago-basic-suite-master-engine firewalld[565]:
WARNING: ICMP type 'beyond-scope' is not supported by the kernel for
ipv6.Feb 7 22:43:28 lago-basic-suite-master-engine firewalld[565]:
WARNING: beyond-scope: INVALID_ICMPTYPE: No supported ICMP type., ignoring
for run-time.Feb 7 22:43:28 lago-basic-suite-master-engine firewalld[565]:
WARNING: ICMP type 'failed-policy' is not supported by the kernel for
ipv6.Feb 7 22:43:28 lago-basic-suite-master-engine firewalld[565]:
WARNING: failed-policy: INVALID_ICMPTYPE: No supported ICMP type., ignoring
for run-time.Feb 7 22:43:28 lago-basic-suite-master-engine firewalld[565]:
WARNING: ICMP type 'reject-route' is not supported by the kernel for
ipv6.Feb 7 22:43:28 lago-basic-suite-master-engine firewalld[565]:
WARNING: reject-route: INVALID_ICMPTYPE: No supported ICMP type., ignoring
for run-time.Feb 7 22:43:28 lago-basic-suite-master-engine systemd:
Starting oVirt Engine fence_kdump listener...Feb 7 22:43:28
lago-basic-suite-master-engine systemd: Started oVirt Engine fence_kdump
listener.Feb 7 22:43:28 lago-basic-suite-master-engine systemd:
Reloading.Feb 7 22:43:28 lago-basic-suite-master-engine systemd: Starting
oVirt Engine...Feb 7 22:43:28 lago-basic-suite-master-engine journal:
2018-02-07 22:43:28,908-0500 ovirt-engine: INFO _detectJBossVersion:187
Detecting JBoss version. Running: /usr/lib/jvm/jre/bin/java
['ovirt-engine-version', '-server', '-XX:+TieredCompilation', '-Xms1024M',
'-Xmx1024M', '-Djava.awt.headless=true',
'-Dsun.rmi.dgc.client.gcInterval=3600000',
'-Dsun.rmi.dgc.server.gcInterval=3600000',
'-Djsse.enableSNIExtension=false', '-XX:+HeapDumpOnOutOfMemoryError',
'-XX:HeapDumpPath=/var/log/ovirt-engine/dump',
'-Djava.util.logging.manager=org.jboss.logmanager',
'-Dlogging.configuration=file:///var/lib/ovirt-engine/jboss_runtime/config/ovirt-engine-logging.properties',
'-Dorg.jboss.resolver.warning=true',
'-Djboss.modules.system.pkgs=org.jboss.byteman',
'-Djboss.server.default.config=ovirt-engine',
'-Djboss.home.dir=/usr/share/ovirt-engine-wildfly',
'-Djboss.server.base.dir=/usr/share/ovirt-engine',
'-Djboss.server.data.dir=/var/lib/ovirt-engine',
'-Djboss.server.log.dir=/var/log/ovirt-engine',
'-Djboss.server.config.dir=/var/lib/ovirt-engine/jboss_runtime/config',
'-Djboss.server.temp.dir=/var/lib/ovirt-engine/jboss_runtime/tmp',
'-Djboss.controller.temp.dir=/var/lib/ovirt-engine/jboss_runtime/tmp',
'-jar', '/usr/share/ovirt-engine-wildfly/jboss-modules.jar', '-mp',
'/usr/share/ovirt-engine-wildfly-overlay/modules:/usr/share/ovirt-engine/modules/common:/usr/share/ovirt-engine-extension-aaa-jdbc/modules:/usr/share/ovirt-engine-extension-aaa-ldap/modules:/usr/share/ovirt-engine-wildfly/modules',
'-jaxpmodule', 'javax.xml.jaxp-provider', 'org.jboss.as.standalone',
'-v']Feb 7 22:43:29 lago-basic-suite-master-engine journal: 2018-02-07
22:43:29,481-0500 ovirt-engine: INFO _detectJBossVersion:207 Return code:
0, | stdout: '[u'WildFly Full 11.0.0.Final (WildFly Core 3.0.8.Final)'],
| stderr: '[]'Feb 7 22:43:29 lago-basic-suite-master-engine systemd:
Started oVirt Engine.Feb 7 22:43:29 lago-basic-suite-master-engine
systemd: Reloading.Feb 7 22:43:29 lago-basic-suite-master-engine systemd:
Starting oVirt Engine Data Warehouse...Feb 7 22:43:29
lago-basic-suite-master-engine systemd: Started oVirt Engine Data
Warehouse.Feb 7 22:43:29 lago-basic-suite-master-engine systemd:
Reloading.Feb 7 22:43:30 lago-basic-suite-master-engine systemd: Starting
oVirt ImageIO Proxy...Feb 7 22:43:30 lago-basic-suite-master-engine
systemd: Started oVirt ImageIO Proxy.Feb 7 22:43:30
lago-basic-suite-master-engine systemd: Reloading.Feb 7 22:43:30
lago-basic-suite-master-engine journal: 2018-02-07 22:43:30,301-0500
ovirt-engine-dwhd: ERROR run:554 Error: process terminated with status code
1Feb 7 22:43:30 lago-basic-suite-master-engine systemd:
ovirt-engine-dwhd.service: main process exited, code=exited,
status=1/FAILUREFeb 7 22:43:30 lago-basic-suite-master-engine systemd:
Unit ovirt-engine-dwhd.service entered failed state.Feb 7 22:43:30
lago-basic-suite-master-engine systemd: ovirt-engine-dwhd.service
failed.</error>*
6 years, 9 months
ATTN: oVirt Engine 4.2.z has been branched from master
by Tal Nisan
As the subject says, we've branched today to ovirt-engine-4.2 branch, every
change that is required for 4.2.2 and onward should be backported to this
branch.
Note that we chose now so there will be no confusion on whether your
patches are included, if your patch got merge in master before 21:45 IST
(20:45 CET) then no backport is needed.
Please note that though we branched out the main focus should still be
4.2.2.
6 years, 9 months
[vdsm][RFC] reconsidering branching out ovirt-4.2
by Francesco Romani
Hi all,
It is time again to reconsider branching out the 4.2 stable branch.
So far we decided to *not* branch out, and we are taking tags for ovirt
4.2 releases from master branch.
This means we are merging safe and/or stabilization patches only in master.
I think it is time to reconsider this decision and branch out for 4.2,
because of two reasons:
1. it sends a clearer signal that 4.2 is going in stabilization mode
2. we have requests from virt team, which wants to start working on the
next cycle features.
If we decide to branch out, I'd start the new branch on monday, February
5 (1 week from now).
The discussion is open, please share your acks/nacks for branching out,
and for the branching date.
I for myself I'm inclined to branch out, so if noone chimes in (!!) I'll
execute the above plan.
--
Francesco Romani
Senior SW Eng., Virtualization R&D
Red Hat
IRC: fromani github: @fromanirh
6 years, 9 months
[ACTION-REQUIRED] Making accurate CI for oVirt 4.2
by Barak Korren
Hi,
This message is aimed for project maintainers. Other developers may
also find it interesting to have a glimpse at the oVirt-wide test and
composition processes.
TL;DR: To get accurate CI for oVirt 4.2, most projects
need to add 4.2 jobs in YAML.
Before I can explain what the current issue is and which action is
required, I'm need to provide a brief overview into how oVirt CI
works.
oVirt CI has two major components to it:
1. The STDCI component which is used to build and test individual
projects. Most developers interact with this on a daily basis as it is
responding on GitHub and Gerrit events.
2. The "change-queue" (CQ) component which is used to automatically
compose the whole of oVirt from its sub projects and run system tests
(OST) on it. This component is used to gather the information that is
used by the infra team to compose the "OST failure report" you can
occasionally see being sent to this list. The change queue is also
used to automatically maintain the 'tested' and '*-snapshot' (AKA
nightly) repositories.
The way the CQ composes oVirt is by looking at the post-merge STDCI
'build-artifacts' jobs, and collecting together artifacts built by
jobs that target a specific oVirt version into that version's change
queue. Essentially the '*_master_build-artifacts-*' jobs go into the
'ovirt-master' change queue, the '*_4.1_build-artifacts-*' jobs go
into the 'ovirt-4.1' change queue etc.
Over the course of the oVirt 4.2 development, most project used their
'master' branch, which is typically mapped to '*_master_*' jobs, for
developing 4.2 code. So the 'ovirt-master' CQ provided good indication
of the state of 4.2 code.
As projects started addeing 4.2 branches, we have created an
'ovirt-4.2' CQ to gather them. We did so under the assumption that
most projects will branch soon after. The assumption turned up to be
wrong as most projects did not yet fork and may not do so in the near
future.
As some projects did fork, the end result is that currently:
___there is no accurate representation of oVirt 4.2 in CI___
'ovirt-master' CQ no longer represents oVirt 4.2 as some projects
already have some 4.3 code in their 'master' branches.
'ovirt-4.2' CQ does not represent oVirt 4.2 as most projects do not
push artifacts into it.
To get any benefit from CI, we need to have it represent what we are
going to release. This means that at this point we need all projects
to have '*_4.2_build-artifacts-*' jobs that map to the code that is
intended to be included in oVirt 4.2. Projects can either:
1. Create 4.2 branches and map the new jobs to them.
2. Keep 4.2 development in 'master' and create '4.2' jobs that map to it.
Taking route #2 means a commitment to not adding any 4.3 code to the
'master' branch. Please keep it, as rolling back "too new" builds from
the various repos and caches we have is very difficult.
--
Barak Korren
RHV DevOps team , RHCE, RHCi
Red Hat EMEA
redhat.com | TRIED. TESTED. TRUSTED. | redhat.com/trusted
6 years, 9 months
Re: [ovirt-devel] fcraw Jenkins build fails on import error
by Eyal Edri
also adding devel, it looks more dev issue than infra.
On Mon, Feb 5, 2018 at 2:59 PM, Eyal Edri <eedri(a)redhat.com> wrote:
> Adding Sandro,
> Are we officially supporting fcraw for engine?
>
> On Mon, Feb 5, 2018 at 2:26 PM, Tal Nisan <tnisan(a)redhat.com> wrote:
>
>>
>> *http://jenkins.ovirt.org/job/ovirt-engine_master_check-patch-fcraw-x86_64/29/consoleFull <http://jenkins.ovirt.org/job/ovirt-engine_master_check-patch-fcraw-x86_64...>*
>>
>>
>>
>> *10:53:31* ==================================== ERRORS ====================================*10:53:31* ERROR collecting packaging/setup/tests/ovirt_engine_setup/engine_common/test_database.py *10:53:31* ImportError while importing test module '/home/jenkins/workspace/ovirt-engine_master_check-patch-fcraw-x86_64/ovirt-engine/packaging/setup/tests/ovirt_engine_setup/engine_common/test_database.py'.*10:53:31* Hint: make sure your test modules/packages have valid Python names.*10:53:31* Traceback:*10:53:31* packaging/setup/tests/ovirt_engine_setup/engine_common/test_database.py:19: in <module>*10:53:31* import ovirt_engine_setup.engine_common.database as under_test # isort:skip # noqa: E402*10:53:31* packaging/setup/ovirt_engine_setup/engine_common/database.py:30: in <module>*10:53:31* from otopi import base*10:53:31* E ImportError: No module named otopi*10:53:31* !!!!!!!!!!!!!!!!!!! Interrupted: 1 errors during collection !!!!!!!!!!!!!!!!!!!!*10:53:31* =========================== 1 error in 0.43 seconds ============================
>>
>>
>>
>>
>>
>>
>> _______________________________________________
>> Infra mailing list
>> Infra(a)ovirt.org
>> http://lists.ovirt.org/mailman/listinfo/infra
>>
>>
>
>
> --
>
> Eyal edri
>
>
> MANAGER
>
> RHV DevOps
>
> EMEA VIRTUALIZATION R&D
>
>
> Red Hat EMEA <https://www.redhat.com/>
> <https://red.ht/sig> TRIED. TESTED. TRUSTED. <https://redhat.com/trusted>
> phone: +972-9-7692018 <+972%209-769-2018>
> irc: eedri (on #tlv #rhev-dev #rhev-integ)
>
--
Eyal edri
MANAGER
RHV DevOps
EMEA VIRTUALIZATION R&D
Red Hat EMEA <https://www.redhat.com/>
<https://red.ht/sig> TRIED. TESTED. TRUSTED. <https://redhat.com/trusted>
phone: +972-9-7692018
irc: eedri (on #tlv #rhev-dev #rhev-integ)
6 years, 9 months