Dafna Ron <dron(a)redhat.com> writes:
On Fri, May 10, 2019 at 12:04 PM Milan Zamazal
<mzamazal(a)redhat.com> wrote:
> Marcin Sobczyk <msobczyk(a)redhat.com> writes:
>
> > Hi,
> >
> > the mentioned patch only touches our yum plugin - I don't think it's
> > related to the failure.
> >
> > VDSM fails on 'VM.create' call - Milan, could you please take a quick
> > look at it?
>
> I merged a patch to master yesterday that disables legacy pre-XML Engine
> (< 4.2) support. Engines and clusters < 4.2 are no longer supported in
> master.
>
> I can see the VM is run with a legacy configuration rather than with
> `xml' parameter. I wonder why -- is perhaps the cluster level < 4.2?
> If so then the cluster level must be updated before the VM is run.
>
This is on master branch. but it is happening on upgrade suite.
however, the suite is upgrading from 4.3 to 4.4:
pre-reposync-config.repo -> ../common/yum-repos/ovirt-4.3-cq.repo
reposync-config.repo -> ../common/yum-repos/ovirt-master-cq.repo
I can see that DC_VER_MAJ, DC_VER_MIN is indeed set to [4, 1] in
upgrade-from-release-suite-master/test-scenarios-before-upgrade/002_bootstrap.py.
I don't think upgrade to 4.4 is possible with 4.1 clusters. I believe
that the cluster version should be changed to [4, 2] (or perhaps
[4, 3]?), but someone should confirm it is the right thing to do.
Patch created, let's discuss and test it:
https://gerrit.ovirt.org/99899
>> > Regards, Marcin
>> >
>> > On 5/10/19 11:36 AM, Dafna Ron wrote:
>> >> Hi,
>> >>
>> >> We are failing upgrade-from-release-suite.el7.x86_64 /
>> >> 004_basic_sanity.vm_run
>> >>
>> >> The issue is an unexpected exception in vdsm.
>> >>
>> >> root cause based on CQ is this patch:
>> >>
https://gerrit.ovirt.org/#/c/99854/ - yum: Allow downloading only
>> >> 'vdsm' package
>> >>
>> >> Logs can be found here:
>> >>
>> >>
>>
http://jenkins.ovirt.org/job/ovirt-master_change-queue-tester/14079/artif...
>> >>
>> >> Marcin, can you please take a look?
>> >>
>> >> error:
>> >>
>> >> vdsm:
>> >>
>> >> 2019-05-10 05:06:38,329-0400 ERROR (jsonrpc/0) [api] FINISH create
>> >> error=Unexpected exception (api:131)
>> >> Traceback (most recent call last):
>> >> File "/usr/lib/python2.7/site-packages/vdsm/common/api.py",
line
>> >> 124, in method
>> >> ret = func(*args, **kwargs)
>> >> File "/usr/lib/python2.7/site-packages/vdsm/API.py", line
245, in
>> create
>> >> raise exception.UnexpectedError()
>> >> UnexpectedError: Unexpected exception
>> >> 2019-05-10 05:06:38,330-0400 INFO (jsonrpc/0) [api.virt] FINISH
>> >> create return={'status': {'message': 'Unexpected
exception', 'code':
>> >> 16}} from=::ffff:192.168.201.2,39486,
>> >> flow_id=83e4f0c4-a39a-45aa-891f-4022765e1a87,
>> >> vmId=397013a0-b7d4-4d38-86c3-e944cebd75a7 (api:54)
>> >> 2019-05-10 05:06:38,331-0400 INFO (jsonrpc/0)
>> >> [jsonrpc.JsonRpcServer] RPC call VM.create failed (error 16) in 0.00
>> >> seconds (__init__:312)
>> >> 2019-05-10 05:06:38,629-0400 INFO (jsonrpc/7) [vdsm.api] FINISH
>> >> getStorageDomainInfo return={'info': {'uuid':
>> >> '24498263-8985-46a9-9161-f65ee776cb7f',
'vgMetadataDevice':
>> >> '360014051f333158820a4cc6ab3be5b55', 'vguuid':
>> >> 'rXTLNc-JwEz-0dtL-Z2EC-ASqn-gzkg-jDJa1b',
'metadataDevice':
>> >> '360014051f333158820a4cc6ab3be5b55', 'state':
'OK', 'version': '4',
>> >> 'role': 'Master', 'type': 'ISCSI',
'class': 'Data', 'pool':
>> >> ['3e330122-c587-4078-b9a4-13dbb697c5cc'], 'name':
'iscsi'}}
>> >> from=::ffff:192.168.201.2,39486, flow_id=3f76d32a,
>> >> task_id=084e7ee5-3c43-4905-833f-9a522043a862 (api:54)
>> >>
>> >> engine:
>> >>
>> >> 2019-05-10 05:06:38,328-04 ERROR
>> >> [org.ovirt.engine.core.vdsbroker.CreateVDSCommand] (default task-1)
>> >> [83e4f0c4-a39a-45aa-891f-4022765e1a87] VDS::create Failed creating
>> >> vm 'v
>> >> m0' in vds = '566eda0f-3ef0-4791-a618-1e649af4b0be' error =
>> >> 'org.ovirt.engine.core.vdsbroker.vdsbroker.VDSErrorException:
>> >> VDSGenericException: VDSErrorException: Failed to C
>> >> reateBrokerVDS, error = Unexpected exception, code = 16'
>> >> 2019-05-10 05:06:38,328-04 INFO
>> >> [org.ovirt.engine.core.vdsbroker.CreateVDSCommand] (default task-1)
>> >> [83e4f0c4-a39a-45aa-891f-4022765e1a87] FINISH, CreateVDSCommand,
>> >> return:
>> >> Down, log id: 3cb7c62d
>> >> 2019-05-10 05:06:38,328-04 WARN
>> >> [org.ovirt.engine.core.bll.RunVmCommand] (default task-1)
>> >> [83e4f0c4-a39a-45aa-891f-4022765e1a87] Failed to run VM 'vm0':
>> >> EngineException: or
>> >> g.ovirt.engine.core.vdsbroker.vdsbroker.VDSErrorException:
>> >> VDSGenericException: VDSErrorException: Failed to CreateBrokerVDS,
>> >> error = Unexpected exception, code = 16 (Failed
>> >> with error unexpected and code 16)
>> >> 2019-05-10 05:06:38,328-04 INFO
>> >> [org.ovirt.engine.core.bll.RunVmOnceCommand] (default task-1)
>> >> [83e4f0c4-a39a-45aa-891f-4022765e1a87] Lock freed to object
>> >> 'EngineLock:{exclu
>> >> siveLocks='[397013a0-b7d4-4d38-86c3-e944cebd75a7=VM]',
sharedLocks=''}'
>> >> 2019-05-10 05:06:38,329-04 INFO
>> >> [org.ovirt.engine.core.bll.RunVmCommand] (default task-1)
>> >> [83e4f0c4-a39a-45aa-891f-4022765e1a87] Trying to rerun VM 'vm0'
>> >> 2019-05-10 05:06:38,373-04 WARN
>> >> [org.ovirt.engine.core.dal.dbbroker.auditloghandling.AuditLogDirector]
>> >> (default task-1) [83e4f0c4-a39a-45aa-891f-4022765e1a87] EVENT_ID:
>> >> USE
>> >> R_INITIATED_RUN_VM_FAILED(151), Failed to run VM vm0 on Host
>> >> lago-upgrade-from-release-suite-master-host-0.
>> >> 2019-05-10 05:06:38,381-04 INFO
>> >> [org.ovirt.engine.core.bll.RunVmOnceCommand] (default task-1)
>> >> [83e4f0c4-a39a-45aa-891f-4022765e1a87] Lock Acquired to object
>> >> 'EngineLock:{ex
>> >> clusiveLocks='[397013a0-b7d4-4d38-86c3-e944cebd75a7=VM]',
>> sharedLocks=''}'
>> >> 2019-05-10 05:06:38,393-04 INFO
>> >> [org.ovirt.engine.core.vdsbroker.IsVmDuringInitiatingVDSCommand]
>> >> (default task-1) [83e4f0c4-a39a-45aa-891f-4022765e1a87] START,
>> >> IsVmDuringIn
>> >> itiatingVDSCommand(
>> >>
>>
IsVmDuringInitiatingVDSCommandParameters:{vmId='397013a0-b7d4-4d38-86c3-e944cebd75a7'}),
>> >> log id: 96be328
>> >> 2019-05-10 05:06:38,393-04 INFO
>> >> [org.ovirt.engine.core.vdsbroker.IsVmDuringInitiatingVDSCommand]
>> >> (default task-1) [83e4f0c4-a39a-45aa-891f-4022765e1a87] FINISH,
>> >> IsVmDuringI
>> >> nitiatingVDSCommand, return: false, log id: 96be328
>> >> 2019-05-10 05:06:38,403-04 WARN
>> >> [org.ovirt.engine.core.bll.RunVmOnceCommand] (default task-1)
>> >> [83e4f0c4-a39a-45aa-891f-4022765e1a87] Validation of action
>> >> 'RunVmOnce' failed
>> >> for user admin@internal-authz. Reasons:
>> >>
>>
VAR__ACTION__RUN,VAR__TYPE__VM,VAR__ACTION__RUN,VAR__TYPE__VM,SCHEDULING_NO_HOSTS
>> >> 2019-05-10 05:06:38,404-04 INFO
>> >> [org.ovirt.engine.core.bll.RunVmOnceCommand] (default task-1)
>> >> [83e4f0c4-a39a-45aa-891f-4022765e1a87] Lock freed to object
>> >> 'EngineLock:{exclu
>> >> siveLocks='[397013a0-b7d4-4d38-86c3-e944cebd75a7=VM]',
sharedLocks=''}'
>> >> 2019-05-10 05:06:38,413-04 ERROR
>> >> [org.ovirt.engine.core.dal.dbbroker.auditloghandling.AuditLogDirector]
>> >> (default task-1) [83e4f0c4-a39a-45aa-891f-4022765e1a87] EVENT_ID:
>> >> USE
>> >> R_FAILED_RUN_VM(54), Failed to run VM vm0 (User: admin@internal-authz
>> ).
>> >> 2019-05-10 05:06:38,462-04 ERROR
>> >> [org.ovirt.engine.api.restapi.resource.AbstractBackendResource]
>> >> (default task-1) [] Operation Failed: [There are no hosts to
>> >> use. Check that the cluster contains at least one host in Up state.]
>> >> 2019-05-10 05:06:38,466-04 INFO
>> >> [org.ovirt.engine.core.bll.ProcessDownVmCommand]
>> >> (EE-ManagedThreadFactory-engine-Thread-35) [3f40b3e8] Running
>> >> command: ProcessDownVmCommand internal: true.
>> >>
>> >>
>> >> Thanks,
>> >> Dafna
>> >>
>>