resources.ovirt.org migration
by Denis Volkov
Hello
Server `resources.ovirt.org` hosting repositories for ovirt is going to be
migrated to new hardware in a different datacenter within the next couple
of hours.
No service interruption is expected during the migration. I will follow-up
when the process is complete
--
Denis Volkov
3 years
CentOS 8 Stream: hosted-engine deploy with CPU type is not supported
by matyi.szabolcs@internetx.com
Hi all,
On CentOS 8 Stream I get the following error during "hosted-engine --deploy":
[ INFO ] The host has been set in non_operational status, deployment errors: code 156: Host HOST moved to Non-Operational state as host CPU type is not supported in this cluster compatibility version or is not supported at all, code 519
CPU:
[root@ovirt-test ~]# lscpu
Architecture: x86_64
CPU op-mode(s): 32-bit, 64-bit
Byte Order: Little Endian
CPU(s): 24
On-line CPU(s) list: 0-23
Thread(s) per core: 2
Core(s) per socket: 6
Socket(s): 2
NUMA node(s): 2
Vendor ID: GenuineIntel
BIOS Vendor ID: Intel
CPU family: 6
Model: 63
Model name: Intel(R) Xeon(R) CPU E5-2620 v3 @ 2.40GHz
I set up cluster compatibility but it did not solve the problem.
On a normal CentOS 8.4 system, the error is not present.
Does anyone have an idea?
Or have older drivers been removed from the Streram version?
Thanks
3 years
When we commit the snapshot , any subsequent snapshots are erased.
by Bailey Adolph
Deal all:
I have question.
When we commit the snapshot ,any subsequent snapshots are erased.
What is the reason for this design?
If I don’t want to erased the subsequent snapshots, is there a way to configure it?
My requirement is that after the pre-snapshot is submitted, the subsequent snapshots can still be used.
Any one can help me ?
3 years
Viewing and hopefully, modifying the VM's qemu command line
by Gilboa Davara
Hello all,
I'm setting up a fairly (?) complex oVirt over Gluster setup built around 3
Xeon servers-turned-into-workstations, each doubling as oVirt node + one
primary Fedora VM w/ a dedicated passthrough GPU (+audio and a couple of
USB root devices).
One of the servers seems to have some weird issue w/ the passthrough nVidia
GPU that seems to require me to edit the VM iommu (1) and passthrough
device (2) command line.
I tried using the qemu-cmdline addon to add the missing parameters, but it
seems that qemu treats the added parameters as an additional device / iommu
instead of editing the existing parameters.
So:
1. How can I view the VM qemu command line?
2. Can I somehow manually edit the qemu command line, either directly or by
somehow adding parameters in the HE XML file?
- Gilboa.
[1] iommu: VM XXX is down with error. Exit message: internal error: qemu
unexpectedly closed the monitor: 2021-11-05T14:59:44.499366Z qemu-kvm: We
need to set caching-mode=on for intel-iommu to enable device assignment
with IOMMU protection.
[2] GPU: May need to add x-vga=off,
3 years
Re: after restore check for upgrade fails
by Martin Necas
Hmm does not look like major changes except the stream, do you have some
error in httpd?
/var/log/httpd/error_log
Martin Necas
On Tue, Nov 9, 2021 at 12:12 PM Staniforth, Paul <
P.Staniforth(a)leedsbeckett.ac.uk> wrote:
> ansible-runner-service-1.0.7-1.el8.noarch
> python3-ansible-runner-1.4.6-1.el8.noarch
>
> The original system was on 4.4.8.5
>
> I'm not sure what minor version the system I restored it to.
>
> Yes, I ran engine-setup after the restore.
>
> I then did an upgrade to 4.4.9.4
>
> I also upgraded the new system to centos8 stream.
>
>
> Regards,
>
> Paul S.
>
> ------------------------------
> *From:* Martin Necas <mnecas(a)redhat.com>
> *Sent:* 09 November 2021 10:31
> *To:* Staniforth, Paul <P.Staniforth(a)leedsbeckett.ac.uk>
> *Cc:* users <users(a)ovirt.org>
> *Subject:* Re: [ovirt-users] after restore check for upgrade fails
>
>
> *Caution External Mail:* Do not click any links or open any attachments
> unless you trust the sender and know that the content is safe.
> Hi,
>
> what is your version of runner and runner service?
> rpm -qa "*ansible-runner*"
> It is a bit strange that the
> /var/log/ovirt-engine/ansible-runner-service.log does not exist.
>
> From which version did you upgrade the engine?
> Have you run the engine-setup after the restore?
>
> Martin Necas
>
> On Mon, Nov 8, 2021 at 4:13 PM <p.staniforth(a)leedsbeckett.ac.uk> wrote:
>
> Hello
>
> After doing a a backup and restoring to a new oVirt management server
> check for upgrade of hosts fails.
>
> Also reinstall and enroll certificate fails.
>
> The error message is
>
> Failed to check for available updates on host node1.example.com
> <https://eur02.safelinks.protection.outlook.com/?url=http%3A%2F%2Fnode1.ex...>
> with message 'Failed to run check-update of host 'node1.example.com
> <https://eur02.safelinks.protection.outlook.com/?url=http%3A%2F%2Fnode1.ex...>'.
> Error: Failed to read the runner-service response. Unexpected character
> ('<' (code 60)): expected a valid value (JSON String, Number, Array, Object
> or token 'null', 'true' or 'false')
> at [Source: (org.apache.http.conn.EofSensorInputStream); line: 1, column:
> 2]'.
>
> Also removing and adding a host back fails
>
> The error message is
> Host node3.example.com
> <https://eur02.safelinks.protection.outlook.com/?url=http%3A%2F%2Fnode3.ex...>
> installation failed. Failed to execute Ansible host-deploy role: Failed to
> read the runner-service response. Unexpected character ('<' (code 60)):
> expected a valid value (JSON String, Number, Array, Object or token 'null',
> 'true' or 'false')
> at [Source: (org.apache.http.conn.EofSensorInputStream); line: 1, column:
> 2]. Please check logs for more details:
> /var/log/ovirt-engine/ansible-runner-service.log.
>
> Unfortunately the file /var/log/ovirt-engine/ansible-runner-service.log
> doesn't exist.
>
> Regards,
> Paul S.
> _______________________________________________
> Users mailing list -- users(a)ovirt.org
> To unsubscribe send an email to users-leave(a)ovirt.org
> Privacy Statement: https://www.ovirt.org/privacy-policy.html
> <https://eur02.safelinks.protection.outlook.com/?url=https%3A%2F%2Fwww.ovi...>
> oVirt Code of Conduct:
> https://www.ovirt.org/community/about/community-guidelines/
> <https://eur02.safelinks.protection.outlook.com/?url=https%3A%2F%2Fwww.ovi...>
> List Archives:
> https://lists.ovirt.org/archives/list/users@ovirt.org/message/7FVU4GYB4HX...
> <https://eur02.safelinks.protection.outlook.com/?url=https%3A%2F%2Flists.o...>
>
> To view the terms under which this email is distributed, please go to:-
> https://leedsbeckett.ac.uk/disclaimer/email
>
>
3 years
Missing snapshot in the engine
by francesco@shellrent.com
Hi,
I have an issue with a VM (Windows Server 2016), running on Centos8, oVirt host 4.4.8, oVirt engine 4.4.5. I used to perform regular snapshot (deleting the previous one) on this VM but starting from 25/10 the task fail with the errors that I'll attach at the bottom. The volume ID mentioned in the error... :
[...] vdsm.storage.exception.prepareIllegalVolumeError: Cannot prepare illegal volume: ('5cb3fe58-3e01-4d32-bc7c-5907a4f858a8',) [...]
... refers to a snapshot's volume, because the ID of the current volume is different and smaller that one in the engine UI with ID 5aad30c7-96f0-433d-95c8-2317e5f80045:
[root@ovirt-host44 4d79c1da-34f0-44e3-8b92-c4bcb8524d83]# ls -lh
total 163G
-rw-rw---- 1 vdsm kvm 154G Nov 8 10:32 5aad30c7-96f0-433d-95c8-2317e5f80045
-rw-rw---- 1 vdsm kvm 1.0M Aug 31 11:49 5aad30c7-96f0-433d-95c8-2317e5f80045.lease
-rw-r--r-- 1 vdsm kvm 360 Nov 8 10:19 5aad30c7-96f0-433d-95c8-2317e5f80045.meta
-rw-rw---- 1 vdsm kvm 8.2G Oct 25 05:16 5cb3fe58-3e01-4d32-bc7c-5907a4f858a8
-rw-rw---- 1 vdsm kvm 1.0M Oct 23 05:15 5cb3fe58-3e01-4d32-bc7c-5907a4f858a8.lease
-rw-r--r-- 1 vdsm kvm 254 Oct 25 05:16 5cb3fe58-3e01-4d32-bc7c-5907a4f858a8.meta
It seems that the last working snapshot performend on 25/10 was not completely deleted and now is used as the base from a new snapshot on the host side, but is not listed on the engine.
Any idea? I should manually merge the snapsot on the host side? If yes, any indications on that?
Thank you for your time,
Francesco
--- Engine log during snapshot removal:
2021-11-08 10:19:25,751+01 INFO [org.ovirt.engine.core.bll.snapshots.CreateSnapshotForVmCommand] (default task-63) [469dbfd8-2e2f-4cb3-84b1-d456acc78fd9] Lock Acquired to object 'EngineLock:{exclusiveLocks='[f1d56493-b5e0-480f-87a3-5e7f373712fa=VM]', sharedLocks=''}'
2021-11-08 10:19:26,306+01 INFO [org.ovirt.engine.core.bll.snapshots.CreateSnapshotForVmCommand] (EE-ManagedThreadFactory-engine-Thread-49) [469dbfd8-2e2f-4cb3-84b1-d456acc78fd9] Running command: CreateSnapshotForVmCommand internal: false. Entities affected : ID: f1d56493-b5e0-480f-87a3-5e7f373712fa Type: VMAction group MANIPULATE_VM_SNAPSHOTS with role type USER
2021-11-08 10:19:26,383+01 INFO [org.ovirt.engine.core.bll.snapshots.CreateSnapshotDiskCommand] (EE-ManagedThreadFactory-engine-Thread-49) [469dbfd8-2e2f-4cb3-84b1-d456acc78fd9] Running command: CreateSnapshotDiskCommand internal: true. Entities affected : ID: f1d56493-b5e0-480f-87a3-5e7f373712fa Type: VMAction group MANIPULATE_VM_SNAPSHOTS with role type USER
2021-11-08 10:19:26,503+01 INFO [org.ovirt.engine.core.bll.snapshots.CreateSnapshotCommand] (EE-ManagedThreadFactory-engine-Thread-49) [469dbfd8-2e2f-4cb3-84b1-d456acc78fd9] Running command: CreateSnapshotCommand internal: true. Entities affected : ID: 00000000-0000-0000-0000-000000000000 Type: Storage
2021-11-08 10:19:26,616+01 INFO [org.ovirt.engine.core.vdsbroker.irsbroker.CreateVolumeVDSCommand] (EE-ManagedThreadFactory-engine-Thread-49) [469dbfd8-2e2f-4cb3-84b1-d456acc78fd9] START, CreateVolumeVDSCommand( CreateVolumeVDSCommandParameters:{storagePoolId='609ff8db-09c5-435b-b2e5-023d57003138', ignoreFailoverLimit='false', storageDomainId='e25db7d0-060a-4046-94b5-235f38097cd8', imageGroupId='4d79c1da-34f0-44e3-8b92-c4bcb8524d83', imageSizeInBytes='214748364800', volumeFormat='COW', newImageId='74e7188d-3727-4ed6-a2e5-dfa73b9e7da3', imageType='Sparse', newImageDescription='', imageInitialSizeInBytes='0', imageId='5aad30c7-96f0-433d-95c8-2317e5f80045', sourceImageGroupId='4d79c1da-34f0-44e3-8b92-c4bcb8524d83', shouldAddBitmaps='false'}), log id: 514e7f02
2021-11-08 10:19:26,768+01 INFO [org.ovirt.engine.core.vdsbroker.irsbroker.CreateVolumeVDSCommand] (EE-ManagedThreadFactory-engine-Thread-49) [469dbfd8-2e2f-4cb3-84b1-d456acc78fd9] FINISH, CreateVolumeVDSCommand, return: 74e7188d-3727-4ed6-a2e5-dfa73b9e7da3, log id: 514e7f02
2021-11-08 10:19:26,805+01 INFO [org.ovirt.engine.core.bll.tasks.CommandAsyncTask] (EE-ManagedThreadFactory-engine-Thread-49) [469dbfd8-2e2f-4cb3-84b1-d456acc78fd9] CommandAsyncTask::Adding CommandMultiAsyncTasks object for command 'eb1f1fdd-a46e-45e1-a6f0-3a97fe1f6e28'
2021-11-08 10:19:26,805+01 INFO [org.ovirt.engine.core.bll.CommandMultiAsyncTasks] (EE-ManagedThreadFactory-engine-Thread-49) [469dbfd8-2e2f-4cb3-84b1-d456acc78fd9] CommandMultiAsyncTasks::attachTask: Attaching task '4bb54004-f96c-4f14-abca-bea477d866ea' to command 'eb1f1fdd-a46e-45e1-a6f0-3a97fe1f6e28'.
2021-11-08 10:19:27,033+01 INFO [org.ovirt.engine.core.bll.tasks.AsyncTaskManager] (EE-ManagedThreadFactory-engine-Thread-49) [469dbfd8-2e2f-4cb3-84b1-d456acc78fd9] Adding task '4bb54004-f96c-4f14-abca-bea477d866ea' (Parent Command 'CreateSnapshot', Parameters Type 'org.ovirt.engine.core.common.asynctasks.AsyncTaskParameters'), polling hasn't started yet..
2021-11-08 10:19:27,282+01 INFO [org.ovirt.engine.core.bll.tasks.SPMAsyncTask] (EE-ManagedThreadFactory-engine-Thread-49) [469dbfd8-2e2f-4cb3-84b1-d456acc78fd9] BaseAsyncTask::startPollingTask: Starting to poll task '4bb54004-f96c-4f14-abca-bea477d866ea'.
2021-11-08 10:19:27,533+01 INFO [org.ovirt.engine.core.dal.dbbroker.auditloghandling.AuditLogDirector] (EE-ManagedThreadFactory-engine-Thread-49) [469dbfd8-2e2f-4cb3-84b1-d456acc78fd9] EVENT_ID: USER_CREATE_SNAPSHOT(45), Snapshot 'test' creation for VM 'VM.NAME' was initiated by admin@internal-authz.
2021-11-08 10:19:29,099+01 INFO [org.ovirt.engine.core.bll.snapshots.CreateSnapshotCommand] (EE-ManagedThreadFactory-engine-Thread-27) [469dbfd8-2e2f-4cb3-84b1-d456acc78fd9] Command [id=eb1f1fdd-a46e-45e1-a6f0-3a97fe1f6e28]: Updating status to 'SUCCEEDED', The command end method logic will be executed by one of its parent commands.
2021-11-08 10:19:29,114+01 INFO [org.ovirt.engine.core.bll.tasks.CommandAsyncTask] (EE-ManagedThreadFactory-engine-Thread-27) [469dbfd8-2e2f-4cb3-84b1-d456acc78fd9] CommandAsyncTask::HandleEndActionResult [within thread]: endAction for action type 'CreateSnapshot' completed, handling the result.
2021-11-08 10:19:29,114+01 INFO [org.ovirt.engine.core.bll.tasks.CommandAsyncTask] (EE-ManagedThreadFactory-engine-Thread-27) [469dbfd8-2e2f-4cb3-84b1-d456acc78fd9] CommandAsyncTask::HandleEndActionResult [within thread]: endAction for action type 'CreateSnapshot' succeeded, clearing tasks.
2021-11-08 10:19:29,114+01 INFO [org.ovirt.engine.core.bll.tasks.SPMAsyncTask] (EE-ManagedThreadFactory-engine-Thread-27) [469dbfd8-2e2f-4cb3-84b1-d456acc78fd9] SPMAsyncTask::ClearAsyncTask: Attempting to clear task '4bb54004-f96c-4f14-abca-bea477d866ea'
2021-11-08 10:19:29,115+01 INFO [org.ovirt.engine.core.vdsbroker.irsbroker.SPMClearTaskVDSCommand] (EE-ManagedThreadFactory-engine-Thread-27) [469dbfd8-2e2f-4cb3-84b1-d456acc78fd9] START, SPMClearTaskVDSCommand( SPMTaskGuidBaseVDSCommandParameters:{storagePoolId='609ff8db-09c5-435b-b2e5-023d57003138', ignoreFailoverLimit='false', taskId='4bb54004-f96c-4f14-abca-bea477d866ea'}), log id: 52494e42
2021-11-08 10:19:29,115+01 INFO [org.ovirt.engine.core.vdsbroker.vdsbroker.HSMClearTaskVDSCommand] (EE-ManagedThreadFactory-engine-Thread-27) [469dbfd8-2e2f-4cb3-84b1-d456acc78fd9] START, HSMClearTaskVDSCommand(HostName = OVIRT-HOST-44, HSMTaskGuidBaseVDSCommandParameters:{hostId='c0e7a0c5-8048-4f30-af08-cbd17d797e3b', taskId='4bb54004-f96c-4f14-abca-bea477d866ea'}), log id: 180e0ad2
2021-11-08 10:19:29,143+01 INFO [org.ovirt.engine.core.vdsbroker.vdsbroker.HSMClearTaskVDSCommand] (EE-ManagedThreadFactory-engine-Thread-27) [469dbfd8-2e2f-4cb3-84b1-d456acc78fd9] FINISH, HSMClearTaskVDSCommand, return: , log id: 180e0ad2
2021-11-08 10:19:29,143+01 INFO [org.ovirt.engine.core.vdsbroker.irsbroker.SPMClearTaskVDSCommand] (EE-ManagedThreadFactory-engine-Thread-27) [469dbfd8-2e2f-4cb3-84b1-d456acc78fd9] FINISH, SPMClearTaskVDSCommand, return: , log id: 52494e42
2021-11-08 10:19:29,188+01 INFO [org.ovirt.engine.core.bll.tasks.SPMAsyncTask] (EE-ManagedThreadFactory-engine-Thread-27) [469dbfd8-2e2f-4cb3-84b1-d456acc78fd9] BaseAsyncTask::removeTaskFromDB: Removed task '4bb54004-f96c-4f14-abca-bea477d866ea' from DataBase
2021-11-08 10:19:29,188+01 INFO [org.ovirt.engine.core.bll.tasks.CommandAsyncTask] (EE-ManagedThreadFactory-engine-Thread-27) [469dbfd8-2e2f-4cb3-84b1-d456acc78fd9] CommandAsyncTask::HandleEndActionResult [within thread]: Removing CommandMultiAsyncTasks object for entity 'eb1f1fdd-a46e-45e1-a6f0-3a97fe1f6e28'
2021-11-08 10:19:29,190+01 INFO [org.ovirt.engine.core.bll.SerialChildCommandsExecutionCallback] (EE-ManagedScheduledExecutorService-engineScheduledThreadPool-Thread-62) [469dbfd8-2e2f-4cb3-84b1-d456acc78fd9] Command 'CreateSnapshotForVm' (id: '3210f39a-a664-4211-a22f-4173aa5bce78') waiting on child command id: 'f5413897-9ab7-4651-9b04-3dd82dd77064' type:'CreateSnapshotDisk' to complete
2021-11-08 10:19:29,191+01 INFO [org.ovirt.engine.core.bll.ConcurrentChildCommandsExecutionCallback] (EE-ManagedScheduledExecutorService-engineScheduledThreadPool-Thread-62) [469dbfd8-2e2f-4cb3-84b1-d456acc78fd9] Command 'CreateSnapshotDisk' id: 'f5413897-9ab7-4651-9b04-3dd82dd77064' child commands '[eb1f1fdd-a46e-45e1-a6f0-3a97fe1f6e28]' executions were completed, status 'SUCCEEDED'
2021-11-08 10:19:29,192+01 INFO [org.ovirt.engine.core.bll.ConcurrentChildCommandsExecutionCallback] (EE-ManagedScheduledExecutorService-engineScheduledThreadPool-Thread-62) [469dbfd8-2e2f-4cb3-84b1-d456acc78fd9] Command 'CreateSnapshotDisk' id: 'f5413897-9ab7-4651-9b04-3dd82dd77064' Updating status to 'SUCCEEDED', The command end method logic will be executed by one of its parent commands.
2021-11-08 10:19:31,605+01 INFO [org.ovirt.engine.core.bll.snapshots.CreateLiveSnapshotForVmCommand] (EE-ManagedScheduledExecutorService-engineScheduledThreadPool-Thread-80) [469dbfd8-2e2f-4cb3-84b1-d456acc78fd9] Running command: CreateLiveSnapshotForVmCommand internal: true. Entities affected : ID: f1d56493-b5e0-480f-87a3-5e7f373712fa Type: VMAction group MANIPULATE_VM_SNAPSHOTS with role type USER
2021-11-08 10:19:31,634+01 INFO [org.ovirt.engine.core.vdsbroker.vdsbroker.SnapshotVDSCommand] (EE-ManagedScheduledExecutorService-engineScheduledThreadPool-Thread-80) [469dbfd8-2e2f-4cb3-84b1-d456acc78fd9] START, SnapshotVDSCommand(HostName = OVIRT-HOST-44, SnapshotVDSCommandParameters:{hostId='c0e7a0c5-8048-4f30-af08-cbd17d797e3b', vmId='f1d56493-b5e0-480f-87a3-5e7f373712fa'}), log id: 341652b5
2021-11-08 10:19:31,650+01 INFO [org.ovirt.engine.core.vdsbroker.vdsbroker.SnapshotVDSCommand] (EE-ManagedScheduledExecutorService-engineScheduledThreadPool-Thread-80) [469dbfd8-2e2f-4cb3-84b1-d456acc78fd9] FINISH, SnapshotVDSCommand, return: 40886d7f-adad-414e-9488-ab23e36d3b0c, log id: 341652b5
2021-11-08 10:19:31,753+01 INFO [org.ovirt.engine.core.vdsbroker.vdsbroker.GetHostJobsVDSCommand] (EE-ManagedScheduledExecutorService-engineScheduledThreadPool-Thread-80) [469dbfd8-2e2f-4cb3-84b1-d456acc78fd9] START, GetHostJobsVDSCommand(HostName = OVIRT-HOST-44, GetHostJobsVDSCommandParameters:{hostId='c0e7a0c5-8048-4f30-af08-cbd17d797e3b', type='virt', jobIds='[40886d7f-adad-414e-9488-ab23e36d3b0c]'}), log id: 2cdb7b5e
2021-11-08 10:19:31,768+01 INFO [org.ovirt.engine.core.vdsbroker.vdsbroker.GetHostJobsVDSCommand] (EE-ManagedScheduledExecutorService-engineScheduledThreadPool-Thread-80) [469dbfd8-2e2f-4cb3-84b1-d456acc78fd9] FINISH, GetHostJobsVDSCommand, return: {40886d7f-adad-414e-9488-ab23e36d3b0c=HostJobInfo:{id='40886d7f-adad-414e-9488-ab23e36d3b0c', type='virt', description='snapshot_vm', status='running', progress='null', error='null'}}, log id: 2cdb7b5e
2021-11-08 10:19:31,768+01 INFO [org.ovirt.engine.core.bll.VirtJobCallback] (EE-ManagedScheduledExecutorService-engineScheduledThreadPool-Thread-80) [469dbfd8-2e2f-4cb3-84b1-d456acc78fd9] Command CreateLiveSnapshotForVm id: '34da993b-2d86-40c7-933a-f8e67be9a7a2': waiting for job '40886d7f-adad-414e-9488-ab23e36d3b0c' on host 'OVIRT-HOST-44' (id: 'c0e7a0c5-8048-4f30-af08-cbd17d797e3b') to complete
2021-11-08 10:19:34,164+01 INFO [org.ovirt.engine.core.bll.SerialChildCommandsExecutionCallback] (EE-ManagedScheduledExecutorService-engineScheduledThreadPool-Thread-89) [469dbfd8-2e2f-4cb3-84b1-d456acc78fd9] Command 'CreateSnapshotForVm' (id: '3210f39a-a664-4211-a22f-4173aa5bce78') waiting on child command id: '34da993b-2d86-40c7-933a-f8e67be9a7a2' type:'CreateLiveSnapshotForVm' to complete
2021-11-08 10:19:35,447+01 INFO [org.ovirt.engine.core.vdsbroker.vdsbroker.GetHostJobsVDSCommand] (EE-ManagedScheduledExecutorService-engineScheduledThreadPool-Thread-89) [469dbfd8-2e2f-4cb3-84b1-d456acc78fd9] START, GetHostJobsVDSCommand(HostName = OVIRT-HOST-44, GetHostJobsVDSCommandParameters:{hostId='c0e7a0c5-8048-4f30-af08-cbd17d797e3b', type='virt', jobIds='[40886d7f-adad-414e-9488-ab23e36d3b0c]'}), log id: 37efbc35
2021-11-08 10:19:35,463+01 INFO [org.ovirt.engine.core.vdsbroker.vdsbroker.GetHostJobsVDSCommand] (EE-ManagedScheduledExecutorService-engineScheduledThreadPool-Thread-89) [469dbfd8-2e2f-4cb3-84b1-d456acc78fd9] FINISH, GetHostJobsVDSCommand, return: {40886d7f-adad-414e-9488-ab23e36d3b0c=HostJobInfo:{id='40886d7f-adad-414e-9488-ab23e36d3b0c', type='virt', description='snapshot_vm', status='failed', progress='null', error='VDSError:{code='SNAPSHOT_FAILED', message='Snapshot failed'}'}}, log id: 37efbc35
2021-11-08 10:19:35,464+01 INFO [org.ovirt.engine.core.bll.VirtJobCallback] (EE-ManagedScheduledExecutorService-engineScheduledThreadPool-Thread-89) [469dbfd8-2e2f-4cb3-84b1-d456acc78fd9] Command CreateLiveSnapshotForVm id: '34da993b-2d86-40c7-933a-f8e67be9a7a2': job '40886d7f-adad-414e-9488-ab23e36d3b0c' execution was completed with VDSM job status 'failed'
2021-11-08 10:19:35,476+01 INFO [org.ovirt.engine.core.bll.VirtJobCallback] (EE-ManagedScheduledExecutorService-engineScheduledThreadPool-Thread-89) [469dbfd8-2e2f-4cb3-84b1-d456acc78fd9] Command CreateLiveSnapshotForVm id: '34da993b-2d86-40c7-933a-f8e67be9a7a2': execution was completed, the command status is 'FAILED'
2021-11-08 10:19:36,496+01 ERROR [org.ovirt.engine.core.bll.snapshots.CreateLiveSnapshotForVmCommand] (EE-ManagedScheduledExecutorService-engineScheduledThreadPool-Thread-65) [469dbfd8-2e2f-4cb3-84b1-d456acc78fd9] Ending command 'org.ovirt.engine.core.bll.snapshots.CreateLiveSnapshotForVmCommand' with failure.
2021-11-08 10:19:37,695+01 INFO [org.ovirt.engine.core.bll.SerialChildCommandsExecutionCallback] (EE-ManagedScheduledExecutorService-engineScheduledThreadPool-Thread-18) [469dbfd8-2e2f-4cb3-84b1-d456acc78fd9] Command 'CreateSnapshotForVm' id: '3210f39a-a664-4211-a22f-4173aa5bce78' child commands '[f5413897-9ab7-4651-9b04-3dd82dd77064, 34da993b-2d86-40c7-933a-f8e67be9a7a2]' executions were completed, status 'FAILED'
2021-11-08 10:19:39,204+01 ERROR [org.ovirt.engine.core.bll.snapshots.CreateSnapshotForVmCommand] (EE-ManagedScheduledExecutorService-engineScheduledThreadPool-Thread-16) [469dbfd8-2e2f-4cb3-84b1-d456acc78fd9] Ending command 'org.ovirt.engine.core.bll.snapshots.CreateSnapshotForVmCommand' with failure.
2021-11-08 10:19:39,211+01 ERROR [org.ovirt.engine.core.bll.snapshots.CreateSnapshotDiskCommand] (EE-ManagedScheduledExecutorService-engineScheduledThreadPool-Thread-16) [469dbfd8-2e2f-4cb3-84b1-d456acc78fd9] Ending command 'org.ovirt.engine.core.bll.snapshots.CreateSnapshotDiskCommand' with failure.
2021-11-08 10:19:39,224+01 ERROR [org.ovirt.engine.core.bll.snapshots.CreateSnapshotCommand] (EE-ManagedScheduledExecutorService-engineScheduledThreadPool-Thread-16) [469dbfd8-2e2f-4cb3-84b1-d456acc78fd9] Ending command 'org.ovirt.engine.core.bll.snapshots.CreateSnapshotCommand' with failure.
2021-11-08 10:19:39,246+01 INFO [org.ovirt.engine.core.vdsbroker.irsbroker.SPMRevertTaskVDSCommand] (EE-ManagedScheduledExecutorService-engineScheduledThreadPool-Thread-16) [469dbfd8-2e2f-4cb3-84b1-d456acc78fd
] START, SPMRevertTaskVDSCommand( SPMTaskGuidBaseVDSCommandParameters:{storagePoolId='609ff8db-09c5-435b-b2e5-023d57003138', ignoreFailoverLimit='false', taskId='4bb54004-f96c-4f14-abca-bea477d866ea'}), log id: 3ab88656
2021-11-08 10:19:39,247+01 INFO [org.ovirt.engine.core.vdsbroker.vdsbroker.HSMRevertTaskVDSCommand] (EE-ManagedScheduledExecutorService-engineScheduledThreadPool-Thread-16) [469dbfd8-2e2f-4cb3-84b1-d456acc78fd
] START, HSMRevertTaskVDSCommand(HostName = OVIRT-HOST-44, HSMTaskGuidBaseVDSCommandParameters:{hostId='c0e7a0c5-8048-4f30-af08-cbd17d797e3b', taskId='4bb54004-f96c-4f14-abca-bea477d866ea'}), log id: 185dcbfc
2021-11-08 10:19:39,270+01 ERROR [org.ovirt.engine.core.vdsbroker.vdsbroker.HSMRevertTaskVDSCommand] (EE-ManagedScheduledExecutorService-engineScheduledThreadPool-Thread-16) [469dbfd8-2e2f-4cb3-84b1-d456acc78fd
] Trying to revert unknown task '4bb54004-f96c-4f14-abca-bea477d866ea'
2021-11-08 10:19:39,270+01 INFO [org.ovirt.engine.core.vdsbroker.vdsbroker.HSMRevertTaskVDSCommand] (EE-ManagedScheduledExecutorService-engineScheduledThreadPool-Thread-16) [469dbfd8-2e2f-4cb3-84b1-d456acc78fd
] FINISH, HSMRevertTaskVDSCommand, return: , log id: 185dcbfc
2021-11-08 10:19:39,270+01 INFO [org.ovirt.engine.core.vdsbroker.irsbroker.SPMRevertTaskVDSCommand] (EE-ManagedScheduledExecutorService-engineScheduledThreadPool-Thread-16) [469dbfd8-2e2f-4cb3-84b1-d456acc78fd
] FINISH, SPMRevertTaskVDSCommand, return: , log id: 3ab88656
2021-11-08 10:19:39,387+01 INFO [org.ovirt.engine.core.vdsbroker.vdsbroker.DumpXmlsVDSCommand] (EE-ManagedScheduledExecutorService-engineScheduledThreadPool-Thread-16) [469dbfd8-2e2f-4cb3-84b1-d456acc78fd9] START, DumpXmlsVDSCommand(HostName = OVIRT-HOST-44, Params:{hostId='c0e7a0c5-8048-4f30-af08-cbd17d797e3b', vmIds='[f1d56493-b5e0-480f-87a3-5e7f373712fa]'}), log id: e48849e
2021-11-08 10:19:39,420+01 INFO [org.ovirt.engine.core.vdsbroker.vdsbroker.DumpXmlsVDSCommand] (EE-ManagedScheduledExecutorService-engineScheduledThreadPool-Thread-16) [469dbfd8-2e2f-4cb3-84b1-d456acc78fd9] FINISH, DumpXmlsVDSCommand, return: {f1d56493-b5e0-480f-87a3-5e7f373712fa=<domain type='kvm' id='7' xmlns:qemu='http://libvirt.org/schemas/domain/qemu/1.0'>
2021-11-08 10:19:39,537+01 INFO [org.ovirt.engine.core.bll.storage.disk.image.DestroyImageCommand] (EE-ManagedScheduledExecutorService-engineScheduledThreadPool-Thread-16) [469dbfd8-2e2f-4cb3-84b1-d456acc78fd9] Running command: DestroyImageCommand internal: true. Entities affected : ID: e25db7d0-060a-4046-94b5-235f38097cd8 Type: Storage
2021-11-08 10:19:39,570+01 INFO [org.ovirt.engine.core.vdsbroker.irsbroker.DestroyImageVDSCommand] (EE-ManagedScheduledExecutorService-engineScheduledThreadPool-Thread-16) [469dbfd8-2e2f-4cb3-84b1-d456acc78fd9] START, DestroyImageVDSCommand( DestroyImageVDSCommandParameters:{storagePoolId='609ff8db-09c5-435b-b2e5-023d57003138', ignoreFailoverLimit='false', storageDomainId='e25db7d0-060a-4046-94b5-235f38097cd8', imageGroupId='4d79c1da-34f0-44e3-8b92-c4bcb8524d83', imageId='00000000-0000-0000-0000-000000000000', imageList='[74e7188d-3727-4ed6-a2e5-dfa73b9e7da3]', postZero='false', force='false'}), log id: 396d738
2021-11-08 10:19:39,649+01 INFO [org.ovirt.engine.core.vdsbroker.irsbroker.DestroyImageVDSCommand] (EE-ManagedScheduledExecutorService-engineScheduledThreadPool-Thread-16) [469dbfd8-2e2f-4cb3-84b1-d456acc78fd9] FINISH, DestroyImageVDSCommand, return: , log id: 396d738
2021-11-08 10:19:39,954+01 INFO [org.ovirt.engine.core.bll.tasks.AsyncTaskManager] (EE-ManagedScheduledExecutorService-engineScheduledThreadPool-Thread-16) [469dbfd8-2e2f-4cb3-84b1-d456acc78fd9] Adding task '18850219-9586-4648-b3fb-be7edd4b6b28' (Parent Command 'Unknown', Parameters Type 'org.ovirt.engine.core.common.asynctasks.AsyncTaskParameters'), polling hasn't started yet..
2021-11-08 10:19:39,954+01 INFO [org.ovirt.engine.core.bll.storage.disk.image.DestroyImageCommand] (EE-ManagedScheduledExecutorService-engineScheduledThreadPool-Thread-16) [469dbfd8-2e2f-4cb3-84b1-d456acc78fd9] Successfully started task to remove orphaned volumes
2021-11-08 10:19:40,089+01 INFO [org.ovirt.engine.core.bll.tasks.SPMAsyncTask] (EE-ManagedScheduledExecutorService-engineScheduledThreadPool-Thread-16) [469dbfd8-2e2f-4cb3-84b1-d456acc78fd9] BaseAsyncTask::startPollingTask: Starting to poll task '18850219-9586-4648-b3fb-be7edd4b6b28'.
2021-11-08 10:19:40,089+01 INFO [org.ovirt.engine.core.bll.tasks.SPMAsyncTask] (EE-ManagedScheduledExecutorService-engineScheduledThreadPool-Thread-16) [469dbfd8-2e2f-4cb3-84b1-d456acc78fd9] BaseAsyncTask::startPollingTask: Starting to poll task '18850219-9586-4648-b3fb-be7edd4b6b28'.
--- Host log during snapshot removal
2021-11-08 10:19:26,757+0100 INFO (tasks/5) [storage.ThreadPool.WorkerThread] START task 4bb54004-f96c-4f14-abca-bea477d866ea (cmd=<bound method Task.commit of <vdsm.storage.task.Task object at 0x7f78e6d93550>>, args=None) (threadPool:146)
2021-11-08 10:19:26,794+0100 INFO (tasks/5) [storage.Volume] Creating volume 74e7188d-3727-4ed6-a2e5-dfa73b9e7da3 (volume:1232)
2021-11-08 10:19:26,870+0100 INFO (tasks/5) [storage.Volume] Request to create snapshot 4d79c1da-34f0-44e3-8b92-c4bcb8524d83/74e7188d-3727-4ed6-a2e5-dfa73b9e7da3 of volume 4d79c1da-34f0-44e3-8b92-c4bcb8524d83/5aad30c7-96f0-433d-95c8-2317e5f80045 with capacity 214748364800 (fileVolume:528)
2021-11-08 10:19:26,904+0100 INFO (tasks/5) [storage.Volume] Changing volume '/rhev/data-center/mnt/OVIRT-HOST-44:_data/e25db7d0-060a-4046-94b5-235f38097cd8/images/4d79c1da-34f0-44e3-8b92-c4bcb8524d83/74e7188d-3727-4ed6-a2e5-dfa73b9e7da3' permission to 0660 (fileVolume:587)
2021-11-08 10:19:26,973+0100 INFO (tasks/5) [storage.ThreadPool.WorkerThread] FINISH task 4bb54004-f96c-4f14-abca-bea477d866ea (threadPool:148)
2021-11-08 10:19:27,447+0100 INFO (jsonrpc/6) [vdsm.api] START getSpmStatus(spUUID='609ff8db-09c5-435b-b2e5-023d57003138') from=::ffff:HOST.IP.ADDRESS,36340, task_id=c86c0220-9103-4267-848d-ef1cb1ee69b0 (api:48)
2021-11-08 10:19:27,459+0100 INFO (jsonrpc/6) [vdsm.api] FINISH getSpmStatus return={'spm_st': {'spmStatus': 'SPM', 'spmLver': 4, 'spmId': 1}} from=::ffff:HOST.IP.ADDRESS,36340, task_id=c86c0220-9103-4267-848d-ef1cb1ee69b0 (api:54)
2021-11-08 10:19:27,495+0100 INFO (jsonrpc/1) [vdsm.api] START getStoragePoolInfo(spUUID='609ff8db-09c5-435b-b2e5-023d57003138') from=::ffff:HOST.IP.ADDRESS,36356, task_id=1c8dc779-dc03-443f-8e87-610c3bb1775a (api:48)
2021-11-08 10:19:27,498+0100 INFO (jsonrpc/1) [vdsm.api] FINISH getStoragePoolInfo return={'info': {'domains': 'dd1ac97a-20d9-4232-88cc-fbf53410ed5a:Active,e25db7d0-060a-4046-94b5-235f38097cd8:Active', 'isoprefix': '', 'lver': 4, 'master_uuid': 'e25db7d0-060a-4046-94b5-235f38097cd8', 'master_ver': 1, 'name': 'No Description', 'pool_status': 'connected', 'spm_id': 1, 'type': 'NFS', 'version': '5'}, 'dominfo': {'dd1ac97a-20d9-4232-88cc-fbf53410ed5a': {'status': 'Active', 'alerts': [], 'isoprefix': '', 'version': 5, 'disktotal': '1999421571072', 'diskfree': '1182995054592'}, 'e25db7d0-060a-4046-94b5-235f38097cd8': {'status': 'Active', 'alerts': [], 'isoprefix': '', 'version': 5, 'disktotal': '1924279566336', 'diskfree': '1728397180928'}}} from=::ffff:HOST.IP.ADDRESS,36356, task_id=1c8dc779-dc03-443f-8e87-610c3bb1775a (api:54)
2021-11-08 10:19:28,910+0100 INFO (jsonrpc/2) [vdsm.api] START getAllTasksStatuses() from=::ffff:HOST.IP.ADDRESS,36340, task_id=6b1aa4e7-51d0-4444-abdd-d53703e34605 (api:48)
2021-11-08 10:19:28,911+0100 INFO (jsonrpc/2) [vdsm.api] FINISH getAllTasksStatuses return={'allTasksStatus': {'4bb54004-f96c-4f14-abca-bea477d866ea': {'taskID': '4bb54004-f96c-4f14-abca-bea477d866ea', 'taskState': 'finished', 'taskResult': 'success', 'code': 0, 'message': '1 jobs completed successfully'}}} from=::ffff:HOST.IP.ADDRESS,36340, task_id=6b1aa4e7-51d0-4444-abdd-d53703e34605 (api:54)
2021-11-08 10:19:29,124+0100 INFO (jsonrpc/3) [vdsm.api] START clearTask(taskID='4bb54004-f96c-4f14-abca-bea477d866ea') from=::ffff:HOST.IP.ADDRESS,36340, flow_id=469dbfd8-2e2f-4cb3-84b1-d456acc78fd9, task_id=c23a844a-bf91-48d5-ae8c-d0fb5f118812 (api:48)
2021-11-08 10:19:29,128+0100 INFO (jsonrpc/3) [vdsm.api] FINISH clearTask return=None from=::ffff:HOST.IP.ADDRESS,36340, flow_id=469dbfd8-2e2f-4cb3-84b1-d456acc78fd9, task_id=c23a844a-bf91-48d5-ae8c-d0fb5f118812 (api:54)
2021-11-08 10:19:30,768+0100 INFO (periodic/1) [vdsm.api] START getVolumeSize(sdUUID='e25db7d0-060a-4046-94b5-235f38097cd8', spUUID='609ff8db-09c5-435b-b2e5-023d57003138', imgUUID='72b67a6a-0ea3-4101-90cc-a18bcf774717', volUUID='4506da8b-d73a-46ba-a91e-07e786ae934b') from=internal, task_id=5e76433f-eaae-459f-a806-0d2ce2a5d4db (api:48)
2021-11-08 10:19:30,768+0100 INFO (periodic/0) [vdsm.api] START getVolumeSize(sdUUID='e25db7d0-060a-4046-94b5-235f38097cd8', spUUID='609ff8db-09c5-435b-b2e5-023d57003138', imgUUID='4d79c1da-34f0-44e3-8b92-c4bcb8524d83', volUUID='5aad30c7-96f0-433d-95c8-2317e5f80045') from=internal, task_id=7d1a51e0-2ce4-4d77-a4b8-d6c6d48566cb (api:48)
2021-11-08 10:19:30,769+0100 INFO (periodic/0) [vdsm.api] FINISH getVolumeSize return={'apparentsize': '165236113408', 'truesize': '165235134464'} from=internal, task_id=7d1a51e0-2ce4-4d77-a4b8-d6c6d48566cb (api:54)
2021-11-08 10:19:30,769+0100 INFO (periodic/1) [vdsm.api] FINISH getVolumeSize return={'apparentsize': '8427077632', 'truesize': '8427077632'} from=internal, task_id=5e76433f-eaae-459f-a806-0d2ce2a5d4db (api:54)
2021-11-08 10:19:30,770+0100 INFO (jsonrpc/4) [api.host] START getAllVmStats() from=::ffff:HOST.IP.ADDRESS,36340 (api:48)
2021-11-08 10:19:30,772+0100 INFO (jsonrpc/4) [api.host] FINISH getAllVmStats return={'status': {'code': 0, 'message': 'Done'}, 'statsList': (suppressed)} from=::ffff:HOST.IP.ADDRESS,36340 (api:54)
2021-11-08 10:19:31,643+0100 INFO (jsonrpc/7) [api.virt] START snapshot(snapDrives=[{'imageID': '4d79c1da-34f0-44e3-8b92-c4bcb8524d83', 'baseVolumeID': '5aad30c7-96f0-433d-95c8-2317e5f80045', 'volumeID': '74e7188d-3727-4ed6-a2e5-dfa73b9e7da3', 'domainID': 'e25db7d0-060a-4046-94b5-235f38097cd8'}], snapMemory=None, frozen=False, jobUUID='40886d7f-adad-414e-9488-ab23e36d3b0c', timeout=30) from=::ffff:HOST.IP.ADDRESS,36340, flow_id=469dbfd8-2e2f-4cb3-84b1-d456acc78fd9, vmId=f1d56493-b5e0-480f-87a3-5e7f373712fa (api:48)
2021-11-08 10:19:31,644+0100 INFO (jsonrpc/7) [api.virt] FINISH snapshot return={'status': {'code': 0, 'message': 'Done'}} from=::ffff:HOST.IP.ADDRESS,36340, flow_id=469dbfd8-2e2f-4cb3-84b1-d456acc78fd9, vmId=f1d56493-b5e0-480f-87a3-5e7f373712fa (api:54)
2021-11-08 10:19:31,644+0100 INFO (virt/40886d7f) [root] Running job '40886d7f-adad-414e-9488-ab23e36d3b0c'... (jobs:185)
2021-11-08 10:19:31,645+0100 INFO (snap_abort/40886d7f) [virt.vm] (vmId='f1d56493-b5e0-480f-87a3-5e7f373712fa') Starting snapshot abort job, with check interval 60 (snapshot:628)
2021-11-08 10:19:31,762+0100 INFO (jsonrpc/5) [api.host] START getJobs(job_type='virt', job_ids=['40886d7f-adad-414e-9488-ab23e36d3b0c']) from=::ffff:HOST.IP.ADDRESS,36340, flow_id=469dbfd8-2e2f-4cb3-84b1-d456acc78fd9 (api:48)
2021-11-08 10:19:31,762+0100 INFO (jsonrpc/5) [api.host] FINISH getJobs return={'jobs': {'40886d7f-adad-414e-9488-ab23e36d3b0c': {'id': '40886d7f-adad-414e-9488-ab23e36d3b0c', 'status': 'running', 'description': 'snapshot_vm', 'job_type': 'virt'}}, 'status': {'code': 0, 'message': 'Done'}} from=::ffff:HOST.IP.ADDRESS,36340, flow_id=469dbfd8-2e2f-4cb3-84b1-d456acc78fd9 (api:54)
2021-11-08 10:19:32,711+0100 INFO (virt/40886d7f) [vdsm.api] START prepareImage(sdUUID='e25db7d0-060a-4046-94b5-235f38097cd8', spUUID='609ff8db-09c5-435b-b2e5-023d57003138', imgUUID='4d79c1da-34f0-44e3-8b92-c4bcb8524d83', leafUUID='74e7188d-3727-4ed6-a2e5-dfa73b9e7da3', allowIllegal=False) from=internal, task_id=03e28303-b333-4132-84ce-9b24b4b931f4 (api:48)
2021-11-08 10:19:32,718+0100 INFO (virt/40886d7f) [vdsm.api] FINISH prepareImage error=Cannot prepare illegal volume: ('5cb3fe58-3e01-4d32-bc7c-5907a4f858a8',) from=internal, task_id=03e28303-b333-4132-84ce-9b24b4b931f4 (api:52)
2021-11-08 10:19:32,718+0100 ERROR (virt/40886d7f) [storage.TaskManager.Task] (Task='03e28303-b333-4132-84ce-9b24b4b931f4') Unexpected error (task:877)
Traceback (most recent call last):
File "/usr/lib/python3.6/site-packages/vdsm/storage/task.py", line 884, in _run
return fn(*args, **kargs)
File "<decorator-gen-167>", line 2, in prepareImage
File "/usr/lib/python3.6/site-packages/vdsm/common/api.py", line 50, in method
ret = func(*args, **kwargs)
File "/usr/lib/python3.6/site-packages/vdsm/storage/hsm.py", line 3178, in prepareImage
raise se.prepareIllegalVolumeError(volUUID)
vdsm.storage.exception.prepareIllegalVolumeError: Cannot prepare illegal volume: ('5cb3fe58-3e01-4d32-bc7c-5907a4f858a8',)
2021-11-08 10:19:32,718+0100 INFO (virt/40886d7f) [storage.TaskManager.Task] (Task='03e28303-b333-4132-84ce-9b24b4b931f4') aborting: Task is aborted: "value=Cannot prepare illegal volume: ('5cb3fe58-3e01-4d32-bc7c-5907a4f858a8',) abortedcode=227" (task:1182)
2021-11-08 10:19:32,718+0100 ERROR (virt/40886d7f) [storage.Dispatcher] FINISH prepareImage error=Cannot prepare illegal volume: ('5cb3fe58-3e01-4d32-bc7c-5907a4f858a8',) (dispatcher:83)
2021-11-08 10:19:32,718+0100 ERROR (virt/40886d7f) [virt.vm] (vmId='f1d56493-b5e0-480f-87a3-5e7f373712fa') unable to prepare the volume path for disk sda (snapshot:392)
Traceback (most recent call last):
File "/usr/lib/python3.6/site-packages/vdsm/virt/jobs/snapshot.py", line 389, in snapshot
self._vm.cif.prepareVolumePath(new_drives[vm_dev_name])
File "/usr/lib/python3.6/site-packages/vdsm/clientIF.py", line 430, in prepareVolumePath
raise vm.VolumeError(drive)
vdsm.virt.vm.VolumeError: Bad volume specification {'device': 'disk', 'domainID': 'e25db7d0-060a-4046-94b5-235f38097cd8', 'imageID': '4d79c1da-34f0-44e3-8b92-c4bcb8524d83', 'volumeID': '74e7188d-3727-4ed6-a2e5-dfa73b9e7da3', 'type': 'disk', 'diskType': 'file', 'poolID': '609ff8db-09c5-435b-b2e5-023d57003138', 'name': 'sda', 'format': 'cow'}
2021-11-08 10:19:32,719+0100 INFO (virt/40886d7f) [vdsm.api] START teardownImage(sdUUID='e25db7d0-060a-4046-94b5-235f38097cd8', spUUID='609ff8db-09c5-435b-b2e5-023d57003138', imgUUID='4d79c1da-34f0-44e3-8b92-c4bcb8524d83', volUUID=None) from=internal, task_id=0b042aba-7f85-4de4-98f0-e4dce1df5f34 (api:48)
2021-11-08 10:19:32,720+0100 INFO (virt/40886d7f) [storage.StorageDomain] Removing image rundir link '/run/vdsm/storage/e25db7d0-060a-4046-94b5-235f38097cd8/4d79c1da-34f0-44e3-8b92-c4bcb8524d83' (fileSD:601)
2021-11-08 10:19:32,720+0100 INFO (virt/40886d7f) [vdsm.api] FINISH teardownImage return=None from=internal, task_id=0b042aba-7f85-4de4-98f0-e4dce1df5f34 (api:54)
2021-11-08 10:19:33,694+0100 ERROR (snap_abort/40886d7f) [virt.vm] (vmId='f1d56493-b5e0-480f-87a3-5e7f373712fa') Snapshot job didn't start on the domain (snapshot:639)
2021-11-08 10:19:33,695+0100 ERROR (virt/40886d7f) [root] Job '40886d7f-adad-414e-9488-ab23e36d3b0c' failed (jobs:223)
Traceback (most recent call last):
File "/usr/lib/python3.6/site-packages/vdsm/virt/jobs/snapshot.py", line 389, in snapshot
self._vm.cif.prepareVolumePath(new_drives[vm_dev_name])
File "/usr/lib/python3.6/site-packages/vdsm/clientIF.py", line 430, in prepareVolumePath
raise vm.VolumeError(drive)
vdsm.virt.vm.VolumeError: Bad volume specification {'device': 'disk', 'domainID': 'e25db7d0-060a-4046-94b5-235f38097cd8', 'imageID': '4d79c1da-34f0-44e3-8b92-c4bcb8524d83', 'volumeID': '74e7188d-3727-4ed6-a2e5-dfa73b9e7da3', 'type': 'disk', 'diskType': 'file', 'poolID': '609ff8db-09c5-435b-b2e5-023d57003138', 'name': 'sda', 'format': 'cow'}
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/usr/lib/python3.6/site-packages/vdsm/virt/jobs/snapshot.py", line 122, in _run
snap.snapshot()
File "/usr/lib/python3.6/site-packages/vdsm/virt/jobs/snapshot.py", line 394, in snapshot
raise exception.SnapshotFailed()
vdsm.common.exception.SnapshotFailed: Snapshot failed
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/usr/lib/python3.6/site-packages/vdsm/jobs.py", line 159, in run
self._run()
File "/usr/lib/python3.6/site-packages/vdsm/virt/jobs/snapshot.py", line 132, in _run
raise exception.SnapshotFailed()
vdsm.common.exception.SnapshotFailed: Snapshot failed
2021-11-08 10:19:33,696+0100 INFO (virt/40886d7f) [root] Job '40886d7f-adad-414e-9488-ab23e36d3b0c' will be deleted in 3600 seconds (jobs:251)
2021-11-08 10:19:34,197+0100 INFO (jsonrpc/0) [api.host] START getStats() from=::ffff:HOST.IP.ADDRESS,36340 (api:48)
2021-11-08 10:19:34,210+0100 INFO (jsonrpc/0) [vdsm.api] START repoStats(domains=()) from=::ffff:HOST.IP.ADDRESS,36340, task_id=a6a26818-4a4f-4b31-93be-f50dccce445f (api:48)
2021-11-08 10:19:34,210+0100 INFO (jsonrpc/0) [vdsm.api] FINISH repoStats return={'e25db7d0-060a-4046-94b5-235f38097cd8': {'code': 0, 'lastCheck': '0.5', 'delay': '0.000138172', 'valid': True, 'version': 5, 'acquired': True, 'actual': True}, 'dd1ac97a-20d9-4232-88cc-fbf53410ed5a': {'code': 0, 'lastCheck': '0.5', 'delay': '0.00012691', 'valid': True, 'version': 5, 'acquired': True, 'actual': True}} from=::ffff:HOST.IP.ADDRESS,36340, task_id=a6a26818-4a4f-4b31-93be-f50dccce445f (api:54)
2021-11-08 10:19:34,210+0100 INFO (jsonrpc/0) [vdsm.api] START multipath_health() from=::ffff:HOST.IP.ADDRESS,36340, task_id=b03ffa07-1b63-4d99-a283-4f4b887fb362 (api:48)
2021-11-08 10:19:34,211+0100 INFO (jsonrpc/0) [vdsm.api] FINISH multipath_health return={} from=::ffff:HOST.IP.ADDRESS,36340, task_id=b03ffa07-1b63-4d99-a283-4f4b887fb362 (api:54)
2021-11-08 10:19:34,215+0100 INFO (jsonrpc/0) [api.host] FINISH getStats return={'status': {'code': 0, 'message': 'Done'}, 'info': (suppressed)} from=::ffff:HOST.IP.ADDRESS,36340 (api:54)
2021-11-08 10:19:34,891+0100 INFO (periodic/0) [vdsm.api] START repoStats(domains=()) from=internal, task_id=339a38e1-2524-4281-8a0b-a6ffd9bad9ad (api:48)
2021-11-08 10:19:34,891+0100 INFO (periodic/0) [vdsm.api] FINISH repoStats return={'e25db7d0-060a-4046-94b5-235f38097cd8': {'code': 0, 'lastCheck': '1.2', 'delay': '0.000138172', 'valid': True, 'version': 5, 'acquired': True, 'actual': True}, 'dd1ac97a-20d9-4232-88cc-fbf53410ed5a': {'code': 0, 'lastCheck': '1.2', 'delay': '0.00012691', 'valid': True, 'version': 5, 'acquired': True, 'actual': True}} from=internal, task_id=339a38e1-2524-4281-8a0b-a6ffd9bad9ad (api:54)
2021-11-08 10:19:35,456+0100 INFO (jsonrpc/6) [api.host] START getJobs(job_type='virt', job_ids=['40886d7f-adad-414e-9488-ab23e36d3b0c']) from=::ffff:HOST.IP.ADDRESS,36340, flow_id=469dbfd8-2e2f-4cb3-84b1-d456acc78fd9 (api:48)
2021-11-08 10:19:35,456+0100 INFO (jsonrpc/6) [api.host] FINISH getJobs return={'jobs': {'40886d7f-adad-414e-9488-ab23e36d3b0c': {'id': '40886d7f-adad-414e-9488-ab23e36d3b0c', 'status': 'failed', 'description': 'snapshot_vm', 'job_type': 'virt', 'error': {'code': 48, 'message': 'Snapshot failed'}}}, 'status': {'code': 0, 'message': 'Done'}} from=::ffff:HOST.IP.ADDRESS,36340, flow_id=469dbfd8-2e2f-4cb3-84b1-d456acc78fd9 (api:54)
2021-11-08 10:19:37,609+0100 INFO (jsonrpc/1) [vdsm.api] START getSpmStatus(spUUID='609ff8db-09c5-435b-b2e5-023d57003138') from=::ffff:HOST.IP.ADDRESS,36340, task_id=85cf07e8-f8c3-4b8d-bdc7-2275d41eeffd (api:48)
2021-11-08 10:19:37,612+0100 INFO (jsonrpc/1) [vdsm.api] FINISH getSpmStatus return={'spm_st': {'spmStatus': 'SPM', 'spmLver': 4, 'spmId': 1}} from=::ffff:HOST.IP.ADDRESS,36340, task_id=85cf07e8-f8c3-4b8d-bdc7-2275d41eeffd (api:54)
2021-11-08 10:19:37,648+0100 INFO (jsonrpc/2) [vdsm.api] START getStoragePoolInfo(spUUID='609ff8db-09c5-435b-b2e5-023d57003138') from=::ffff:HOST.IP.ADDRESS,36356, task_id=42e7e39f-990b-4b6d-9eac-69d6da773d85 (api:48)
2021-11-08 10:19:37,653+0100 INFO (jsonrpc/2) [vdsm.api] FINISH getStoragePoolInfo return={'info': {'domains': 'dd1ac97a-20d9-4232-88cc-fbf53410ed5a:Active,e25db7d0-060a-4046-94b5-235f38097cd8:Active', 'isoprefix': '', 'lver': 4, 'master_uuid': 'e25db7d0-060a-4046-94b5-235f38097cd8', 'master_ver': 1, 'name': 'No Description', 'pool_status': 'connected', 'spm_id': 1, 'type': 'NFS', 'version': '5'}, 'dominfo': {'dd1ac97a-20d9-4232-88cc-fbf53410ed5a': {'status': 'Active', 'alerts': [], 'isoprefix': '', 'version': 5, 'disktotal': '1999421571072', 'diskfree': '1182995054592'}, 'e25db7d0-060a-4046-94b5-235f38097cd8': {'status': 'Active', 'alerts': [], 'isoprefix': '', 'version': 5, 'disktotal': '1924279566336', 'diskfree': '1728395083776'}}} from=::ffff:HOST.IP.ADDRESS,36356, task_id=42e7e39f-990b-4b6d-9eac-69d6da773d85 (api:54)
2021-11-08 10:19:38,225+0100 INFO (jsonrpc/3) [api.host] START getAllVmStats() from=::1,36296 (api:48)
2021-11-08 10:19:38,226+0100 INFO (jsonrpc/3) [api.host] FINISH getAllVmStats return={'status': {'code': 0, 'message': 'Done'}, 'statsList': (suppressed)} from=::1,36296 (api:54)
2021-11-08 10:19:38,231+0100 INFO (jsonrpc/4) [api.host] START getAllVmIoTunePolicies() from=::1,36296 (api:48)
2021-11-08 10:19:38,232+0100 INFO (jsonrpc/4) [api.host] FINISH getAllVmIoTunePolicies return={'status': {'code': 0, 'message': 'Done'}, 'io_tune_policies_dict': {'fceee8e2-b6c5-4e4f-ad4d-b4a866a3992d': {'policy': [], 'current_values': [{'name': 'vda', 'path': '/rhev/data-center/mnt/OVIRT-HOST-44:_data/e25db7d0-060a-4046-94b5-235f38097cd8/images/72b67a6a-0ea3-4101-90cc-a18bcf774717/4506da8b-d73a-46ba-a91e-07e786ae934b', 'ioTune': {'total_bytes_sec': 0, 'read_bytes_sec': 0, 'write_bytes_sec': 0, 'total_iops_sec': 0, 'write_iops_sec': 0, 'read_iops_sec': 0}}]}, 'f1d56493-b5e0-480f-87a3-5e7f373712fa': {'policy': [], 'current_values': [{'name': 'sda', 'path': '/rhev/data-center/mnt/OVIRT-HOST-44:_data/e25db7d0-060a-4046-94b5-235f38097cd8/images/4d79c1da-34f0-44e3-8b92-c4bcb8524d83/5aad30c7-96f0-433d-95c8-2317e5f80045', 'ioTune': {'total_bytes_sec': 0, 'read_bytes_sec': 0, 'write_bytes_sec': 0, 'total_iops_sec': 0, 'write_iops_sec': 0, 'read_iops_sec': 0}}]}}} from=::1,36296 (api:54)
2021-11-08 10:19:39,256+0100 INFO (jsonrpc/7) [vdsm.api] START revertTask(taskID='4bb54004-f96c-4f14-abca-bea477d866ea') from=::ffff:HOST.IP.ADDRESS,36340, flow_id=469dbfd8-2e2f-4cb3-84b1-d456acc78fd9, task_id=858e0bc1-cb25-4555-b387-320f11720071 (api:48)
2021-11-08 10:19:39,256+0100 INFO (jsonrpc/7) [vdsm.api] FINISH revertTask error=Task id unknown: ('4bb54004-f96c-4f14-abca-bea477d866ea',) from=::ffff:HOST.IP.ADDRESS,36340, flow_id=469dbfd8-2e2f-4cb3-84b1-d456acc78fd9, task_id=858e0bc1-cb25-4555-b387-320f11720071 (api:52)
2021-11-08 10:19:39,256+0100 ERROR (jsonrpc/7) [storage.TaskManager.Task] (Task='858e0bc1-cb25-4555-b387-320f11720071') Unexpected error (task:877)
Traceback (most recent call last):
File "/usr/lib/python3.6/site-packages/vdsm/storage/task.py", line 884, in _run
return fn(*args, **kargs)
File "<decorator-gen-113>", line 2, in revertTask
File "/usr/lib/python3.6/site-packages/vdsm/common/api.py", line 50, in method
ret = func(*args, **kwargs)
File "/usr/lib/python3.6/site-packages/vdsm/storage/hsm.py", line 2267, in revertTask
return self.taskMng.revertTask(taskID=taskID)
File "/usr/lib/python3.6/site-packages/vdsm/storage/taskManager.py", line 161, in revertTask
t = self._getTask(taskID)
File "/usr/lib/python3.6/site-packages/vdsm/storage/taskManager.py", line 85, in _getTask
raise se.UnknownTask(taskID)
vdsm.storage.exception.UnknownTask: Task id unknown: ('4bb54004-f96c-4f14-abca-bea477d866ea',)
2021-11-08 10:19:39,256+0100 INFO (jsonrpc/7) [storage.TaskManager.Task] (Task='858e0bc1-cb25-4555-b387-320f11720071') aborting: Task is aborted: "value=Task id unknown: ('4bb54004-f96c-4f14-abca-bea477d866ea',) abortedcode=100" (task:1182)
2021-11-08 10:19:39,256+0100 ERROR (jsonrpc/7) [storage.Dispatcher] FINISH revertTask error=Task id unknown: ('4bb54004-f96c-4f14-abca-bea477d866ea',) (dispatcher:83)
2021-11-08 10:19:39,256+0100 INFO (jsonrpc/7) [jsonrpc.JsonRpcServer] RPC call Task.revert failed (error 401) in 0.00 seconds (__init__:312)
2021-11-08 10:19:39,403+0100 INFO (jsonrpc/5) [api.host] START dumpxmls(vmList=['f1d56493-b5e0-480f-87a3-5e7f373712fa']) from=::ffff:HOST.IP.ADDRESS,36340, flow_id=469dbfd8-2e2f-4cb3-84b1-d456acc78fd9 (api:48)
2021-11-08 10:19:39,403+0100 INFO (jsonrpc/5) [api.host] FINISH dumpxmls return={'domxmls': {'f1d56493-b5e0-480f-87a3-5e7f373712fa': '<domain type=\'kvm\' id=\'7\' xmlns:qemu=\'http://libvirt.org/schemas/domain/qemu/1.0\'>\n <name>VM.NAME.COM</name>\n <uuid>f1d56493-b5e0-480f-87a3-5e7f373712fa</uuid>\n <metadata xmlns:ns1="http://ovirt.org/vm/tune/1.0" xmlns:ovirt-vm="http://ovirt.org/vm/1.0">\n <ns1:qos/>\n <ovirt-vm:vm xmlns:ovirt-vm="http://ovirt.org/vm/1.0">\n <ovirt-vm:balloonTarget type="int">60817408</ovirt-vm:balloonTarget>\n <ovirt-vm:clusterVersion>4.4</ovirt-vm:clusterVersion>\n <ovirt-vm:destroy_on_reboot type="bool">False</ovirt-vm:destroy_on_reboot>\n <ovirt-vm:jobs>{}</ovirt-vm:jobs>\n <ovirt-vm:launchPaused>false</ovirt-vm:launchPaused>\n <ovirt-vm:memGuaranteedSize type="int">59392</ovirt-vm:memGuaranteedSize>\n <ovirt-vm:minGuaranteedMemoryMb type="int">59392</ovirt-vm:minGuaranteedMemoryMb>\n <ovirt-vm:resumeBehavior>auto_resume</o
virt-vm:resumeBehavior>\n <ovirt-vm:snapshot_job>{"startTime": "6020275.110714603", "timeout": "1800", "abort": true, "completed": false, "jobUUID": "b05d072b-374f-4321-8949-d160d9797e17", "frozen": false, "memoryParams": {}}</ovirt-vm:snapshot_job>\n <ovirt-vm:startTime type="float">1630936802.2939482</ovirt-vm:startTime>\n <ovirt-vm:device mac_address="56:6f:96:b1:00:4f">\n <ovirt-vm:network>onb6abac0adf5e4</ovirt-vm:network>\n <ovirt-vm:custom>\n <ovirt-vm:plugin_type>OVIRT_PROVIDER_OVN</ovirt-vm:plugin_type>\n <ovirt-vm:provider_type>EXTERNAL_NETWORK</ovirt-vm:provider_type>\n <ovirt-vm:queues>4</ovirt-vm:queues>\n <ovirt-vm:vnic_id>0e0836da-1679-49c7-9fdb-1ba556ee7ece</ovirt-vm:vnic_id>\n </ovirt-vm:custom>\n </ovirt-vm:device>\n <ovirt-vm:device mac_address="02:00:00:b8:80:c5">\n <ovirt-vm:network>ovirtmgmt</ovirt-vm:network>\n <ovirt-vm:custom>\n <ovirt-vm:queues>4</ovirt-
vm:queues>\n </ovirt-vm:custom>\n </ovirt-vm:device>\n <ovirt-vm:device devtype="disk" name="sda">\n <ovirt-vm:domainID>e25db7d0-060a-4046-94b5-235f38097cd8</ovirt-vm:domainID>\n <ovirt-vm:guestName>\\\\.\\PHYSICALDRIVE0</ovirt-vm:guestName>\n <ovirt-vm:imageID>4d79c1da-34f0-44e3-8b92-c4bcb8524d83</ovirt-vm:imageID>\n <ovirt-vm:managed type="bool">False</ovirt-vm:managed>\n <ovirt-vm:poolID>609ff8db-09c5-435b-b2e5-023d57003138</ovirt-vm:poolID>\n <ovirt-vm:volumeID>5aad30c7-96f0-433d-95c8-2317e5f80045</ovirt-vm:volumeID>\n <ovirt-vm:volumeChain>\n <ovirt-vm:volumeChainNode>\n <ovirt-vm:domainID>e25db7d0-060a-4046-94b5-235f38097cd8</ovirt-vm:domainID>\n <ovirt-vm:imageID>4d79c1da-34f0-44e3-8b92-c4bcb8524d83</ovirt-vm:imageID>\n <ovirt-vm:leaseOffset type="int">0</ovirt-vm:leaseOffset>\n <ovirt-vm:leasePath>/rhev/data-center/mnt/OVIRT-HOST-44:_data/e25db
7d0-060a-4046-94b5-235f38097cd8/images/4d79c1da-34f0-44e3-8b92-c4bcb8524d83/5aad30c7-96f0-433d-95c8-2317e5f80045.lease</ovirt-vm:leasePath>\n <ovirt-vm:path>/rhev/data-center/mnt/OVIRT-HOST-44:_data/e25db7d0-060a-4046-94b5-235f38097cd8/images/4d79c1da-34f0-44e3-8b92-c4bcb8524d83/5aad30c7-96f0-433d-95c8-2317e5f80045</ovirt-vm:path>\n <ovirt-vm:volumeID>5aad30c7-96f0-433d-95c8-2317e5f80045</ovirt-vm:volumeID>\n </ovirt-vm:volumeChainNode>\n </ovirt-vm:volumeChain>\n </ovirt-vm:device>\n <ovirt-vm:device devtype="disk" name="hdc">\n <ovirt-vm:managed type="bool">False</ovirt-vm:managed>\n </ovirt-vm:device>\n</ovirt-vm:vm>\n </metadata>\n <memory unit=\'KiB\'>60817408</memory>\n <currentMemory unit=\'KiB\'>60817408</currentMemory>\n <vcpu placement=\'static\' current=\'7\'>112</vcpu>\n <iothreads>1</iothreads>\n <resource>\n <partition>/machine</partition>\n </resource>\n <sysinfo type=\'smbios\'>\n <system>\
n <entry name=\'manufacturer\'>oVirt</entry>\n <entry name=\'product\'>RHEL</entry>\n <entry name=\'version\'>8.4-1.2105.el8</entry>\n <entry name=\'serial\'>00000000-0000-0000-0000-0cc47ada59e8</entry>\n <entry name=\'uuid\'>f1d56493-b5e0-480f-87a3-5e7f373712fa</entry>\n <entry name=\'family\'>oVirt</entry>\n </system>\n </sysinfo>\n <os>\n <type arch=\'x86_64\' machine=\'pc-i440fx-rhel7.6.0\'>hvm</type>\n <bios useserial=\'yes\'/>\n <smbios mode=\'sysinfo\'/>\n </os>\n <features>\n <acpi/>\n </features>\n <cpu mode=\'custom\' match=\'exact\' check=\'full\'>\n <model fallback=\'forbid\'>Nehalem</model>\n <topology sockets=\'16\' dies=\'1\' cores=\'7\' threads=\'1\'/>\n <feature policy=\'require\' name=\'vme\'/>\n <feature policy=\'require\' name=\'x2apic\'/>\n <feature policy=\'require\' name=\'hypervisor\'/>\n <numa>\n <cell id=\'0\' cpus=\'0-111\' memory=\'60817408\' unit=\'KiB\'/>\n </numa>\n </cpu>\n
<clock offset=\'variable\' adjustment=\'3551\' basis=\'utc\'>\n <timer name=\'rtc\' tickpolicy=\'catchup\'/>\n <timer name=\'pit\' tickpolicy=\'delay\'/>\n <timer name=\'hpet\' present=\'no\'/>\n </clock>\n <on_poweroff>destroy</on_poweroff>\n <on_reboot>restart</on_reboot>\n <on_crash>destroy</on_crash>\n <pm>\n <suspend-to-mem enabled=\'no\'/>\n <suspend-to-disk enabled=\'no\'/>\n </pm>\n <devices>\n <emulator>/usr/libexec/qemu-kvm</emulator>\n <disk type=\'file\' device=\'cdrom\'>\n <driver name=\'qemu\' error_policy=\'report\'/>\n <source startupPolicy=\'optional\'/>\n <target dev=\'hdc\' bus=\'ide\'/>\n <readonly/>\n <alias name=\'ua-c3981d65-ac78-491e-b6ba-58511e19dcd9\'/>\n <address type=\'drive\' controller=\'0\' bus=\'1\' target=\'0\' unit=\'0\'/>\n </disk>\n <disk type=\'file\' device=\'disk\' snapshot=\'no\'>\n <driver name=\'qemu\' type=\'qcow2\' cache=\'none\' error_policy=\'stop\' io=\'threads\'/>\n
<source file=\'/rhev/data-center/mnt/OVIRT-HOST-44:_data/e25db7d0-060a-4046-94b5-235f38097cd8/images/4d79c1da-34f0-44e3-8b92-c4bcb8524d83/5aad30c7-96f0-433d-95c8-2317e5f80045\' index=\'1\'>\n <seclabel model=\'dac\' relabel=\'no\'/>\n </source>\n <backingStore/>\n <target dev=\'sda\' bus=\'scsi\'/>\n <serial>4d79c1da-34f0-44e3-8b92-c4bcb8524d83</serial>\n <boot order=\'1\'/>\n <alias name=\'ua-4d79c1da-34f0-44e3-8b92-c4bcb8524d83\'/>\n <address type=\'drive\' controller=\'0\' bus=\'0\' target=\'0\' unit=\'0\'/>\n </disk>\n <controller type=\'virtio-serial\' index=\'0\' ports=\'16\'>\n <alias name=\'ua-74456cae-661e-4fc9-abbf-426c8805d8ce\'/>\n <address type=\'pci\' domain=\'0x0000\' bus=\'0x00\' slot=\'0x03\' function=\'0x0\'/>\n </controller>\n <controller type=\'scsi\' index=\'0\' model=\'virtio-scsi\'>\n <driver iothread=\'1\'/>\n <alias name=\'ua-96d2cae7-2954-4f05-a4a2-b9dd701daf9a\'/>\n <addres
s type=\'pci\' domain=\'0x0000\' bus=\'0x00\' slot=\'0x04\' function=\'0x0\'/>\n </controller>\n <controller type=\'usb\' index=\'0\' model=\'piix3-uhci\'>\n <alias name=\'ua-aa367f25-227d-46fb-9b1a-b1e98e680404\'/>\n <address type=\'pci\' domain=\'0x0000\' bus=\'0x00\' slot=\'0x01\' function=\'0x2\'/>\n </controller>\n <controller type=\'pci\' index=\'0\' model=\'pci-root\'>\n <alias name=\'pci.0\'/>\n </controller>\n <controller type=\'ide\' index=\'0\'>\n <alias name=\'ide\'/>\n <address type=\'pci\' domain=\'0x0000\' bus=\'0x00\' slot=\'0x01\' function=\'0x1\'/>\n </controller>\n <interface type=\'bridge\'>\n <mac address=\'02:00:00:b8:80:c5\'/>\n <source bridge=\'ovirtmgmt\'/>\n <target dev=\'vnet5\'/>\n <model type=\'virtio\'/>\n <driver name=\'vhost\' queues=\'4\'/>\n <filterref filter=\'vdsm-no-mac-spoofing\'/>\n <link state=\'up\'/>\n <mtu size=\'1500\'/>\n <alias name=\'ua-3779fa3
5-e0b0-421d-a1c9-2642b52a1add\'/>\n <address type=\'pci\' domain=\'0x0000\' bus=\'0x00\' slot=\'0x06\' function=\'0x0\'/>\n </interface>\n <interface type=\'bridge\'>\n <mac address=\'56:6f:96:b1:00:4f\'/>\n <source bridge=\'br-int\'/>\n <virtualport type=\'openvswitch\'>\n <parameters interfaceid=\'0e0836da-1679-49c7-9fdb-1ba556ee7ece\'/>\n </virtualport>\n <target dev=\'vnet6\'/>\n <model type=\'virtio\'/>\n <driver name=\'vhost\' queues=\'4\'/>\n <link state=\'up\'/>\n <mtu size=\'1442\'/>\n <alias name=\'ua-07ea2855-879d-4f35-a361-40cb8adac18c\'/>\n <address type=\'pci\' domain=\'0x0000\' bus=\'0x00\' slot=\'0x08\' function=\'0x0\'/>\n </interface>\n <serial type=\'unix\'>\n <source mode=\'bind\' path=\'/var/run/ovirt-vmconsole-console/f1d56493-b5e0-480f-87a3-5e7f373712fa.sock\'/>\n <target type=\'isa-serial\' port=\'0\'>\n <model name=\'isa-serial\'/>\n </target>\n <alias n
ame=\'serial0\'/>\n </serial>\n <console type=\'unix\'>\n <source mode=\'bind\' path=\'/var/run/ovirt-vmconsole-console/f1d56493-b5e0-480f-87a3-5e7f373712fa.sock\'/>\n <target type=\'serial\' port=\'0\'/>\n <alias name=\'serial0\'/>\n </console>\n <channel type=\'unix\'>\n <source mode=\'bind\' path=\'/var/lib/libvirt/qemu/channels/f1d56493-b5e0-480f-87a3-5e7f373712fa.ovirt-guest-agent.0\'/>\n <target type=\'virtio\' name=\'ovirt-guest-agent.0\' state=\'connected\'/>\n <alias name=\'channel0\'/>\n <address type=\'virtio-serial\' controller=\'0\' bus=\'0\' port=\'1\'/>\n </channel>\n <channel type=\'unix\'>\n <source mode=\'bind\' path=\'/var/lib/libvirt/qemu/channels/f1d56493-b5e0-480f-87a3-5e7f373712fa.org.qemu.guest_agent.0\'/>\n <target type=\'virtio\' name=\'org.qemu.guest_agent.0\' state=\'disconnected\'/>\n <alias name=\'channel1\'/>\n <address type=\'virtio-serial\' controller=\'0\' bus=\'0\' port=\'2\'/
>\n </channel>\n <channel type=\'spicevmc\'>\n <target type=\'virtio\' name=\'com.redhat.spice.0\' state=\'connected\'/>\n <alias name=\'channel2\'/>\n <address type=\'virtio-serial\' controller=\'0\' bus=\'0\' port=\'3\'/>\n </channel>\n <input type=\'tablet\' bus=\'usb\'>\n <alias name=\'input0\'/>\n <address type=\'usb\' bus=\'0\' port=\'1\'/>\n </input>\n <input type=\'mouse\' bus=\'ps2\'>\n <alias name=\'input1\'/>\n </input>\n <input type=\'keyboard\' bus=\'ps2\'>\n <alias name=\'input2\'/>\n </input>\n <graphics type=\'spice\' port=\'5903\' tlsPort=\'5904\' autoport=\'yes\' listen=\'51.255.71.19\' passwdValidTo=\'1970-01-01T00:00:01\'>\n <listen type=\'network\' address=\'51.255.71.19\' network=\'vdsm-ovirtmgmt\'/>\n <channel name=\'main\' mode=\'secure\'/>\n <channel name=\'display\' mode=\'secure\'/>\n <channel name=\'inputs\' mode=\'secure\'/>\n <channel name=\'cursor\' mode=\'secure\
'/>\n <channel name=\'playback\' mode=\'secure\'/>\n <channel name=\'record\' mode=\'secure\'/>\n <channel name=\'smartcard\' mode=\'secure\'/>\n <channel name=\'usbredir\' mode=\'secure\'/>\n </graphics>\n <graphics type=\'vnc\' port=\'5905\' autoport=\'yes\' listen=\'51.255.71.19\' keymap=\'it\' passwdValidTo=\'2021-09-01T15:26:11\'>\n <listen type=\'network\' address=\'51.255.71.19\' network=\'vdsm-ovirtmgmt\'/>\n </graphics>\n <video>\n <model type=\'qxl\' ram=\'65536\' vram=\'8192\' vgamem=\'16384\' heads=\'1\' primary=\'yes\'/>\n <alias name=\'ua-064c88b3-28f1-4eb1-b27d-3daee11c8f86\'/>\n <address type=\'pci\' domain=\'0x0000\' bus=\'0x00\' slot=\'0x02\' function=\'0x0\'/>\n </video>\n <watchdog model=\'i6300esb\' action=\'reset\'>\n <alias name=\'ua-4155692c-8795-4eb8-996f-eea2ae89ae77\'/>\n <address type=\'pci\' domain=\'0x0000\' bus=\'0x00\' slot=\'0x07\' function=\'0x0\'/>\n </watchdog>\n <memballoon
model=\'virtio\'>\n <stats period=\'5\'/>\n <alias name=\'ua-4b200768-8edc-418b-9f9e-f830dc1678ea\'/>\n <address type=\'pci\' domain=\'0x0000\' bus=\'0x00\' slot=\'0x05\' function=\'0x0\'/>\n </memballoon>\n <rng model=\'virtio\'>\n <backend model=\'random\'>/dev/urandom</backend>\n <alias name=\'ua-3c380bd8-2263-4fdb-86fc-8d6f24687f84\'/>\n <address type=\'pci\' domain=\'0x0000\' bus=\'0x00\' slot=\'0x09\' function=\'0x0\'/>\n </rng>\n </devices>\n <seclabel type=\'dynamic\' model=\'dac\' relabel=\'yes\'>\n <label>+107:+107</label>\n <imagelabel>+107:+107</imagelabel>\n </seclabel>\n <qemu:capabilities>\n <qemu:add capability=\'blockdev\'/>\n <qemu:add capability=\'incremental-backup\'/>\n </qemu:capabilities>\n</domain>\n'}, 'status': {'code': 0, 'message': 'Done'}} from=::ffff:HOST.IP.ADDRESS,36340, flow_id=469dbfd8-2e2f-4cb3-84b1-d456acc78fd9 (api:54)
2021-11-08 10:19:39,585+0100 INFO (jsonrpc/0) [vdsm.api] START deleteVolume(sdUUID='e25db7d0-060a-4046-94b5-235f38097cd8', spUUID='609ff8db-09c5-435b-b2e5-023d57003138', imgUUID='4d79c1da-34f0-44e3-8b92-c4bcb8524d83', volumes=['74e7188d-3727-4ed6-a2e5-dfa73b9e7da3'], postZero='false', force='false', discard=False) from=::ffff:HOST.IP.ADDRESS,36356, flow_id=469dbfd8-2e2f-4cb3-84b1-d456acc78fd9, task_id=18850219-9586-4648-b3fb-be7edd4b6b28 (api:48)
2021-11-08 10:19:39,625+0100 INFO (jsonrpc/0) [vdsm.api] FINISH deleteVolume return=None from=::ffff:HOST.IP.ADDRESS,36356, flow_id=469dbfd8-2e2f-4cb3-84b1-d456acc78fd9, task_id=18850219-9586-4648-b3fb-be7edd4b6b28 (api:54)
2021-11-08 10:19:39,643+0100 INFO (tasks/7) [storage.ThreadPool.WorkerThread] START task 18850219-9586-4648-b3fb-be7edd4b6b28 (cmd=<bound method Task.commit of <vdsm.storage.task.Task object at 0x7f78e6db80b8>>, args=None) (threadPool:146)
2021-11-08 10:19:39,675+0100 INFO (tasks/7) [storage.Volume] Request to delete volume 74e7188d-3727-4ed6-a2e5-dfa73b9e7da3 (fileVolume:600)
2021-11-08 10:19:39,688+0100 INFO (tasks/7) [storage.VolumeManifest] sdUUID=e25db7d0-060a-4046-94b5-235f38097cd8 imgUUID=4d79c1da-34f0-44e3-8b92-c4bcb8524d83 volUUID = 74e7188d-3727-4ed6-a2e5-dfa73b9e7da3 legality = ILLEGAL (volume:404)
2021-11-08 10:19:39,715+0100 INFO (tasks/7) [storage.VolumeManifest] Removing: /rhev/data-center/mnt/OVIRT-HOST-44:_data/e25db7d0-060a-4046-94b5-235f38097cd8/images/4d79c1da-34f0-44e3-8b92-c4bcb8524d83/74e7188d-3727-4ed6-a2e5-dfa73b9e7da3.meta (fileVolume:286)
2021-11-08 10:19:39,739+0100 INFO (tasks/7) [storage.ThreadPool.WorkerThread] FINISH task 18850219-9586-4648-b3fb-be7edd4b6b28 (threadPool:148)
2021-11-08 10:19:45,963+0100 INFO (jsonrpc/6) [api.host] START getAllVmStats() from=::ffff:HOST.IP.ADDRESS,36340 (api:48)
3 years
Re: Change host IP on VM
by Derek Atkins
When I did it, I set the SpiceProxyDefault Engine configuration option.
(Use engine-config -s SpiceProxyDefault=http://public-ip:port)
-derek
On Tue, November 9, 2021 7:04 am, Staniforth, Paul wrote:
>
> Hello,
> you could create a separate display network or install a spice
> proxy.
>
> Regards,
>
> Paul S.
> ________________________________
> From: sekevgeniyig(a)gmail.com <sekevgeniyig(a)gmail.com>
> Sent: 09 November 2021 11:43
> To: users(a)ovirt.org <users(a)ovirt.org>
> Subject: [ovirt-users] Change host IP on VM
>
> Caution External Mail: Do not click any links or open any attachments
> unless you trust the sender and know that the content is safe.
>
> Hi, can you please help me?
> I deployed ovirt on an AWS instance, created a vm but during generation
> .vv the file, an private IP is put in the host field instead of an public
> one, which is why the remote-viewer does not connect to the vm. If I
> change the host from an private IP to an public one, then everything
> works. Where and how can I change this field ?
> _______________________________________________
> Users mailing list -- users(a)ovirt.org
> To unsubscribe send an email to users-leave(a)ovirt.org
> Privacy Statement:
> https://eur02.safelinks.protection.outlook.com/?url=https%3A%2F%2Fwww.ovi...
> oVirt Code of Conduct:
> https://eur02.safelinks.protection.outlook.com/?url=https%3A%2F%2Fwww.ovi...
> List Archives:
> https://eur02.safelinks.protection.outlook.com/?url=https%3A%2F%2Flists.o...
> To view the terms under which this email is distributed, please go to:-
> https://leedsbeckett.ac.uk/disclaimer/email
> _______________________________________________
> Users mailing list -- users(a)ovirt.org
> To unsubscribe send an email to users-leave(a)ovirt.org
> Privacy Statement: https://www.ovirt.org/privacy-policy.html
> oVirt Code of Conduct:
> https://www.ovirt.org/community/about/community-guidelines/
> List Archives:
> https://lists.ovirt.org/archives/list/users@ovirt.org/message/O6FI5TTDF7L...
>
--
Derek Atkins 617-623-3745
derek(a)ihtfp.com www.ihtfp.com
Computer and Internet Security Consultant
3 years
Change host IP on VM
by sekevgeniyig@gmail.com
Hi, can you please help me?
I deployed ovirt on an AWS instance, created a vm but during generation .vv the file, an private IP is put in the host field instead of an public one, which is why the remote-viewer does not connect to the vm. If I change the host from an private IP to an public one, then everything works. Where and how can I change this field ?
3 years
after restore check for upgrade fails
by p.staniforth@leedsbeckett.ac.uk
Hello
After doing a a backup and restoring to a new oVirt management server check for upgrade of hosts fails.
Also reinstall and enroll certificate fails.
The error message is
Failed to check for available updates on host node1.example.com with message 'Failed to run check-update of host 'node1.example.com'. Error: Failed to read the runner-service response. Unexpected character ('<' (code 60)): expected a valid value (JSON String, Number, Array, Object or token 'null', 'true' or 'false')
at [Source: (org.apache.http.conn.EofSensorInputStream); line: 1, column: 2]'.
Also removing and adding a host back fails
The error message is
Host node3.example.com installation failed. Failed to execute Ansible host-deploy role: Failed to read the runner-service response. Unexpected character ('<' (code 60)): expected a valid value (JSON String, Number, Array, Object or token 'null', 'true' or 'false')
at [Source: (org.apache.http.conn.EofSensorInputStream); line: 1, column: 2]. Please check logs for more details: /var/log/ovirt-engine/ansible-runner-service.log.
Unfortunately the file /var/log/ovirt-engine/ansible-runner-service.log doesn't exist.
Regards,
Paul S.
3 years