Installed Packages
qemu-kvm.x86_64 2:0.12.1.2-2.415.el6_5.6
@updates
Available Packages
qemu-kvm.x86_64 2:0.12.1.2-2.415.el6_5.7
updates
[
2014-04-04 17:06 GMT+02:00 Kevin Tibi <kevintibi(a)hotmail.com>:
It's centos 6.5. Have I need to change my repo ? I have just EPEL
and
Ovirt repo.
2014-04-04 16:23 GMT+02:00 Douglas Schilling Landgraf <
dougsland(a)redhat.com>:
Hi,
>
>
> On 04/04/2014 10:04 AM, Kevin Tibi wrote:
>
>> Yes it's a live snapshots. Normal snapshot works.
>>
>
> Question:
> Is it a EL6 hosts? If yes, are you using qemu-kvm from:
>
jenkins.ovirt.org/view/Packaging/job/qemu-kvm-rhev_create_rpms_el6/ ?
>
>
> Thanks!
>
>
>> How i make debug in vdsm ?
>>
>> mom.conf :
>>
>> log: /var/log/vdsm/mom.log
>>
>> verbosity: info
>>
>> vdsm.conf :
>>
>> [root@host02 ~]# cat /etc/vdsm/vdsm.conf
>> [addresses]
>> management_port = 54321
>>
>> [vars]
>> ssl = true
>>
>>
>>
>> 2014-04-04 15:27 GMT+02:00 Dafna Ron <dron(a)redhat.com
>> <mailto:dron@redhat.com>>:
>>
>>
>> is this a live snapshots (wile vm is running)?
>> can you please make sure your vdsm log is in debug and attach the
>> full log?
>>
>> Thanks,
>> Dafna
>>
>>
>>
>> On 04/04/2014 02:23 PM, Michal Skrivanek wrote:
>>
>> On 4 Apr 2014, at 12:45, Kevin Tibi wrote:
>>
>> Hi,
>>
>> I have a pb when i try to snapshot a VM.
>>
>> are you running the right qemu/libvirt from virt-preview repo?
>>
>> Ovirt engine self hosted 3.4. Two node (host01 and host02).
>>
>> my engine.log :
>>
>> 2014-04-04 12:30:03,013 INFO
>> [org.ovirt.engine.core.bll.__
>> CreateAllSnapshotsFromVmComman__d] (org.ovirt.thread.pool-6-__thread-24)
>> Ending command successfully: org.ovirt.engine.core.bll.__
>> CreateAllSnapshotsFromVmComman__d
>>
>> 2014-04-04 12:30:03,028 INFO
>> [org.ovirt.engine.core.__vdsbroker.vdsbroker.__SnapshotVDSCommand]
>> (org.ovirt.thread.pool-6-__thread-24) START,
>> SnapshotVDSCommand(HostName = host01, HostId =
fcb9a5cf-2064-42a5-99fe-__dc56ea39ed81,
>> vmId=cb038ccf-6c6f-475c-872f-__ea812ff795a1), log id: 36463977
>>
>> 2014-04-04 12:30:03,075 ERROR
>> [org.ovirt.engine.core.__vdsbroker.vdsbroker.__
>> SnapshotVDSCommand]
>> (org.ovirt.thread.pool-6-__thread-24) Failed in SnapshotVDS
>>
>> method
>> 2014-04-04 12:30:03,076 INFO
>> [org.ovirt.engine.core.__vdsbroker.vdsbroker.__SnapshotVDSCommand]
>> (org.ovirt.thread.pool-6-__thread-24) Command org.ovirt.engine.core.__
>> vdsbroker.vdsbroker.__SnapshotVDSCommand return value
>>
>> StatusOnlyReturnForXmlRpc [mStatus=StatusForXmlRpc
>> [mCode=48, mMessage=Snapshot failed]]
>> 2014-04-04 12:30:03,077 INFO
>> [org.ovirt.engine.core.__vdsbroker.vdsbroker.__SnapshotVDSCommand]
>> (org.ovirt.thread.pool-6-__thread-24) HostName = host01
>>
>> 2014-04-04 12:30:03,078 ERROR
>> [org.ovirt.engine.core.__vdsbroker.vdsbroker.__
>> SnapshotVDSCommand]
>> (org.ovirt.thread.pool-6-__thread-24) Command
>>
>> SnapshotVDSCommand(HostName = host01, HostId =
>> fcb9a5cf-2064-42a5-99fe-__dc56ea39ed81,
>> vmId=cb038ccf-6c6f-475c-872f-__ea812ff795a1) execution
>>
>> failed. Exception: VDSErrorException: VDSGenericException:
>> VDSErrorException: Failed to SnapshotVDS, error = Snapshot
>> failed, code = 48
>> 2014-04-04 12:30:03,080 INFO
>> [org.ovirt.engine.core.__vdsbroker.vdsbroker.__SnapshotVDSCommand]
>> (org.ovirt.thread.pool-6-__thread-24) FINISH, SnapshotVDSCommand, log
>> id: 36463977
>>
>> 2014-04-04 12:30:03,083 WARN
>> [org.ovirt.engine.core.bll.__
>> CreateAllSnapshotsFromVmComman__d] (org.ovirt.thread.pool-6-__thread-24)
>> Wasnt able to live snapshot due to error: VdcBLLException: VdcBLLException:
>> org.ovirt.engine.core.__vdsbroker.vdsbroker.__VDSErrorException:
>> VDSGenericException: VDSErrorException: Failed to SnapshotVDS, error =
>> Snapshot failed, code = 48 (Failed with error SNAPSHOT_FAILED and code 48).
>> VM will still be configured to the new created snapshot
>>
>> 2014-04-04 12:30:03,097 INFO
>>
[org.ovirt.engine.core.dal.__dbbroker.auditloghandling.__AuditLogDirector]
>> (org.ovirt.thread.pool-6-__thread-24) Correlation ID: 5650b99f, Job ID:
>> c1b2d861-2a52-49f1-9eaa-__1b63aa8b4fba, Call Stack:
>> org.ovirt.engine.core.common.__errors.VdcBLLException: VdcBLLException:
>> org.ovirt.engine.core.__vdsbroker.vdsbroker.__VDSErrorException:
>> VDSGenericException: VDSErrorException: Failed to SnapshotVDS, error =
>> Snapshot failed, code = 48 (Failed with error SNAPSHOT_FAILED and code 48)
>>
>>
>>
>> My /var/log/messages
>>
>> Apr 4 12:30:04 host01 vdsm vm.Vm ERROR
>> vmId=`cb038ccf-6c6f-475c-872f-__ea812ff795a1`::The base
>>
>> volume doesn't exist: {'device': 'disk',
'domainID':
>> '5ae613a4-44e4-42cb-89fc-__7b5d34c1f30f',
'volumeID':
>> '3b6cbb5d-beed-428d-ac66-__9db3dd002e2f', 'imageID':
>> '646df162-5c6d-44b1-bc47-__b63c3fdab0e2'}
>>
>>
>> My /var/log/libvirt/libvirt.log
>>
>> 2014-04-04 10:40:13.886+0000: 8234: debug :
>> qemuMonitorIOWrite:462 : QEMU_MONITOR_IO_WRITE:
>> mon=0x7f77ec0ccce0
>>
buf={"execute":"query-__blockstats","id":"libvirt-__20842"}
>>
>> len=53 ret=53 errno=11
>> 2014-04-04 10:40:13.888+0000: 8234: debug :
>> qemuMonitorIOProcess:354 : QEMU_MONITOR_IO_PROCESS:
>> mon=0x7f77ec0ccce0 buf={"return": [{"device":
>> "drive-ide0-1-0", "parent": {"stats":
>> {"flush_total_time_ns": 0, "wr_highest_offset":
0,
>> "wr_total_time_ns": 0, "wr_bytes": 0,
"rd_total_time_ns": 0,
>> "flush_operations": 0, "wr_operations": 0,
"rd_bytes": 0,
>> "rd_operations": 0}}, "stats":
{"flush_total_time_ns": 0,
>> "wr_highest_offset": 0, "wr_total_time_ns": 0,
"wr_bytes":
>> 0, "rd_total_time_ns": 11929902,
"flush_operations": 0,
>> "wr_operations": 0, "rd_bytes": 135520,
"rd_operations":
>> 46}}, {"device": "drive-virtio-disk0",
"parent": {"stats":
>> {"flush_total_time_ns": 0, "wr_highest_offset":
22184332800,
>> "wr_total_time_ns": 0, "wr_bytes": 0,
"rd_total_time_ns": 0,
>> "flush_operations": 0, "wr_operations": 0,
"rd_bytes": 0,
>> "rd_operations": 0}}, "stats":
{"flush_total_time_ns":
>> 34786515034, "wr_highest_offset": 22184332800,
>> "wr_total_time_ns": 5131205369094, "wr_bytes":
5122065408,
>> "rd_tot
>>
>> a
>>
>> l_time_ns": 12987633373, "flush_operations": 285398,
>> "wr_operations": 401232, "rd_bytes": 392342016,
"rd_operations":
>> 15069}}], "id": "libvirt-20842"}
>>
>> len=1021
>> 2014-04-04 10:40:13.888+0000: 8263: debug :
>> qemuMonitorGetBlockStatsInfo:__1478 : mon=0x7f77ec0ccce0
>>
>> dev=ide0-1-0
>> 2014-04-04 10:40:13.889+0000: 8263: debug :
>> qemuMonitorSend:904 : QEMU_MONITOR_SEND_MSG:
>> mon=0x7f77ec0ccce0
>>
msg={"execute":"query-__blockstats","id":"libvirt-__20843"}
>>
>> /var/log/vdsm/vdsm.log
>> Thread-4732::DEBUG::2014-04-04
>> 12:43:34,439::BindingXMLRPC::__1067::vds::(wrapper) client
>> [192.168.99.104]::call vmSnapshot with
>> ('cb038ccf-6c6f-475c-872f-__ea812ff795a1',
[{'baseVolumeID':
>> 'b62232fc-4e02-41ce-ae10-__5dff9e2f7bbe',
'domainID':
>> '5ae613a4-44e4-42cb-89fc-__7b5d34c1f30f',
'volumeID':
>> 'f5fc4fed-4acd-46e8-9980-__90a9c3985840', 'imageID':
>> '646df162-5c6d-44b1-bc47-__b63c3fdab0e2'}],
>> '5ae613a4-44e4-42cb-89fc-__7b5d34c1f30f,00000002-0002-__
>> 0002-0002-000000000076,__4fb31c32-8467-4d4a-b817-__
>> 977643a462e3,ceb881f3-9a46-__4ebc-b82e-c4c91035f807,__
>> 2c06b4da-2743-4422-ba94-__74da2c709188,02804da9-34f8-__
>> 438f-9e8a-9689bc94790c')
>> {}
>> Thread-4732::ERROR::2014-04-04
>> 12:43:34,440::vm::3910::vm.Vm:__:(snapshot)
>> vmId=`cb038ccf-6c6f-475c-872f-__ea812ff795a1`::The base
>>
>> volume doesn't exist: {'device': 'disk',
'domainID':
>> '5ae613a4-44e4-42cb-89fc-__7b5d34c1f30f',
'volumeID':
>> 'b62232fc-4e02-41ce-ae10-__5dff9e2f7bbe', 'imageID':
>> '646df162-5c6d-44b1-bc47-__b63c3fdab0e2'}
>> Thread-4732::DEBUG::2014-04-04
>> 12:43:34,440::BindingXMLRPC::__1074::vds::(wrapper) return
>>
>> vmSnapshot with {'status': {'message': 'Snapshot
failed',
>> 'code': 48}}
>> Thread-299::DEBUG::2014-04-04
>> 12:43:35,423::fileSD::225::__Storage.Misc.excCmd::(__
>> getReadDelay)
>> '/bin/dd iflag=direct
>> if=/rhev/data-center/mnt/__host01.ovirt.lan:_home_export/
>> __ff98d346-4515-4349-8437-__fb2f5e9eaadf/dom_md/metadata
>>
>> bs=4096 count=1' (cwd None)
>>
>>
>> Thx;)
>>
>> _________________________________________________
>> Users mailing list
>> Users(a)ovirt.org <mailto:Users@ovirt.org>
>>
http://lists.ovirt.org/__mailman/listinfo/users
>> <
http://lists.ovirt.org/mailman/listinfo/users>
>>
>> _________________________________________________
>> Users mailing list
>> Users(a)ovirt.org <mailto:Users@ovirt.org>
>>
http://lists.ovirt.org/__mailman/listinfo/users
>>
>> <
http://lists.ovirt.org/mailman/listinfo/users>
>>
>>
>>
>> --
>> Dafna Ron
>>
>>
>> _________________________________________________
>> Users mailing list
>> Users(a)ovirt.org <mailto:Users@ovirt.org>
>>
http://lists.ovirt.org/__mailman/listinfo/users
>>
>> <
http://lists.ovirt.org/mailman/listinfo/users>
>>
>>
>>
>>
>> _______________________________________________
>> Users mailing list
>> Users(a)ovirt.org
>>
http://lists.ovirt.org/mailman/listinfo/users
>>
>>
>
> --
> Cheers
> Douglas
>
> _______________________________________________
> Users mailing list
> Users(a)ovirt.org
>
http://lists.ovirt.org/mailman/listinfo/users
>