[Users] Help: Fatal error during migration

hello,dears: i have a ovirt-engine with two node 172.30.0.229 ovirt-engine.local 172.30.0.231 ovirt-node-1.local 172.30.0.232 ovirt-node-2.local When i use the live migration ,migrate a vm from node2 to node1 , it failed with the error code 12 && error msg 'Fatal error during migration' , I don't know why , could anybody help me ? thanks Detail log as bellow : tail -f /var/log/ovirt-engine/engine.log 2012-04-25 22:59:52,744 INFO [org.ovirt.engine.core.bll.VdsSelector] (http--0.0.0.0-8080-16) Checking for a specific VDS only - id:ae567034-5d8e-11e1-bdc9-a7168ad4d39f, name:ovirt-node-1.local, host_name(ip):172.30.0.231 2012-04-25 22:59:52,745 INFO [org.ovirt.engine.core.bll.MigrateVmToServerCommand] (pool-5-thread-46) Running command: MigrateVmToServerCommand internal: false. Entities affected : ID: f73f17a8-a418-4318-af0e-2cea18ab597a Type: VM 2012-04-25 22:59:52,748 INFO [org.ovirt.engine.core.bll.VdsSelector] (pool-5-thread-46) Checking for a specific VDS only - id:ae567034-5d8e-11e1-bdc9-a7168ad4d39f, name:ovirt-node-1.local, host_name(ip):172.30.0.231 2012-04-25 22:59:52,754 INFO [org.ovirt.engine.core.vdsbroker.MigrateVDSCommand] (pool-5-thread-46) START, MigrateVDSCommand(vdsId = bcee74d4-5d8e-11e1-a03e-8bc4768fc531, vmId=f73f17a8-a418-4318-af0e-2cea18ab597a, srcHost=172.30.0.232, dstVdsId=ae567034-5d8e-11e1-bdc9-a7168ad4d39f, dstHost=172.30.0.231:54321, migrationMethod=ONLINE), log id: 444b3006 2012-04-25 22:59:52,757 INFO [org.ovirt.engine.core.vdsbroker.vdsbroker.MigrateBrokerVDSCommand] (pool-5-thread-46) VdsBroker::migrate::Entered (vm_guid=f73f17a8-a418-4318-af0e-2cea18ab597a, srcHost=172.30.0.232, dstHost=172.30.0.231:54321, method=online 2012-04-25 22:59:52,757 INFO [org.ovirt.engine.core.vdsbroker.vdsbroker.MigrateBrokerVDSCommand] (pool-5-thread-46) START, MigrateBrokerVDSCommand(vdsId = bcee74d4-5d8e-11e1-a03e-8bc4768fc531, vmId=f73f17a8-a418-4318-af0e-2cea18ab597a, srcHost=172.30.0.232, dstVdsId=ae567034-5d8e-11e1-bdc9-a7168ad4d39f, dstHost=172.30.0.231:54321, migrationMethod=ONLINE), log id: 2a26d38e 2012-04-25 22:59:52,778 INFO [org.ovirt.engine.core.vdsbroker.vdsbroker.MigrateBrokerVDSCommand] (pool-5-thread-46) FINISH, MigrateBrokerVDSCommand, log id: 2a26d38e 2012-04-25 22:59:52,817 INFO [org.ovirt.engine.core.vdsbroker.MigrateVDSCommand] (pool-5-thread-46) FINISH, MigrateVDSCommand, return: MigratingFrom, log id: 444b3006 2012-04-25 22:59:53,281 ERROR [org.ovirt.engine.core.vdsbroker.VdsUpdateRunTimeInfo] (QuartzScheduler_Worker-40) Rerun vm f73f17a8-a418-4318-af0e-2cea18ab597a. Called from vds ovirt-node-2.local 2012-04-25 22:59:53,286 INFO [org.ovirt.engine.core.vdsbroker.UpdateVdsDynamicDataVDSCommand] (pool-5-thread-46) START, UpdateVdsDynamicDataVDSCommand(vdsId = ae567034-5d8e-11e1-bdc9-a7168ad4d39f, vdsDynamic=org.ovirt.engine.core.common.businessentities.VdsDynamic@34c39cde), log id: 4c2ed94e 2012-04-25 22:59:53,299 INFO [org.ovirt.engine.core.vdsbroker.UpdateVdsDynamicDataVDSCommand] (pool-5-thread-46) FINISH, UpdateVdsDynamicDataVDSCommand, log id: 4c2ed94e 2012-04-25 22:59:53,304 INFO [org.ovirt.engine.core.vdsbroker.vdsbroker.MigrateStatusVDSCommand] (pool-5-thread-46) START, MigrateStatusVDSCommand(vdsId = bcee74d4-5d8e-11e1-a03e-8bc4768fc531, vmId=f73f17a8-a418-4318-af0e-2cea18ab597a), log id: 69d2528 2012-04-25 22:59:53,320 ERROR [org.ovirt.engine.core.vdsbroker.vdsbroker.BrokerCommandBase] (pool-5-thread-46) Failed in MigrateStatusVDS method 2012-04-25 22:59:53,320 ERROR [org.ovirt.engine.core.vdsbroker.vdsbroker.BrokerCommandBase] (pool-5-thread-46) Error code migrateErr and error message VDSGenericException: VDSErrorException: Failed to MigrateStatusVDS, error = Fatal error during migration 2012-04-25 22:59:53,321 INFO [org.ovirt.engine.core.vdsbroker.vdsbroker.BrokerCommandBase] (pool-5-thread-46) Command org.ovirt.engine.core.vdsbroker.vdsbroker.MigrateStatusVDSCommand return value Class Name: org.ovirt.engine.core.vdsbroker.vdsbroker.StatusOnlyReturnForXmlRpc mStatus Class Name: org.ovirt.engine.core.vdsbroker.vdsbroker.StatusForXmlRpc mCode 12 mMessage Fatal error during migration 2012-04-25 22:59:53,321 INFO [org.ovirt.engine.core.vdsbroker.vdsbroker.BrokerCommandBase] (pool-5-thread-46) Vds: ovirt-node-2.local 2012-04-25 22:59:53,321 ERROR [org.ovirt.engine.core.vdsbroker.VDSCommandBase] (pool-5-thread-46) Command MigrateStatusVDS execution failed. Exception: VDSErrorException: VDSGenericException: VDSErrorException: Failed to MigrateStatusVDS, error = Fatal error during migration 2012-04-25 22:59:53,321 INFO [org.ovirt.engine.core.vdsbroker.vdsbroker.MigrateStatusVDSCommand] (pool-5-thread-46) FINISH, MigrateStatusVDSCommand, log id: 69d2528

--Apple-Mail=_440FAF08-BE2F-4F2B-933A-11F683AFC866 Content-Transfer-Encoding: quoted-printable Content-Type: text/plain; charset=us-ascii thanks Heim node1 is target node, node2 is source node , there're too many logs , i = put the relative logs to the bellow files=20 --Apple-Mail=_440FAF08-BE2F-4F2B-933A-11F683AFC866 Content-Disposition: attachment; filename=node1-vdsm-log.txt Content-Type: text/plain; x-mac-hide-extension=yes; x-unix-mode=0644; name="node1-vdsm-log.txt" Content-Transfer-Encoding: quoted-printable Thread-110812::DEBUG::2012-04-26 = 10:14:11,538::resourceManager::562::ResourceManager::(releaseResource) = No one is waiting for resource = 'Storage.524a7003-edec-4f52-a38e-b15cadfbe3ef', Clearing records. Thread-110812::DEBUG::2012-04-26 = 10:14:11,538::task::980::TaskManager.Task::(_decref) = Task=3D`2df4d0ae-084e-4d13-8098-6649ac5c7132`::ref 0 aborting False Thread-110816::DEBUG::2012-04-26 = 10:14:17,886::task::588::TaskManager.Task::(_updateState) = Task=3D`126cc9ca-241d-43a3-ac66-7f0d41c98352`::moving from state init -> = state preparing Thread-110816::INFO::2012-04-26 = 10:14:17,887::logUtils::37::dispatcher::(wrapper) Run and protect: = repoStats(options=3DNone) Thread-110816::INFO::2012-04-26 = 10:14:17,888::logUtils::39::dispatcher::(wrapper) Run and protect: = repoStats, Return response: {'fce72110-71ae-46be-8811-2fbeaa001ed2': = {'delay': '0.00180697441101', 'lastCheck': 1335485552.219619, 'code': 0, = 'valid': True}, '5e2ac537-6a73-4faf-8379-68f3ff26a75d': {'delay': = '0.00167393684387', 'lastCheck': 1335485555.818574, 'code': 0, 'valid': = True}, '6d931187-2c83-4c33-8693-ca2570f7c136': {'delay': = '0.00185298919678', 'lastCheck': 1335485555.817463, 'code': 0, 'valid': = True}} Thread-110816::DEBUG::2012-04-26 = 10:14:17,889::task::1174::TaskManager.Task::(prepare) = Task=3D`126cc9ca-241d-43a3-ac66-7f0d41c98352`::finished: = {'fce72110-71ae-46be-8811-2fbeaa001ed2': {'delay': '0.00180697441101', = 'lastCheck': 1335485552.219619, 'code': 0, 'valid': True}, = '5e2ac537-6a73-4faf-8379-68f3ff26a75d': {'delay': '0.00167393684387', = 'lastCheck': 1335485555.818574, 'code': 0, 'valid': True}, = '6d931187-2c83-4c33-8693-ca2570f7c136': {'delay': '0.00185298919678', = 'lastCheck': 1335485555.817463, 'code': 0, 'valid': True}} Thread-110816::DEBUG::2012-04-26 = 10:14:17,889::task::588::TaskManager.Task::(_updateState) = Task=3D`126cc9ca-241d-43a3-ac66-7f0d41c98352`::moving from state = preparing -> state finished Thread-110816::DEBUG::2012-04-26 = 10:14:17,890::resourceManager::806::ResourceManager.Owner::(releaseAll) = Owner.releaseAll requests {} resources {} Thread-110816::DEBUG::2012-04-26 = 10:14:17,891::resourceManager::841::ResourceManager.Owner::(cancelAll) = Owner.cancelAll requests {} Thread-110816::DEBUG::2012-04-26 = 10:14:17,891::task::980::TaskManager.Task::(_decref) = Task=3D`126cc9ca-241d-43a3-ac66-7f0d41c98352`::ref 0 aborting False Thread-110819::DEBUG::2012-04-26 = 10:14:21,632::clientIF::261::Storage.Dispatcher.Protect::(wrapper) = [172.30.0.229] Thread-110819::DEBUG::2012-04-26 = 10:14:21,634::task::588::TaskManager.Task::(_updateState) = Task=3D`9023ebe8-4202-4119-bb8d-5ad9b1584652`::moving from state init -> = state preparing Thread-110819::INFO::2012-04-26 = 10:14:21,635::logUtils::37::dispatcher::(wrapper) Run and protect: = getSpmStatus(spUUID=3D'524a7003-edec-4f52-a38e-b15cadfbe3ef', = options=3DNone) Thread-110819::INFO::2012-04-26 = 10:14:21,636::logUtils::39::dispatcher::(wrapper) Run and protect: = getSpmStatus, Return response: {'spm_st': {'spmId': 1, 'spmStatus': = 'SPM', 'spmLver': 10}} Thread-110819::DEBUG::2012-04-26 = 10:14:21,636::task::1174::TaskManager.Task::(prepare) = Task=3D`9023ebe8-4202-4119-bb8d-5ad9b1584652`::finished: {'spm_st': = {'spmId': 1, 'spmStatus': 'SPM', 'spmLver': 10}} Thread-110819::DEBUG::2012-04-26 = 10:14:21,637::task::588::TaskManager.Task::(_updateState) = Task=3D`9023ebe8-4202-4119-bb8d-5ad9b1584652`::moving from state = preparing -> state finished Thread-110819::DEBUG::2012-04-26 = 10:14:21,637::resourceManager::806::ResourceManager.Owner::(releaseAll) = Owner.releaseAll requests {} resources {} Thread-110819::DEBUG::2012-04-26 = 10:14:21,638::resourceManager::841::ResourceManager.Owner::(cancelAll) = Owner.cancelAll requests {} Thread-110819::DEBUG::2012-04-26 = 10:14:21,639::task::980::TaskManager.Task::(_decref) = Task=3D`9023ebe8-4202-4119-bb8d-5ad9b1584652`::ref 0 aborting False Thread-110820::DEBUG::2012-04-26 = 10:14:21,655::clientIF::261::Storage.Dispatcher.Protect::(wrapper) = [172.30.0.229] Thread-110820::DEBUG::2012-04-26 = 10:14:21,656::task::588::TaskManager.Task::(_updateState) = Task=3D`1601d4ff-ab77-424a-a677-6df3aa703c4d`::moving from state init -> = state preparing Thread-110820::INFO::2012-04-26 = 10:14:21,657::logUtils::37::dispatcher::(wrapper) Run and protect: = getStoragePoolInfo(spUUID=3D'524a7003-edec-4f52-a38e-b15cadfbe3ef', = options=3DNone) Thread-110820::DEBUG::2012-04-26 = 10:14:21,658::resourceManager::175::ResourceManager.Request::(__init__) = ResName=3D`Storage.524a7003-edec-4f52-a38e-b15cadfbe3ef`ReqID=3D`239079f4-= 2f57-4ca6-8a4f-f0944666cfed`::Request was made in = '/usr/share/vdsm/storage/hsm.py' line '1878' at 'getStoragePoolInfo' Thread-110820::DEBUG::2012-04-26 = 10:14:21,659::resourceManager::483::ResourceManager::(registerResource) = Trying to register resource = 'Storage.524a7003-edec-4f52-a38e-b15cadfbe3ef' for lock type 'shared' Thread-110820::DEBUG::2012-04-26 = 10:14:21,660::resourceManager::525::ResourceManager::(registerResource) = Resource 'Storage.524a7003-edec-4f52-a38e-b15cadfbe3ef' is free. Now = locking as 'shared' (1 active user) Thread-110820::DEBUG::2012-04-26 = 10:14:21,661::resourceManager::212::ResourceManager.Request::(grant) = ResName=3D`Storage.524a7003-edec-4f52-a38e-b15cadfbe3ef`ReqID=3D`239079f4-= 2f57-4ca6-8a4f-f0944666cfed`::Granted request Thread-110820::DEBUG::2012-04-26 = 10:14:21,662::task::819::TaskManager.Task::(resourceAcquired) = Task=3D`1601d4ff-ab77-424a-a677-6df3aa703c4d`::_resourcesAcquired: = Storage.524a7003-edec-4f52-a38e-b15cadfbe3ef (shared) Thread-110820::DEBUG::2012-04-26 = 10:14:21,662::task::980::TaskManager.Task::(_decref) = Task=3D`1601d4ff-ab77-424a-a677-6df3aa703c4d`::ref 1 aborting False Thread-110820::INFO::2012-04-26 = 10:14:21,666::logUtils::39::dispatcher::(wrapper) Run and protect: = getStoragePoolInfo, Return response: {'info': {'spm_id': 1, = 'master_uuid': '5e2ac537-6a73-4faf-8379-68f3ff26a75d', 'name': 'BLC', = 'version': '0', 'domains': = 'fce72110-71ae-46be-8811-2fbeaa001ed2:Active,5e2ac537-6a73-4faf-8379-68f3f= f26a75d:Active,6d931187-2c83-4c33-8693-ca2570f7c136:Active', = 'pool_status': 'connected', 'isoprefix': = '/rhev/data-center/524a7003-edec-4f52-a38e-b15cadfbe3ef/fce72110-71ae-46be= -8811-2fbeaa001ed2/images/11111111-1111-1111-1111-111111111111', 'type': = 'NFS', 'master_ver': 1, 'lver': 10}, 'dominfo': = {'fce72110-71ae-46be-8811-2fbeaa001ed2': {'status': 'Active', = 'diskfree': '212638531584', 'alerts': [], 'disktotal': '1733743181824'}, = '5e2ac537-6a73-4faf-8379-68f3ff26a75d': {'status': 'Active', 'diskfree': = '212638531584', 'alerts': [], 'disktotal': '1733743181824'}, = '6d931187-2c83-4c33-8693-ca2570f7c136': {'status': 'Active', 'diskfree': = '212638531584', 'alerts': [], 'disktotal': '1733743181824'}}} Thread-110820::DEBUG::2012-04-26 = 10:14:21,666::task::1174::TaskManager.Task::(prepare) = Task=3D`1601d4ff-ab77-424a-a677-6df3aa703c4d`::finished: {'info': = {'spm_id': 1, 'master_uuid': '5e2ac537-6a73-4faf-8379-68f3ff26a75d', = 'name': 'BLC', 'version': '0', 'domains': = 'fce72110-71ae-46be-8811-2fbeaa001ed2:Active,5e2ac537-6a73-4faf-8379-68f3f= f26a75d:Active,6d931187-2c83-4c33-8693-ca2570f7c136:Active', = 'pool_status': 'connected', 'isoprefix': = '/rhev/data-center/524a7003-edec-4f52-a38e-b15cadfbe3ef/fce72110-71ae-46be= -8811-2fbeaa001ed2/images/11111111-1111-1111-1111-111111111111', 'type': = 'NFS', 'master_ver': 1, 'lver': 10}, 'dominfo': = {'fce72110-71ae-46be-8811-2fbeaa001ed2': {'status': 'Active', = 'diskfree': '212638531584', 'alerts': [], 'disktotal': '1733743181824'}, = '5e2ac537-6a73-4faf-8379-68f3ff26a75d': {'status': 'Active', 'diskfree': = '212638531584', 'alerts': [], 'disktotal': '1733743181824'}, = '6d931187-2c83-4c33-8693-ca2570f7c136': {'status': 'Active', 'diskfree': = '212638531584', 'alerts': [], 'disktotal': '1733743181824'}}} Thread-110820::DEBUG::2012-04-26 = 10:14:21,667::task::588::TaskManager.Task::(_updateState) = Task=3D`1601d4ff-ab77-424a-a677-6df3aa703c4d`::moving from state = preparing -> state finished Thread-110820::DEBUG::2012-04-26 = 10:14:21,667::resourceManager::806::ResourceManager.Owner::(releaseAll) = Owner.releaseAll requests {} resources = {'Storage.524a7003-edec-4f52-a38e-b15cadfbe3ef': < ResourceRef = 'Storage.524a7003-edec-4f52-a38e-b15cadfbe3ef', isValid: 'True' obj: = 'None'>} Thread-110820::DEBUG::2012-04-26 = 10:14:21,667::resourceManager::841::ResourceManager.Owner::(cancelAll) = Owner.cancelAll requests {} Thread-110820::DEBUG::2012-04-26 = 10:14:21,667::resourceManager::535::ResourceManager::(releaseResource) = Trying to release resource = 'Storage.524a7003-edec-4f52-a38e-b15cadfbe3ef' Thread-110820::DEBUG::2012-04-26 = 10:14:21,668::resourceManager::550::ResourceManager::(releaseResource) = Released resource 'Storage.524a7003-edec-4f52-a38e-b15cadfbe3ef' (0 = active users) Thread-110820::DEBUG::2012-04-26 = 10:14:21,668::resourceManager::555::ResourceManager::(releaseResource) = Resource 'Storage.524a7003-edec-4f52-a38e-b15cadfbe3ef' is free, finding = out if anyone is waiting for it. Thread-110820::DEBUG::2012-04-26 = 10:14:21,668::resourceManager::562::ResourceManager::(releaseResource) = No one is waiting for resource = 'Storage.524a7003-edec-4f52-a38e-b15cadfbe3ef', Clearing records. Thread-110820::DEBUG::2012-04-26 = 10:14:21,668::task::980::TaskManager.Task::(_decref) = Task=3D`1601d4ff-ab77-424a-a677-6df3aa703c4d`::ref 0 aborting False Thread-110822::DEBUG::2012-04-26 = 10:14:23,457::clientIF::76::vds::(wrapper) [172.30.0.232]::call = getVmStats with ('f73f17a8-a418-4318-af0e-2cea18ab597a',) {} Thread-110822::DEBUG::2012-04-26 = 10:14:23,458::clientIF::81::vds::(wrapper) return getVmStats with = {'status': {'message': 'Virtual machine does not exist', 'code': 1}} Thread-110823::DEBUG::2012-04-26 = 10:14:23,518::clientIF::76::vds::(wrapper) [172.30.0.232]::call = migrationCreate with ({'bridge': 'ovirtmgmt', 'acpiEnable': 'true', = 'emulatedMachine': 'pc-0.14', 'afterMigrationStatus': 'Up', 'vmId': = 'f73f17a8-a418-4318-af0e-2cea18ab597a', 'transparentHugePages': 'true', = 'displaySecurePort': '5902', 'spiceSslCipherSuite': 'DEFAULT', = 'cpuType': 'Opteron_G3', 'smp': '1', 'pauseCode': 'NOERR', = 'migrationDest': 'libvirt', 'macAddr': '00:1a:4a:1e:00:00', 'boot': 'c', = 'custom': {}, 'vmType': 'kvm', '_srcDomXML': "<domain type=3D'kvm' = id=3D'20'>\n <name>test1</name>\n = <uuid>f73f17a8-a418-4318-af0e-2cea18ab597a</uuid>\n = <memory>1048576</memory>\n <currentMemory>1048576</currentMemory>\n = <vcpu>1</vcpu>\n <cputune>\n <shares>1020</shares>\n = <period>100000</period>\n <quota>-1</quota>\n </cputune>\n <sysinfo = type=3D'smbios'>\n <system>\n <entry name=3D'manufacturer'>Red = Hat</entry>\n <entry name=3D'product'>RHEV Hypervisor</entry>\n = <entry name=3D'version'>16-1</entry>\n <entry = name=3D'serial'>44454C4C-5900-105A-804B-B7C04F563258_00:1D:09:FD:8B:80</en= try>\n <entry = name=3D'uuid'>f73f17a8-a418-4318-af0e-2cea18ab597a</entry>\n = </system>\n </sysinfo>\n <os>\n <type arch=3D'x86_64' = machine=3D'pc-0.14'>hvm</type>\n <boot dev=3D'hd'/>\n <smbios = mode=3D'sysinfo'/>\n </os>\n <features>\n <acpi/>\n </features>\n = <cpu match=3D'exact'>\n <model>Opteron_G3</model>\n <topology = sockets=3D'1' cores=3D'1' threads=3D'1'/>\n </cpu>\n <clock = offset=3D'variable' adjustment=3D'0'>\n <timer name=3D'rtc' = tickpolicy=3D'catchup'/>\n </clock>\n = <on_poweroff>destroy</on_poweroff>\n <on_reboot>restart</on_reboot>\n = <on_crash>destroy</on_crash>\n <devices>\n = <emulator>/usr/bin/qemu-kvm</emulator>\n <disk type=3D'file' = device=3D'disk'>\n <driver name=3D'qemu' type=3D'raw' cache=3D'none' = error_policy=3D'stop' io=3D'threads'/>\n <source = file=3D'/rhev/data-center/524a7003-edec-4f52-a38e-b15cadfbe3ef/5e2ac537-6a= 73-4faf-8379-68f3ff26a75d/images/7338426e-3e31-4ffd-ac37-14c22feea80a/27e8= 4476-26ae-4ac2-8ce7-4fdb3e3225b0'/>\n <target dev=3D'vda' = bus=3D'virtio'/>\n = <serial>7338426e-3e31-4ffd-ac37-14c22feea80a</serial>\n <alias = name=3D'virtio-disk0'/>\n <address type=3D'pci' domain=3D'0x0000' = bus=3D'0x00' slot=3D'0x05' function=3D'0x0'/>\n </disk>\n <disk = type=3D'file' device=3D'cdrom'>\n <driver name=3D'qemu' = type=3D'raw'/>\n <target dev=3D'hdc' bus=3D'ide'/>\n = <readonly/>\n <alias name=3D'ide0-1-0'/>\n <address = type=3D'drive' controller=3D'0' bus=3D'1' unit=3D'0'/>\n </disk>\n = <controller type=3D'virtio-serial' index=3D'0' ports=3D'16'>\n = <alias name=3D'virtio-serial0'/>\n <address type=3D'pci' = domain=3D'0x0000' bus=3D'0x00' slot=3D'0x04' function=3D'0x0'/>\n = </controller>\n <controller type=3D'ide' index=3D'0'>\n <alias = name=3D'ide0'/>\n <address type=3D'pci' domain=3D'0x0000' = bus=3D'0x00' slot=3D'0x01' function=3D'0x1'/>\n </controller>\n = <interface type=3D'bridge'>\n <mac address=3D'00:1a:4a:1e:00:00'/>\n = <source bridge=3D'ovirtmgmt'/>\n <target dev=3D'vnet1'/>\n = <model type=3D'virtio'/>\n <alias name=3D'net0'/>\n <address = type=3D'pci' domain=3D'0x0000' bus=3D'0x00' slot=3D'0x03' = function=3D'0x0'/>\n </interface>\n <channel type=3D'unix'>\n = <source mode=3D'bind' = path=3D'/var/lib/libvirt/qemu/channels/test1.com.redhat.rhevm.vdsm'/>\n = <target type=3D'virtio' name=3D'com.redhat.rhevm.vdsm'/>\n = <alias name=3D'channel0'/>\n <address type=3D'virtio-serial' = controller=3D'0' bus=3D'0' port=3D'1'/>\n </channel>\n <channel = type=3D'spicevmc'>\n <target type=3D'virtio' = name=3D'com.redhat.spice.0'/>\n <alias name=3D'channel1'/>\n = <address type=3D'virtio-serial' controller=3D'0' bus=3D'0' port=3D'2'/>\n = </channel>\n <input type=3D'mouse' bus=3D'ps2'/>\n <graphics = type=3D'spice' port=3D'5901' tlsPort=3D'5902' autoport=3D'yes' = listen=3D'0' keymap=3D'en-us' passwdValidTo=3D'2012-04-25T07:18:06' = connected=3D'disconnect'>\n <listen type=3D'address' = address=3D'0'/>\n <channel name=3D'main' mode=3D'secure'/>\n = <channel name=3D'inputs' mode=3D'secure'/>\n </graphics>\n = <video>\n <model type=3D'qxl' vram=3D'65536' heads=3D'1'/>\n = <alias name=3D'video0'/>\n <address type=3D'pci' domain=3D'0x0000' = bus=3D'0x00' slot=3D'0x02' function=3D'0x0'/>\n </video>\n = <memballoon model=3D'none'>\n <alias name=3D'balloon0'/>\n = </memballoon>\n </devices>\n <seclabel type=3D'dynamic' = model=3D'selinux' relabel=3D'yes'>\n = <label>system_u:system_r:svirt_t:s0:c39,c972</label>\n = <imagelabel>system_u:object_r:svirt_image_t:s0:c39,c972</imagelabel>\n = </seclabel>\n</domain>\n", 'memSize': 1024, 'smpCoresPerSocket': '1', = 'vmName': 'test1', 'spiceMonitors': '1', 'nice': '0', 'status': 'Up', = 'clientIp': '', 'displayIp': '0', 'drives': [{'index': '0', 'domainID': = '5e2ac537-6a73-4faf-8379-68f3ff26a75d', 'reqsize': '0', 'format': 'raw', = 'boot': 'true', 'volumeID': '27e84476-26ae-4ac2-8ce7-4fdb3e3225b0', = 'needExtend': False, 'imageID': '7338426e-3e31-4ffd-ac37-14c22feea80a', = 'blockDev': False, 'iface': 'virtio', 'propagateErrors': 'off', = 'poolID': '524a7003-edec-4f52-a38e-b15cadfbe3ef', 'apparentsize': = '21474836480', 'path': = '/rhev/data-center/524a7003-edec-4f52-a38e-b15cadfbe3ef/5e2ac537-6a73-4faf= -8379-68f3ff26a75d/images/7338426e-3e31-4ffd-ac37-14c22feea80a/27e84476-26= ae-4ac2-8ce7-4fdb3e3225b0', 'truesize': '21495832576', 'type': 'disk', = 'if': 'virtio'}, {'index': 2, 'iface': 'ide', 'format': 'raw', = 'blockDev': False, 'truesize': 0, 'device': 'cdrom', 'path': '', = 'propagateErrors': 'off', 'type': 'disk'}], 'keyboardLayout': 'en-us', = 'displayPort': '5901', 'spiceSecureChannels': 'smain,sinputs', = 'nicModel': 'pv', 'elapsedTimeOffset': 98902.67715907097, 'kvmEnable': = 'true', 'pitReinjection': 'false', 'devices': [{'device': 'bridge', = 'nicModel': 'virtio', 'macAddr': '00:1a:4a:1e:00:00', 'type': = 'interface', 'network': 'ovirtmgmt'}, {'index': '0', 'domainID': = '5e2ac537-6a73-4faf-8379-68f3ff26a75d', 'reqsize': '0', 'format': 'raw', = 'boot': 'true', 'volumeID': '27e84476-26ae-4ac2-8ce7-4fdb3e3225b0', = 'needExtend': False, 'imageID': '7338426e-3e31-4ffd-ac37-14c22feea80a', = 'blockDev': False, 'iface': 'virtio', 'propagateErrors': 'off', = 'poolID': '524a7003-edec-4f52-a38e-b15cadfbe3ef', 'apparentsize': = '21474836480', 'path': = '/rhev/data-center/524a7003-edec-4f52-a38e-b15cadfbe3ef/5e2ac537-6a73-4faf= -8379-68f3ff26a75d/images/7338426e-3e31-4ffd-ac37-14c22feea80a/27e84476-26= ae-4ac2-8ce7-4fdb3e3225b0', 'truesize': '21495832576', 'type': 'disk', = 'if': 'virtio'}, {'index': 2, 'iface': 'ide', 'format': 'raw', = 'blockDev': False, 'truesize': 0, 'device': 'cdrom', 'path': '', = 'propagateErrors': 'off', 'type': 'disk'}, {'device': 'qxl', = 'specParams': {'vram': '65536'}, 'type': 'video'}], 'username': = 'Unknown', 'timeOffset': '0', 'guestIPs': '', 'display': 'qxl'},) {} Thread-110823::DEBUG::2012-04-26 = 10:14:23,520::clientIF::829::vds::(migrationCreate) Migration create Thread-110823::INFO::2012-04-26 = 10:14:23,544::clientIF::549::vds::(_getNetworkIp) network None: using 0 Thread-110823::INFO::2012-04-26 = 10:14:23,545::clientIF::712::vds::(create) vmContainerLock acquired by = vm f73f17a8-a418-4318-af0e-2cea18ab597a Thread-110824::DEBUG::2012-04-26 = 10:14:23,554::vm::512::vm.Vm::(_startUnderlyingVm) = vmId=3D`f73f17a8-a418-4318-af0e-2cea18ab597a`::Start Thread-110823::DEBUG::2012-04-26 = 10:14:23,555::clientIF::725::vds::(create) Total desktops after creation = of f73f17a8-a418-4318-af0e-2cea18ab597a is 1 Thread-110824::DEBUG::2012-04-26 = 10:14:23,556::vm::516::vm.Vm::(_startUnderlyingVm) = vmId=3D`f73f17a8-a418-4318-af0e-2cea18ab597a`::_ongoingCreations = acquired Thread-110823::DEBUG::2012-04-26 = 10:14:23,556::libvirtvm::1504::vm.Vm::(waitForMigrationDestinationPrepare)= vmId=3D`f73f17a8-a418-4318-af0e-2cea18ab597a`::migration destination: = waiting 31s for path preparation Thread-110824::INFO::2012-04-26 = 10:14:23,557::libvirtvm::1105::vm.Vm::(_run) = vmId=3D`f73f17a8-a418-4318-af0e-2cea18ab597a`::VM wrapper has started Thread-110824::DEBUG::2012-04-26 = 10:14:23,559::task::588::TaskManager.Task::(_updateState) = Task=3D`bd60c4e9-877c-4400-9015-c9a6d6218676`::moving from state init -> = state preparing Thread-110824::INFO::2012-04-26 = 10:14:23,559::logUtils::37::dispatcher::(wrapper) Run and protect: = getVolumeSize(sdUUID=3D'5e2ac537-6a73-4faf-8379-68f3ff26a75d', = spUUID=3D'524a7003-edec-4f52-a38e-b15cadfbe3ef', = imgUUID=3D'7338426e-3e31-4ffd-ac37-14c22feea80a', = volUUID=3D'27e84476-26ae-4ac2-8ce7-4fdb3e3225b0', options=3DNone) Thread-110824::DEBUG::2012-04-26 = 10:14:23,561::resourceManager::175::ResourceManager.Request::(__init__) = ResName=3D`Storage.5e2ac537-6a73-4faf-8379-68f3ff26a75d`ReqID=3D`845d4f67-= 0bf3-4981-ac88-c7684017cc14`::Request was made in = '/usr/share/vdsm/storage/hsm.py' line '2352' at 'getVolumeSize' Thread-110824::DEBUG::2012-04-26 = 10:14:23,562::resourceManager::483::ResourceManager::(registerResource) = Trying to register resource = 'Storage.5e2ac537-6a73-4faf-8379-68f3ff26a75d' for lock type 'shared' Thread-110824::DEBUG::2012-04-26 = 10:14:23,563::resourceManager::525::ResourceManager::(registerResource) = Resource 'Storage.5e2ac537-6a73-4faf-8379-68f3ff26a75d' is free. Now = locking as 'shared' (1 active user) Thread-110824::DEBUG::2012-04-26 = 10:14:23,563::resourceManager::212::ResourceManager.Request::(grant) = ResName=3D`Storage.5e2ac537-6a73-4faf-8379-68f3ff26a75d`ReqID=3D`845d4f67-= 0bf3-4981-ac88-c7684017cc14`::Granted request Thread-110824::DEBUG::2012-04-26 = 10:14:23,564::task::819::TaskManager.Task::(resourceAcquired) = Task=3D`bd60c4e9-877c-4400-9015-c9a6d6218676`::_resourcesAcquired: = Storage.5e2ac537-6a73-4faf-8379-68f3ff26a75d (shared) Thread-110824::DEBUG::2012-04-26 = 10:14:23,564::task::980::TaskManager.Task::(_decref) = Task=3D`bd60c4e9-877c-4400-9015-c9a6d6218676`::ref 1 aborting False Thread-110824::DEBUG::2012-04-26 = 10:14:23,567::fileVolume::537::Storage.Volume::(validateVolumePath) = validate path for 27e84476-26ae-4ac2-8ce7-4fdb3e3225b0 Thread-110824::DEBUG::2012-04-26 = 10:14:23,572::fileVolume::537::Storage.Volume::(validateVolumePath) = validate path for 27e84476-26ae-4ac2-8ce7-4fdb3e3225b0 Thread-110824::INFO::2012-04-26 = 10:14:23,575::logUtils::39::dispatcher::(wrapper) Run and protect: = getVolumeSize, Return response: {'truesize': '21495832576', = 'apparentsize': '21474836480'} Thread-110824::DEBUG::2012-04-26 = 10:14:23,575::task::1174::TaskManager.Task::(prepare) = Task=3D`bd60c4e9-877c-4400-9015-c9a6d6218676`::finished: {'truesize': = '21495832576', 'apparentsize': '21474836480'} Thread-110824::DEBUG::2012-04-26 = 10:14:23,576::task::588::TaskManager.Task::(_updateState) = Task=3D`bd60c4e9-877c-4400-9015-c9a6d6218676`::moving from state = preparing -> state finished Thread-110824::DEBUG::2012-04-26 = 10:14:23,576::resourceManager::806::ResourceManager.Owner::(releaseAll) = Owner.releaseAll requests {} resources = {'Storage.5e2ac537-6a73-4faf-8379-68f3ff26a75d': < ResourceRef = 'Storage.5e2ac537-6a73-4faf-8379-68f3ff26a75d', isValid: 'True' obj: = 'None'>} Thread-110824::DEBUG::2012-04-26 = 10:14:23,577::resourceManager::841::ResourceManager.Owner::(cancelAll) = Owner.cancelAll requests {} Thread-110824::DEBUG::2012-04-26 = 10:14:23,578::resourceManager::535::ResourceManager::(releaseResource) = Trying to release resource = 'Storage.5e2ac537-6a73-4faf-8379-68f3ff26a75d' Thread-110824::DEBUG::2012-04-26 = 10:14:23,579::resourceManager::550::ResourceManager::(releaseResource) = Released resource 'Storage.5e2ac537-6a73-4faf-8379-68f3ff26a75d' (0 = active users) Thread-110824::DEBUG::2012-04-26 = 10:14:23,579::resourceManager::555::ResourceManager::(releaseResource) = Resource 'Storage.5e2ac537-6a73-4faf-8379-68f3ff26a75d' is free, finding = out if anyone is waiting for it. Thread-110824::DEBUG::2012-04-26 = 10:14:23,580::resourceManager::562::ResourceManager::(releaseResource) = No one is waiting for resource = 'Storage.5e2ac537-6a73-4faf-8379-68f3ff26a75d', Clearing records. Thread-110824::DEBUG::2012-04-26 = 10:14:23,581::task::980::TaskManager.Task::(_decref) = Task=3D`bd60c4e9-877c-4400-9015-c9a6d6218676`::ref 0 aborting False Thread-110824::DEBUG::2012-04-26 = 10:14:23,582::task::588::TaskManager.Task::(_updateState) = Task=3D`860493eb-861b-4ae1-905c-ec0af1d554b6`::moving from state init -> = state preparing Thread-110824::INFO::2012-04-26 = 10:14:23,583::logUtils::37::dispatcher::(wrapper) Run and protect: = getStorageDomainInfo(sdUUID=3D'5e2ac537-6a73-4faf-8379-68f3ff26a75d', = options=3DNone) Thread-110824::INFO::2012-04-26 = 10:14:23,583::fileSD::195::Storage.StorageDomain::(validate) = sdUUID=3D5e2ac537-6a73-4faf-8379-68f3ff26a75d Thread-110824::DEBUG::2012-04-26 = 10:14:23,586::persistentDict::216::Storage.PersistentDict::(refresh) = read lines (FileMetadataRW)=3D['CLASS=3DData', 'DESCRIPTION=3DVMDomain', = 'IOOPTIMEOUTSEC=3D10', 'LEASERETRIES=3D3', 'LEASETIMESEC=3D60', = 'LOCKPOLICY=3D', 'LOCKRENEWALINTERVALSEC=3D5', 'MASTER_VERSION=3D1', = 'POOL_DESCRIPTION=3DBLC', = 'POOL_DOMAINS=3Dfce72110-71ae-46be-8811-2fbeaa001ed2:Active,5e2ac537-6a73-= 4faf-8379-68f3ff26a75d:Active,6d931187-2c83-4c33-8693-ca2570f7c136:Active'= , 'POOL_SPM_ID=3D1', 'POOL_SPM_LVER=3D10', = 'POOL_UUID=3D524a7003-edec-4f52-a38e-b15cadfbe3ef', = 'REMOTE_PATH=3D192.168.8.28:/export/data/Ovirt-Domain/vmdomain', = 'ROLE=3DMaster', 'SDUUID=3D5e2ac537-6a73-4faf-8379-68f3ff26a75d', = 'TYPE=3DNFS', 'VERSION=3D0', = '_SHA_CKSUM=3D6cea80b04ba091c9ed94475268c7551289bd6394'] Thread-110824::DEBUG::2012-04-26 = 10:14:23,588::resourceManager::175::ResourceManager.Request::(__init__) = ResName=3D`Storage.5e2ac537-6a73-4faf-8379-68f3ff26a75d`ReqID=3D`9e4cab87-= ac0b-4d5b-a0e0-4016f0d805d7`::Request was made in = '/usr/share/vdsm/storage/hsm.py' line '2032' at 'getStorageDomainInfo' Thread-110824::DEBUG::2012-04-26 = 10:14:23,589::resourceManager::483::ResourceManager::(registerResource) = Trying to register resource = 'Storage.5e2ac537-6a73-4faf-8379-68f3ff26a75d' for lock type 'shared' Thread-110824::DEBUG::2012-04-26 = 10:14:23,590::resourceManager::525::ResourceManager::(registerResource) = Resource 'Storage.5e2ac537-6a73-4faf-8379-68f3ff26a75d' is free. Now = locking as 'shared' (1 active user) Thread-110824::DEBUG::2012-04-26 = 10:14:23,591::resourceManager::212::ResourceManager.Request::(grant) = ResName=3D`Storage.5e2ac537-6a73-4faf-8379-68f3ff26a75d`ReqID=3D`9e4cab87-= ac0b-4d5b-a0e0-4016f0d805d7`::Granted request Thread-110824::DEBUG::2012-04-26 = 10:14:23,592::task::819::TaskManager.Task::(resourceAcquired) = Task=3D`860493eb-861b-4ae1-905c-ec0af1d554b6`::_resourcesAcquired: = Storage.5e2ac537-6a73-4faf-8379-68f3ff26a75d (shared) Thread-110824::DEBUG::2012-04-26 = 10:14:23,592::task::980::TaskManager.Task::(_decref) = Task=3D`860493eb-861b-4ae1-905c-ec0af1d554b6`::ref 1 aborting False Thread-110824::INFO::2012-04-26 = 10:14:23,605::logUtils::39::dispatcher::(wrapper) Run and protect: = getStorageDomainInfo, Return response: {'info': {'uuid': = '5e2ac537-6a73-4faf-8379-68f3ff26a75d', 'master_ver': 1, 'lver': 10, = 'version': '0', 'role': 'Master', 'remotePath': = '192.168.8.28:/export/data/Ovirt-Domain/vmdomain', 'spm_id': 1, 'type': = 'NFS', 'class': 'Data', 'pool': = ['524a7003-edec-4f52-a38e-b15cadfbe3ef'], 'name': 'VMDomain'}} Thread-110824::DEBUG::2012-04-26 = 10:14:23,606::task::1174::TaskManager.Task::(prepare) = Task=3D`860493eb-861b-4ae1-905c-ec0af1d554b6`::finished: {'info': = {'uuid': '5e2ac537-6a73-4faf-8379-68f3ff26a75d', 'master_ver': 1, = 'lver': 10, 'version': '0', 'role': 'Master', 'remotePath': = '192.168.8.28:/export/data/Ovirt-Domain/vmdomain', 'spm_id': 1, 'type': = 'NFS', 'class': 'Data', 'pool': = ['524a7003-edec-4f52-a38e-b15cadfbe3ef'], 'name': 'VMDomain'}} Thread-110824::DEBUG::2012-04-26 = 10:14:23,606::task::588::TaskManager.Task::(_updateState) = Task=3D`860493eb-861b-4ae1-905c-ec0af1d554b6`::moving from state = preparing -> state finished Thread-110824::DEBUG::2012-04-26 = 10:14:23,606::resourceManager::806::ResourceManager.Owner::(releaseAll) = Owner.releaseAll requests {} resources = {'Storage.5e2ac537-6a73-4faf-8379-68f3ff26a75d': < ResourceRef = 'Storage.5e2ac537-6a73-4faf-8379-68f3ff26a75d', isValid: 'True' obj: = 'None'>} Thread-110824::DEBUG::2012-04-26 = 10:14:23,606::resourceManager::841::ResourceManager.Owner::(cancelAll) = Owner.cancelAll requests {} Thread-110824::DEBUG::2012-04-26 = 10:14:23,606::resourceManager::535::ResourceManager::(releaseResource) = Trying to release resource = 'Storage.5e2ac537-6a73-4faf-8379-68f3ff26a75d' Thread-110824::DEBUG::2012-04-26 = 10:14:23,607::resourceManager::550::ResourceManager::(releaseResource) = Released resource 'Storage.5e2ac537-6a73-4faf-8379-68f3ff26a75d' (0 = active users) Thread-110824::DEBUG::2012-04-26 = 10:14:23,607::resourceManager::555::ResourceManager::(releaseResource) = Resource 'Storage.5e2ac537-6a73-4faf-8379-68f3ff26a75d' is free, finding = out if anyone is waiting for it. Thread-110824::DEBUG::2012-04-26 = 10:14:23,607::resourceManager::562::ResourceManager::(releaseResource) = No one is waiting for resource = 'Storage.5e2ac537-6a73-4faf-8379-68f3ff26a75d', Clearing records. Thread-110824::DEBUG::2012-04-26 = 10:14:23,607::task::980::TaskManager.Task::(_decref) = Task=3D`860493eb-861b-4ae1-905c-ec0af1d554b6`::ref 0 aborting False Thread-110824::DEBUG::2012-04-26 = 10:14:23,608::task::588::TaskManager.Task::(_updateState) = Task=3D`3ba64588-b7f8-4eb8-a443-26d2b92234c0`::moving from state init -> = state preparing Thread-110824::INFO::2012-04-26 = 10:14:23,608::logUtils::37::dispatcher::(wrapper) Run and protect: = prepareVolume(sdUUID=3D'5e2ac537-6a73-4faf-8379-68f3ff26a75d', = spUUID=3D'524a7003-edec-4f52-a38e-b15cadfbe3ef', = imgUUID=3D'7338426e-3e31-4ffd-ac37-14c22feea80a', = volUUID=3D'27e84476-26ae-4ac2-8ce7-4fdb3e3225b0', rw=3DTrue, = options=3DNone) Thread-110824::DEBUG::2012-04-26 = 10:14:23,608::resourceManager::175::ResourceManager.Request::(__init__) = ResName=3D`Storage.5e2ac537-6a73-4faf-8379-68f3ff26a75d`ReqID=3D`6515c65f-= 33e5-4c6f-a677-8e142c0e3fcf`::Request was made in = '/usr/share/vdsm/storage/hsm.py' line '2423' at 'prepareVolume' Thread-110824::DEBUG::2012-04-26 = 10:14:23,609::resourceManager::483::ResourceManager::(registerResource) = Trying to register resource = 'Storage.5e2ac537-6a73-4faf-8379-68f3ff26a75d' for lock type 'shared' Thread-110824::DEBUG::2012-04-26 = 10:14:23,609::resourceManager::525::ResourceManager::(registerResource) = Resource 'Storage.5e2ac537-6a73-4faf-8379-68f3ff26a75d' is free. Now = locking as 'shared' (1 active user) Thread-110824::DEBUG::2012-04-26 = 10:14:23,609::resourceManager::212::ResourceManager.Request::(grant) = ResName=3D`Storage.5e2ac537-6a73-4faf-8379-68f3ff26a75d`ReqID=3D`6515c65f-= 33e5-4c6f-a677-8e142c0e3fcf`::Granted request Thread-110824::DEBUG::2012-04-26 = 10:14:23,609::task::819::TaskManager.Task::(resourceAcquired) = Task=3D`3ba64588-b7f8-4eb8-a443-26d2b92234c0`::_resourcesAcquired: = Storage.5e2ac537-6a73-4faf-8379-68f3ff26a75d (shared) Thread-110824::DEBUG::2012-04-26 = 10:14:23,610::task::980::TaskManager.Task::(_decref) = Task=3D`3ba64588-b7f8-4eb8-a443-26d2b92234c0`::ref 1 aborting False Thread-110824::DEBUG::2012-04-26 = 10:14:23,610::resourceManager::175::ResourceManager.Request::(__init__) = ResName=3D`5e2ac537-6a73-4faf-8379-68f3ff26a75d_imageNS.7338426e-3e31-4ffd= -ac37-14c22feea80a`ReqID=3D`82e15858-f4c7-481a-9687-740590752b20`::Request= was made in '/usr/share/vdsm/storage/hsm.py' line '2428' at = 'prepareVolume' Thread-110824::DEBUG::2012-04-26 = 10:14:23,610::resourceManager::483::ResourceManager::(registerResource) = Trying to register resource = '5e2ac537-6a73-4faf-8379-68f3ff26a75d_imageNS.7338426e-3e31-4ffd-ac37-14c2= 2feea80a' for lock type 'exclusive' Thread-110824::DEBUG::2012-04-26 = 10:14:23,614::fileVolume::537::Storage.Volume::(validateVolumePath) = validate path for 27e84476-26ae-4ac2-8ce7-4fdb3e3225b0 Thread-110824::DEBUG::2012-04-26 = 10:14:23,619::fileVolume::537::Storage.Volume::(validateVolumePath) = validate path for 27e84476-26ae-4ac2-8ce7-4fdb3e3225b0 Thread-110824::DEBUG::2012-04-26 = 10:14:23,624::fileVolume::537::Storage.Volume::(validateVolumePath) = validate path for 27e84476-26ae-4ac2-8ce7-4fdb3e3225b0 Thread-110824::INFO::2012-04-26 = 10:14:23,629::image::354::Storage.Image::(getChain) = sdUUID=3D5e2ac537-6a73-4faf-8379-68f3ff26a75d = imgUUID=3D7338426e-3e31-4ffd-ac37-14c22feea80a = chain=3D[<storage.fileVolume.FileVolume instance at 0x7f816805ab90>]=20 Thread-110824::DEBUG::2012-04-26 = 10:14:23,632::resourceManager::175::ResourceManager.Request::(__init__) = ResName=3D`5e2ac537-6a73-4faf-8379-68f3ff26a75d_volumeNS.27e84476-26ae-4ac= 2-8ce7-4fdb3e3225b0`ReqID=3D`07cfdaca-bfdc-46c4-8233-9ef87c928cd1`::Reques= t was made in '/usr/share/vdsm/storage/resourceFactories.py' line '165' = at '__getResourceCandidatesList' Thread-110824::DEBUG::2012-04-26 = 10:14:23,633::resourceManager::483::ResourceManager::(registerResource) = Trying to register resource = '5e2ac537-6a73-4faf-8379-68f3ff26a75d_volumeNS.27e84476-26ae-4ac2-8ce7-4fd= b3e3225b0' for lock type 'exclusive' Thread-110824::DEBUG::2012-04-26 = 10:14:23,634::resourceManager::525::ResourceManager::(registerResource) = Resource = '5e2ac537-6a73-4faf-8379-68f3ff26a75d_volumeNS.27e84476-26ae-4ac2-8ce7-4fd= b3e3225b0' is free. Now locking as 'exclusive' (1 active user) Thread-110824::DEBUG::2012-04-26 = 10:14:23,634::resourceManager::212::ResourceManager.Request::(grant) = ResName=3D`5e2ac537-6a73-4faf-8379-68f3ff26a75d_volumeNS.27e84476-26ae-4ac= 2-8ce7-4fdb3e3225b0`ReqID=3D`07cfdaca-bfdc-46c4-8233-9ef87c928cd1`::Grante= d request Thread-110824::DEBUG::2012-04-26 = 10:14:23,636::resourceManager::525::ResourceManager::(registerResource) = Resource = '5e2ac537-6a73-4faf-8379-68f3ff26a75d_imageNS.7338426e-3e31-4ffd-ac37-14c2= 2feea80a' is free. Now locking as 'exclusive' (1 active user) Thread-110824::DEBUG::2012-04-26 = 10:14:23,636::resourceManager::212::ResourceManager.Request::(grant) = ResName=3D`5e2ac537-6a73-4faf-8379-68f3ff26a75d_imageNS.7338426e-3e31-4ffd= -ac37-14c22feea80a`ReqID=3D`82e15858-f4c7-481a-9687-740590752b20`::Granted= request Thread-110824::DEBUG::2012-04-26 = 10:14:23,639::fileVolume::537::Storage.Volume::(validateVolumePath) = validate path for 27e84476-26ae-4ac2-8ce7-4fdb3e3225b0 Thread-110824::INFO::2012-04-26 = 10:14:23,642::volume::534::Storage.Volume::(prepare) Volume: preparing = volume = 5e2ac537-6a73-4faf-8379-68f3ff26a75d/27e84476-26ae-4ac2-8ce7-4fdb3e3225b0 Thread-110824::INFO::2012-04-26 = 10:14:23,654::logUtils::39::dispatcher::(wrapper) Run and protect: = prepareVolume, Return response: None Thread-110824::DEBUG::2012-04-26 = 10:14:23,655::task::1174::TaskManager.Task::(prepare) = Task=3D`3ba64588-b7f8-4eb8-a443-26d2b92234c0`::finished: None Thread-110824::DEBUG::2012-04-26 = 10:14:23,656::task::588::TaskManager.Task::(_updateState) = Task=3D`3ba64588-b7f8-4eb8-a443-26d2b92234c0`::moving from state = preparing -> state finished Thread-110824::DEBUG::2012-04-26 = 10:14:23,656::resourceManager::806::ResourceManager.Owner::(releaseAll) = Owner.releaseAll requests {} resources = {'Storage.5e2ac537-6a73-4faf-8379-68f3ff26a75d': < ResourceRef = 'Storage.5e2ac537-6a73-4faf-8379-68f3ff26a75d', isValid: 'True' obj: = 'None'>} Thread-110824::DEBUG::2012-04-26 = 10:14:23,657::resourceManager::841::ResourceManager.Owner::(cancelAll) = Owner.cancelAll requests {} Thread-110824::DEBUG::2012-04-26 = 10:14:23,658::resourceManager::535::ResourceManager::(releaseResource) = Trying to release resource = 'Storage.5e2ac537-6a73-4faf-8379-68f3ff26a75d' Thread-110824::DEBUG::2012-04-26 = 10:14:23,659::resourceManager::550::ResourceManager::(releaseResource) = Released resource 'Storage.5e2ac537-6a73-4faf-8379-68f3ff26a75d' (0 = active users) Thread-110824::DEBUG::2012-04-26 = 10:14:23,659::resourceManager::555::ResourceManager::(releaseResource) = Resource 'Storage.5e2ac537-6a73-4faf-8379-68f3ff26a75d' is free, finding = out if anyone is waiting for it. Thread-110824::DEBUG::2012-04-26 = 10:14:23,660::resourceManager::562::ResourceManager::(releaseResource) = No one is waiting for resource = 'Storage.5e2ac537-6a73-4faf-8379-68f3ff26a75d', Clearing records. Thread-110824::DEBUG::2012-04-26 = 10:14:23,661::task::980::TaskManager.Task::(_decref) = Task=3D`3ba64588-b7f8-4eb8-a443-26d2b92234c0`::ref 0 aborting False Thread-110824::DEBUG::2012-04-26 = 10:14:23,662::task::588::TaskManager.Task::(_updateState) = Task=3D`5907de99-d08d-4802-9d09-f7ab2583efaa`::moving from state init -> = state preparing Thread-110824::INFO::2012-04-26 = 10:14:23,663::logUtils::37::dispatcher::(wrapper) Run and protect: = getVolumePath(sdUUID=3D'5e2ac537-6a73-4faf-8379-68f3ff26a75d', = spUUID=3D'524a7003-edec-4f52-a38e-b15cadfbe3ef', = imgUUID=3D'7338426e-3e31-4ffd-ac37-14c22feea80a', = volUUID=3D'27e84476-26ae-4ac2-8ce7-4fdb3e3225b0', options=3DNone) Thread-110824::DEBUG::2012-04-26 = 10:14:23,664::resourceManager::175::ResourceManager.Request::(__init__) = ResName=3D`Storage.5e2ac537-6a73-4faf-8379-68f3ff26a75d`ReqID=3D`ea414466-= 6d9a-4c67-9277-7de487f5a085`::Request was made in = '/usr/share/vdsm/storage/hsm.py' line '2400' at 'getVolumePath' Thread-110824::DEBUG::2012-04-26 = 10:14:23,664::resourceManager::483::ResourceManager::(registerResource) = Trying to register resource = 'Storage.5e2ac537-6a73-4faf-8379-68f3ff26a75d' for lock type 'shared' Thread-110824::DEBUG::2012-04-26 = 10:14:23,664::resourceManager::525::ResourceManager::(registerResource) = Resource 'Storage.5e2ac537-6a73-4faf-8379-68f3ff26a75d' is free. Now = locking as 'shared' (1 active user) Thread-110824::DEBUG::2012-04-26 = 10:14:23,664::resourceManager::212::ResourceManager.Request::(grant) = ResName=3D`Storage.5e2ac537-6a73-4faf-8379-68f3ff26a75d`ReqID=3D`ea414466-= 6d9a-4c67-9277-7de487f5a085`::Granted request Thread-110824::DEBUG::2012-04-26 = 10:14:23,665::task::819::TaskManager.Task::(resourceAcquired) = Task=3D`5907de99-d08d-4802-9d09-f7ab2583efaa`::_resourcesAcquired: = Storage.5e2ac537-6a73-4faf-8379-68f3ff26a75d (shared) Thread-110824::DEBUG::2012-04-26 = 10:14:23,665::task::980::TaskManager.Task::(_decref) = Task=3D`5907de99-d08d-4802-9d09-f7ab2583efaa`::ref 1 aborting False Thread-110824::DEBUG::2012-04-26 = 10:14:23,667::fileVolume::537::Storage.Volume::(validateVolumePath) = validate path for 27e84476-26ae-4ac2-8ce7-4fdb3e3225b0 Thread-110824::INFO::2012-04-26 = 10:14:23,669::logUtils::39::dispatcher::(wrapper) Run and protect: = getVolumePath, Return response: {'path': = '/rhev/data-center/524a7003-edec-4f52-a38e-b15cadfbe3ef/5e2ac537-6a73-4faf= -8379-68f3ff26a75d/images/7338426e-3e31-4ffd-ac37-14c22feea80a/27e84476-26= ae-4ac2-8ce7-4fdb3e3225b0'} Thread-110824::DEBUG::2012-04-26 = 10:14:23,670::task::1174::TaskManager.Task::(prepare) = Task=3D`5907de99-d08d-4802-9d09-f7ab2583efaa`::finished: {'path': = '/rhev/data-center/524a7003-edec-4f52-a38e-b15cadfbe3ef/5e2ac537-6a73-4faf= -8379-68f3ff26a75d/images/7338426e-3e31-4ffd-ac37-14c22feea80a/27e84476-26= ae-4ac2-8ce7-4fdb3e3225b0'} Thread-110824::DEBUG::2012-04-26 = 10:14:23,671::task::588::TaskManager.Task::(_updateState) = Task=3D`5907de99-d08d-4802-9d09-f7ab2583efaa`::moving from state = preparing -> state finished Thread-110824::DEBUG::2012-04-26 = 10:14:23,671::resourceManager::806::ResourceManager.Owner::(releaseAll) = Owner.releaseAll requests {} resources = {'Storage.5e2ac537-6a73-4faf-8379-68f3ff26a75d': < ResourceRef = 'Storage.5e2ac537-6a73-4faf-8379-68f3ff26a75d', isValid: 'True' obj: = 'None'>} Thread-110824::DEBUG::2012-04-26 = 10:14:23,672::resourceManager::841::ResourceManager.Owner::(cancelAll) = Owner.cancelAll requests {} Thread-110824::DEBUG::2012-04-26 = 10:14:23,673::resourceManager::535::ResourceManager::(releaseResource) = Trying to release resource = 'Storage.5e2ac537-6a73-4faf-8379-68f3ff26a75d' Thread-110824::DEBUG::2012-04-26 = 10:14:23,673::resourceManager::550::ResourceManager::(releaseResource) = Released resource 'Storage.5e2ac537-6a73-4faf-8379-68f3ff26a75d' (0 = active users) Thread-110824::DEBUG::2012-04-26 = 10:14:23,674::resourceManager::555::ResourceManager::(releaseResource) = Resource 'Storage.5e2ac537-6a73-4faf-8379-68f3ff26a75d' is free, finding = out if anyone is waiting for it. Thread-110824::DEBUG::2012-04-26 = 10:14:23,675::resourceManager::562::ResourceManager::(releaseResource) = No one is waiting for resource = 'Storage.5e2ac537-6a73-4faf-8379-68f3ff26a75d', Clearing records. Thread-110824::DEBUG::2012-04-26 = 10:14:23,675::task::980::TaskManager.Task::(_decref) = Task=3D`5907de99-d08d-4802-9d09-f7ab2583efaa`::ref 0 aborting False Thread-110824::INFO::2012-04-26 = 10:14:23,676::clientIF::606::vds::(_prepareVolumePath) prepared volume = path: = /rhev/data-center/524a7003-edec-4f52-a38e-b15cadfbe3ef/5e2ac537-6a73-4faf-= 8379-68f3ff26a75d/images/7338426e-3e31-4ffd-ac37-14c22feea80a/27e84476-26a= e-4ac2-8ce7-4fdb3e3225b0 Thread-110824::INFO::2012-04-26 = 10:14:23,677::clientIF::606::vds::(_prepareVolumePath) prepared volume = path:=20 Thread-110824::DEBUG::2012-04-26 = 10:14:23,678::utils::595::Storage.Misc.excCmd::(execCmd) '/usr/bin/sudo = -n /sbin/service ksmtuned retune' (cwd None) Thread-110823::DEBUG::2012-04-26 = 10:14:23,679::clientIF::841::vds::(migrationCreate) Destination VM = creation succeeded Thread-110823::DEBUG::2012-04-26 = 10:14:23,680::clientIF::81::vds::(wrapper) return migrationCreate with = {'status': {'message': 'Done', 'code': 0}, 'migrationPort': 0, 'params': = {'bridge': 'ovirtmgmt', 'acpiEnable': 'true', 'emulatedMachine': = 'pc-0.14', 'afterMigrationStatus': 'Up', 'pid': '0', = 'transparentHugePages': 'true', 'displaySecurePort': '-1', = 'spiceSslCipherSuite': 'DEFAULT', 'cpuType': 'Opteron_G3', 'custom': {}, = 'pauseCode': 'NOERR', 'migrationDest': 'libvirt', 'macAddr': = '00:1a:4a:1e:00:00', 'boot': 'c', 'smp': '1', 'vmType': 'kvm', = 'memSize': 1024, 'smpCoresPerSocket': '1', 'vmName': 'test1', = 'spiceMonitors': '1', 'nice': '0', 'status': 'Migration Destination', = 'vmId': 'f73f17a8-a418-4318-af0e-2cea18ab597a', 'displayIp': '0', = 'drives': [{'index': '0', 'domainID': = '5e2ac537-6a73-4faf-8379-68f3ff26a75d', 'reqsize': '0', 'format': 'raw', = 'boot': 'true', 'volumeID': '27e84476-26ae-4ac2-8ce7-4fdb3e3225b0', = 'needExtend': False, 'imageID': '7338426e-3e31-4ffd-ac37-14c22feea80a', = 'blockDev': False, 'iface': 'virtio', 'propagateErrors': 'off', = 'poolID': '524a7003-edec-4f52-a38e-b15cadfbe3ef', 'apparentsize': = '21474836480', 'path': = '/rhev/data-center/524a7003-edec-4f52-a38e-b15cadfbe3ef/5e2ac537-6a73-4faf= -8379-68f3ff26a75d/images/7338426e-3e31-4ffd-ac37-14c22feea80a/27e84476-26= ae-4ac2-8ce7-4fdb3e3225b0', 'truesize': '21495832576', 'type': 'disk', = 'if': 'virtio'}, {'index': 2, 'iface': 'ide', 'format': 'raw', = 'blockDev': False, 'truesize': 0, 'device': 'cdrom', 'path': '', = 'propagateErrors': 'off', 'type': 'disk'}], 'displayPort': '-1', = 'guestIPs': '', 'spiceSecureChannels': 'smain,sinputs', 'nicModel': = 'pv', 'keyboardLayout': 'en-us', 'kvmEnable': 'true', 'pitReinjection': = 'false', 'devices': [{'device': 'bridge', 'nicModel': 'virtio', = 'macAddr': '00:1a:4a:1e:00:00', 'type': 'interface', 'network': = 'ovirtmgmt'}, {'index': '0', 'domainID': = '5e2ac537-6a73-4faf-8379-68f3ff26a75d', 'reqsize': '0', 'format': 'raw', = 'boot': 'true', 'volumeID': '27e84476-26ae-4ac2-8ce7-4fdb3e3225b0', = 'needExtend': False, 'imageID': '7338426e-3e31-4ffd-ac37-14c22feea80a', = 'blockDev': False, 'iface': 'virtio', 'propagateErrors': 'off', = 'poolID': '524a7003-edec-4f52-a38e-b15cadfbe3ef', 'apparentsize': = '21474836480', 'path': = '/rhev/data-center/524a7003-edec-4f52-a38e-b15cadfbe3ef/5e2ac537-6a73-4faf= -8379-68f3ff26a75d/images/7338426e-3e31-4ffd-ac37-14c22feea80a/27e84476-26= ae-4ac2-8ce7-4fdb3e3225b0', 'truesize': '21495832576', 'type': 'disk', = 'if': 'virtio'}, {'index': 2, 'iface': 'ide', 'format': 'raw', = 'blockDev': False, 'truesize': 0, 'device': 'cdrom', 'path': '', = 'propagateErrors': 'off', 'type': 'disk'}, {'device': 'qxl', = 'specParams': {'vram': '65536'}, 'type': 'video'}], 'username': = 'Unknown', 'timeOffset': '0', 'clientIp': '', 'display': 'qxl'}} Thread-110824::DEBUG::2012-04-26 = 10:14:23,742::utils::595::Storage.Misc.excCmd::(execCmd) FAILED: <err> =3D= 'Unknown operation retune\n'; <rc> =3D 1 Thread-110826::DEBUG::2012-04-26 = 10:14:23,745::clientIF::76::vds::(wrapper) [172.30.0.232]::call destroy = with ('f73f17a8-a418-4318-af0e-2cea18ab597a',) {} Thread-110824::DEBUG::2012-04-26 = 10:14:23,746::vm::528::vm.Vm::(_startUnderlyingVm) = vmId=3D`f73f17a8-a418-4318-af0e-2cea18ab597a`::_ongoingCreations = released Thread-110826::INFO::2012-04-26 = 10:14:23,747::clientIF::476::vds::(destroy) vmContainerLock acquired by = vm f73f17a8-a418-4318-af0e-2cea18ab597a Thread-110824::DEBUG::2012-04-26 = 10:14:23,747::libvirtvm::1199::vm.Vm::(_waitForIncomingMigrationFinish) = vmId=3D`f73f17a8-a418-4318-af0e-2cea18ab597a`::Waiting 300 seconds for = end of migration Thread-110826::DEBUG::2012-04-26 = 10:14:23,748::libvirtvm::1400::vm.Vm::(destroy) = vmId=3D`f73f17a8-a418-4318-af0e-2cea18ab597a`::destroy Called Thread-110826::INFO::2012-04-26 = 10:14:23,749::libvirtvm::1360::vm.Vm::(releaseVm) = vmId=3D`f73f17a8-a418-4318-af0e-2cea18ab597a`::Release VM resources Thread-110826::DEBUG::2012-04-26 = 10:14:23,751::utils::595::Storage.Misc.excCmd::(execCmd) '/usr/bin/sudo = -n /sbin/service ksmtuned retune' (cwd None) Thread-110826::DEBUG::2012-04-26 = 10:14:23,817::utils::595::Storage.Misc.excCmd::(execCmd) FAILED: <err> =3D= 'Unknown operation retune\n'; <rc> =3D 1 Thread-110826::DEBUG::2012-04-26 = 10:14:23,818::vm::858::vm.Vm::(_cleanup) = vmId=3D`f73f17a8-a418-4318-af0e-2cea18ab597a`::Drive cleanup Thread-110826::INFO::2012-04-26 = 10:14:23,819::clientIF::617::vds::(_teardownVolumePath) Avoiding tear = down drive {'index': 2, 'iface': 'ide', 'format': 'raw', 'blockDev': = False, 'truesize': 0, 'device': 'cdrom', 'path': '', 'propagateErrors': = 'off', 'type': 'disk'} Thread-110826::DEBUG::2012-04-26 = 10:14:23,820::vm::858::vm.Vm::(_cleanup) = vmId=3D`f73f17a8-a418-4318-af0e-2cea18ab597a`::Drive = /rhev/data-center/524a7003-edec-4f52-a38e-b15cadfbe3ef/5e2ac537-6a73-4faf-= 8379-68f3ff26a75d/images/7338426e-3e31-4ffd-ac37-14c22feea80a/27e84476-26a= e-4ac2-8ce7-4fdb3e3225b0 cleanup Thread-110826::DEBUG::2012-04-26 = 10:14:23,821::task::588::TaskManager.Task::(_updateState) = Task=3D`d1559e5f-f46c-47c6-8475-d9b17fb05c3e`::moving from state init -> = state preparing= --Apple-Mail=_440FAF08-BE2F-4F2B-933A-11F683AFC866 Content-Disposition: attachment; filename=node2-vdsm-log.txt Content-Type: text/plain; name="node2-vdsm-log.txt" Content-Transfer-Encoding: quoted-printable Thread-3203800::DEBUG::2012-04-26 = 10:14:14,216::task::588::TaskManager.Task::(_updateState) = Task=3D`ce3ebc36-94c2-44ac-ad56-c95fe2d82513`::moving from state = preparing -> state finished Thread-3203800::DEBUG::2012-04-26 = 10:14:14,217::resourceManager::806::ResourceManager.Owner::(releaseAll) = Owner.releaseAll requests {} resources {} Thread-3203800::DEBUG::2012-04-26 = 10:14:14,217::resourceManager::841::ResourceManager.Owner::(cancelAll) = Owner.cancelAll requests {} Thread-3203800::DEBUG::2012-04-26 = 10:14:14,218::task::980::TaskManager.Task::(_decref) = Task=3D`ce3ebc36-94c2-44ac-ad56-c95fe2d82513`::ref 0 aborting False Thread-3203801::DEBUG::2012-04-26 = 10:14:14,242::libvirtvm::232::vm.Vm::(_getDiskStats) = vmId=3D`a1f01689-240a-40b8-8719-1a9c815f5642`::Disk hdc stats not = available Thread-3203801::DEBUG::2012-04-26 = 10:14:14,243::libvirtvm::261::vm.Vm::(_getDiskLatency) = vmId=3D`a1f01689-240a-40b8-8719-1a9c815f5642`::Disk hda latency not = available Thread-3203801::DEBUG::2012-04-26 = 10:14:14,244::libvirtvm::261::vm.Vm::(_getDiskLatency) = vmId=3D`a1f01689-240a-40b8-8719-1a9c815f5642`::Disk hdc latency not = available Thread-3203801::DEBUG::2012-04-26 = 10:14:14,245::libvirtvm::232::vm.Vm::(_getDiskStats) = vmId=3D`f73f17a8-a418-4318-af0e-2cea18ab597a`::Disk hdc stats not = available Thread-3203801::DEBUG::2012-04-26 = 10:14:14,246::libvirtvm::261::vm.Vm::(_getDiskLatency) = vmId=3D`f73f17a8-a418-4318-af0e-2cea18ab597a`::Disk vda latency not = available Thread-3203801::DEBUG::2012-04-26 = 10:14:14,246::libvirtvm::261::vm.Vm::(_getDiskLatency) = vmId=3D`f73f17a8-a418-4318-af0e-2cea18ab597a`::Disk hdc latency not = available Thread-3203806::DEBUG::2012-04-26 = 10:14:23,709::clientIF::76::vds::(wrapper) [172.30.0.229]::call migrate = with ({'src': '172.30.0.232', 'dst': '172.30.0.231:54321', 'vmId': = 'f73f17a8-a418-4318-af0e-2cea18ab597a', 'method': 'online'},) {} Thread-3203806::DEBUG::2012-04-26 = 10:14:23,710::clientIF::379::vds::(migrate) {'src': '172.30.0.232', = 'dst': '172.30.0.231:54321', 'vmId': = 'f73f17a8-a418-4318-af0e-2cea18ab597a', 'method': 'online'} Thread-3203807::DEBUG::2012-04-26 = 10:14:23,712::vm::116::vm.Vm::(_setupVdsConnection) = vmId=3D`f73f17a8-a418-4318-af0e-2cea18ab597a`::Destination server is: = 172.30.0.231:54321 Thread-3203806::DEBUG::2012-04-26 = 10:14:23,712::clientIF::81::vds::(wrapper) return migrate with = {'status': {'message': 'Migration process starting', 'code': 0}} Thread-3203807::DEBUG::2012-04-26 = 10:14:23,713::vm::118::vm.Vm::(_setupVdsConnection) = vmId=3D`f73f17a8-a418-4318-af0e-2cea18ab597a`::Initiating connection = with destination Thread-3203807::DEBUG::2012-04-26 = 10:14:23,737::libvirtvm::232::vm.Vm::(_getDiskStats) = vmId=3D`f73f17a8-a418-4318-af0e-2cea18ab597a`::Disk hdc stats not = available Thread-3203807::DEBUG::2012-04-26 = 10:14:23,737::libvirtvm::261::vm.Vm::(_getDiskLatency) = vmId=3D`f73f17a8-a418-4318-af0e-2cea18ab597a`::Disk vda latency not = available Thread-3203807::DEBUG::2012-04-26 = 10:14:23,738::libvirtvm::261::vm.Vm::(_getDiskLatency) = vmId=3D`f73f17a8-a418-4318-af0e-2cea18ab597a`::Disk hdc latency not = available Thread-3203807::DEBUG::2012-04-26 = 10:14:23,742::vm::164::vm.Vm::(_prepareGuest) = vmId=3D`f73f17a8-a418-4318-af0e-2cea18ab597a`::migration Process begins Thread-3203807::DEBUG::2012-04-26 10:14:23,749::vm::211::vm.Vm::(run) = vmId=3D`f73f17a8-a418-4318-af0e-2cea18ab597a`::migration semaphore = acquired Thread-3203807::DEBUG::2012-04-26 = 10:14:23,977::libvirtvm::419::vm.Vm::(_startUnderlyingMigration) = vmId=3D`f73f17a8-a418-4318-af0e-2cea18ab597a`::starting migration to = qemu+tls://172.30.0.231/system Thread-3203808::DEBUG::2012-04-26 = 10:14:23,978::libvirtvm::317::vm.Vm::(run) = vmId=3D`f73f17a8-a418-4318-af0e-2cea18ab597a`::migration downtime thread = started Thread-3203809::DEBUG::2012-04-26 = 10:14:23,979::libvirtvm::345::vm.Vm::(run) = vmId=3D`f73f17a8-a418-4318-af0e-2cea18ab597a`::starting migration = monitor thread Thread-3203807::DEBUG::2012-04-26 = 10:14:23,998::libvirtvm::332::vm.Vm::(cancel) = vmId=3D`f73f17a8-a418-4318-af0e-2cea18ab597a`::canceling migration = downtime thread Thread-3203807::DEBUG::2012-04-26 = 10:14:23,999::libvirtvm::382::vm.Vm::(stop) = vmId=3D`f73f17a8-a418-4318-af0e-2cea18ab597a`::stopping migration = monitor thread Thread-3203808::DEBUG::2012-04-26 = 10:14:23,999::libvirtvm::329::vm.Vm::(run) = vmId=3D`f73f17a8-a418-4318-af0e-2cea18ab597a`::migration downtime thread = exiting Thread-3203807::ERROR::2012-04-26 = 10:14:24,001::vm::170::vm.Vm::(_recover) = vmId=3D`f73f17a8-a418-4318-af0e-2cea18ab597a`::operation failed: Failed = to connect to remote libvirt URI qemu+tls://172.30.0.231/system Thread-3203807::ERROR::2012-04-26 10:14:24,119::vm::234::vm.Vm::(run) = vmId=3D`f73f17a8-a418-4318-af0e-2cea18ab597a`::Traceback (most recent = call last): File "/usr/share/vdsm/vm.py", line 217, in run File "/usr/share/vdsm/libvirtvm.py", line 443, in = _startUnderlyingMigration File "/usr/share/vdsm/libvirtvm.py", line 483, in f File "/usr/share/vdsm/libvirtconnection.py", line 79, in wrapper File "/usr/lib64/python2.7/site-packages/libvirt.py", line 971, in = migrateToURI2 libvirtError: operation failed: Failed to connect to remote libvirt URI = qemu+tls://172.30.0.231/system Thread-3203810::DEBUG::2012-04-26 = 10:14:24,415::task::588::TaskManager.Task::(_updateState) = Task=3D`ce192aac-27ff-48bc-bef4-451d3e4aad48`::moving from state init -> = state preparing Thread-3203810::INFO::2012-04-26 = 10:14:24,416::logUtils::37::dispatcher::(wrapper) Run and protect: = repoStats(options=3DNone) Thread-3203810::INFO::2012-04-26 = 10:14:24,417::logUtils::39::dispatcher::(wrapper) Run and protect: = repoStats, Return response: {'fce72110-71ae-46be-8811-2fbeaa001ed2': = {'delay': '0.00173997879028', 'lastCheck': 1335477130.119629, 'code': 0, = 'valid': True}, '5e2ac537-6a73-4faf-8379-68f3ff26a75d': {'delay': = '0.00190806388855', 'lastCheck': 1335477130.122188, 'code': 0, 'valid': = True}, '6d931187-2c83-4c33-8693-ca2570f7c136': {'delay': = '0.00839805603027', 'lastCheck': 1335477130.117298, 'code': 0, 'valid': = True}} Thread-3203810::DEBUG::2012-04-26 = 10:14:24,417::task::1174::TaskManager.Task::(prepare) = Task=3D`ce192aac-27ff-48bc-bef4-451d3e4aad48`::finished: = {'fce72110-71ae-46be-8811-2fbeaa001ed2': {'delay': '0.00173997879028', = 'lastCheck': 1335477130.119629, 'code': 0, 'valid': True}, = '5e2ac537-6a73-4faf-8379-68f3ff26a75d': {'delay': '0.00190806388855', = 'lastCheck': 1335477130.122188, 'code': 0, 'valid': True}, = '6d931187-2c83-4c33-8693-ca2570f7c136': {'delay': '0.00839805603027', = 'lastCheck': 1335477130.117298, 'code': 0, 'valid': True}} Thread-3203810::DEBUG::2012-04-26 = 10:14:24,418::task::588::TaskManager.Task::(_updateState) = Task=3D`ce192aac-27ff-48bc-bef4-451d3e4aad48`::moving from state = preparing -> state finished Thread-3203810::DEBUG::2012-04-26 = 10:14:24,419::resourceManager::806::ResourceManager.Owner::(releaseAll) = Owner.releaseAll requests {} resources {} Thread-3203810::DEBUG::2012-04-26 = 10:14:24,420::resourceManager::841::ResourceManager.Owner::(cancelAll) = Owner.cancelAll requests {} Thread-3203810::DEBUG::2012-04-26 = 10:14:24,420::task::980::TaskManager.Task::(_decref) = Task=3D`ce192aac-27ff-48bc-bef4-451d3e4aad48`::ref 0 aborting False Thread-3203811::DEBUG::2012-04-26 = 10:14:24,440::libvirtvm::232::vm.Vm::(_getDiskStats) = vmId=3D`a1f01689-240a-40b8-8719-1a9c815f5642`::Disk hdc stats not = available Thread-3203811::DEBUG::2012-04-26 = 10:14:24,441::libvirtvm::261::vm.Vm::(_getDiskLatency) = vmId=3D`a1f01689-240a-40b8-8719-1a9c815f5642`::Disk hda latency not = available Thread-3203811::DEBUG::2012-04-26 = 10:14:24,441::libvirtvm::261::vm.Vm::(_getDiskLatency) = vmId=3D`a1f01689-240a-40b8-8719-1a9c815f5642`::Disk hdc latency not = available Thread-3203811::DEBUG::2012-04-26 = 10:14:24,443::libvirtvm::232::vm.Vm::(_getDiskStats) = vmId=3D`f73f17a8-a418-4318-af0e-2cea18ab597a`::Disk hdc stats not = available Thread-3203811::DEBUG::2012-04-26 = 10:14:24,443::libvirtvm::261::vm.Vm::(_getDiskLatency) = vmId=3D`f73f17a8-a418-4318-af0e-2cea18ab597a`::Disk vda latency not = available Thread-3203811::DEBUG::2012-04-26 = 10:14:24,444::libvirtvm::261::vm.Vm::(_getDiskLatency) = vmId=3D`f73f17a8-a418-4318-af0e-2cea18ab597a`::Disk hdc latency not = available Thread-3203812::DEBUG::2012-04-26 = 10:14:24,528::clientIF::76::vds::(wrapper) [172.30.0.229]::call = migrateStatus with ('f73f17a8-a418-4318-af0e-2cea18ab597a',) {} Thread-3203812::DEBUG::2012-04-26 = 10:14:24,529::clientIF::81::vds::(wrapper) return migrateStatus with = {'status': {'message': 'Fatal error during migration', 'code': 12}} Thread-3203817::DEBUG::2012-04-26 = 10:14:34,617::task::588::TaskManager.Task::(_updateState) = Task=3D`ee082137-e3f0-4755-a549-65407f551eea`::moving from state init -> = state preparing Thread-3203817::INFO::2012-04-26 = 10:14:34,618::logUtils::37::dispatcher::(wrapper) Run and protect: = repoStats(options=3DNone) Thread-3203817::INFO::2012-04-26 = 10:14:34,619::logUtils::39::dispatcher::(wrapper) Run and protect: = repoStats, Return response: {'fce72110-71ae-46be-8811-2fbeaa001ed2': = {'delay': '0.00173997879028', 'lastCheck': 1335477130.119629, 'code': 0, = 'valid': True}, '5e2ac537-6a73-4faf-8379-68f3ff26a75d': {'delay': = '0.00190806388855', 'lastCheck': 1335477130.122188, 'code': 0, 'valid': = True}, '6d931187-2c83-4c33-8693-ca2570f7c136': {'delay': = '0.00839805603027', 'lastCheck': 1335477130.117298, 'code': 0, 'valid': = True}} Thread-3203817::DEBUG::2012-04-26 = 10:14:34,619::task::1174::TaskManager.Task::(prepare) = Task=3D`ee082137-e3f0-4755-a549-65407f551eea`::finished: = {'fce72110-71ae-46be-8811-2fbeaa001ed2': {'delay': '0.00173997879028', = 'lastCheck': 1335477130.119629, 'code': 0, 'valid': True}, = '5e2ac537-6a73-4faf-8379-68f3ff26a75d': {'delay': '0.00190806388855', = 'lastCheck': 1335477130.122188, 'code': 0, 'valid': True}, = '6d931187-2c83-4c33-8693-ca2570f7c136': {'delay': '0.00839805603027', = 'lastCheck': 1335477130.117298, 'code': 0, 'valid': True}} Thread-3203817::DEBUG::2012-04-26 = 10:14:34,620::task::588::TaskManager.Task::(_updateState) = Task=3D`ee082137-e3f0-4755-a549-65407f551eea`::moving from state = preparing -> state finished Thread-3203817::DEBUG::2012-04-26 = 10:14:34,621::resourceManager::806::ResourceManager.Owner::(releaseAll) = Owner.releaseAll requests {} resources {} Thread-3203817::DEBUG::2012-04-26 = 10:14:34,622::resourceManager::841::ResourceManager.Owner::(cancelAll) = Owner.cancelAll requests {} Thread-3203817::DEBUG::2012-04-26 = 10:14:34,622::task::980::TaskManager.Task::(_decref) = Task=3D`ee082137-e3f0-4755-a549-65407f551eea`::ref 0 aborting False Thread-3203818::DEBUG::2012-04-26 = 10:14:34,644::libvirtvm::232::vm.Vm::(_getDiskStats) = vmId=3D`a1f01689-240a-40b8-8719-1a9c815f5642`::Disk hdc stats not = available Thread-3203818::DEBUG::2012-04-26 = 10:14:34,645::libvirtvm::261::vm.Vm::(_getDiskLatency) = vmId=3D`a1f01689-240a-40b8-8719-1a9c815f5642`::Disk hda latency not = available Thread-3203818::DEBUG::2012-04-26 = 10:14:34,646::libvirtvm::261::vm.Vm::(_getDiskLatency) = vmId=3D`a1f01689-240a-40b8-8719-1a9c815f5642`::Disk hdc latency not = available Thread-3203818::DEBUG::2012-04-26 = 10:14:34,647::libvirtvm::232::vm.Vm::(_getDiskStats) = vmId=3D`f73f17a8-a418-4318-af0e-2cea18ab597a`::Disk hdc stats not = available Thread-3203818::DEBUG::2012-04-26 = 10:14:34,648::libvirtvm::261::vm.Vm::(_getDiskLatency) = vmId=3D`f73f17a8-a418-4318-af0e-2cea18ab597a`::Disk vda latency not = available Thread-3203818::DEBUG::2012-04-26 = 10:14:34,648::libvirtvm::261::vm.Vm::(_getDiskLatency) = vmId=3D`f73f17a8-a418-4318-af0e-2cea18ab597a`::Disk hdc latency not = available Thread-3203823::DEBUG::2012-04-26 = 10:14:44,833::task::588::TaskManager.Task::(_updateState) = Task=3D`22a3d000-1122-497c-91d1-ca032ae2b799`::moving from state init -> = state preparing Thread-3203823::INFO::2012-04-26 = 10:14:44,834::logUtils::37::dispatcher::(wrapper) Run and protect: = repoStats(options=3DNone) Thread-3203823::INFO::2012-04-26 = 10:14:44,835::logUtils::39::dispatcher::(wrapper) Run and protect: = repoStats, Return response: {'fce72110-71ae-46be-8811-2fbeaa001ed2': = {'delay': '0.00173997879028', 'lastCheck': 1335477130.119629, 'code': 0, = 'valid': True}, '5e2ac537-6a73-4faf-8379-68f3ff26a75d': {'delay': = '0.00190806388855', 'lastCheck': 1335477130.122188, 'code': 0, 'valid': = True}, '6d931187-2c83-4c33-8693-ca2570f7c136': {'delay': = '0.00839805603027', 'lastCheck': 1335477130.117298, 'code': 0, 'valid': = True}} Thread-3203823::DEBUG::2012-04-26 = 10:14:44,836::task::1174::TaskManager.Task::(prepare) = Task=3D`22a3d000-1122-497c-91d1-ca032ae2b799`::finished: = {'fce72110-71ae-46be-8811-2fbeaa001ed2': {'delay': '0.00173997879028', = 'lastCheck': 1335477130.119629, 'code': 0, 'valid': True}, = '5e2ac537-6a73-4faf-8379-68f3ff26a75d': {'delay': '0.00190806388855', = 'lastCheck': 1335477130.122188, 'code': 0, 'valid': True}, = '6d931187-2c83-4c33-8693-ca2570f7c136': {'delay': '0.00839805603027', = 'lastCheck': 1335477130.117298, 'code': 0, 'valid': True}} Thread-3203823::DEBUG::2012-04-26 = 10:14:44,837::task::588::TaskManager.Task::(_updateState) = Task=3D`22a3d000-1122-497c-91d1-ca032ae2b799`::moving from state = preparing -> state finished Thread-3203823::DEBUG::2012-04-26 = 10:14:44,838::resourceManager::806::ResourceManager.Owner::(releaseAll) = Owner.releaseAll requests {} resources {} Thread-3203823::DEBUG::2012-04-26 = 10:14:44,838::resourceManager::841::ResourceManager.Owner::(cancelAll) = Owner.cancelAll requests {} Thread-3203823::DEBUG::2012-04-26 = 10:14:44,839::task::980::TaskManager.Task::(_decref) = Task=3D`22a3d000-1122-497c-91d1-ca032ae2b799`::ref 0 aborting False Thread-3203824::DEBUG::2012-04-26 = 10:14:44,862::libvirtvm::232::vm.Vm::(_getDiskStats) = vmId=3D`a1f01689-240a-40b8-8719-1a9c815f5642`::Disk hdc stats not = available Thread-3203824::DEBUG::2012-04-26 = 10:14:44,863::libvirtvm::261::vm.Vm::(_getDiskLatency) = vmId=3D`a1f01689-240a-40b8-8719-1a9c815f5642`::Disk hda latency not = available Thread-3203824::DEBUG::2012-04-26 = 10:14:44,864::libvirtvm::261::vm.Vm::(_getDiskLatency) = vmId=3D`a1f01689-240a-40b8-8719-1a9c815f5642`::Disk hdc latency not = available Thread-3203824::DEBUG::2012-04-26 = 10:14:44,865::libvirtvm::232::vm.Vm::(_getDiskStats) = vmId=3D`f73f17a8-a418-4318-af0e-2cea18ab597a`::Disk hdc stats not = available Thread-3203824::DEBUG::2012-04-26 = 10:14:44,866::libvirtvm::261::vm.Vm::(_getDiskLatency) = vmId=3D`f73f17a8-a418-4318-af0e-2cea18ab597a`::Disk vda latency not = available Thread-3203824::DEBUG::2012-04-26 = 10:14:44,866::libvirtvm::261::vm.Vm::(_getDiskLatency) = vmId=3D`f73f17a8-a418-4318-af0e-2cea18ab597a`::Disk hdc latency not = available Thread-3203829::DEBUG::2012-04-26 = 10:14:55,029::task::588::TaskManager.Task::(_updateState) = Task=3D`59edf306-8af3-410e-b13f-493510719325`::moving from state init -> = state preparing Thread-3203829::INFO::2012-04-26 = 10:14:55,030::logUtils::37::dispatcher::(wrapper) Run and protect: = repoStats(options=3DNone) Thread-3203829::INFO::2012-04-26 = 10:14:55,031::logUtils::39::dispatcher::(wrapper) Run and protect: = repoStats, Return response: {'fce72110-71ae-46be-8811-2fbeaa001ed2': = {'delay': '0.00173997879028', 'lastCheck': 1335477130.119629, 'code': 0, = 'valid': True}, '5e2ac537-6a73-4faf-8379-68f3ff26a75d': {'delay': = '0.00190806388855', 'lastCheck': 1335477130.122188, 'code': 0, 'valid': = True}, '6d931187-2c83-4c33-8693-ca2570f7c136': {'delay': = '0.00839805603027', 'lastCheck': 1335477130.117298, 'code': 0, 'valid': = True}} Thread-3203829::DEBUG::2012-04-26 = 10:14:55,032::task::1174::TaskManager.Task::(prepare) = Task=3D`59edf306-8af3-410e-b13f-493510719325`::finished: = {'fce72110-71ae-46be-8811-2fbeaa001ed2': {'delay': '0.00173997879028', = 'lastCheck': 1335477130.119629, 'code': 0, 'valid': True}, = '5e2ac537-6a73-4faf-8379-68f3ff26a75d': {'delay': '0.00190806388855', = 'lastCheck': 1335477130.122188, 'code': 0, 'valid': True}, = '6d931187-2c83-4c33-8693-ca2570f7c136': {'delay': '0.00839805603027', = 'lastCheck': 1335477130.117298, 'code': 0, 'valid': True}} Thread-3203829::DEBUG::2012-04-26 = 10:14:55,032::task::588::TaskManager.Task::(_updateState) = Task=3D`59edf306-8af3-410e-b13f-493510719325`::moving from state = preparing -> state finished Thread-3203829::DEBUG::2012-04-26 = 10:14:55,033::resourceManager::806::ResourceManager.Owner::(releaseAll) = Owner.releaseAll requests {} resources {} Thread-3203829::DEBUG::2012-04-26 = 10:14:55,034::resourceManager::841::ResourceManager.Owner::(cancelAll) = Owner.cancelAll requests {} Thread-3203829::DEBUG::2012-04-26 = 10:14:55,034::task::980::TaskManager.Task::(_decref) = Task=3D`59edf306-8af3-410e-b13f-493510719325`::ref 0 aborting False Thread-3203830::DEBUG::2012-04-26 = 10:14:55,054::libvirtvm::232::vm.Vm::(_getDiskStats) = vmId=3D`a1f01689-240a-40b8-8719-1a9c815f5642`::Disk hdc stats not = available Thread-3203830::DEBUG::2012-04-26 = 10:14:55,055::libvirtvm::261::vm.Vm::(_getDiskLatency) = vmId=3D`a1f01689-240a-40b8-8719-1a9c815f5642`::Disk hda latency not = available Thread-3203830::DEBUG::2012-04-26 = 10:14:55,055::libvirtvm::261::vm.Vm::(_getDiskLatency) = vmId=3D`a1f01689-240a-40b8-8719-1a9c815f5642`::Disk hdc latency not = available Thread-3203830::DEBUG::2012-04-26 = 10:14:55,056::libvirtvm::232::vm.Vm::(_getDiskStats) = vmId=3D`f73f17a8-a418-4318-af0e-2cea18ab597a`::Disk hdc stats not = available Thread-3203830::DEBUG::2012-04-26 = 10:14:55,057::libvirtvm::261::vm.Vm::(_getDiskLatency) = vmId=3D`f73f17a8-a418-4318-af0e-2cea18ab597a`::Disk vda latency not = available Thread-3203830::DEBUG::2012-04-26 = 10:14:55,058::libvirtvm::261::vm.Vm::(_getDiskLatency) = vmId=3D`f73f17a8-a418-4318-af0e-2cea18ab597a`::Disk hdc latency not = available= --Apple-Mail=_440FAF08-BE2F-4F2B-933A-11F683AFC866 Content-Disposition: attachment; filename=node2-libvirtd-log.txt Content-Type: text/plain; x-mac-hide-extension=yes; x-unix-mode=0644; name="node2-libvirtd-log.txt" Content-Transfer-Encoding: quoted-printable [root@ovirt-node-2 ~]# grep "2012-04-26 10:14" /var/log/libvirtd.log=20 2012-04-26 10:14:23.740+0000: 16973: debug : virDomainGetXMLDesc:3811 : = dom=3D0x7fafd40e0410, (VM: name=3Dtest1, = uuid=3Df73f17a8-a418-4318-af0e-2cea18ab597a), flags=3D0 2012-04-26 10:14:23.740+0000: 16973: debug : virDomainFree:2169 : = dom=3D0x7fafd40e0410, (VM: name=3Dtest1, = uuid=3Df73f17a8-a418-4318-af0e-2cea18ab597a) 2012-04-26 10:14:23.747+0000: 16970: debug : virDomainGetXMLDesc:3811 : = dom=3D0x7fafc8015770, (VM: name=3Dtest1, = uuid=3Df73f17a8-a418-4318-af0e-2cea18ab597a), flags=3D0 2012-04-26 10:14:23.747+0000: 16970: debug : virDomainFree:2169 : = dom=3D0x7fafc8015770, (VM: name=3Dtest1, = uuid=3Df73f17a8-a418-4318-af0e-2cea18ab597a) 2012-04-26 10:14:23.756+0000: 16971: debug : virDomainGetXMLDesc:3811 : = dom=3D0x7fafdc0d7620, (VM: name=3Dtest1, = uuid=3Df73f17a8-a418-4318-af0e-2cea18ab597a), flags=3D0 2012-04-26 10:14:23.757+0000: 16971: debug : virDomainFree:2169 : = dom=3D0x7fafdc0d7620, (VM: name=3Dtest1, = uuid=3Df73f17a8-a418-4318-af0e-2cea18ab597a) 2012-04-26 10:14:23.760+0000: 16969: debug : virDomainGetXMLDesc:3811 : = dom=3D0x7fafe00ab970, (VM: name=3Dtest1, = uuid=3Df73f17a8-a418-4318-af0e-2cea18ab597a), flags=3D0 2012-04-26 10:14:23.760+0000: 16969: debug : virDomainFree:2169 : = dom=3D0x7fafe00ab970, (VM: name=3Dtest1, = uuid=3Df73f17a8-a418-4318-af0e-2cea18ab597a) 2012-04-26 10:14:23.980+0000: 16972: debug : virDrvSupportsFeature:1398 = : conn=3D0x7fafdc063080, feature=3D4 2012-04-26 10:14:23.983+0000: 16973: debug : virDrvSupportsFeature:1398 = : conn=3D0x7fafdc063080, feature=3D6 2012-04-26 10:14:23.986+0000: 16970: debug : = virDomainMigratePerform3:5672 : dom=3D0x7fafc8015770, (VM: name=3Dtest1, = uuid=3Df73f17a8-a418-4318-af0e-2cea18ab597a), xmlin=3D(null) = cookiein=3D(nil), cookieinlen=3D0, cookieout=3D0x7fafebcecb90, = cookieoutlen=3D0x7fafebcecb9c, dconnuri=3Dqemu+tls://172.30.0.231/system, = uri=3Dtcp://172.30.0.231, flags=3D3, dname=3D(null), bandwidth=3D0 2012-04-26 10:14:23.986+0000: 16970: debug : qemuMigrationPerform:2389 : = driver=3D0x7fafe001e990, conn=3D0x7fafdc063080, vm=3D0x7fafdc0ef860, = xmlin=3D(null), dconnuri=3Dqemu+tls://172.30.0.231/system, = uri=3Dtcp://172.30.0.231, cookiein=3D(null), cookieinlen=3D0, = cookieout=3D0x7fafebcecb90, cookieoutlen=3D0x7fafebcecb9c, flags=3D3, = dname=3D(null), resource=3D0, v3proto=3D1 2012-04-26 10:14:23.986+0000: 16970: debug : = qemuProcessAutoDestroyActive:3760 : vm=3Dtest1 = uuid=3Df73f17a8-a418-4318-af0e-2cea18ab597a 2012-04-26 10:14:23.986+0000: 16970: debug : doPeer2PeerMigrate:2118 : = driver=3D0x7fafe001e990, sconn=3D0x7fafdc063080, vm=3D0x7fafdc0ef860, = xmlin=3D(null), dconnuri=3Dqemu+tls://172.30.0.231/system, = uri=3Dtcp://172.30.0.231, flags=3D3, dname=3D(null), resource=3D0 2012-04-26 10:14:23.986+0000: 16970: debug : virConnectOpen:1226 : = name=3Dqemu+tls://172.30.0.231/system 2012-04-26 10:14:23.986+0000: 16970: debug : do_open:1030 : name = "qemu+tls://172.30.0.231/system" to URI components: 2012-04-26 10:14:23.986+0000: 16970: debug : do_open:1072 : trying = driver 0 (Test) ... 2012-04-26 10:14:23.986+0000: 16970: debug : do_open:1078 : driver 0 = Test returned DECLINED 2012-04-26 10:14:23.986+0000: 16970: debug : do_open:1072 : trying = driver 1 (Xen) ... 2012-04-26 10:14:23.986+0000: 16970: debug : do_open:1078 : driver 1 Xen = returned DECLINED 2012-04-26 10:14:23.986+0000: 16970: debug : do_open:1072 : trying = driver 2 (OPENVZ) ... 2012-04-26 10:14:23.986+0000: 16970: debug : do_open:1078 : driver 2 = OPENVZ returned DECLINED 2012-04-26 10:14:23.987+0000: 16970: debug : do_open:1072 : trying = driver 3 (VMWARE) ... 2012-04-26 10:14:23.987+0000: 16970: debug : do_open:1078 : driver 3 = VMWARE returned DECLINED 2012-04-26 10:14:23.987+0000: 16970: debug : do_open:1072 : trying = driver 4 (PHYP) ... 2012-04-26 10:14:23.987+0000: 16970: debug : do_open:1078 : driver 4 = PHYP returned DECLINED 2012-04-26 10:14:23.987+0000: 16970: debug : do_open:1072 : trying = driver 5 (VBOX) ... 2012-04-26 10:14:23.987+0000: 16970: debug : do_open:1078 : driver 5 = VBOX returned DECLINED 2012-04-26 10:14:23.987+0000: 16970: debug : do_open:1072 : trying = driver 6 (ESX) ... 2012-04-26 10:14:23.987+0000: 16970: debug : do_open:1078 : driver 6 ESX = returned DECLINED 2012-04-26 10:14:23.987+0000: 16970: debug : do_open:1072 : trying = driver 7 (Hyper-V) ... 2012-04-26 10:14:23.987+0000: 16970: debug : do_open:1078 : driver 7 = Hyper-V returned DECLINED 2012-04-26 10:14:23.987+0000: 16970: debug : do_open:1072 : trying = driver 8 (remote) ... 2012-04-26 10:14:23.996+0000: 16970: error : = virNetSocketNewConnectTCP:432 : unable to connect to server at = '172.30.0.231:16514': Connection refused 2012-04-26 10:14:23.996+0000: 16970: debug : do_open:1078 : driver 8 = remote returned ERROR 2012-04-26 10:14:23.996+0000: 16970: error : doPeer2PeerMigrate:2129 : = operation failed: Failed to connect to remote libvirt URI = qemu+tls://172.30.0.231/system 2012-04-26 10:14:23.996+0000: 16970: debug : virDomainFree:2169 : = dom=3D0x7fafc8015770, (VM: name=3Dtest1, = uuid=3Df73f17a8-a418-4318-af0e-2cea18ab597a) 2012-04-26 10:14:24.002+0000: 16971: debug : virDomainGetJobInfo:15164 : = dom=3D0x7fafdc0d7620, (VM: name=3Dtest1, = uuid=3Df73f17a8-a418-4318-af0e-2cea18ab597a), info=3D0x7fafeb4ebb40 2012-04-26 10:14:24.002+0000: 16971: debug : virDomainFree:2169 : = dom=3D0x7fafdc0d7620, (VM: name=3Dtest1, = uuid=3Df73f17a8-a418-4318-af0e-2cea18ab597a) 2012-04-26 10:14:24.117+0000: 16969: debug : virDomainGetXMLDesc:3811 : = dom=3D0x7fafe00ab970, (VM: name=3Dtest1, = uuid=3Df73f17a8-a418-4318-af0e-2cea18ab597a), flags=3D0 2012-04-26 10:14:24.118+0000: 16969: debug : virDomainFree:2169 : = dom=3D0x7fafe00ab970, (VM: name=3Dtest1, = uuid=3Df73f17a8-a418-4318-af0e-2cea18ab597a)= --Apple-Mail=_440FAF08-BE2F-4F2B-933A-11F683AFC866 Content-Transfer-Encoding: quoted-printable Content-Type: text/plain; charset=utf-8 On 26 Apr, 2012, at 3:07 PM, Itamar Heim wrote:
--Apple-Mail=_440FAF08-BE2F-4F2B-933A-11F683AFC866--

After looking into the log , it says : virNetSocketNewConnectTCP:432 : unable to connect to server at '172.30.0.231:16514': Connection refused 2012-04-26 10:14:23.996+0000: 16970: debug : do_open:1078 : driver 8 remote returned ERROR 2012-04-26 10:14:23.996+0000: 16970: error : doPeer2PeerMigrate:2129 : operation failed: Failed to connect to remote libvirt URI qemu+tls://172.30.0.231/system the tls is not working I have resolved this by set the node 1 and node 2 libvirt conf " listen_tls = 1 ", then the migration going on thx all the same. On 26 Apr, 2012, at 6:28 PM, T-Sinjon wrote:
participants (3)
-
Gmail
-
Itamar Heim
-
T-Sinjon