[ovirt-users] vdsm storage problem - maybe cache problem?

ml at ohnewald.net ml at ohnewald.net
Mon May 18 12:40:00 UTC 2015


Manual mounting works:

[root at ovirt-node01 ~]# mount ovirt.hq.example.net:/export2 /tmp/mnt/
[root at ovirt-node01 ~]# umount /tmp/mnt/

(but i removed the export2 domain from the webgui engine. I still wonder 
why the node want to access it....)

vdsm log:
================
StorageDomainDoesNotExist: Storage domain does not exist: 
('036b5575-51fa-4f14-8b05-890d7807894c',)
47d37321-3f0f-4a1b-a8d5-629065fc1c4c::ERROR::2015-05-18 
14:35:35,099::sdc::137::Storage.StorageDomainCache::(_findDomain) 
looking for unfetched domain 036b5575-51fa-4f14-8b05-890d7807894c
47d37321-3f0f-4a1b-a8d5-629065fc1c4c::ERROR::2015-05-18 
14:35:35,099::sdc::154::Storage.StorageDomainCache::(_findUnfetchedDomain) 
looking for domain 036b5575-51fa-4f14-8b05-890d7807894c
47d37321-3f0f-4a1b-a8d5-629065fc1c4c::DEBUG::2015-05-18 
14:35:35,100::lvm::372::OperationMutex::(_reloadvgs) Operation 'lvm 
reload operation' got the operation mutex
47d37321-3f0f-4a1b-a8d5-629065fc1c4c::DEBUG::2015-05-18 
14:35:35,100::lvm::295::Storage.Misc.excCmd::(cmd) '/usr/bin/sudo -n 
/sbin/lvm vgs --config " devices { preferred_names = 
[\\"^/dev/mapper/\\"] ignore_suspended_devices=1 write_cache_state=0 
disable_after_error_count=3 obtain_device_list_from_udev=0 filter = [ 
\'r|.*|\' ] }  global {  locking_type=1  prioritise_write_locks=1 
wait_for_locks=1 }  backup {  retain_min = 50  retain_days = 0 } " 
--noheadings --units b --nosuffix --separator | -o 
uuid,name,attr,size,free,extent_size,extent_count,free_count,tags,vg_mda_size,vg_mda_free,lv_count,pv_count,pv_name 
036b5575-51fa-4f14-8b05-890d7807894c' (cwd None)
47d37321-3f0f-4a1b-a8d5-629065fc1c4c::DEBUG::2015-05-18 
14:35:35,121::lvm::295::Storage.Misc.excCmd::(cmd) FAILED: <err> = ' 
Volume group "036b5575-51fa-4f14-8b05-890d7807894c" not found\n 
Skipping volume group 036b5575-51fa-4f14-8b05-890d7807894c\n'; <rc> = 5
47d37321-3f0f-4a1b-a8d5-629065fc1c4c::WARNING::2015-05-18 
14:35:35,122::lvm::377::Storage.LVM::(_reloadvgs) lvm vgs failed: 5 [] 
['  Volume group "036b5575-51fa-4f14-8b05-890d7807894c" not found', ' 
Skipping volume group 036b5575-51fa-4f14-8b05-890d7807894c']
47d37321-3f0f-4a1b-a8d5-629065fc1c4c::DEBUG::2015-05-18 
14:35:35,122::lvm::414::OperationMutex::(_reloadvgs) Operation 'lvm 
reload operation' released the operation mutex
47d37321-3f0f-4a1b-a8d5-629065fc1c4c::ERROR::2015-05-18 
14:35:35,133::sdc::143::Storage.StorageDomainCache::(_findDomain) domain 
036b5575-51fa-4f14-8b05-890d7807894c not found
Traceback (most recent call last):
   File "/usr/share/vdsm/storage/sdc.py", line 141, in _findDomain
     dom = findMethod(sdUUID)
   File "/usr/share/vdsm/storage/sdc.py", line 171, in _findUnfetchedDomain
     raise se.StorageDomainDoesNotExist(sdUUID)
StorageDomainDoesNotExist: Storage domain does not exist: 
('036b5575-51fa-4f14-8b05-890d7807894c',)
47d37321-3f0f-4a1b-a8d5-629065fc1c4c::ERROR::2015-05-18 
14:35:35,133::sp::329::Storage.StoragePool::(startSpm) Unexpected error
Traceback (most recent call last):
   File "/usr/share/vdsm/storage/sp.py", line 296, in startSpm
     self._updateDomainsRole()
   File "/usr/share/vdsm/storage/securable.py", line 75, in wrapper
     return method(self, *args, **kwargs)
   File "/usr/share/vdsm/storage/sp.py", line 205, in _updateDomainsRole
     domain = sdCache.produce(sdUUID)
   File "/usr/share/vdsm/storage/sdc.py", line 98, in produce
     domain.getRealDomain()
   File "/usr/share/vdsm/storage/sdc.py", line 52, in getRealDomain
     return self._cache._realProduce(self._sdUUID)
   File "/usr/share/vdsm/storage/sdc.py", line 122, in _realProduce
     domain = self._findDomain(sdUUID)
   File "/usr/share/vdsm/storage/sdc.py", line 141, in _findDomain
     dom = findMethod(sdUUID)
   File "/usr/share/vdsm/storage/sdc.py", line 171, in _findUnfetchedDomain
     raise se.StorageDomainDoesNotExist(sdUUID)
StorageDomainDoesNotExist: Storage domain does not exist: 
('036b5575-51fa-4f14-8b05-890d7807894c',)
47d37321-3f0f-4a1b-a8d5-629065fc1c4c::ERROR::2015-05-18 
14:35:35,134::sp::330::Storage.StoragePool::(startSpm) failed: Storage 
domain does not exist: ('036b5575-51fa-4f14-8b05-890d7807894c',)
47d37321-3f0f-4a1b-a8d5-629065fc1c4c::DEBUG::2015-05-18 
14:35:35,134::sp::336::Storage.StoragePool::(_shutDownUpgrade) Shutting 
down upgrade process
47d37321-3f0f-4a1b-a8d5-629065fc1c4c::DEBUG::2015-05-18 
14:35:35,134::resourceManager::198::ResourceManager.Request::(__init__) 
ResName=`Storage.upgrade_b384b3da-02a6-44f3-a3f6-56751ce8c26d`ReqID=`24d881e8-eba3-4b51-85e7-32301218b2e9`::Request 
was made in '/usr/share/vdsm/storage/sp.py' line '338' at '_shutDownUpgrade'
47d37321-3f0f-4a1b-a8d5-629065fc1c4c::DEBUG::2015-05-18 
14:35:35,135::resourceManager::542::ResourceManager::(registerResource) 
Trying to register resource 
'Storage.upgrade_b384b3da-02a6-44f3-a3f6-56751ce8c26d' for lock type 
'exclusive'
47d37321-3f0f-4a1b-a8d5-629065fc1c4c::DEBUG::2015-05-18 
14:35:35,135::resourceManager::601::ResourceManager::(registerResource) 
Resource 'Storage.upgrade_b384b3da-02a6-44f3-a3f6-56751ce8c26d' is free. 
Now locking as 'exclusive' (1 active user)
47d37321-3f0f-4a1b-a8d5-629065fc1c4c::DEBUG::2015-05-18 
14:35:35,135::resourceManager::238::ResourceManager.Request::(grant) 
ResName=`Storage.upgrade_b384b3da-02a6-44f3-a3f6-56751ce8c26d`ReqID=`24d881e8-eba3-4b51-85e7-32301218b2e9`::Granted 
request
47d37321-3f0f-4a1b-a8d5-629065fc1c4c::DEBUG::2015-05-18 
14:35:35,136::resourceManager::198::ResourceManager.Request::(__init__) 
ResName=`Storage.upgrade_036b5575-51fa-4f14-8b05-890d7807894c`ReqID=`140bc21d-520f-4796-aab0-595f7aba6932`::Request 
was made in '/usr/share/vdsm/storage/sp.py' line '358' at '_shutDownUpgrade'
47d37321-3f0f-4a1b-a8d5-629065fc1c4c::DEBUG::2015-05-18 
14:35:35,136::resourceManager::542::ResourceManager::(registerResource) 
Trying to register resource 
'Storage.upgrade_036b5575-51fa-4f14-8b05-890d7807894c' for lock type 
'exclusive'
47d37321-3f0f-4a1b-a8d5-629065fc1c4c::DEBUG::2015-05-18 
14:35:35,136::resourceManager::601::ResourceManager::(registerResource) 
Resource 'Storage.upgrade_036b5575-51fa-4f14-8b05-890d7807894c' is free. 
Now locking as 'exclusive' (1 active user)
47d37321-3f0f-4a1b-a8d5-629065fc1c4c::DEBUG::2015-05-18 
14:35:35,136::resourceManager::238::ResourceManager.Request::(grant) 
ResName=`Storage.upgrade_036b5575-51fa-4f14-8b05-890d7807894c`ReqID=`140bc21d-520f-4796-aab0-595f7aba6932`::Granted 
request
47d37321-3f0f-4a1b-a8d5-629065fc1c4c::DEBUG::2015-05-18 
14:35:35,136::resourceManager::616::ResourceManager::(releaseResource) 
Trying to release resource 
'Storage.upgrade_036b5575-51fa-4f14-8b05-890d7807894c'
47d37321-3f0f-4a1b-a8d5-629065fc1c4c::DEBUG::2015-05-18 
14:35:35,137::resourceManager::635::ResourceManager::(releaseResource) 
Released resource 'Storage.upgrade_036b5575-51fa-4f14-8b05-890d7807894c' 
(0 active users)
47d37321-3f0f-4a1b-a8d5-629065fc1c4c::DEBUG::2015-05-18 
14:35:35,137::resourceManager::641::ResourceManager::(releaseResource) 
Resource 'Storage.upgrade_036b5575-51fa-4f14-8b05-890d7807894c' is free, 
finding out if anyone is waiting for it.
47d37321-3f0f-4a1b-a8d5-629065fc1c4c::DEBUG::2015-05-18 
14:35:35,137::resourceManager::649::ResourceManager::(releaseResource) 
No one is waiting for resource 
'Storage.upgrade_036b5575-51fa-4f14-8b05-890d7807894c', Clearing records.
47d37321-3f0f-4a1b-a8d5-629065fc1c4c::DEBUG::2015-05-18 
14:35:35,137::resourceManager::616::ResourceManager::(releaseResource) 
Trying to release resource 
'Storage.upgrade_b384b3da-02a6-44f3-a3f6-56751ce8c26d'
47d37321-3f0f-4a1b-a8d5-629065fc1c4c::DEBUG::2015-05-18 
14:35:35,137::resourceManager::635::ResourceManager::(releaseResource) 
Released resource 'Storage.upgrade_b384b3da-02a6-44f3-a3f6-56751ce8c26d' 
(0 active users)
47d37321-3f0f-4a1b-a8d5-629065fc1c4c::DEBUG::2015-05-18 
14:35:35,138::resourceManager::641::ResourceManager::(releaseResource) 
Resource 'Storage.upgrade_b384b3da-02a6-44f3-a3f6-56751ce8c26d' is free, 
finding out if anyone is waiting for it.
47d37321-3f0f-4a1b-a8d5-629065fc1c4c::DEBUG::2015-05-18 
14:35:35,138::resourceManager::649::ResourceManager::(releaseResource) 
No one is waiting for resource 
'Storage.upgrade_b384b3da-02a6-44f3-a3f6-56751ce8c26d', Clearing records.
47d37321-3f0f-4a1b-a8d5-629065fc1c4c::DEBUG::2015-05-18 
14:35:35,138::persistentDict::167::Storage.PersistentDict::(transaction) 
Starting transaction
47d37321-3f0f-4a1b-a8d5-629065fc1c4c::DEBUG::2015-05-18 
14:35:35,139::persistentDict::173::Storage.PersistentDict::(transaction) 
Flushing changes
47d37321-3f0f-4a1b-a8d5-629065fc1c4c::DEBUG::2015-05-18 
14:35:35,139::persistentDict::299::Storage.PersistentDict::(flush) about 
to write lines (FileMetadataRW)=['CLASS=Data', 
'DESCRIPTION=RaidVolBGluster', 'IOOPTIMEOUTSEC=10', 'LEASERETRIES=3', 
'LEASETIMESEC=60', 'LOCKPOLICY=', 'LOCKRENEWALINTERVALSEC=5', 
'MASTER_VERSION=1', 'POOL_DESCRIPTION=HP_Proliant_DL180G6', 
'POOL_DOMAINS=6d882c77-cdbc-48ef-ae21-1a6d45e7f8a1:Active,036b5575-51fa-4f14-8b05-890d7807894c:Active,abc51e26-7175-4b38-b3a8-95c6928fbc2b:Active,23602741-2967-4fad-a749-d58e1459d5c8:Attached', 
'POOL_SPM_ID=-1', 'POOL_SPM_LVER=0', 
'POOL_UUID=b384b3da-02a6-44f3-a3f6-56751ce8c26d', 
'REMOTE_PATH=127.0.0.1:/RaidVolB', 'ROLE=Master', 
'SDUUID=abc51e26-7175-4b38-b3a8-95c6928fbc2b', 'TYPE=GLUSTERFS', 
'VERSION=3', '_SHA_CKSUM=401970538872dfd4e17afcb84361851559b18e4c']
47d37321-3f0f-4a1b-a8d5-629065fc1c4c::DEBUG::2015-05-18 
14:35:35,146::persistentDict::175::Storage.PersistentDict::(transaction) 
Finished transaction
47d37321-3f0f-4a1b-a8d5-629065fc1c4c::INFO::2015-05-18 
14:35:35,146::clusterlock::279::SANLock::(release) Releasing cluster 
lock for domain abc51e26-7175-4b38-b3a8-95c6928fbc2b
47d37321-3f0f-4a1b-a8d5-629065fc1c4c::DEBUG::2015-05-18 
14:35:35,151::clusterlock::289::SANLock::(release) Cluster lock for 
domain abc51e26-7175-4b38-b3a8-95c6928fbc2b successfully released
47d37321-3f0f-4a1b-a8d5-629065fc1c4c::ERROR::2015-05-18 
14:35:35,151::task::866::TaskManager.Task::(_setError) 
Task=`47d37321-3f0f-4a1b-a8d5-629065fc1c4c`::Unexpected error
Traceback (most recent call last):
   File "/usr/share/vdsm/storage/task.py", line 873, in _run
     return fn(*args, **kargs)
   File "/usr/share/vdsm/storage/task.py", line 334, in run
     return self.cmd(*self.argslist, **self.argsdict)
   File "/usr/share/vdsm/storage/sp.py", line 296, in startSpm
     self._updateDomainsRole()
   File "/usr/share/vdsm/storage/securable.py", line 75, in wrapper
     return method(self, *args, **kwargs)
   File "/usr/share/vdsm/storage/sp.py", line 205, in _updateDomainsRole
     domain = sdCache.produce(sdUUID)
   File "/usr/share/vdsm/storage/sdc.py", line 98, in produce
     domain.getRealDomain()
   File "/usr/share/vdsm/storage/sdc.py", line 52, in getRealDomain
     return self._cache._realProduce(self._sdUUID)
   File "/usr/share/vdsm/storage/sdc.py", line 122, in _realProduce
     domain = self._findDomain(sdUUID)
   File "/usr/share/vdsm/storage/sdc.py", line 141, in _findDomain
     dom = findMethod(sdUUID)
   File "/usr/share/vdsm/storage/sdc.py", line 171, in _findUnfetchedDomain
     raise se.StorageDomainDoesNotExist(sdUUID)
StorageDomainDoesNotExist: Storage domain does not exist: 
('036b5575-51fa-4f14-8b05-890d7807894c',)
47d37321-3f0f-4a1b-a8d5-629065fc1c4c::DEBUG::2015-05-18 
14:35:35,151::task::885::TaskManager.Task::(_run) 
Task=`47d37321-3f0f-4a1b-a8d5-629065fc1c4c`::Task._run: 
47d37321-3f0f-4a1b-a8d5-629065fc1c4c () {} failed - stopping task
47d37321-3f0f-4a1b-a8d5-629065fc1c4c::DEBUG::2015-05-18 
14:35:35,152::task::1211::TaskManager.Task::(stop) 
Task=`47d37321-3f0f-4a1b-a8d5-629065fc1c4c`::stopping in state running 
(force False)
47d37321-3f0f-4a1b-a8d5-629065fc1c4c::DEBUG::2015-05-18 
14:35:35,152::task::990::TaskManager.Task::(_decref) 
Task=`47d37321-3f0f-4a1b-a8d5-629065fc1c4c`::ref 1 aborting True
47d37321-3f0f-4a1b-a8d5-629065fc1c4c::DEBUG::2015-05-18 
14:35:35,152::task::916::TaskManager.Task::(_runJobs) 
Task=`47d37321-3f0f-4a1b-a8d5-629065fc1c4c`::aborting: Task is aborted: 
'Storage domain does not exist' - code 358
47d37321-3f0f-4a1b-a8d5-629065fc1c4c::DEBUG::2015-05-18 
14:35:35,152::task::990::TaskManager.Task::(_decref) 
Task=`47d37321-3f0f-4a1b-a8d5-629065fc1c4c`::ref 0 aborting True
47d37321-3f0f-4a1b-a8d5-629065fc1c4c::DEBUG::2015-05-18 
14:35:35,152::task::925::TaskManager.Task::(_doAbort) 
Task=`47d37321-3f0f-4a1b-a8d5-629065fc1c4c`::Task._doAbort: force False
47d37321-3f0f-4a1b-a8d5-629065fc1c4c::DEBUG::2015-05-18 
14:35:35,153::resourceManager::977::ResourceManager.Owner::(cancelAll) 
Owner.cancelAll requests {}
47d37321-3f0f-4a1b-a8d5-629065fc1c4c::DEBUG::2015-05-18 
14:35:35,153::task::595::TaskManager.Task::(_updateState) 
Task=`47d37321-3f0f-4a1b-a8d5-629065fc1c4c`::moving from state running 
-> state aborting
47d37321-3f0f-4a1b-a8d5-629065fc1c4c::DEBUG::2015-05-18 
14:35:35,153::task::550::TaskManager.Task::(__state_aborting) 
Task=`47d37321-3f0f-4a1b-a8d5-629065fc1c4c`::_aborting: recover policy auto
47d37321-3f0f-4a1b-a8d5-629065fc1c4c::DEBUG::2015-05-18 
14:35:35,153::task::595::TaskManager.Task::(_updateState) 
Task=`47d37321-3f0f-4a1b-a8d5-629065fc1c4c`::moving from state aborting 
-> state racquiring
47d37321-3f0f-4a1b-a8d5-629065fc1c4c::DEBUG::2015-05-18 
14:35:35,153::task::595::TaskManager.Task::(_updateState) 
Task=`47d37321-3f0f-4a1b-a8d5-629065fc1c4c`::moving from state 
racquiring -> state recovering
47d37321-3f0f-4a1b-a8d5-629065fc1c4c::DEBUG::2015-05-18 
14:35:35,153::task::798::TaskManager.Task::(_recover) 
Task=`47d37321-3f0f-4a1b-a8d5-629065fc1c4c`::_recover
47d37321-3f0f-4a1b-a8d5-629065fc1c4c::DEBUG::2015-05-18 
14:35:35,154::task::805::TaskManager.Task::(_recover) 
Task=`47d37321-3f0f-4a1b-a8d5-629065fc1c4c`::running recovery None
47d37321-3f0f-4a1b-a8d5-629065fc1c4c::DEBUG::2015-05-18 
14:35:35,154::task::786::TaskManager.Task::(_recoverDone) 
Task=`47d37321-3f0f-4a1b-a8d5-629065fc1c4c`::Recover Done: state recovering
47d37321-3f0f-4a1b-a8d5-629065fc1c4c::DEBUG::2015-05-18 
14:35:35,154::task::595::TaskManager.Task::(_updateState) 
Task=`47d37321-3f0f-4a1b-a8d5-629065fc1c4c`::moving from state 
recovering -> state recovered
47d37321-3f0f-4a1b-a8d5-629065fc1c4c::DEBUG::2015-05-18 
14:35:35,154::resourceManager::940::ResourceManager.Owner::(releaseAll) 
Owner.releaseAll requests {} resources 
{'Storage.b384b3da-02a6-44f3-a3f6-56751ce8c26d': < ResourceRef 
'Storage.b384b3da-02a6-44f3-a3f6-56751ce8c26d', isValid: 'True' obj: 
'None'>}
47d37321-3f0f-4a1b-a8d5-629065fc1c4c::DEBUG::2015-05-18 
14:35:35,154::resourceManager::977::ResourceManager.Owner::(cancelAll) 
Owner.cancelAll requests {}
47d37321-3f0f-4a1b-a8d5-629065fc1c4c::DEBUG::2015-05-18 
14:35:35,155::resourceManager::616::ResourceManager::(releaseResource) 
Trying to release resource 'Storage.b384b3da-02a6-44f3-a3f6-56751ce8c26d'
47d37321-3f0f-4a1b-a8d5-629065fc1c4c::DEBUG::2015-05-18 
14:35:35,155::resourceManager::635::ResourceManager::(releaseResource) 
Released resource 'Storage.b384b3da-02a6-44f3-a3f6-56751ce8c26d' (0 
active users)
47d37321-3f0f-4a1b-a8d5-629065fc1c4c::DEBUG::2015-05-18 
14:35:35,155::resourceManager::641::ResourceManager::(releaseResource) 
Resource 'Storage.b384b3da-02a6-44f3-a3f6-56751ce8c26d' is free, finding 
out if anyone is waiting for it.
47d37321-3f0f-4a1b-a8d5-629065fc1c4c::DEBUG::2015-05-18 
14:35:35,155::resourceManager::649::ResourceManager::(releaseResource) 
No one is waiting for resource 
'Storage.b384b3da-02a6-44f3-a3f6-56751ce8c26d', Clearing records.
47d37321-3f0f-4a1b-a8d5-629065fc1c4c::DEBUG::2015-05-18 
14:35:35,155::threadPool::57::Misc.ThreadPool::(setRunningTask) Number 
of running tasks: 0
Thread-40::DEBUG::2015-05-18 
14:35:35,284::fileSD::225::Storage.Misc.excCmd::(getReadDelay) '/bin/dd 
iflag=direct 
if=/rhev/data-center/mnt/glusterSD/127.0.0.1:_RaidVolB/abc51e26-7175-4b38-b3a8-95c6928fbc2b/dom_md/metadata 
bs=4096 count=1' (cwd None)
Thread-40::DEBUG::2015-05-18 
14:35:35,295::fileSD::225::Storage.Misc.excCmd::(getReadDelay) SUCCESS: 
<err> = '0+1 records in\n0+1 records out\n613 bytes (613 B) copied, 
0.000701751 s, 874 kB/s\n'; <rc> = 0
Thread-5036::DEBUG::2015-05-18 
14:35:35,501::task::595::TaskManager.Task::(_updateState) 
Task=`4ff82d7b-b4a7-48f0-ba5f-7381d2dcee67`::moving from state init -> 
state preparing
Thread-5036::INFO::2015-05-18 
14:35:35,501::logUtils::44::dispatcher::(wrapper) Run and protect: 
repoStats(options=None)
Thread-5036::INFO::2015-05-18 
14:35:35,502::logUtils::47::dispatcher::(wrapper) Run and protect: 
repoStats, Return response: {'6d882c77-cdbc-48ef-ae21-1a6d45e7f8a1': 
{'code': 0, 'version': 0, 'acquired': True, 'delay': '0.000757621', 
'lastCheck': '1.9', 'valid': True}, 
'036b5575-51fa-4f14-8b05-890d7807894c': {'code': 358, 'version': -1, 
'acquired': False, 'delay': '0', 'lastCheck': '9.1', 'valid': False}, 
'abc51e26-7175-4b38-b3a8-95c6928fbc2b': {'code': 0, 'version': 3, 
'acquired': True, 'delay': '0.000701751', 'lastCheck': '0.2', 'valid': 
True}}
Thread-5036::DEBUG::2015-05-18 
14:35:35,502::task::1185::TaskManager.Task::(prepare) 
Task=`4ff82d7b-b4a7-48f0-ba5f-7381d2dcee67`::finished: 
{'6d882c77-cdbc-48ef-ae21-1a6d45e7f8a1': {'code': 0, 'version': 0, 
'acquired': True, 'delay': '0.000757621', 'lastCheck': '1.9', 'valid': 
True}, '036b5575-51fa-4f14-8b05-890d7807894c': {'code': 358, 'version': 
-1, 'acquired': False, 'delay': '0', 'lastCheck': '9.1', 'valid': 
False}, 'abc51e26-7175-4b38-b3a8-95c6928fbc2b': {'code': 0, 'version': 
3, 'acquired': True, 'delay': '0.000701751', 'lastCheck': '0.2', 
'valid': True}}
Thread-5036::DEBUG::2015-05-18 
14:35:35,502::task::595::TaskManager.Task::(_updateState) 
Task=`4ff82d7b-b4a7-48f0-ba5f-7381d2dcee67`::moving from state preparing 
-> state finished
Thread-5036::DEBUG::2015-05-18 
14:35:35,502::resourceManager::940::ResourceManager.Owner::(releaseAll) 
Owner.releaseAll requests {} resources {}
Thread-5036::DEBUG::2015-05-18 
14:35:35,503::resourceManager::977::ResourceManager.Owner::(cancelAll) 
Owner.cancelAll requests {}
Thread-5036::DEBUG::2015-05-18 
14:35:35,503::task::990::TaskManager.Task::(_decref) 
Task=`4ff82d7b-b4a7-48f0-ba5f-7381d2dcee67`::ref 0 aborting False
VM Channels Listener::DEBUG::2015-05-18 
14:35:35,562::vmChannels::128::vds::(_handle_unconnected) Trying to 
connect fileno 38.
VM Channels Listener::DEBUG::2015-05-18 
14:35:35,562::vmChannels::128::vds::(_handle_unconnected) Trying to 
connect fileno 72.
VM Channels Listener::DEBUG::2015-05-18 
14:35:35,562::vmChannels::128::vds::(_handle_unconnected) Trying to 
connect fileno 49.
VM Channels Listener::DEBUG::2015-05-18 
14:35:35,563::vmChannels::128::vds::(_handle_unconnected) Trying to 
connect fileno 50.
VM Channels Listener::DEBUG::2015-05-18 
14:35:35,563::vmChannels::128::vds::(_handle_unconnected) Trying to 
connect fileno 54.
VM Channels Listener::DEBUG::2015-05-18 
14:35:35,563::vmChannels::128::vds::(_handle_unconnected) Trying to 
connect fileno 62.
Thread-5036::DEBUG::2015-05-18 
14:35:35,938::BindingXMLRPC::251::vds::(wrapper) client [192.168.30.103] 
flowID [447665f7]
Thread-5036::DEBUG::2015-05-18 
14:35:35,939::task::595::TaskManager.Task::(_updateState) 
Task=`c7cd482f-ee37-4b8e-a879-16bd4f317e97`::moving from state init -> 
state preparing
Thread-5036::INFO::2015-05-18 
14:35:35,939::logUtils::44::dispatcher::(wrapper) Run and protect: 
getTaskStatus(taskID='47d37321-3f0f-4a1b-a8d5-629065fc1c4c', 
spUUID=None, options=None)
Thread-5036::DEBUG::2015-05-18 
14:35:35,939::taskManager::93::TaskManager::(getTaskStatus) Entry. 
taskID: 47d37321-3f0f-4a1b-a8d5-629065fc1c4c
Thread-5036::DEBUG::2015-05-18 
14:35:35,939::taskManager::96::TaskManager::(getTaskStatus) Return. 
Response: {'code': 358, 'message': 'Storage domain does not exist', 
'taskState': 'finished', 'taskResult': 'cleanSuccess', 'taskID': 
'47d37321-3f0f-4a1b-a8d5-629065fc1c4c'}
Thread-5036::INFO::2015-05-18 
14:35:35,939::logUtils::47::dispatcher::(wrapper) Run and protect: 
getTaskStatus, Return response: {'taskStatus': {'code': 358, 'message': 
'Storage domain does not exist', 'taskState': 'finished', 'taskResult': 
'cleanSuccess', 'taskID': '47d37321-3f0f-4a1b-a8d5-629065fc1c4c'}}
Thread-5036::DEBUG::2015-05-18 
14:35:35,940::task::1185::TaskManager.Task::(prepare) 
Task=`c7cd482f-ee37-4b8e-a879-16bd4f317e97`::finished: {'taskStatus': 
{'code': 358, 'message': 'Storage domain does not exist', 'taskState': 
'finished', 'taskResult': 'cleanSuccess', 'taskID': 
'47d37321-3f0f-4a1b-a8d5-629065fc1c4c'}}
Thread-5036::DEBUG::2015-05-18 
14:35:35,940::task::595::TaskManager.Task::(_updateState) 
Task=`c7cd482f-ee37-4b8e-a879-16bd4f317e97`::moving from state preparing 
-> state finished
Thread-5036::DEBUG::2015-05-18 
14:35:35,940::resourceManager::940::ResourceManager.Owner::(releaseAll) 
Owner.releaseAll requests {} resources {}
Thread-5036::DEBUG::2015-05-18 
14:35:35,940::resourceManager::977::ResourceManager.Owner::(cancelAll) 
Owner.cancelAll requests {}
Thread-5036::DEBUG::2015-05-18 
14:35:35,940::task::990::TaskManager.Task::(_decref) 
Task=`c7cd482f-ee37-4b8e-a879-16bd4f317e97`::ref 0 aborting False
Thread-5036::DEBUG::2015-05-18 
14:35:35,952::BindingXMLRPC::251::vds::(wrapper) client [192.168.30.103] 
flowID [447665f7]
Thread-5036::DEBUG::2015-05-18 
14:35:35,953::task::595::TaskManager.Task::(_updateState) 
Task=`e30dd05c-4520-4074-94ea-632be5ffa06d`::moving from state init -> 
state preparing
Thread-5036::INFO::2015-05-18 
14:35:35,953::logUtils::44::dispatcher::(wrapper) Run and protect: 
getSpmStatus(spUUID='b384b3da-02a6-44f3-a3f6-56751ce8c26d', options=None)
Thread-5036::INFO::2015-05-18 
14:35:35,957::logUtils::47::dispatcher::(wrapper) Run and protect: 
getSpmStatus, Return response: {'spm_st': {'spmId': -1, 'spmStatus': 
'Free', 'spmLver': -1}}
Thread-5036::DEBUG::2015-05-18 
14:35:35,957::task::1185::TaskManager.Task::(prepare) 
Task=`e30dd05c-4520-4074-94ea-632be5ffa06d`::finished: {'spm_st': 
{'spmId': -1, 'spmStatus': 'Free', 'spmLver': -1}}
Thread-5036::DEBUG::2015-05-18 
14:35:35,958::task::595::TaskManager.Task::(_updateState) 
Task=`e30dd05c-4520-4074-94ea-632be5ffa06d`::moving from state preparing 
-> state finished
Thread-5036::DEBUG::2015-05-18 
14:35:35,958::resourceManager::940::ResourceManager.Owner::(releaseAll) 
Owner.releaseAll requests {} resources {}
Thread-5036::DEBUG::2015-05-18 
14:35:35,958::resourceManager::977::ResourceManager.Owner::(cancelAll) 
Owner.cancelAll requests {}
Thread-5036::DEBUG::2015-05-18 
14:35:35,958::task::990::TaskManager.Task::(_decref) 
Task=`e30dd05c-4520-4074-94ea-632be5ffa06d`::ref 0 aborting False
Thread-5036::DEBUG::2015-05-18 
14:35:36,062::BindingXMLRPC::251::vds::(wrapper) client [192.168.30.103] 
flowID [447665f7]
Thread-5036::DEBUG::2015-05-18 
14:35:36,063::task::595::TaskManager.Task::(_updateState) 
Task=`a8cae188-5991-4513-a077-2f6d2bc7b3e6`::moving from state init -> 
state preparing
Thread-5036::INFO::2015-05-18 
14:35:36,063::logUtils::44::dispatcher::(wrapper) Run and protect: 
clearTask(taskID='47d37321-3f0f-4a1b-a8d5-629065fc1c4c', spUUID=None, 
options=None)
Thread-5036::DEBUG::2015-05-18 
14:35:36,063::taskManager::161::TaskManager::(clearTask) Entry. taskID: 
47d37321-3f0f-4a1b-a8d5-629065fc1c4c
Thread-5036::DEBUG::2015-05-18 
14:35:36,063::taskManager::166::TaskManager::(clearTask) Return.
Thread-5036::INFO::2015-05-18 
14:35:36,063::logUtils::47::dispatcher::(wrapper) Run and protect: 
clearTask, Return response: None
Thread-5036::DEBUG::2015-05-18 
14:35:36,063::task::1185::TaskManager.Task::(prepare) 
Task=`a8cae188-5991-4513-a077-2f6d2bc7b3e6`::finished: None
Thread-5036::DEBUG::2015-05-18 
14:35:36,064::task::595::TaskManager.Task::(_updateState) 
Task=`a8cae188-5991-4513-a077-2f6d2bc7b3e6`::moving from state preparing 
-> state finished
Thread-5036::DEBUG::2015-05-18 
14:35:36,064::resourceManager::940::ResourceManager.Owner::(releaseAll) 
Owner.releaseAll requests {} resources {}
Thread-5036::DEBUG::2015-05-18 
14:35:36,064::resourceManager::977::ResourceManager.Owner::(cancelAll) 
Owner.cancelAll requests {}
Thread-5036::DEBUG::2015-05-18 
14:35:36,064::task::990::TaskManager.Task::(_decref) 
Task=`a8cae188-5991-4513-a077-2f6d2bc7b3e6`::ref 0 aborting False
Thread-5036::DEBUG::2015-05-18 
14:35:36,300::BindingXMLRPC::251::vds::(wrapper) client [192.168.30.103] 
flowID [4f00b66b]
Thread-5036::DEBUG::2015-05-18 
14:35:36,301::task::595::TaskManager.Task::(_updateState) 
Task=`0b2423bc-dbf2-483e-85b9-fbf6f446b305`::moving from state init -> 
state preparing
Thread-5036::INFO::2015-05-18 
14:35:36,301::logUtils::44::dispatcher::(wrapper) Run and protect: 
getSpmStatus(spUUID='b384b3da-02a6-44f3-a3f6-56751ce8c26d', options=None)
Thread-5036::INFO::2015-05-18 
14:35:36,306::logUtils::47::dispatcher::(wrapper) Run and protect: 
getSpmStatus, Return response: {'spm_st': {'spmId': -1, 'spmStatus': 
'Free', 'spmLver': -1}}
Thread-5036::DEBUG::2015-05-18 
14:35:36,306::task::1185::TaskManager.Task::(prepare) 
Task=`0b2423bc-dbf2-483e-85b9-fbf6f446b305`::finished: {'spm_st': 
{'spmId': -1, 'spmStatus': 'Free', 'spmLver': -1}}
Thread-5036::DEBUG::2015-05-18 
14:35:36,307::task::595::TaskManager.Task::(_updateState) 
Task=`0b2423bc-dbf2-483e-85b9-fbf6f446b305`::moving from state preparing 
-> state finished
Thread-5036::DEBUG::2015-05-18 
14:35:36,307::resourceManager::940::ResourceManager.Owner::(releaseAll) 
Owner.releaseAll requests {} resources {}
Thread-5036::DEBUG::2015-05-18 
14:35:36,307::resourceManager::977::ResourceManager.Owner::(cancelAll) 
Owner.cancelAll requests {}
Thread-5036::DEBUG::2015-05-18 
14:35:36,307::task::990::TaskManager.Task::(_decref) 
Task=`0b2423bc-dbf2-483e-85b9-fbf6f446b305`::ref 0 aborting False
Thread-36::ERROR::2015-05-18 
14:35:36,419::sdc::137::Storage.StorageDomainCache::(_findDomain) 
looking for unfetched domain 036b5575-51fa-4f14-8b05-890d7807894c
Thread-36::ERROR::2015-05-18 
14:35:36,419::sdc::154::Storage.StorageDomainCache::(_findUnfetchedDomain) 
looking for domain 036b5575-51fa-4f14-8b05-890d7807894c
Thread-36::DEBUG::2015-05-18 
14:35:36,419::lvm::372::OperationMutex::(_reloadvgs) Operation 'lvm 
reload operation' got the operation mutex
Thread-36::DEBUG::2015-05-18 
14:35:36,420::lvm::295::Storage.Misc.excCmd::(cmd) '/usr/bin/sudo -n 
/sbin/lvm vgs --config " devices { preferred_names = 
[\\"^/dev/mapper/\\"] ignore_suspended_devices=1 write_cache_state=0 
disable_after_error_count=3 obtain_device_list_from_udev=0 filter = [ 
\'r|.*|\' ] }  global {  locking_type=1  prioritise_write_locks=1 
wait_for_locks=1 }  backup {  retain_min = 50  retain_days = 0 } " 
--noheadings --units b --nosuffix --separator | -o 
uuid,name,attr,size,free,extent_size,extent_count,free_count,tags,vg_mda_size,vg_mda_free,lv_count,pv_count,pv_name 
036b5575-51fa-4f14-8b05-890d7807894c' (cwd None)
Thread-36::DEBUG::2015-05-18 
14:35:36,443::lvm::295::Storage.Misc.excCmd::(cmd) FAILED: <err> = ' 
Volume group "036b5575-51fa-4f14-8b05-890d7807894c" not found\n 
Skipping volume group 036b5575-51fa-4f14-8b05-890d7807894c\n'; <rc> = 5
Thread-36::WARNING::2015-05-18 
14:35:36,444::lvm::377::Storage.LVM::(_reloadvgs) lvm vgs failed: 5 [] 
['  Volume group "036b5575-51fa-4f14-8b05-890d7807894c" not found', ' 
Skipping volume group 036b5575-51fa-4f14-8b05-890d7807894c']
Thread-36::DEBUG::2015-05-18 
14:35:36,444::lvm::414::OperationMutex::(_reloadvgs) Operation 'lvm 
reload operation' released the operation mutex
Thread-36::ERROR::2015-05-18 
14:35:36,460::sdc::143::Storage.StorageDomainCache::(_findDomain) domain 
036b5575-51fa-4f14-8b05-890d7807894c not found
Traceback (most recent call last):
   File "/usr/share/vdsm/storage/sdc.py", line 141, in _findDomain
     dom = findMethod(sdUUID)
   File "/usr/share/vdsm/storage/sdc.py", line 171, in _findUnfetchedDomain
     raise se.StorageDomainDoesNotExist(sdUUID)
StorageDomainDoesNotExist: Storage domain does not exist: 
('036b5575-51fa-4f14-8b05-890d7807894c',)
Thread-36::ERROR::2015-05-18 
14:35:36,463::domainMonitor::239::Storage.DomainMonitorThread::(_monitorDomain) 
Error while collecting domain 036b5575-51fa-4f14-8b05-890d7807894c 
monitoring information
Traceback (most recent call last):
   File "/usr/share/vdsm/storage/domainMonitor.py", line 204, in 
_monitorDomain
     self.domain = sdCache.produce(self.sdUUID)
   File "/usr/share/vdsm/storage/sdc.py", line 98, in produce
     domain.getRealDomain()
   File "/usr/share/vdsm/storage/sdc.py", line 52, in getRealDomain
     return self._cache._realProduce(self._sdUUID)
   File "/usr/share/vdsm/storage/sdc.py", line 122, in _realProduce
     domain = self._findDomain(sdUUID)
   File "/usr/share/vdsm/storage/sdc.py", line 141, in _findDomain
     dom = findMethod(sdUUID)
   File "/usr/share/vdsm/storage/sdc.py", line 171, in _findUnfetchedDomain
     raise se.StorageDomainDoesNotExist(sdUUID)
StorageDomainDoesNotExist: Storage domain does not exist: 
('036b5575-51fa-4f14-8b05-890d7807894c',)
VM Channels Listener::DEBUG::2015-05-18 
14:35:36,564::vmChannels::128::vds::(_handle_unconnected) Trying to 
connect fileno 38.
VM Channels Listener::DEBUG::2015-05-18 
14:35:36,564::vmChannels::128::vds::(_handle_unconnected) Trying to 
connect fileno 72.
VM Channels Listener::DEBUG::2015-05-18 
14:35:36,565::vmChannels::128::vds::(_handle_unconnected) Trying to 
connect fileno 49.
VM Channels Listener::DEBUG::2015-05-18 
14:35:36,565::vmChannels::128::vds::(_handle_unconnected) Trying to 
connect fileno 50.
VM Channels Listener::DEBUG::2015-05-18 
14:35:36,565::vmChannels::128::vds::(_handle_unconnected) Trying to 
connect fileno 54.
VM Channels Listener::DEBUG::2015-05-18 
14:35:36,565::vmChannels::128::vds::(_handle_unconnected) Trying to 
connect fileno 62.
VM Channels Listener::DEBUG::2015-05-18 
14:35:37,566::vmChannels::128::vds::(_handle_unconnected) Trying to 
connect fileno 38.
VM Channels Listener::DEBUG::2015-05-18 
14:35:37,566::vmChannels::128::vds::(_handle_unconnected) Trying to 
connect fileno 72.
VM Channels Listener::DEBUG::2015-05-18 
14:35:37,566::vmChannels::128::vds::(_handle_unconnected) Trying to 
connect fileno 49.
VM Channels Listener::DEBUG::2015-05-18 
14:35:37,566::vmChannels::128::vds::(_handle_unconnected) Trying to 
connect fileno 50.
VM Channels Listener::DEBUG::2015-05-18 
14:35:37,566::vmChannels::128::vds::(_handle_unconnected) Trying to 
connect fileno 54.
VM Channels Listener::DEBUG::2015-05-18 
14:35:37,566::vmChannels::128::vds::(_handle_unconnected) Trying to 
connect fileno 62.
Thread-27::DEBUG::2015-05-18 
14:35:38,253::task::595::TaskManager.Task::(_updateState) 
Task=`7674d743-bdef-497b-8dd3-98716e8c8f94`::moving from state init -> 
state preparing
Thread-27::INFO::2015-05-18 
14:35:38,253::logUtils::44::dispatcher::(wrapper) Run and protect: 
getVolumeSize(sdUUID='abc51e26-7175-4b38-b3a8-95c6928fbc2b', 
spUUID='b384b3da-02a6-44f3-a3f6-56751ce8c26d', 
imgUUID='9faa9678-d0c3-4886-bb7d-201032c60901', 
volUUID='3fd534be-fdd3-4f3a-96f1-e8d73f2660a7', options=None)
Thread-27::INFO::2015-05-18 
14:35:38,257::logUtils::47::dispatcher::(wrapper) Run and protect: 
getVolumeSize, Return response: {'truesize': '42949672960', 
'apparentsize': '42949672960'}
Thread-27::DEBUG::2015-05-18 
14:35:38,258::task::1185::TaskManager.Task::(prepare) 
Task=`7674d743-bdef-497b-8dd3-98716e8c8f94`::finished: {'truesize': 
'42949672960', 'apparentsize': '42949672960'}
Thread-27::DEBUG::2015-05-18 
14:35:38,258::task::595::TaskManager.Task::(_updateState) 
Task=`7674d743-bdef-497b-8dd3-98716e8c8f94`::moving from state preparing 
-> state finished
Thread-27::DEBUG::2015-05-18 
14:35:38,258::resourceManager::940::ResourceManager.Owner::(releaseAll) 
Owner.releaseAll requests {} resources {}
Thread-27::DEBUG::2015-05-18 
14:35:38,258::resourceManager::977::ResourceManager.Owner::(cancelAll) 
Owner.cancelAll requests {}
Thread-27::DEBUG::2015-05-18 
14:35:38,258::task::990::TaskManager.Task::(_decref) 
Task=`7674d743-bdef-497b-8dd3-98716e8c8f94`::ref 0 aborting False
Thread-5036::DEBUG::2015-05-18 
14:35:38,480::BindingXMLRPC::1067::vds::(wrapper) client 
[192.168.30.103]::call volumesList with () {} flowID [e98fff5]
VM Channels Listener::DEBUG::2015-05-18 
14:35:38,567::vmChannels::128::vds::(_handle_unconnected) Trying to 
connect fileno 38.
VM Channels Listener::DEBUG::2015-05-18 
14:35:38,568::vmChannels::128::vds::(_handle_unconnected) Trying to 
connect fileno 72.
VM Channels Listener::DEBUG::2015-05-18 
14:35:38,568::vmChannels::128::vds::(_handle_unconnected) Trying to 
connect fileno 49.
VM Channels Listener::DEBUG::2015-05-18 
14:35:38,568::vmChannels::128::vds::(_handle_unconnected) Trying to 
connect fileno 50.
VM Channels Listener::DEBUG::2015-05-18 
14:35:38,568::vmChannels::128::vds::(_handle_unconnected) Trying to 
connect fileno 54.
VM Channels Listener::DEBUG::2015-05-18 
14:35:38,569::vmChannels::128::vds::(_handle_unconnected) Trying to 
connect fileno 62.
Thread-5036::DEBUG::2015-05-18 
14:35:38,592::BindingXMLRPC::1074::vds::(wrapper) return volumesList 
with {'status': {'message': 'Done', 'code': 0}, 'volumes': {'RaidVolB': 
{'transportType': ['TCP'], 'uuid': 
'd46619e9-9368-4e82-bf3a-a2377b6e85e4', 'bricks': 
['ovirt-node01.stuttgart.example.net:/raidvol/volb', 
'ovirt-node02.stuttgart.example.net:/raidvol/volb'], 'volumeName': 
'RaidVolB', 'volumeType': 'REPLICATE', 'replicaCount': '2', 
'brickCount': '2', 'distCount': '2', 'volumeStatus': 'ONLINE', 
'stripeCount': '1', 'bricksInfo': [{'name': 
'ovirt-node01.stuttgart.example.net:/raidvol/volb', 'hostUuid': 
'de2a515f-c728-449d-b91c-d80cabe0539f'}, {'name': 
'ovirt-node02.stuttgart.example.net:/raidvol/volb', 'hostUuid': 
'7540f5c0-c4ba-4520-bdf1-3115c10d0eea'}], 'options': {'user.cifs': 
'disable', 'storage.owner-gid': '36', 'storage.owner-uid': '36', 
'nfs.disable': 'on', 'auth.allow': '*'}}}}
VM Channels Listener::DEBUG::2015-05-18 
14:35:39,570::vmChannels::128::vds::(_handle_unconnected) Trying to 
connect fileno 38.
VM Channels Listener::DEBUG::2015-05-18 
14:35:39,570::vmChannels::128::vds::(_handle_unconnected) Trying to 
connect fileno 72.
VM Channels Listener::DEBUG::2015-05-18 
14:35:39,570::vmChannels::128::vds::(_handle_unconnected) Trying to 
connect fileno 49.
VM Channels Listener::DEBUG::2015-05-18 
14:35:39,570::vmChannels::128::vds::(_handle_unconnected) Trying to 
connect fileno 50.
VM Channels Listener::DEBUG::2015-05-18 
14:35:39,571::vmChannels::128::vds::(_handle_unconnected) Trying to 
connect fileno 54.
VM Channels Listener::DEBUG::2015-05-18 
14:35:39,571::vmChannels::128::vds::(_handle_unconnected) Trying to 
connect fileno 62.
VM Channels Listener::DEBUG::2015-05-18 
14:35:40,572::vmChannels::128::vds::(_handle_unconnected) Trying to 
connect fileno 38.
VM Channels Listener::DEBUG::2015-05-18 
14:35:40,572::vmChannels::128::vds::(_handle_unconnected) Trying to 
connect fileno 72.
VM Channels Listener::DEBUG::2015-05-18 
14:35:40,572::vmChannels::128::vds::(_handle_unconnected) Trying to 
connect fileno 49.
VM Channels Listener::DEBUG::2015-05-18 
14:35:40,573::vmChannels::128::vds::(_handle_unconnected) Trying to 
connect fileno 50.
VM Channels Listener::DEBUG::2015-05-18 
14:35:40,573::vmChannels::128::vds::(_handle_unconnected) Trying to 
connect fileno 54.
VM Channels Listener::DEBUG::2015-05-18 
14:35:40,573::vmChannels::128::vds::(_handle_unconnected) Trying to 
connect fileno 62.
Thread-29::DEBUG::2015-05-18 
14:35:41,311::task::595::TaskManager.Task::(_updateState) 
Task=`23f5dbfe-bfbb-47ef-b456-827361104976`::moving from state init -> 
state preparing
Thread-29::INFO::2015-05-18 
14:35:41,312::logUtils::44::dispatcher::(wrapper) Run and protect: 
getVolumeSize(sdUUID='abc51e26-7175-4b38-b3a8-95c6928fbc2b', 
spUUID='b384b3da-02a6-44f3-a3f6-56751ce8c26d', 
imgUUID='09e9fba1-19e4-4b51-855f-c7d80c437690', 
volUUID='cc0219bc-f9ae-487f-bca2-a4aa79bea718', options=None)
Thread-28::DEBUG::2015-05-18 
14:35:41,315::task::595::TaskManager.Task::(_updateState) 
Task=`e9fac93e-e64a-487a-bea7-754e164b534f`::moving from state init -> 
state preparing
Thread-28::INFO::2015-05-18 
14:35:41,315::logUtils::44::dispatcher::(wrapper) Run and protect: 
getVolumeSize(sdUUID='abc51e26-7175-4b38-b3a8-95c6928fbc2b', 
spUUID='b384b3da-02a6-44f3-a3f6-56751ce8c26d', 
imgUUID='12f8ef8a-8bb8-4458-b1a0-eaf9137bfec6', 
volUUID='29aa0417-953e-4df7-aee1-46ac96966a62', options=None)
Thread-29::INFO::2015-05-18 
14:35:41,317::logUtils::47::dispatcher::(wrapper) Run and protect: 
getVolumeSize, Return response: {'truesize': '38558031872', 
'apparentsize': '64424509440'}
Thread-29::DEBUG::2015-05-18 
14:35:41,317::task::1185::TaskManager.Task::(prepare) 
Task=`23f5dbfe-bfbb-47ef-b456-827361104976`::finished: {'truesize': 
'38558031872', 'apparentsize': '64424509440'}
Thread-29::DEBUG::2015-05-18 
14:35:41,318::task::595::TaskManager.Task::(_updateState) 
Task=`23f5dbfe-bfbb-47ef-b456-827361104976`::moving from state preparing 
-> state finished
Thread-29::DEBUG::2015-05-18 
14:35:41,318::resourceManager::940::ResourceManager.Owner::(releaseAll) 
Owner.releaseAll requests {} resources {}
Thread-29::DEBUG::2015-05-18 
14:35:41,319::resourceManager::977::ResourceManager.Owner::(cancelAll) 
Owner.cancelAll requests {}
Thread-29::DEBUG::2015-05-18 
14:35:41,320::task::990::TaskManager.Task::(_decref) 
Task=`23f5dbfe-bfbb-47ef-b456-827361104976`::ref 0 aborting False
Thread-28::INFO::2015-05-18 
14:35:41,321::logUtils::47::dispatcher::(wrapper) Run and protect: 
getVolumeSize, Return response: {'truesize': '56300916736', 
'apparentsize': '85899345920'}
Thread-28::DEBUG::2015-05-18 
14:35:41,321::task::1185::TaskManager.Task::(prepare) 
Task=`e9fac93e-e64a-487a-bea7-754e164b534f`::finished: {'truesize': 
'56300916736', 'apparentsize': '85899345920'}
Thread-28::DEBUG::2015-05-18 
14:35:41,321::task::595::TaskManager.Task::(_updateState) 
Task=`e9fac93e-e64a-487a-bea7-754e164b534f`::moving from state preparing 
-> state finished
Thread-28::DEBUG::2015-05-18 
14:35:41,321::resourceManager::940::ResourceManager.Owner::(releaseAll) 
Owner.releaseAll requests {} resources {}
Thread-28::DEBUG::2015-05-18 
14:35:41,321::resourceManager::977::ResourceManager.Owner::(cancelAll) 
Owner.cancelAll requests {}
Thread-28::DEBUG::2015-05-18 
14:35:41,322::task::990::TaskManager.Task::(_decref) 
Task=`e9fac93e-e64a-487a-bea7-754e164b534f`::ref 0 aborting False
Thread-32::DEBUG::2015-05-18 
14:35:41,375::task::595::TaskManager.Task::(_updateState) 
Task=`51bc3b9a-a892-404d-9b4a-2bf55b5a0d15`::moving from state init -> 
state preparing
Thread-32::INFO::2015-05-18 
14:35:41,375::logUtils::44::dispatcher::(wrapper) Run and protect: 
getVolumeSize(sdUUID='abc51e26-7175-4b38-b3a8-95c6928fbc2b', 
spUUID='b384b3da-02a6-44f3-a3f6-56751ce8c26d', 
imgUUID='0963f2b7-d352-4d92-91bf-4bac9ec6b660', 
volUUID='6c7c2109-353a-4324-8cb0-8b719876b2dc', options=None)
Thread-32::INFO::2015-05-18 
14:35:41,378::logUtils::47::dispatcher::(wrapper) Run and protect: 
getVolumeSize, Return response: {'truesize': '5368709120', 
'apparentsize': '5368709120'}
Thread-32::DEBUG::2015-05-18 
14:35:41,378::task::1185::TaskManager.Task::(prepare) 
Task=`51bc3b9a-a892-404d-9b4a-2bf55b5a0d15`::finished: {'truesize': 
'5368709120', 'apparentsize': '5368709120'}
Thread-32::DEBUG::2015-05-18 
14:35:41,378::task::595::TaskManager.Task::(_updateState) 
Task=`51bc3b9a-a892-404d-9b4a-2bf55b5a0d15`::moving from state preparing 
-> state finished
Thread-32::DEBUG::2015-05-18 
14:35:41,378::resourceManager::940::ResourceManager.Owner::(releaseAll) 
Owner.releaseAll requests {} resources {}
Thread-32::DEBUG::2015-05-18 
14:35:41,379::resourceManager::977::ResourceManager.Owner::(cancelAll) 
Owner.cancelAll requests {}
Thread-32::DEBUG::2015-05-18 
14:35:41,379::task::990::TaskManager.Task::(_decref) 
Task=`51bc3b9a-a892-404d-9b4a-2bf55b5a0d15`::ref 0 aborting False
Thread-31::DEBUG::2015-05-18 
14:35:41,386::task::595::TaskManager.Task::(_updateState) 
Task=`66eaac56-77c7-4329-87a6-57a6d9763b6f`::moving from state init -> 
state preparing
Thread-31::INFO::2015-05-18 
14:35:41,387::logUtils::44::dispatcher::(wrapper) Run and protect: 
getVolumeSize(sdUUID='abc51e26-7175-4b38-b3a8-95c6928fbc2b', 
spUUID='b384b3da-02a6-44f3-a3f6-56751ce8c26d', 
imgUUID='c7b196de-b8a4-46c9-be72-193d765e3a1b', 
volUUID='286fb5c6-095c-4872-936b-343dbe4b8106', options=None)
Thread-31::INFO::2015-05-18 
14:35:41,390::logUtils::47::dispatcher::(wrapper) Run and protect: 
getVolumeSize, Return response: {'truesize': '3243724800', 
'apparentsize': '21474836480'}
Thread-31::DEBUG::2015-05-18 
14:35:41,390::task::1185::TaskManager.Task::(prepare) 
Task=`66eaac56-77c7-4329-87a6-57a6d9763b6f`::finished: {'truesize': 
'3243724800', 'apparentsize': '21474836480'}
Thread-31::DEBUG::2015-05-18 
14:35:41,390::task::595::TaskManager.Task::(_updateState) 
Task=`66eaac56-77c7-4329-87a6-57a6d9763b6f`::moving from state preparing 
-> state finished
Thread-31::DEBUG::2015-05-18 
14:35:41,390::resourceManager::940::ResourceManager.Owner::(releaseAll) 
Owner.releaseAll requests {} resources {}
Thread-31::DEBUG::2015-05-18 
14:35:41,391::resourceManager::977::ResourceManager.Owner::(cancelAll) 
Owner.cancelAll requests {}
Thread-31::DEBUG::2015-05-18 
14:35:41,391::task::990::TaskManager.Task::(_decref) 
Task=`66eaac56-77c7-4329-87a6-57a6d9763b6f`::ref 0 aborting False
VM Channels Listener::DEBUG::2015-05-18 
14:35:41,573::vmChannels::128::vds::(_handle_unconnected) Trying to 
connect fileno 38.
VM Channels Listener::DEBUG::2015-05-18 
14:35:41,574::vmChannels::128::vds::(_handle_unconnected) Trying to 
connect fileno 72.
VM Channels Listener::DEBUG::2015-05-18 
14:35:41,574::vmChannels::128::vds::(_handle_unconnected) Trying to 
connect fileno 49.
VM Channels Listener::DEBUG::2015-05-18 
14:35:41,574::vmChannels::128::vds::(_handle_unconnected) Trying to 
connect fileno 50.
VM Channels Listener::DEBUG::2015-05-18 
14:35:41,574::vmChannels::128::vds::(_handle_unconnected) Trying to 
connect fileno 54.
VM Channels Listener::DEBUG::2015-05-18 
14:35:41,575::vmChannels::128::vds::(_handle_unconnected) Trying to 
connect fileno 62.
VM Channels Listener::DEBUG::2015-05-18 
14:35:42,576::vmChannels::128::vds::(_handle_unconnected) Trying to 
connect fileno 38.
VM Channels Listener::DEBUG::2015-05-18 
14:35:42,576::vmChannels::128::vds::(_handle_unconnected) Trying to 
connect fileno 72.
VM Channels Listener::DEBUG::2015-05-18 
14:35:42,576::vmChannels::128::vds::(_handle_unconnected) Trying to 
connect fileno 49.
VM Channels Listener::DEBUG::2015-05-18 
14:35:42,576::vmChannels::128::vds::(_handle_unconnected) Trying to 
connect fileno 50.
VM Channels Listener::DEBUG::2015-05-18 
14:35:42,577::vmChannels::128::vds::(_handle_unconnected) Trying to 
connect fileno 54.
VM Channels Listener::DEBUG::2015-05-18 
14:35:42,577::vmChannels::128::vds::(_handle_unconnected) Trying to 
connect fileno 62.
Thread-35::DEBUG::2015-05-18 
14:35:43,570::fileSD::225::Storage.Misc.excCmd::(getReadDelay) '/bin/dd 
iflag=direct 
if=/rhev/data-center/mnt/ovirt.hq.example.net:_var_lib_exports_iso-20140618114713/6d882c77-cdbc-48ef-ae21-1a6d45e7f8a1/dom_md/metadata 
bs=4096 count=1' (cwd None)
VM Channels Listener::DEBUG::2015-05-18 
14:35:43,579::vmChannels::128::vds::(_handle_unconnected) Trying to 
connect fileno 38.
VM Channels Listener::DEBUG::2015-05-18 
14:35:43,580::vmChannels::128::vds::(_handle_unconnected) Trying to 
connect fileno 72.
VM Channels Listener::DEBUG::2015-05-18 
14:35:43,581::vmChannels::128::vds::(_handle_unconnected) Trying to 
connect fileno 49.
VM Channels Listener::DEBUG::2015-05-18 
14:35:43,583::vmChannels::128::vds::(_handle_unconnected) Trying to 
connect fileno 50.
VM Channels Listener::DEBUG::2015-05-18 
14:35:43,584::vmChannels::128::vds::(_handle_unconnected) Trying to 
connect fileno 54.
VM Channels Listener::DEBUG::2015-05-18 
14:35:43,585::vmChannels::128::vds::(_handle_unconnected) Trying to 
connect fileno 62.
Thread-35::DEBUG::2015-05-18 
14:35:43,587::fileSD::225::Storage.Misc.excCmd::(getReadDelay) SUCCESS: 
<err> = '0+1 records in\n0+1 records out\n560 bytes (560 B) copied, 
0.000340492 s, 1.6 MB/s\n'; <rc> = 0
Thread-30::DEBUG::2015-05-18 
14:35:43,695::task::595::TaskManager.Task::(_updateState) 
Task=`df8a42f5-b6d4-4059-8586-ad3d2f60f040`::moving from state init -> 
state preparing
Thread-30::INFO::2015-05-18 
14:35:43,696::logUtils::44::dispatcher::(wrapper) Run and protect: 
getVolumeSize(sdUUID='abc51e26-7175-4b38-b3a8-95c6928fbc2b', 
spUUID='b384b3da-02a6-44f3-a3f6-56751ce8c26d', 
imgUUID='b68d3a83-d422-4882-a3bf-14809d7ef3a1', 
volUUID='0a17d461-4fd6-483c-91a9-86796d4aa6bf', options=None)
Thread-30::INFO::2015-05-18 
14:35:43,701::logUtils::47::dispatcher::(wrapper) Run and protect: 
getVolumeSize, Return response: {'truesize': '20731297792', 
'apparentsize': '21474836480'}
Thread-30::DEBUG::2015-05-18 
14:35:43,701::task::1185::TaskManager.Task::(prepare) 
Task=`df8a42f5-b6d4-4059-8586-ad3d2f60f040`::finished: {'truesize': 
'20731297792', 'apparentsize': '21474836480'}
Thread-30::DEBUG::2015-05-18 
14:35:43,701::task::595::TaskManager.Task::(_updateState) 
Task=`df8a42f5-b6d4-4059-8586-ad3d2f60f040`::moving from state preparing 
-> state finished
Thread-30::DEBUG::2015-05-18 
14:35:43,701::resourceManager::940::ResourceManager.Owner::(releaseAll) 
Owner.releaseAll requests {} resources {}
Thread-30::DEBUG::2015-05-18 
14:35:43,701::resourceManager::977::ResourceManager.Owner::(cancelAll) 
Owner.cancelAll requests {}
Thread-30::DEBUG::2015-05-18 
14:35:43,702::task::990::TaskManager.Task::(_decref) 
Task=`df8a42f5-b6d4-4059-8586-ad3d2f60f040`::ref 0 aborting False
Thread-30::DEBUG::2015-05-18 
14:35:43,702::task::595::TaskManager.Task::(_updateState) 
Task=`7f25c7ef-49c2-4dbf-b785-902efecf0cc3`::moving from state init -> 
state preparing
Thread-30::INFO::2015-05-18 
14:35:43,702::logUtils::44::dispatcher::(wrapper) Run and protect: 
getVolumeSize(sdUUID='abc51e26-7175-4b38-b3a8-95c6928fbc2b', 
spUUID='b384b3da-02a6-44f3-a3f6-56751ce8c26d', 
imgUUID='174aa201-cfea-4ad4-8b54-f04205b0830c', 
volUUID='9a39b0ff-e3bc-42ae-bedb-0c809e60665d', options=None)
Thread-30::INFO::2015-05-18 
14:35:43,705::logUtils::47::dispatcher::(wrapper) Run and protect: 
getVolumeSize, Return response: {'truesize': '18952265728', 
'apparentsize': '21474836480'}
Thread-30::DEBUG::2015-05-18 
14:35:43,705::task::1185::TaskManager.Task::(prepare) 
Task=`7f25c7ef-49c2-4dbf-b785-902efecf0cc3`::finished: {'truesize': 
'18952265728', 'apparentsize': '21474836480'}
Thread-30::DEBUG::2015-05-18 
14:35:43,705::task::595::TaskManager.Task::(_updateState) 
Task=`7f25c7ef-49c2-4dbf-b785-902efecf0cc3`::moving from state preparing 
-> state finished
Thread-30::DEBUG::2015-05-18 
14:35:43,706::resourceManager::940::ResourceManager.Owner::(releaseAll) 
Owner.releaseAll requests {} resources {}
Thread-30::DEBUG::2015-05-18 
14:35:43,706::resourceManager::977::ResourceManager.Owner::(cancelAll) 
Owner.cancelAll requests {}
Thread-30::DEBUG::2015-05-18 
14:35:43,706::task::990::TaskManager.Task::(_decref) 
Task=`7f25c7ef-49c2-4dbf-b785-902efecf0cc3`::ref 0 aborting False
Thread-30::DEBUG::2015-05-18 
14:35:43,706::task::595::TaskManager.Task::(_updateState) 
Task=`87a5d5d2-ad55-4dd4-9a53-584831e91fe1`::moving from state init -> 
state preparing
Thread-30::INFO::2015-05-18 
14:35:43,706::logUtils::44::dispatcher::(wrapper) Run and protect: 
getVolumeSize(sdUUID='abc51e26-7175-4b38-b3a8-95c6928fbc2b', 
spUUID='b384b3da-02a6-44f3-a3f6-56751ce8c26d', 
imgUUID='423535cb-3a9c-41b7-a1a0-19da94b72e21', 
volUUID='747dfb7c-f2e6-4cdc-a76c-2c37c243ab4d', options=None)
Thread-30::INFO::2015-05-18 
14:35:43,709::logUtils::47::dispatcher::(wrapper) Run and protect: 
getVolumeSize, Return response: {'truesize': '25580142592', 
'apparentsize': '26843545600'}
Thread-30::DEBUG::2015-05-18 
14:35:43,709::task::1185::TaskManager.Task::(prepare) 
Task=`87a5d5d2-ad55-4dd4-9a53-584831e91fe1`::finished: {'truesize': 
'25580142592', 'apparentsize': '26843545600'}
Thread-30::DEBUG::2015-05-18 
14:35:43,709::task::595::TaskManager.Task::(_updateState) 
Task=`87a5d5d2-ad55-4dd4-9a53-584831e91fe1`::moving from state preparing 
-> state finished
Thread-30::DEBUG::2015-05-18 
14:35:43,710::resourceManager::940::ResourceManager.Owner::(releaseAll) 
Owner.releaseAll requests {} resources {}
Thread-30::DEBUG::2015-05-18 
14:35:43,710::resourceManager::977::ResourceManager.Owner::(cancelAll) 
Owner.cancelAll requests {}
Thread-30::DEBUG::2015-05-18 
14:35:43,710::task::990::TaskManager.Task::(_decref) 
Task=`87a5d5d2-ad55-4dd4-9a53-584831e91fe1`::ref 0 aborting False
(END)


engine.log:
================
2015-05-18 14:35:48,069 INFO 
[org.ovirt.engine.core.vdsbroker.vdsbroker.HSMGetAllTasksStatusesVDSCommand] 
(DefaultQuartzScheduler_Worker-46) HostName = 
ovirt-node02.stuttgart.example.net
2015-05-18 14:35:48,070 ERROR 
[org.ovirt.engine.core.vdsbroker.vdsbroker.HSMGetAllTasksStatusesVDSCommand] 
(DefaultQuartzScheduler_Worker-46) Command 
HSMGetAllTasksStatusesVDSCommand(HostName = 
ovirt-node02.stuttgart.example.net, HostId = 
6948da12-0b8a-4b6d-a9af-162e6c25dad3) execution failed. Exception: 
IRSNonOperationalException: IRSGenericException: IRSErrorException: 
IRSNonOperationalException: Not SPM
2015-05-18 14:35:48,172 INFO 
[org.ovirt.engine.core.vdsbroker.irsbroker.IrsProxyData] 
(DefaultQuartzScheduler_Worker-46) hostFromVds::selectedVds - 
ovirt-node01.stuttgart.example.net, spmStatus Free, storage pool 
HP_Proliant_DL180G6
2015-05-18 14:35:48,176 INFO 
[org.ovirt.engine.core.vdsbroker.irsbroker.IrsProxyData] 
(DefaultQuartzScheduler_Worker-46) starting spm on vds 
ovirt-node01.stuttgart.example.net, storage pool HP_Proliant_DL180G6, 
prevId -1, LVER -1
2015-05-18 14:35:48,209 INFO 
[org.ovirt.engine.core.vdsbroker.vdsbroker.SpmStartVDSCommand] 
(DefaultQuartzScheduler_Worker-46) START, SpmStartVDSCommand(HostName = 
ovirt-node01.stuttgart.example.net, HostId = 
2e8cec66-23d7-4a5c-b6f3-9758d1d87f5c, storagePoolId = 
b384b3da-02a6-44f3-a3f6-56751ce8c26d, prevId=-1, prevLVER=-1, 
storagePoolFormatType=V3, recoveryMode=Manual, SCSIFencing=false), log 
id: e0b2b6e
2015-05-18 14:35:48,220 INFO 
[org.ovirt.engine.core.vdsbroker.vdsbroker.SpmStartVDSCommand] 
(DefaultQuartzScheduler_Worker-46) spmStart polling started: taskId = 
5a51fe50-7803-4518-a4d8-f2a369c2490a
2015-05-18 14:35:49,227 ERROR 
[org.ovirt.engine.core.vdsbroker.vdsbroker.HSMGetTaskStatusVDSCommand] 
(DefaultQuartzScheduler_Worker-46) Failed in HSMGetTaskStatusVDS method
2015-05-18 14:35:49,228 INFO 
[org.ovirt.engine.core.vdsbroker.vdsbroker.SpmStartVDSCommand] 
(DefaultQuartzScheduler_Worker-46) spmStart polling ended: taskId = 
5a51fe50-7803-4518-a4d8-f2a369c2490a task status = finished
2015-05-18 14:35:49,229 ERROR 
[org.ovirt.engine.core.vdsbroker.vdsbroker.SpmStartVDSCommand] 
(DefaultQuartzScheduler_Worker-46) Start SPM Task failed - result: 
cleanSuccess, message: VDSGenericException: VDSErrorException: Failed to 
HSMGetTaskStatusVDS, error = Storage domain does not exist, code = 358
2015-05-18 14:35:49,281 INFO 
[org.ovirt.engine.core.vdsbroker.vdsbroker.SpmStartVDSCommand] 
(DefaultQuartzScheduler_Worker-46) spmStart polling ended, spm status: Free
2015-05-18 14:35:49,319 INFO 
[org.ovirt.engine.core.vdsbroker.vdsbroker.HSMClearTaskVDSCommand] 
(DefaultQuartzScheduler_Worker-46) START, 
HSMClearTaskVDSCommand(HostName = ovirt-node01.stuttgart.example.net, 
HostId = 2e8cec66-23d7-4a5c-b6f3-9758d1d87f5c, 
taskId=5a51fe50-7803-4518-a4d8-f2a369c2490a), log id: 621a07b5
2015-05-18 14:35:49,327 INFO 
[org.ovirt.engine.core.vdsbroker.vdsbroker.HSMClearTaskVDSCommand] 
(DefaultQuartzScheduler_Worker-46) FINISH, HSMClearTaskVDSCommand, log 
id: 621a07b5
2015-05-18 14:35:49,328 INFO 
[org.ovirt.engine.core.vdsbroker.vdsbroker.SpmStartVDSCommand] 
(DefaultQuartzScheduler_Worker-46) FINISH, SpmStartVDSCommand, return: 
org.ovirt.engine.core.common.businessentities.SpmStatusResult at 3addfc90, 
log id: e0b2b6e
2015-05-18 14:35:49,343 INFO 
[org.ovirt.engine.core.bll.storage.SetStoragePoolStatusCommand] 
(DefaultQuartzScheduler_Worker-46) [43f07021] Running command: 
SetStoragePoolStatusCommand internal: true. Entities affected :  ID: 
b384b3da-02a6-44f3-a3f6-56751ce8c26d Type: StoragePool
2015-05-18 14:35:49,395 ERROR 
[org.ovirt.engine.core.vdsbroker.irsbroker.IrsBrokerCommand] 
(DefaultQuartzScheduler_Worker-46) [43f07021] 
IrsBroker::Failed::GetStoragePoolInfoVDS due to: 
IrsSpmStartFailedException: IRSGenericException: IRSErrorException: 
SpmStart failed
2015-05-18 14:35:49,470 INFO 
[org.ovirt.engine.core.vdsbroker.irsbroker.IrsProxyData] 
(DefaultQuartzScheduler_Worker-46) [43f07021] Irs placed on server 
2e8cec66-23d7-4a5c-b6f3-9758d1d87f5c failed. Proceed Failover
2015-05-18 14:35:49,546 INFO 
[org.ovirt.engine.core.vdsbroker.irsbroker.IrsProxyData] 
(DefaultQuartzScheduler_Worker-46) [43f07021] hostFromVds::selectedVds - 
ovirt-node02.stuttgart.example.net, spmStatus Free, storage pool 
HP_Proliant_DL180G6
2015-05-18 14:35:49,549 INFO 
[org.ovirt.engine.core.vdsbroker.irsbroker.IrsProxyData] 
(DefaultQuartzScheduler_Worker-46) [43f07021] starting spm on vds 
ovirt-node02.stuttgart.example.net, storage pool HP_Proliant_DL180G6, 
prevId -1, LVER -1
2015-05-18 14:35:49,581 INFO 
[org.ovirt.engine.core.vdsbroker.vdsbroker.SpmStartVDSCommand] 
(DefaultQuartzScheduler_Worker-46) [43f07021] START, 
SpmStartVDSCommand(HostName = ovirt-node02.stuttgart.example.net, HostId 
= 6948da12-0b8a-4b6d-a9af-162e6c25dad3, storagePoolId = 
b384b3da-02a6-44f3-a3f6-56751ce8c26d, prevId=-1, prevLVER=-1, 
storagePoolFormatType=V3, recoveryMode=Manual, SCSIFencing=false), log 
id: 63b55478
2015-05-18 14:35:49,591 INFO 
[org.ovirt.engine.core.vdsbroker.vdsbroker.SpmStartVDSCommand] 
(DefaultQuartzScheduler_Worker-46) [43f07021] spmStart polling started: 
taskId = 85b96d7e-429c-45a5-acc3-2359d1ad4c53
2015-05-18 14:35:50,155 INFO 
[org.ovirt.engine.core.vdsbroker.gluster.GlusterVolumesListVDSCommand] 
(DefaultQuartzScheduler_Worker-43) START, 
GlusterVolumesListVDSCommand(HostName = 
ovirt-node01.stuttgart.example.net, HostId = 
2e8cec66-23d7-4a5c-b6f3-9758d1d87f5c), log id: 5203a336
2015-05-18 14:35:50,345 INFO 
[org.ovirt.engine.core.vdsbroker.gluster.GlusterVolumesListVDSCommand] 
(DefaultQuartzScheduler_Worker-43) FINISH, GlusterVolumesListVDSCommand, 
return: 
{d46619e9-9368-4e82-bf3a-a2377b6e85e4=org.ovirt.engine.core.common.businessentities.gluster.GlusterVolumeEntity at a584232}, 
log id: 5203a336
2015-05-18 14:35:50,513 INFO 
[org.ovirt.engine.core.vdsbroker.gluster.GlusterVolumesListVDSCommand] 
(DefaultQuartzScheduler_Worker-43) START, 
GlusterVolumesListVDSCommand(HostName = 
ovirt-node04.stuttgart.example.net, HostId = 
073c24e1-003f-412a-be56-0c41a435829a), log id: 6316c75
2015-05-18 14:35:50,598 INFO 
[org.ovirt.engine.core.vdsbroker.gluster.GlusterVolumesListVDSCommand] 
(DefaultQuartzScheduler_Worker-43) FINISH, GlusterVolumesListVDSCommand, 
return: 
{e952fd41-45bf-42d9-b494-8e0195cb9756=org.ovirt.engine.core.common.businessentities.gluster.GlusterVolumeEntity at 383d74c4}, 
log id: 6316c75
2015-05-18 14:35:50,599 ERROR 
[org.ovirt.engine.core.vdsbroker.vdsbroker.HSMGetTaskStatusVDSCommand] 
(DefaultQuartzScheduler_Worker-46) [43f07021] Failed in 
HSMGetTaskStatusVDS method
2015-05-18 14:35:50,600 INFO 
[org.ovirt.engine.core.vdsbroker.vdsbroker.SpmStartVDSCommand] 
(DefaultQuartzScheduler_Worker-46) [43f07021] spmStart polling ended: 
taskId = 85b96d7e-429c-45a5-acc3-2359d1ad4c53 task status = finished
2015-05-18 14:35:50,602 ERROR 
[org.ovirt.engine.core.vdsbroker.vdsbroker.SpmStartVDSCommand] 
(DefaultQuartzScheduler_Worker-46) [43f07021] Start SPM Task failed - 
result: cleanSuccess, message: VDSGenericException: VDSErrorException: 
Failed to HSMGetTaskStatusVDS, error = Storage domain does not exist, 
code = 358
2015-05-18 14:35:50,617 INFO 
[org.ovirt.engine.core.vdsbroker.gluster.GlusterVolumesListVDSCommand] 
(DefaultQuartzScheduler_Worker-43) START, 
GlusterVolumesListVDSCommand(HostName = 
ovirt-node07-stgt.stuttgart.example.net, HostId = 
32dfc1f1-c474-4410-94c0-38776c121289), log id: 2d300f8a
2015-05-18 14:35:50,653 INFO 
[org.ovirt.engine.core.vdsbroker.vdsbroker.SpmStartVDSCommand] 
(DefaultQuartzScheduler_Worker-46) [43f07021] spmStart polling ended, 
spm status: Free
2015-05-18 14:35:50,658 INFO 
[org.ovirt.engine.core.vdsbroker.gluster.GlusterVolumesListVDSCommand] 
(DefaultQuartzScheduler_Worker-43) FINISH, GlusterVolumesListVDSCommand, 
return: 
{8786357b-9d11-4c01-a34b-aee949c116e9=org.ovirt.engine.core.common.businessentities.gluster.GlusterVolumeEntity at bf0d67a0}, 
log id: 2d300f8a
2015-05-18 14:35:50,685 INFO 
[org.ovirt.engine.core.vdsbroker.vdsbroker.HSMClearTaskVDSCommand] 
(DefaultQuartzScheduler_Worker-46) [43f07021] START, 
HSMClearTaskVDSCommand(HostName = ovirt-node02.stuttgart.example.net, 
HostId = 6948da12-0b8a-4b6d-a9af-162e6c25dad3, 
taskId=85b96d7e-429c-45a5-acc3-2359d1ad4c53), log id: 541fc1
2015-05-18 14:35:50,731 INFO 
[org.ovirt.engine.core.vdsbroker.vdsbroker.HSMClearTaskVDSCommand] 
(DefaultQuartzScheduler_Worker-46) [43f07021] FINISH, 
HSMClearTaskVDSCommand, log id: 541fc1
2015-05-18 14:35:50,732 INFO 
[org.ovirt.engine.core.vdsbroker.vdsbroker.SpmStartVDSCommand] 
(DefaultQuartzScheduler_Worker-46) [43f07021] FINISH, 
SpmStartVDSCommand, return: 
org.ovirt.engine.core.common.businessentities.SpmStatusResult at 605bfeda, 
log id: 63b55478
2015-05-18 14:35:50,736 INFO 
[org.ovirt.engine.core.bll.storage.SetStoragePoolStatusCommand] 
(DefaultQuartzScheduler_Worker-46) [37b8cba2] Running command: 
SetStoragePoolStatusCommand internal: true. Entities affected :  ID: 
b384b3da-02a6-44f3-a3f6-56751ce8c26d Type: StoragePool
2015-05-18 14:35:50,775 ERROR 
[org.ovirt.engine.core.vdsbroker.irsbroker.IrsBrokerCommand] 
(DefaultQuartzScheduler_Worker-46) [37b8cba2] 
IrsBroker::Failed::GetStoragePoolInfoVDS due to: 
IrsSpmStartFailedException: IRSGenericException: IRSErrorException: 
SpmStart failed
2015-05-18 14:35:55,931 INFO 
[org.ovirt.engine.core.vdsbroker.gluster.GlusterVolumesListVDSCommand] 
(DefaultQuartzScheduler_Worker-75) START, 
GlusterVolumesListVDSCommand(HostName = 
ovirt-node01.stuttgart.example.net, HostId = 
2e8cec66-23d7-4a5c-b6f3-9758d1d87f5c), log id: 758f88b3
2015-05-18 14:35:56,130 INFO 
[org.ovirt.engine.core.vdsbroker.gluster.GlusterVolumesListVDSCommand] 
(DefaultQuartzScheduler_Worker-75) FINISH, GlusterVolumesListVDSCommand, 
return: 
{d46619e9-9368-4e82-bf3a-a2377b6e85e4=org.ovirt.engine.core.common.businessentities.gluster.GlusterVolumeEntity at ef0011fe}, 
log id: 758f88b3
2015-05-18 14:35:56,297 INFO 
[org.ovirt.engine.core.vdsbroker.gluster.GlusterVolumesListVDSCommand] 
(DefaultQuartzScheduler_Worker-75) START, 
GlusterVolumesListVDSCommand(HostName = 
ovirt-node04.stuttgart.example.net, HostId = 
073c24e1-003f-412a-be56-0c41a435829a), log id: 2d1069c1
2015-05-18 14:35:56,381 INFO 
[org.ovirt.engine.core.vdsbroker.gluster.GlusterVolumesListVDSCommand] 
(DefaultQuartzScheduler_Worker-75) FINISH, GlusterVolumesListVDSCommand, 
return: 
{e952fd41-45bf-42d9-b494-8e0195cb9756=org.ovirt.engine.core.common.businessentities.gluster.GlusterVolumeEntity at 971a4c5a}, 
log id: 2d1069c1
2015-05-18 14:35:56,397 INFO 
[org.ovirt.engine.core.vdsbroker.gluster.GlusterVolumesListVDSCommand] 
(DefaultQuartzScheduler_Worker-75) START, 
GlusterVolumesListVDSCommand(HostName = 
ovirt-node07-stgt.stuttgart.example.net, HostId = 
32dfc1f1-c474-4410-94c0-38776c121289), log id: 2adb5e6b
2015-05-18 14:35:56,437 INFO 
[org.ovirt.engine.core.vdsbroker.gluster.GlusterVolumesListVDSCommand] 
(DefaultQuartzScheduler_Worker-75) FINISH, GlusterVolumesListVDSCommand, 
return: 
{8786357b-9d11-4c01-a34b-aee949c116e9=org.ovirt.engine.core.common.businessentities.gluster.GlusterVolumeEntity at 319748ff}, 
log id: 2adb5e6b
2015-05-18 14:36:01,017 INFO 
[org.ovirt.engine.core.vdsbroker.vdsbroker.HSMGetAllTasksStatusesVDSCommand] 
(DefaultQuartzScheduler_Worker-3) Command 
org.ovirt.engine.core.vdsbroker.vdsbroker.HSMGetAllTasksStatusesVDSCommand 
return value

TaskStatusListReturnForXmlRpc [mStatus=StatusForXmlRpc [mCode=654, 
mMessage=Not SPM]]

2015-05-18 14:36:01,077 INFO 
[org.ovirt.engine.core.vdsbroker.vdsbroker.HSMGetAllTasksStatusesVDSCommand] 
(DefaultQuartzScheduler_Worker-3) HostName = 
ovirt-node02.stuttgart.example.net
2015-05-18 14:36:01,079 ERROR 
[org.ovirt.engine.core.vdsbroker.vdsbroker.HSMGetAllTasksStatusesVDSCommand] 
(DefaultQuartzScheduler_Worker-3) Command 
HSMGetAllTasksStatusesVDSCommand(HostName = 
ovirt-node02.stuttgart.example.net, HostId = 
6948da12-0b8a-4b6d-a9af-162e6c25dad3) execution failed. Exception: 
IRSNonOperationalException: IRSGenericException: IRSErrorException: 
IRSNonOperationalException: Not SPM
2015-05-18 14:36:01,234 INFO 
[org.ovirt.engine.core.vdsbroker.irsbroker.IrsProxyData] 
(DefaultQuartzScheduler_Worker-3) hostFromVds::selectedVds - 
ovirt-node01.stuttgart.example.net, spmStatus Free, storage pool 
HP_Proliant_DL180G6
2015-05-18 14:36:01,239 INFO 
[org.ovirt.engine.core.vdsbroker.irsbroker.IrsProxyData] 
(DefaultQuartzScheduler_Worker-3) starting spm on vds 
ovirt-node01.stuttgart.example.net, storage pool HP_Proliant_DL180G6, 
prevId -1, LVER -1
2015-05-18 14:36:01,271 INFO 
[org.ovirt.engine.core.vdsbroker.vdsbroker.SpmStartVDSCommand] 
(DefaultQuartzScheduler_Worker-3) START, SpmStartVDSCommand(HostName = 
ovirt-node01.stuttgart.example.net, HostId = 
2e8cec66-23d7-4a5c-b6f3-9758d1d87f5c, storagePoolId = 
b384b3da-02a6-44f3-a3f6-56751ce8c26d, prevId=-1, prevLVER=-1, 
storagePoolFormatType=V3, recoveryMode=Manual, SCSIFencing=false), log 
id: 73b21fc3
2015-05-18 14:36:01,287 INFO 
[org.ovirt.engine.core.vdsbroker.vdsbroker.SpmStartVDSCommand] 
(DefaultQuartzScheduler_Worker-3) spmStart polling started: taskId = 
cbf7b7bd-2beb-4479-a1f1-a65bcff6ca81
2015-05-18 14:36:01,714 INFO 
[org.ovirt.engine.core.vdsbroker.gluster.GlusterVolumesListVDSCommand] 
(DefaultQuartzScheduler_Worker-16) START, 
GlusterVolumesListVDSCommand(HostName = 
ovirt-node01.stuttgart.example.net, HostId = 
2e8cec66-23d7-4a5c-b6f3-9758d1d87f5c), log id: 6973c795
2015-05-18 14:36:01,917 INFO 
[org.ovirt.engine.core.vdsbroker.gluster.GlusterVolumesListVDSCommand] 
(DefaultQuartzScheduler_Worker-16) FINISH, GlusterVolumesListVDSCommand, 
return: 
{d46619e9-9368-4e82-bf3a-a2377b6e85e4=org.ovirt.engine.core.common.businessentities.gluster.GlusterVolumeEntity at 93671002}, 
log id: 6973c795
2015-05-18 14:36:02,160 INFO 
[org.ovirt.engine.core.vdsbroker.gluster.GlusterVolumesListVDSCommand] 
(DefaultQuartzScheduler_Worker-16) START, 
GlusterVolumesListVDSCommand(HostName = 
ovirt-node04.stuttgart.example.net, HostId = 
073c24e1-003f-412a-be56-0c41a435829a), log id: 673a67af
2015-05-18 14:36:02,246 INFO 
[org.ovirt.engine.core.vdsbroker.gluster.GlusterVolumesListVDSCommand] 
(DefaultQuartzScheduler_Worker-16) FINISH, GlusterVolumesListVDSCommand, 
return: 
{e952fd41-45bf-42d9-b494-8e0195cb9756=org.ovirt.engine.core.common.businessentities.gluster.GlusterVolumeEntity at 7cc34691}, 
log id: 673a67af
2015-05-18 14:36:02,260 INFO 
[org.ovirt.engine.core.vdsbroker.gluster.GlusterVolumesListVDSCommand] 
(DefaultQuartzScheduler_Worker-16) START, 
GlusterVolumesListVDSCommand(HostName = 
ovirt-node07-stgt.stuttgart.example.net, HostId = 
32dfc1f1-c474-4410-94c0-38776c121289), log id: 4839bf55
2015-05-18 14:36:02,295 ERROR 
[org.ovirt.engine.core.vdsbroker.vdsbroker.HSMGetTaskStatusVDSCommand] 
(DefaultQuartzScheduler_Worker-3) Failed in HSMGetTaskStatusVDS method
2015-05-18 14:36:02,296 INFO 
[org.ovirt.engine.core.vdsbroker.vdsbroker.SpmStartVDSCommand] 
(DefaultQuartzScheduler_Worker-3) spmStart polling ended: taskId = 
cbf7b7bd-2beb-4479-a1f1-a65bcff6ca81 task status = finished
2015-05-18 14:36:02,297 ERROR 
[org.ovirt.engine.core.vdsbroker.vdsbroker.SpmStartVDSCommand] 
(DefaultQuartzScheduler_Worker-3) Start SPM Task failed - result: 
cleanSuccess, message: VDSGenericException: VDSErrorException: Failed to 
HSMGetTaskStatusVDS, error = Storage domain does not exist, code = 358
2015-05-18 14:36:02,303 INFO 
[org.ovirt.engine.core.vdsbroker.gluster.GlusterVolumesListVDSCommand] 
(DefaultQuartzScheduler_Worker-16) FINISH, GlusterVolumesListVDSCommand, 
return: 
{8786357b-9d11-4c01-a34b-aee949c116e9=org.ovirt.engine.core.common.businessentities.gluster.GlusterVolumeEntity at d92bb04f}, 
log id: 4839bf55
2015-05-18 14:36:02,348 INFO 
[org.ovirt.engine.core.vdsbroker.vdsbroker.SpmStartVDSCommand] 
(DefaultQuartzScheduler_Worker-3) spmStart polling ended, spm status: Free
2015-05-18 14:36:02,380 INFO 
[org.ovirt.engine.core.vdsbroker.vdsbroker.HSMClearTaskVDSCommand] 
(DefaultQuartzScheduler_Worker-3) START, HSMClearTaskVDSCommand(HostName 
= ovirt-node01.stuttgart.example.net, HostId = 
2e8cec66-23d7-4a5c-b6f3-9758d1d87f5c, 
taskId=cbf7b7bd-2beb-4479-a1f1-a65bcff6ca81), log id: c94020
2015-05-18 14:36:02,428 INFO 
[org.ovirt.engine.core.vdsbroker.vdsbroker.HSMClearTaskVDSCommand] 
(DefaultQuartzScheduler_Worker-3) FINISH, HSMClearTaskVDSCommand, log 
id: c94020
2015-05-18 14:36:02,429 INFO 
[org.ovirt.engine.core.vdsbroker.vdsbroker.SpmStartVDSCommand] 
(DefaultQuartzScheduler_Worker-3) FINISH, SpmStartVDSCommand, return: 
org.ovirt.engine.core.common.businessentities.SpmStatusResult at 270252f6, 
log id: 73b21fc3
2015-05-18 14:36:02,436 INFO 
[org.ovirt.engine.core.bll.storage.SetStoragePoolStatusCommand] 
(DefaultQuartzScheduler_Worker-3) [994a87c] Running command: 
SetStoragePoolStatusCommand internal: true. Entities affected :  ID: 
b384b3da-02a6-44f3-a3f6-56751ce8c26d Type: StoragePool
2015-05-18 14:36:02,506 ERROR 
[org.ovirt.engine.core.vdsbroker.irsbroker.IrsBrokerCommand] 
(DefaultQuartzScheduler_Worker-3) [994a87c] 
IrsBroker::Failed::GetStoragePoolInfoVDS due to: 
IrsSpmStartFailedException: IRSGenericException: IRSErrorException: 
SpmStart failed
2015-05-18 14:36:02,659 INFO 
[org.ovirt.engine.core.vdsbroker.irsbroker.IrsProxyData] 
(DefaultQuartzScheduler_Worker-3) [994a87c] Irs placed on server 
2e8cec66-23d7-4a5c-b6f3-9758d1d87f5c failed. Proceed Failover
2015-05-18 14:36:02,789 INFO 
[org.ovirt.engine.core.vdsbroker.irsbroker.IrsProxyData] 
(DefaultQuartzScheduler_Worker-3) [994a87c] hostFromVds::selectedVds - 
ovirt-node02.stuttgart.example.net, spmStatus Free, storage pool 
HP_Proliant_DL180G6
2015-05-18 14:36:02,795 INFO 
[org.ovirt.engine.core.vdsbroker.irsbroker.IrsProxyData] 
(DefaultQuartzScheduler_Worker-3) [994a87c] starting spm on vds 
ovirt-node02.stuttgart.example.net, storage pool HP_Proliant_DL180G6, 
prevId -1, LVER -1
2015-05-18 14:36:02,864 INFO 
[org.ovirt.engine.core.vdsbroker.vdsbroker.SpmStartVDSCommand] 
(DefaultQuartzScheduler_Worker-3) [994a87c] START, 
SpmStartVDSCommand(HostName = ovirt-node02.stuttgart.example.net, HostId 
= 6948da12-0b8a-4b6d-a9af-162e6c25dad3, storagePoolId = 
b384b3da-02a6-44f3-a3f6-56751ce8c26d, prevId=-1, prevLVER=-1, 
storagePoolFormatType=V3, recoveryMode=Manual, SCSIFencing=false), log 
id: 51eb1e47
2015-05-18 14:36:02,879 INFO 
[org.ovirt.engine.core.vdsbroker.vdsbroker.SpmStartVDSCommand] 
(DefaultQuartzScheduler_Worker-3) [994a87c] spmStart polling started: 
taskId = a23e67b5-28e4-4572-970e-2cb1c8c589be
2015-05-18 14:36:03,888 ERROR 
[org.ovirt.engine.core.vdsbroker.vdsbroker.HSMGetTaskStatusVDSCommand] 
(DefaultQuartzScheduler_Worker-3) [994a87c] Failed in 
HSMGetTaskStatusVDS method
2015-05-18 14:36:03,890 INFO 
[org.ovirt.engine.core.vdsbroker.vdsbroker.SpmStartVDSCommand] 
(DefaultQuartzScheduler_Worker-3) [994a87c] spmStart polling ended: 
taskId = a23e67b5-28e4-4572-970e-2cb1c8c589be task status = finished
2015-05-18 14:36:03,892 ERROR 
[org.ovirt.engine.core.vdsbroker.vdsbroker.SpmStartVDSCommand] 
(DefaultQuartzScheduler_Worker-3) [994a87c] Start SPM Task failed - 
result: cleanSuccess, message: VDSGenericException: VDSErrorException: 
Failed to HSMGetTaskStatusVDS, error = Storage domain does not exist, 
code = 358
2015-05-18 14:36:03,945 INFO 
[org.ovirt.engine.core.vdsbroker.vdsbroker.SpmStartVDSCommand] 
(DefaultQuartzScheduler_Worker-3) [994a87c] spmStart polling ended, spm 
status: Free
2015-05-18 14:36:04,003 INFO 
[org.ovirt.engine.core.vdsbroker.vdsbroker.HSMClearTaskVDSCommand] 
(DefaultQuartzScheduler_Worker-3) [994a87c] START, 
HSMClearTaskVDSCommand(HostName = ovirt-node02.stuttgart.example.net, 
HostId = 6948da12-0b8a-4b6d-a9af-162e6c25dad3, 
taskId=a23e67b5-28e4-4572-970e-2cb1c8c589be), log id: 79674a9b
2015-05-18 14:36:04,012 INFO 
[org.ovirt.engine.core.vdsbroker.vdsbroker.HSMClearTaskVDSCommand] 
(DefaultQuartzScheduler_Worker-3) [994a87c] FINISH, 
HSMClearTaskVDSCommand, log id: 79674a9b
2015-05-18 14:36:04,014 INFO 
[org.ovirt.engine.core.vdsbroker.vdsbroker.SpmStartVDSCommand] 
(DefaultQuartzScheduler_Worker-3) [994a87c] FINISH, SpmStartVDSCommand, 
return: 
org.ovirt.engine.core.common.businessentities.SpmStatusResult at 57b706d5, 
log id: 51eb1e47
2015-05-18 14:36:04,030 INFO 
[org.ovirt.engine.core.bll.storage.SetStoragePoolStatusCommand] 
(DefaultQuartzScheduler_Worker-3) [3121b6f0] Running command: 
SetStoragePoolStatusCommand internal: true. Entities affected :  ID: 
b384b3da-02a6-44f3-a3f6-56751ce8c26d Type: StoragePool
2015-05-18 14:36:04,078 ERROR 
[org.ovirt.engine.core.vdsbroker.irsbroker.IrsBrokerCommand] 
(DefaultQuartzScheduler_Worker-3) [3121b6f0] 
IrsBroker::Failed::GetStoragePoolInfoVDS due to: 
IrsSpmStartFailedException: IRSGenericException: IRSErrorException: 
SpmStart failed
2015-05-18 14:36:04,777 INFO 
[org.ovirt.engine.core.vdsbroker.gluster.GlusterTasksListVDSCommand] 
(DefaultQuartzScheduler_Worker-35) START, 
GlusterTasksListVDSCommand(HostName = 
ovirt-node01.stuttgart.example.net, HostId = 
2e8cec66-23d7-4a5c-b6f3-9758d1d87f5c), log id: 68a5fa4c
2015-05-18 14:36:04,899 INFO 
[org.ovirt.engine.core.vdsbroker.gluster.GlusterTasksListVDSCommand] 
(DefaultQuartzScheduler_Worker-35) FINISH, GlusterTasksListVDSCommand, 
return: [], log id: 68a5fa4c
2015-05-18 14:36:05,046 INFO 
[org.ovirt.engine.core.vdsbroker.gluster.GlusterTasksListVDSCommand] 
(DefaultQuartzScheduler_Worker-35) START, 
GlusterTasksListVDSCommand(HostName = 
ovirt-node03.stuttgart.example.net, HostId = 
58c3a03d-d289-45ad-a5e5-f3d37d66cca7), log id: 3dde7be5
2015-05-18 14:36:05,074 INFO 
[org.ovirt.engine.core.vdsbroker.gluster.GlusterTasksListVDSCommand] 
(DefaultQuartzScheduler_Worker-35) FINISH, GlusterTasksListVDSCommand, 
return: [], log id: 3dde7be5
2015-05-18 14:36:05,078 INFO 
[org.ovirt.engine.core.vdsbroker.gluster.GlusterTasksListVDSCommand] 
(DefaultQuartzScheduler_Worker-35) START, 
GlusterTasksListVDSCommand(HostName = 
ovirt-node06-stgt.stuttgart.example.net, HostId = 
e15b95d4-96a5-40f7-97d0-9ec2fccd39f0), log id: 68d3acc2
2015-05-18 14:36:05,157 INFO 
[org.ovirt.engine.core.vdsbroker.gluster.GlusterTasksListVDSCommand] 
(DefaultQuartzScheduler_Worker-35) FINISH, GlusterTasksListVDSCommand, 
return: [], log id: 68d3acc2
2015-05-18 14:36:07,647 INFO 
[org.ovirt.engine.core.vdsbroker.gluster.GlusterVolumesListVDSCommand] 
(DefaultQuartzScheduler_Worker-38) START, 
GlusterVolumesListVDSCommand(HostName = 
ovirt-node01.stuttgart.example.net, HostId = 
2e8cec66-23d7-4a5c-b6f3-9758d1d87f5c), log id: 7192b532
2015-05-18 14:36:07,830 INFO 
[org.ovirt.engine.core.vdsbroker.gluster.GlusterVolumesListVDSCommand] 
(DefaultQuartzScheduler_Worker-38) FINISH, GlusterVolumesListVDSCommand, 
return: 
{d46619e9-9368-4e82-bf3a-a2377b6e85e4=org.ovirt.engine.core.common.businessentities.gluster.GlusterVolumeEntity at d1d19f6}, 
log id: 7192b532
2015-05-18 14:36:08,002 INFO 
[org.ovirt.engine.core.vdsbroker.gluster.GlusterVolumesListVDSCommand] 
(DefaultQuartzScheduler_Worker-38) START, 
GlusterVolumesListVDSCommand(HostName = 
ovirt-node04.stuttgart.example.net, HostId = 
073c24e1-003f-412a-be56-0c41a435829a), log id: 40023b20
2015-05-18 14:36:08,090 INFO 
[org.ovirt.engine.core.vdsbroker.gluster.GlusterVolumesListVDSCommand] 
(DefaultQuartzScheduler_Worker-38) FINISH, GlusterVolumesListVDSCommand, 
return: 
{e952fd41-45bf-42d9-b494-8e0195cb9756=org.ovirt.engine.core.common.businessentities.gluster.GlusterVolumeEntity at 81f414aa}, 
log id: 40023b20
2015-05-18 14:36:08,106 INFO 
[org.ovirt.engine.core.vdsbroker.gluster.GlusterVolumesListVDSCommand] 
(DefaultQuartzScheduler_Worker-38) START, 
GlusterVolumesListVDSCommand(HostName = 
ovirt-node07-stgt.stuttgart.example.net, HostId = 
32dfc1f1-c474-4410-94c0-38776c121289), log id: 3901fba
2015-05-18 14:36:08,144 INFO 
[org.ovirt.engine.core.vdsbroker.gluster.GlusterVolumesListVDSCommand] 
(DefaultQuartzScheduler_Worker-38) FINISH, GlusterVolumesListVDSCommand, 
return: 
{8786357b-9d11-4c01-a34b-aee949c116e9=org.ovirt.engine.core.common.businessentities.gluster.GlusterVolumeEntity at 8043d5d4}, 
log id: 3901fba
2015-05-18 14:36:13,378 INFO 
[org.ovirt.engine.core.vdsbroker.gluster.GlusterVolumesListVDSCommand] 
(DefaultQuartzScheduler_Worker-74) START, 
GlusterVolumesListVDSCommand(HostName = 
ovirt-node01.stuttgart.example.net, HostId = 
2e8cec66-23d7-4a5c-b6f3-9758d1d87f5c), log id: 1c08397d
2015-05-18 14:36:13,610 INFO 
[org.ovirt.engine.core.vdsbroker.gluster.GlusterVolumesListVDSCommand] 
(DefaultQuartzScheduler_Worker-74) FINISH, GlusterVolumesListVDSCommand, 
return: 
{d46619e9-9368-4e82-bf3a-a2377b6e85e4=org.ovirt.engine.core.common.businessentities.gluster.GlusterVolumeEntity at 3f192277}, 
log id: 1c08397d
2015-05-18 14:36:13,892 INFO 
[org.ovirt.engine.core.vdsbroker.gluster.GlusterVolumesListVDSCommand] 
(DefaultQuartzScheduler_Worker-74) START, 
GlusterVolumesListVDSCommand(HostName = 
ovirt-node04.stuttgart.example.net, HostId = 
073c24e1-003f-412a-be56-0c41a435829a), log id: e80f14a
2015-05-18 14:36:13,975 INFO 
[org.ovirt.engine.core.vdsbroker.gluster.GlusterVolumesListVDSCommand] 
(DefaultQuartzScheduler_Worker-74) FINISH, GlusterVolumesListVDSCommand, 
return: 
{e952fd41-45bf-42d9-b494-8e0195cb9756=org.ovirt.engine.core.common.businessentities.gluster.GlusterVolumeEntity at b4518810}, 
log id: e80f14a
2015-05-18 14:36:13,995 INFO 
[org.ovirt.engine.core.vdsbroker.gluster.GlusterVolumesListVDSCommand] 
(DefaultQuartzScheduler_Worker-74) START, 
GlusterVolumesListVDSCommand(HostName = 
ovirt-node07-stgt.stuttgart.example.net, HostId = 
32dfc1f1-c474-4410-94c0-38776c121289), log id: 3f6f3ea0
2015-05-18 14:36:14,392 INFO 
[org.ovirt.engine.core.vdsbroker.vdsbroker.HSMGetAllTasksStatusesVDSCommand] 
(DefaultQuartzScheduler_Worker-86) Command 
org.ovirt.engine.core.vdsbroker.vdsbroker.HSMGetAllTasksStatusesVDSCommand 
return value

TaskStatusListReturnForXmlRpc [mStatus=StatusForXmlRpc [mCode=654, 
mMessage=Not SPM]]

2015-05-18 14:36:14,425 INFO 
[org.ovirt.engine.core.vdsbroker.vdsbroker.HSMGetAllTasksStatusesVDSCommand] 
(DefaultQuartzScheduler_Worker-86) HostName = 
ovirt-node02.stuttgart.example.net
2015-05-18 14:36:14,427 ERROR 
[org.ovirt.engine.core.vdsbroker.vdsbroker.HSMGetAllTasksStatusesVDSCommand] 
(DefaultQuartzScheduler_Worker-86) Command 
HSMGetAllTasksStatusesVDSCommand(HostName = 
ovirt-node02.stuttgart.example.net, HostId = 
6948da12-0b8a-4b6d-a9af-162e6c25dad3) execution failed. Exception: 
IRSNonOperationalException: IRSGenericException: IRSErrorException: 
IRSNonOperationalException: Not SPM
2015-05-18 14:36:14,429 INFO 
[org.ovirt.engine.core.vdsbroker.gluster.GlusterVolumesListVDSCommand] 
(DefaultQuartzScheduler_Worker-74) FINISH, GlusterVolumesListVDSCommand, 
return: 
{8786357b-9d11-4c01-a34b-aee949c116e9=org.ovirt.engine.core.common.businessentities.gluster.GlusterVolumeEntity at 7da28ce6}, 
log id: 3f6f3ea0
2015-05-18 14:36:14,619 INFO 
[org.ovirt.engine.core.vdsbroker.irsbroker.IrsProxyData] 
(DefaultQuartzScheduler_Worker-86) hostFromVds::selectedVds - 
ovirt-node01.stuttgart.example.net, spmStatus Free, storage pool 
HP_Proliant_DL180G6
2015-05-18 14:36:14,622 INFO 
[org.ovirt.engine.core.vdsbroker.irsbroker.IrsProxyData] 
(DefaultQuartzScheduler_Worker-86) starting spm on vds 
ovirt-node01.stuttgart.example.net, storage pool HP_Proliant_DL180G6, 
prevId -1, LVER -1
2015-05-18 14:36:14,655 INFO 
[org.ovirt.engine.core.vdsbroker.vdsbroker.SpmStartVDSCommand] 
(DefaultQuartzScheduler_Worker-86) START, SpmStartVDSCommand(HostName = 
ovirt-node01.stuttgart.example.net, HostId = 
2e8cec66-23d7-4a5c-b6f3-9758d1d87f5c, storagePoolId = 
b384b3da-02a6-44f3-a3f6-56751ce8c26d, prevId=-1, prevLVER=-1, 
storagePoolFormatType=V3, recoveryMode=Manual, SCSIFencing=false), log 
id: 26ddbe34
2015-05-18 14:36:14,669 INFO 
[org.ovirt.engine.core.vdsbroker.vdsbroker.SpmStartVDSCommand] 
(DefaultQuartzScheduler_Worker-86) spmStart polling started: taskId = 
f23ede06-d1ff-4398-aa8d-fec60a9b169a
2015-05-18 14:36:15,678 ERROR 
[org.ovirt.engine.core.vdsbroker.vdsbroker.HSMGetTaskStatusVDSCommand] 
(DefaultQuartzScheduler_Worker-86) Failed in HSMGetTaskStatusVDS method
2015-05-18 14:36:15,680 INFO 
[org.ovirt.engine.core.vdsbroker.vdsbroker.SpmStartVDSCommand] 
(DefaultQuartzScheduler_Worker-86) spmStart polling ended: taskId = 
f23ede06-d1ff-4398-aa8d-fec60a9b169a task status = finished
2015-05-18 14:36:15,682 ERROR 
[org.ovirt.engine.core.vdsbroker.vdsbroker.SpmStartVDSCommand] 
(DefaultQuartzScheduler_Worker-86) Start SPM Task failed - result: 
cleanSuccess, message: VDSGenericException: VDSErrorException: Failed to 
HSMGetTaskStatusVDS, error = Storage domain does not exist, code = 358
2015-05-18 14:36:15,737 INFO 
[org.ovirt.engine.core.vdsbroker.vdsbroker.SpmStartVDSCommand] 
(DefaultQuartzScheduler_Worker-86) spmStart polling ended, spm status: Free
2015-05-18 14:36:15,796 INFO 
[org.ovirt.engine.core.vdsbroker.vdsbroker.HSMClearTaskVDSCommand] 
(DefaultQuartzScheduler_Worker-86) START, 
HSMClearTaskVDSCommand(HostName = ovirt-node01.stuttgart.example.net, 
HostId = 2e8cec66-23d7-4a5c-b6f3-9758d1d87f5c, 
taskId=f23ede06-d1ff-4398-aa8d-fec60a9b169a), log id: 1120709c
2015-05-18 14:36:15,810 INFO 
[org.ovirt.engine.core.vdsbroker.vdsbroker.HSMClearTaskVDSCommand] 
(DefaultQuartzScheduler_Worker-86) FINISH, HSMClearTaskVDSCommand, log 
id: 1120709c
2015-05-18 14:36:15,811 INFO 
[org.ovirt.engine.core.vdsbroker.vdsbroker.SpmStartVDSCommand] 
(DefaultQuartzScheduler_Worker-86) FINISH, SpmStartVDSCommand, return: 
org.ovirt.engine.core.common.businessentities.SpmStatusResult at 52c0b61e, 
log id: 26ddbe34
2015-05-18 14:36:15,818 INFO 
[org.ovirt.engine.core.bll.storage.SetStoragePoolStatusCommand] 
(DefaultQuartzScheduler_Worker-86) [169f4d5c] Running command: 
SetStoragePoolStatusCommand internal: true. Entities affected :  ID: 
b384b3da-02a6-44f3-a3f6-56751ce8c26d Type: StoragePool
2015-05-18 14:36:15,858 ERROR 
[org.ovirt.engine.core.vdsbroker.irsbroker.IrsBrokerCommand] 
(DefaultQuartzScheduler_Worker-86) [169f4d5c] 
IrsBroker::Failed::GetStoragePoolInfoVDS due to: 
IrsSpmStartFailedException: IRSGenericException: IRSErrorException: 
SpmStart failed
2015-05-18 14:36:15,934 INFO 
[org.ovirt.engine.core.vdsbroker.irsbroker.IrsProxyData] 
(DefaultQuartzScheduler_Worker-86) [169f4d5c] Irs placed on server 
2e8cec66-23d7-4a5c-b6f3-9758d1d87f5c failed. Proceed Failover
2015-05-18 14:36:16,019 INFO 
[org.ovirt.engine.core.vdsbroker.irsbroker.IrsProxyData] 
(DefaultQuartzScheduler_Worker-86) [169f4d5c] hostFromVds::selectedVds - 
ovirt-node02.stuttgart.example.net, spmStatus Free, storage pool 
HP_Proliant_DL180G6
2015-05-18 14:36:16,023 INFO 
[org.ovirt.engine.core.vdsbroker.irsbroker.IrsProxyData] 
(DefaultQuartzScheduler_Worker-86) [169f4d5c] starting spm on vds 
ovirt-node02.stuttgart.example.net, storage pool HP_Proliant_DL180G6, 
prevId -1, LVER -1
2015-05-18 14:36:16,055 INFO 
[org.ovirt.engine.core.vdsbroker.vdsbroker.SpmStartVDSCommand] 
(DefaultQuartzScheduler_Worker-86) [169f4d5c] START, 
SpmStartVDSCommand(HostName = ovirt-node02.stuttgart.example.net, HostId 
= 6948da12-0b8a-4b6d-a9af-162e6c25dad3, storagePoolId = 
b384b3da-02a6-44f3-a3f6-56751ce8c26d, prevId=-1, prevLVER=-1, 
storagePoolFormatType=V3, recoveryMode=Manual, SCSIFencing=false), log 
id: 3a33371d
2015-05-18 14:36:16,065 INFO 
[org.ovirt.engine.core.vdsbroker.vdsbroker.SpmStartVDSCommand] 
(DefaultQuartzScheduler_Worker-86) [169f4d5c] spmStart polling started: 
taskId = c730e46a-57b3-4996-9916-a428c9a9ac3b
2015-05-18 14:36:17,076 ERROR 
[org.ovirt.engine.core.vdsbroker.vdsbroker.HSMGetTaskStatusVDSCommand] 
(DefaultQuartzScheduler_Worker-86) [169f4d5c] Failed in 
HSMGetTaskStatusVDS method
2015-05-18 14:36:17,078 INFO 
[org.ovirt.engine.core.vdsbroker.vdsbroker.SpmStartVDSCommand] 
(DefaultQuartzScheduler_Worker-86) [169f4d5c] spmStart polling ended: 
taskId = c730e46a-57b3-4996-9916-a428c9a9ac3b task status = finished
2015-05-18 14:36:17,080 ERROR 
[org.ovirt.engine.core.vdsbroker.vdsbroker.SpmStartVDSCommand] 
(DefaultQuartzScheduler_Worker-86) [169f4d5c] Start SPM Task failed - 
result: cleanSuccess, message: VDSGenericException: VDSErrorException: 
Failed to HSMGetTaskStatusVDS, error = Storage domain does not exist, 
code = 358
2015-05-18 14:36:17,169 INFO 
[org.ovirt.engine.core.vdsbroker.vdsbroker.SpmStartVDSCommand] 
(DefaultQuartzScheduler_Worker-86) [169f4d5c] spmStart polling ended, 
spm status: Free
2015-05-18 14:36:17,236 INFO 
[org.ovirt.engine.core.vdsbroker.vdsbroker.HSMClearTaskVDSCommand] 
(DefaultQuartzScheduler_Worker-86) [169f4d5c] START, 
HSMClearTaskVDSCommand(HostName = ovirt-node02.stuttgart.example.net, 
HostId = 6948da12-0b8a-4b6d-a9af-162e6c25dad3, 
taskId=c730e46a-57b3-4996-9916-a428c9a9ac3b), log id: 77422494
2015-05-18 14:36:17,245 INFO 
[org.ovirt.engine.core.vdsbroker.vdsbroker.HSMClearTaskVDSCommand] 
(DefaultQuartzScheduler_Worker-86) [169f4d5c] FINISH, 
HSMClearTaskVDSCommand, log id: 77422494
2015-05-18 14:36:17,246 INFO 
[org.ovirt.engine.core.vdsbroker.vdsbroker.SpmStartVDSCommand] 
(DefaultQuartzScheduler_Worker-86) [169f4d5c] FINISH, 
SpmStartVDSCommand, return: 
org.ovirt.engine.core.common.businessentities.SpmStatusResult at 6db1c38e, 
log id: 3a33371d
2015-05-18 14:36:17,260 INFO 
[org.ovirt.engine.core.bll.storage.SetStoragePoolStatusCommand] 
(DefaultQuartzScheduler_Worker-86) [319dd7e5] Running command: 
SetStoragePoolStatusCommand internal: true. Entities affected :  ID: 
b384b3da-02a6-44f3-a3f6-56751ce8c26d Type: StoragePool
2015-05-18 14:36:17,366 ERROR 
[org.ovirt.engine.core.vdsbroker.irsbroker.IrsBrokerCommand] 
(DefaultQuartzScheduler_Worker-86) [319dd7e5] 
IrsBroker::Failed::GetStoragePoolInfoVDS due to: 
IrsSpmStartFailedException: IRSGenericException: IRSErrorException: 
SpmStart failed
2015-05-18 14:36:19,702 INFO 
[org.ovirt.engine.core.vdsbroker.gluster.GlusterVolumesListVDSCommand] 
(DefaultQuartzScheduler_Worker-14) START, 
GlusterVolumesListVDSCommand(HostName = 
ovirt-node01.stuttgart.example.net, HostId = 
2e8cec66-23d7-4a5c-b6f3-9758d1d87f5c), log id: 39b26495
2015-05-18 14:36:19,941 INFO 
[org.ovirt.engine.core.vdsbroker.gluster.GlusterVolumesListVDSCommand] 
(DefaultQuartzScheduler_Worker-14) FINISH, GlusterVolumesListVDSCommand, 
return: 
{d46619e9-9368-4e82-bf3a-a2377b6e85e4=org.ovirt.engine.core.common.businessentities.gluster.GlusterVolumeEntity at 9c2a93b7}, 
log id: 39b26495
2015-05-18 14:36:20,111 INFO 
[org.ovirt.engine.core.vdsbroker.gluster.GlusterVolumesListVDSCommand] 
(DefaultQuartzScheduler_Worker-14) START, 
GlusterVolumesListVDSCommand(HostName = 
ovirt-node04.stuttgart.example.net, HostId = 
073c24e1-003f-412a-be56-0c41a435829a), log id: 67d54c4a
2015-05-18 14:36:20,194 INFO 
[org.ovirt.engine.core.vdsbroker.gluster.GlusterVolumesListVDSCommand] 
(DefaultQuartzScheduler_Worker-14) FINISH, GlusterVolumesListVDSCommand, 
return: 
{e952fd41-45bf-42d9-b494-8e0195cb9756=org.ovirt.engine.core.common.businessentities.gluster.GlusterVolumeEntity at 8a6b9152}, 
log id: 67d54c4a
2015-05-18 14:36:20,217 INFO 
[org.ovirt.engine.core.vdsbroker.gluster.GlusterVolumesListVDSCommand] 
(DefaultQuartzScheduler_Worker-14) START, 
GlusterVolumesListVDSCommand(HostName = 
ovirt-node07-stgt.stuttgart.example.net, HostId = 
32dfc1f1-c474-4410-94c0-38776c121289), log id: 2e5e3f07
2015-05-18 14:36:20,255 INFO 
[org.ovirt.engine.core.vdsbroker.gluster.GlusterVolumesListVDSCommand] 
(DefaultQuartzScheduler_Worker-14) FINISH, GlusterVolumesListVDSCommand, 
return: 
{8786357b-9d11-4c01-a34b-aee949c116e9=org.ovirt.engine.core.common.businessentities.gluster.GlusterVolumeEntity at 409317cd}, 
log id: 2e5e3f07
2015-05-18 14:36:25,485 INFO 
[org.ovirt.engine.core.vdsbroker.gluster.GlusterVolumesListVDSCommand] 
(DefaultQuartzScheduler_Worker-8) [1f6e11f1] START, 
GlusterVolumesListVDSCommand(HostName = 
ovirt-node01.stuttgart.example.net, HostId = 
2e8cec66-23d7-4a5c-b6f3-9758d1d87f5c), log id: 16543222
2015-05-18 14:36:25,670 INFO 
[org.ovirt.engine.core.vdsbroker.gluster.GlusterVolumesListVDSCommand] 
(DefaultQuartzScheduler_Worker-8) [1f6e11f1] FINISH, 
GlusterVolumesListVDSCommand, return: 
{d46619e9-9368-4e82-bf3a-a2377b6e85e4=org.ovirt.engine.core.common.businessentities.gluster.GlusterVolumeEntity at fd879447}, 
log id: 16543222
2015-05-18 14:36:25,836 INFO 
[org.ovirt.engine.core.vdsbroker.gluster.GlusterVolumesListVDSCommand] 
(DefaultQuartzScheduler_Worker-8) [1f6e11f1] START, 
GlusterVolumesListVDSCommand(HostName = 
ovirt-node04.stuttgart.example.net, HostId = 
073c24e1-003f-412a-be56-0c41a435829a), log id: 7ac850fa
2015-05-18 14:36:25,919 INFO 
[org.ovirt.engine.core.vdsbroker.gluster.GlusterVolumesListVDSCommand] 
(DefaultQuartzScheduler_Worker-8) [1f6e11f1] FINISH, 
GlusterVolumesListVDSCommand, return: 
{e952fd41-45bf-42d9-b494-8e0195cb9756=org.ovirt.engine.core.common.businessentities.gluster.GlusterVolumeEntity at f3e7b162}, 
log id: 7ac850fa
2015-05-18 14:36:25,933 INFO 
[org.ovirt.engine.core.vdsbroker.gluster.GlusterVolumesListVDSCommand] 
(DefaultQuartzScheduler_Worker-8) [1f6e11f1] START, 
GlusterVolumesListVDSCommand(HostName = 
ovirt-node07-stgt.stuttgart.example.net, HostId = 
32dfc1f1-c474-4410-94c0-38776c121289), log id: 1857ade7
2015-05-18 14:36:25,970 INFO 
[org.ovirt.engine.core.vdsbroker.gluster.GlusterVolumesListVDSCommand] 
(DefaultQuartzScheduler_Worker-8) [1f6e11f1] FINISH, 
GlusterVolumesListVDSCommand, return: 
{8786357b-9d11-4c01-a34b-aee949c116e9=org.ovirt.engine.core.common.businessentities.gluster.GlusterVolumeEntity at 11bc654c}, 
log id: 1857ade7
2015-05-18 14:36:27,612 INFO 
[org.ovirt.engine.core.vdsbroker.vdsbroker.HSMGetAllTasksStatusesVDSCommand] 
(DefaultQuartzScheduler_Worker-24) [3f3a55ac] Command 
org.ovirt.engine.core.vdsbroker.vdsbroker.HSMGetAllTasksStatusesVDSCommand 
return value

TaskStatusListReturnForXmlRpc [mStatus=StatusForXmlRpc [mCode=654, 
mMessage=Not SPM]]

2015-05-18 14:36:27,684 INFO 
[org.ovirt.engine.core.vdsbroker.vdsbroker.HSMGetAllTasksStatusesVDSCommand] 
(DefaultQuartzScheduler_Worker-24) [3f3a55ac] HostName = 
ovirt-node02.stuttgart.example.net
2015-05-18 14:36:27,686 ERROR 
[org.ovirt.engine.core.vdsbroker.vdsbroker.HSMGetAllTasksStatusesVDSCommand] 
(DefaultQuartzScheduler_Worker-24) [3f3a55ac] Command 
HSMGetAllTasksStatusesVDSCommand(HostName = 
ovirt-node02.stuttgart.example.net, HostId = 
6948da12-0b8a-4b6d-a9af-162e6c25dad3) execution failed. Exception: 
IRSNonOperationalException: IRSGenericException: IRSErrorException: 
IRSNonOperationalException: Not SPM
2015-05-18 14:36:27,826 INFO 
[org.ovirt.engine.core.vdsbroker.irsbroker.IrsProxyData] 
(DefaultQuartzScheduler_Worker-24) [3f3a55ac] hostFromVds::selectedVds - 
ovirt-node02.stuttgart.example.net, spmStatus Free, storage pool 
HP_Proliant_DL180G6
2015-05-18 14:36:27,832 INFO 
[org.ovirt.engine.core.vdsbroker.irsbroker.IrsProxyData] 
(DefaultQuartzScheduler_Worker-24) [3f3a55ac] starting spm on vds 
ovirt-node02.stuttgart.example.net, storage pool HP_Proliant_DL180G6, 
prevId -1, LVER -1
2015-05-18 14:36:27,867 INFO 
[org.ovirt.engine.core.vdsbroker.vdsbroker.SpmStartVDSCommand] 
(DefaultQuartzScheduler_Worker-24) [3f3a55ac] START, 
SpmStartVDSCommand(HostName = ovirt-node02.stuttgart.example.net, HostId 
= 6948da12-0b8a-4b6d-a9af-162e6c25dad3, storagePoolId = 
b384b3da-02a6-44f3-a3f6-56751ce8c26d, prevId=-1, prevLVER=-1, 
storagePoolFormatType=V3, recoveryMode=Manual, SCSIFencing=false), log 
id: 40dee17b
2015-05-18 14:36:27,880 INFO 
[org.ovirt.engine.core.vdsbroker.vdsbroker.SpmStartVDSCommand] 
(DefaultQuartzScheduler_Worker-24) [3f3a55ac] spmStart polling started: 
taskId = a80f5b5f-6507-44f7-ae03-798fb2bd7d28
2015-05-18 14:36:28,888 ERROR 
[org.ovirt.engine.core.vdsbroker.vdsbroker.HSMGetTaskStatusVDSCommand] 
(DefaultQuartzScheduler_Worker-24) [3f3a55ac] Failed in 
HSMGetTaskStatusVDS method
2015-05-18 14:36:28,890 INFO 
[org.ovirt.engine.core.vdsbroker.vdsbroker.SpmStartVDSCommand] 
(DefaultQuartzScheduler_Worker-24) [3f3a55ac] spmStart polling ended: 
taskId = a80f5b5f-6507-44f7-ae03-798fb2bd7d28 task status = finished
2015-05-18 14:36:28,893 ERROR 
[org.ovirt.engine.core.vdsbroker.vdsbroker.SpmStartVDSCommand] 
(DefaultQuartzScheduler_Worker-24) [3f3a55ac] Start SPM Task failed - 
result: cleanSuccess, message: VDSGenericException: VDSErrorException: 
Failed to HSMGetTaskStatusVDS, error = Storage domain does not exist, 
code = 358
2015-05-18 14:36:28,945 INFO 
[org.ovirt.engine.core.vdsbroker.vdsbroker.SpmStartVDSCommand] 
(DefaultQuartzScheduler_Worker-24) [3f3a55ac] spmStart polling ended, 
spm status: Free
2015-05-18 14:36:29,003 INFO 
[org.ovirt.engine.core.vdsbroker.vdsbroker.HSMClearTaskVDSCommand] 
(DefaultQuartzScheduler_Worker-24) [3f3a55ac] START, 
HSMClearTaskVDSCommand(HostName = ovirt-node02.stuttgart.example.net, 
HostId = 6948da12-0b8a-4b6d-a9af-162e6c25dad3, 
taskId=a80f5b5f-6507-44f7-ae03-798fb2bd7d28), log id: 65247d54
2015-05-18 14:36:29,012 INFO 
[org.ovirt.engine.core.vdsbroker.vdsbroker.HSMClearTaskVDSCommand] 
(DefaultQuartzScheduler_Worker-24) [3f3a55ac] FINISH, 
HSMClearTaskVDSCommand, log id: 65247d54
2015-05-18 14:36:29,015 INFO 
[org.ovirt.engine.core.vdsbroker.vdsbroker.SpmStartVDSCommand] 
(DefaultQuartzScheduler_Worker-24) [3f3a55ac] FINISH, 
SpmStartVDSCommand, return: 
org.ovirt.engine.core.common.businessentities.SpmStatusResult at 20a417c8, 
log id: 40dee17b
2015-05-18 14:36:29,022 INFO 
[org.ovirt.engine.core.bll.storage.SetStoragePoolStatusCommand] 
(DefaultQuartzScheduler_Worker-24) [75155c35] Running command: 
SetStoragePoolStatusCommand internal: true. Entities affected :  ID: 
b384b3da-02a6-44f3-a3f6-56751ce8c26d Type: StoragePool
2015-05-18 14:36:29,108 ERROR 
[org.ovirt.engine.core.vdsbroker.irsbroker.IrsBrokerCommand] 
(DefaultQuartzScheduler_Worker-24) [75155c35] 
IrsBroker::Failed::GetStoragePoolInfoVDS due to: 
IrsSpmStartFailedException: IRSGenericException: IRSErrorException: 
SpmStart failed
2015-05-18 14:36:29,241 INFO 
[org.ovirt.engine.core.vdsbroker.irsbroker.IrsProxyData] 
(DefaultQuartzScheduler_Worker-24) [75155c35] Irs placed on server 
6948da12-0b8a-4b6d-a9af-162e6c25dad3 failed. Proceed Failover
2015-05-18 14:36:29,347 INFO 
[org.ovirt.engine.core.vdsbroker.irsbroker.IrsProxyData] 
(DefaultQuartzScheduler_Worker-24) [75155c35] hostFromVds::selectedVds - 
ovirt-node01.stuttgart.example.net, spmStatus Free, storage pool 
HP_Proliant_DL180G6
2015-05-18 14:36:29,355 INFO 
[org.ovirt.engine.core.vdsbroker.irsbroker.IrsProxyData] 
(DefaultQuartzScheduler_Worker-24) [75155c35] starting spm on vds 
ovirt-node01.stuttgart.example.net, storage pool HP_Proliant_DL180G6, 
prevId -1, LVER -1
2015-05-18 14:36:29,388 INFO 
[org.ovirt.engine.core.vdsbroker.vdsbroker.SpmStartVDSCommand] 
(DefaultQuartzScheduler_Worker-24) [75155c35] START, 
SpmStartVDSCommand(HostName = ovirt-node01.stuttgart.example.net, HostId 
= 2e8cec66-23d7-4a5c-b6f3-9758d1d87f5c, storagePoolId = 
b384b3da-02a6-44f3-a3f6-56751ce8c26d, prevId=-1, prevLVER=-1, 
storagePoolFormatType=V3, recoveryMode=Manual, SCSIFencing=false), log 
id: 1ddc1476
2015-05-18 14:36:29,401 INFO 
[org.ovirt.engine.core.vdsbroker.vdsbroker.SpmStartVDSCommand] 
(DefaultQuartzScheduler_Worker-24) [75155c35] spmStart polling started: 
taskId = e30a8353-fc32-4300-adba-f51a876246b4
2015-05-18 14:36:30,410 ERROR 
[org.ovirt.engine.core.vdsbroker.vdsbroker.HSMGetTaskStatusVDSCommand] 
(DefaultQuartzScheduler_Worker-24) [75155c35] Failed in 
HSMGetTaskStatusVDS method
2015-05-18 14:36:30,412 INFO 
[org.ovirt.engine.core.vdsbroker.vdsbroker.SpmStartVDSCommand] 
(DefaultQuartzScheduler_Worker-24) [75155c35] spmStart polling ended: 
taskId = e30a8353-fc32-4300-adba-f51a876246b4 task status = finished
2015-05-18 14:36:30,414 ERROR 
[org.ovirt.engine.core.vdsbroker.vdsbroker.SpmStartVDSCommand] 
(DefaultQuartzScheduler_Worker-24) [75155c35] Start SPM Task failed - 
result: cleanSuccess, message: VDSGenericException: VDSErrorException: 
Failed to HSMGetTaskStatusVDS, error = Storage domain does not exist, 
code = 358
2015-05-18 14:36:30,467 INFO 
[org.ovirt.engine.core.vdsbroker.vdsbroker.SpmStartVDSCommand] 
(DefaultQuartzScheduler_Worker-24) [75155c35] spmStart polling ended, 
spm status: Free
2015-05-18 14:36:30,528 INFO 
[org.ovirt.engine.core.vdsbroker.vdsbroker.HSMClearTaskVDSCommand] 
(DefaultQuartzScheduler_Worker-24) [75155c35] START, 
HSMClearTaskVDSCommand(HostName = ovirt-node01.stuttgart.example.net, 
HostId = 2e8cec66-23d7-4a5c-b6f3-9758d1d87f5c, 
taskId=e30a8353-fc32-4300-adba-f51a876246b4), log id: 7dd12ea9
2015-05-18 14:36:30,538 INFO 
[org.ovirt.engine.core.vdsbroker.vdsbroker.HSMClearTaskVDSCommand] 
(DefaultQuartzScheduler_Worker-24) [75155c35] FINISH, 
HSMClearTaskVDSCommand, log id: 7dd12ea9
2015-05-18 14:36:30,540 INFO 
[org.ovirt.engine.core.vdsbroker.vdsbroker.SpmStartVDSCommand] 
(DefaultQuartzScheduler_Worker-24) [75155c35] FINISH, 
SpmStartVDSCommand, return: 
org.ovirt.engine.core.common.businessentities.SpmStatusResult at 6d54458d, 
log id: 1ddc1476
2015-05-18 14:36:30,547 INFO 
[org.ovirt.engine.core.bll.storage.SetStoragePoolStatusCommand] 
(DefaultQuartzScheduler_Worker-24) [37a2640e] Running command: 
SetStoragePoolStatusCommand internal: true. Entities affected :  ID: 
b384b3da-02a6-44f3-a3f6-56751ce8c26d Type: StoragePool
2015-05-18 14:36:30,617 ERROR 
[org.ovirt.engine.core.vdsbroker.irsbroker.IrsBrokerCommand] 
(DefaultQuartzScheduler_Worker-24) [37a2640e] 
IrsBroker::Failed::GetStoragePoolInfoVDS due to: 
IrsSpmStartFailedException: IRSGenericException: IRSErrorException: 
SpmStart failed
2015-05-18 14:36:31,176 INFO 
[org.ovirt.engine.core.vdsbroker.gluster.GlusterVolumesListVDSCommand] 
(DefaultQuartzScheduler_Worker-71) START, 
GlusterVolumesListVDSCommand(HostName = 
ovirt-node01.stuttgart.example.net, HostId = 
2e8cec66-23d7-4a5c-b6f3-9758d1d87f5c), log id: 7845333d
2015-05-18 14:36:31,413 INFO 
[org.ovirt.engine.core.vdsbroker.gluster.GlusterVolumesListVDSCommand] 
(DefaultQuartzScheduler_Worker-71) FINISH, GlusterVolumesListVDSCommand, 
return: 
{d46619e9-9368-4e82-bf3a-a2377b6e85e4=org.ovirt.engine.core.common.businessentities.gluster.GlusterVolumeEntity at 621cee87}, 
log id: 7845333d
2015-05-18 14:36:31,667 INFO 
[org.ovirt.engine.core.vdsbroker.gluster.GlusterVolumesListVDSCommand] 
(DefaultQuartzScheduler_Worker-71) START, 
GlusterVolumesListVDSCommand(HostName = 
ovirt-node04.stuttgart.example.net, HostId = 
073c24e1-003f-412a-be56-0c41a435829a), log id: 54f77549
2015-05-18 14:36:31,751 INFO 
[org.ovirt.engine.core.vdsbroker.gluster.GlusterVolumesListVDSCommand] 
(DefaultQuartzScheduler_Worker-71) FINISH, GlusterVolumesListVDSCommand, 
return: 
{e952fd41-45bf-42d9-b494-8e0195cb9756=org.ovirt.engine.core.common.businessentities.gluster.GlusterVolumeEntity at c710ae5}, 
log id: 54f77549
2015-05-18 14:36:31,766 INFO 
[org.ovirt.engine.core.vdsbroker.gluster.GlusterVolumesListVDSCommand] 
(DefaultQuartzScheduler_Worker-71) START, 
GlusterVolumesListVDSCommand(HostName = 
ovirt-node07-stgt.stuttgart.example.net, HostId = 
32dfc1f1-c474-4410-94c0-38776c121289), log id: 4c70d07
2015-05-18 14:36:31,799 INFO 
[org.ovirt.engine.core.vdsbroker.gluster.GlusterVolumesListVDSCommand] 
(DefaultQuartzScheduler_Worker-71) FINISH, GlusterVolumesListVDSCommand, 
return: 
{8786357b-9d11-4c01-a34b-aee949c116e9=org.ovirt.engine.core.common.businessentities.gluster.GlusterVolumeEntity at 2136a7e7}, 
log id: 4c70d07
2015-05-18 14:36:37,034 INFO 
[org.ovirt.engine.core.vdsbroker.gluster.GlusterVolumesListVDSCommand] 
(DefaultQuartzScheduler_Worker-1) START, 
GlusterVolumesListVDSCommand(HostName = 
ovirt-node01.stuttgart.example.net, HostId = 
2e8cec66-23d7-4a5c-b6f3-9758d1d87f5c), log id: 613a7dbb
2015-05-18 14:36:37,261 INFO 
[org.ovirt.engine.core.vdsbroker.gluster.GlusterVolumesListVDSCommand] 
(DefaultQuartzScheduler_Worker-1) FINISH, GlusterVolumesListVDSCommand, 
return: 
{d46619e9-9368-4e82-bf3a-a2377b6e85e4=org.ovirt.engine.core.common.businessentities.gluster.GlusterVolumeEntity at 65925efa}, 
log id: 613a7dbb
2015-05-18 14:36:37,512 INFO 
[org.ovirt.engine.core.vdsbroker.gluster.GlusterVolumesListVDSCommand] 
(DefaultQuartzScheduler_Worker-1) START, 
GlusterVolumesListVDSCommand(HostName = 
ovirt-node04.stuttgart.example.net, HostId = 
073c24e1-003f-412a-be56-0c41a435829a), log id: b285af5
2015-05-18 14:36:37,594 INFO 
[org.ovirt.engine.core.vdsbroker.gluster.GlusterVolumesListVDSCommand] 
(DefaultQuartzScheduler_Worker-1) FINISH, GlusterVolumesListVDSCommand, 
return: 
{e952fd41-45bf-42d9-b494-8e0195cb9756=org.ovirt.engine.core.common.businessentities.gluster.GlusterVolumeEntity at 4ed42f1a}, 
log id: b285af5
2015-05-18 14:36:37,609 INFO 
[org.ovirt.engine.core.vdsbroker.gluster.GlusterVolumesListVDSCommand] 
(DefaultQuartzScheduler_Worker-1) START, 
GlusterVolumesListVDSCommand(HostName = 
ovirt-node07-stgt.stuttgart.example.net, HostId = 
32dfc1f1-c474-4410-94c0-38776c121289), log id: 40264885
2015-05-18 14:36:37,643 INFO 
[org.ovirt.engine.core.vdsbroker.gluster.GlusterVolumesListVDSCommand] 
(DefaultQuartzScheduler_Worker-1) FINISH, GlusterVolumesListVDSCommand, 
return: 
{8786357b-9d11-4c01-a34b-aee949c116e9=org.ovirt.engine.core.common.businessentities.gluster.GlusterVolumeEntity at baf0c0d5}, 
log id: 40264885
2015-05-18 14:36:40,911 INFO 
[org.ovirt.engine.core.vdsbroker.vdsbroker.HSMGetAllTasksStatusesVDSCommand] 
(DefaultQuartzScheduler_Worker-9) [59e67616] Command 
org.ovirt.engine.core.vdsbroker.vdsbroker.HSMGetAllTasksStatusesVDSCommand 
return value

TaskStatusListReturnForXmlRpc [mStatus=StatusForXmlRpc [mCode=654, 
mMessage=Not SPM]]

2015-05-18 14:36:40,972 INFO 
[org.ovirt.engine.core.vdsbroker.vdsbroker.HSMGetAllTasksStatusesVDSCommand] 
(DefaultQuartzScheduler_Worker-9) [59e67616] HostName = 
ovirt-node01.stuttgart.example.net
2015-05-18 14:36:40,974 ERROR 
[org.ovirt.engine.core.vdsbroker.vdsbroker.HSMGetAllTasksStatusesVDSCommand] 
(DefaultQuartzScheduler_Worker-9) [59e67616] Command 
HSMGetAllTasksStatusesVDSCommand(HostName = 
ovirt-node01.stuttgart.example.net, HostId = 
2e8cec66-23d7-4a5c-b6f3-9758d1d87f5c) execution failed. Exception: 
IRSNonOperationalException: IRSGenericException: IRSErrorException: 
IRSNonOperationalException: Not SPM
2015-05-18 14:36:41,091 INFO 
[org.ovirt.engine.core.vdsbroker.irsbroker.IrsProxyData] 
(DefaultQuartzScheduler_Worker-9) [59e67616] hostFromVds::selectedVds - 
ovirt-node01.stuttgart.example.net, spmStatus Free, storage pool 
HP_Proliant_DL180G6
2015-05-18 14:36:41,095 INFO 
[org.ovirt.engine.core.vdsbroker.irsbroker.IrsProxyData] 
(DefaultQuartzScheduler_Worker-9) [59e67616] starting spm on vds 
ovirt-node01.stuttgart.example.net, storage pool HP_Proliant_DL180G6, 
prevId -1, LVER -1
2015-05-18 14:36:41,128 INFO 
[org.ovirt.engine.core.vdsbroker.vdsbroker.SpmStartVDSCommand] 
(DefaultQuartzScheduler_Worker-9) [59e67616] START, 
SpmStartVDSCommand(HostName = ovirt-node01.stuttgart.example.net, HostId 
= 2e8cec66-23d7-4a5c-b6f3-9758d1d87f5c, storagePoolId = 
b384b3da-02a6-44f3-a3f6-56751ce8c26d, prevId=-1, prevLVER=-1, 
storagePoolFormatType=V3, recoveryMode=Manual, SCSIFencing=false), log 
id: 225ad910
2015-05-18 14:36:41,139 INFO 
[org.ovirt.engine.core.vdsbroker.vdsbroker.SpmStartVDSCommand] 
(DefaultQuartzScheduler_Worker-9) [59e67616] spmStart polling started: 
taskId = 64e3d616-9a75-411f-a4d5-f7bd770a4cb8
2015-05-18 14:36:42,148 ERROR 
[org.ovirt.engine.core.vdsbroker.vdsbroker.HSMGetTaskStatusVDSCommand] 
(DefaultQuartzScheduler_Worker-9) [59e67616] Failed in 
HSMGetTaskStatusVDS method
2015-05-18 14:36:42,150 INFO 
[org.ovirt.engine.core.vdsbroker.vdsbroker.SpmStartVDSCommand] 
(DefaultQuartzScheduler_Worker-9) [59e67616] spmStart polling ended: 
taskId = 64e3d616-9a75-411f-a4d5-f7bd770a4cb8 task status = finished
2015-05-18 14:36:42,152 ERROR 
[org.ovirt.engine.core.vdsbroker.vdsbroker.SpmStartVDSCommand] 
(DefaultQuartzScheduler_Worker-9) [59e67616] Start SPM Task failed - 
result: cleanSuccess, message: VDSGenericException: VDSErrorException: 
Failed to HSMGetTaskStatusVDS, error = Storage domain does not exist, 
code = 358
2015-05-18 14:36:42,204 INFO 
[org.ovirt.engine.core.vdsbroker.vdsbroker.SpmStartVDSCommand] 
(DefaultQuartzScheduler_Worker-9) [59e67616] spmStart polling ended, spm 
status: Free
2015-05-18 14:36:42,263 INFO 
[org.ovirt.engine.core.vdsbroker.vdsbroker.HSMClearTaskVDSCommand] 
(DefaultQuartzScheduler_Worker-9) [59e67616] START, 
HSMClearTaskVDSCommand(HostName = ovirt-node01.stuttgart.example.net, 
HostId = 2e8cec66-23d7-4a5c-b6f3-9758d1d87f5c, 
taskId=64e3d616-9a75-411f-a4d5-f7bd770a4cb8), log id: 6918f975
2015-05-18 14:36:42,272 INFO 
[org.ovirt.engine.core.vdsbroker.vdsbroker.HSMClearTaskVDSCommand] 
(DefaultQuartzScheduler_Worker-9) [59e67616] FINISH, 
HSMClearTaskVDSCommand, log id: 6918f975
2015-05-18 14:36:42,275 INFO 
[org.ovirt.engine.core.vdsbroker.vdsbroker.SpmStartVDSCommand] 
(DefaultQuartzScheduler_Worker-9) [59e67616] FINISH, SpmStartVDSCommand, 
return: 
org.ovirt.engine.core.common.businessentities.SpmStatusResult at 29ac1cac, 
log id: 225ad910
2015-05-18 14:36:42,286 INFO 
[org.ovirt.engine.core.bll.storage.SetStoragePoolStatusCommand] 
(DefaultQuartzScheduler_Worker-9) [76c93871] Running command: 
SetStoragePoolStatusCommand internal: true. Entities affected :  ID: 
b384b3da-02a6-44f3-a3f6-56751ce8c26d Type: StoragePool
2015-05-18 14:36:42,383 WARN 
[org.ovirt.engine.core.dal.dbbroker.auditloghandling.AuditLogDirector] 
(DefaultQuartzScheduler_Worker-9) [76c93871] Correlation ID: 76c93871, 
Call Stack: null, Custom Event ID: -1, Message: Invalid status on Data 
Center HP_Proliant_DL180G6. Setting status to Non Responsive.
2015-05-18 14:36:42,390 ERROR 
[org.ovirt.engine.core.vdsbroker.irsbroker.IrsBrokerCommand] 
(DefaultQuartzScheduler_Worker-9) [76c93871] 
IrsBroker::Failed::GetStoragePoolInfoVDS due to: 
IrsSpmStartFailedException: IRSGenericException: IRSErrorException: 
SpmStart failed
2015-05-18 14:36:42,521 INFO 
[org.ovirt.engine.core.vdsbroker.irsbroker.IrsProxyData] 
(DefaultQuartzScheduler_Worker-9) [76c93871] Irs placed on server 
2e8cec66-23d7-4a5c-b6f3-9758d1d87f5c failed. Proceed Failover
2015-05-18 14:36:42,602 INFO 
[org.ovirt.engine.core.vdsbroker.irsbroker.IrsProxyData] 
(DefaultQuartzScheduler_Worker-9) [76c93871] hostFromVds::selectedVds - 
ovirt-node02.stuttgart.example.net, spmStatus Free, storage pool 
HP_Proliant_DL180G6
2015-05-18 14:36:42,612 INFO 
[org.ovirt.engine.core.vdsbroker.irsbroker.IrsProxyData] 
(DefaultQuartzScheduler_Worker-9) [76c93871] starting spm on vds 
ovirt-node02.stuttgart.example.net, storage pool HP_Proliant_DL180G6, 
prevId -1, LVER -1
2015-05-18 14:36:42,657 INFO 
[org.ovirt.engine.core.vdsbroker.vdsbroker.SpmStartVDSCommand] 
(DefaultQuartzScheduler_Worker-9) [76c93871] START, 
SpmStartVDSCommand(HostName = ovirt-node02.stuttgart.example.net, HostId 
= 6948da12-0b8a-4b6d-a9af-162e6c25dad3, storagePoolId = 
b384b3da-02a6-44f3-a3f6-56751ce8c26d, prevId=-1, prevLVER=-1, 
storagePoolFormatType=V3, recoveryMode=Manual, SCSIFencing=false), log 
id: 39ed2e0a
2015-05-18 14:36:42,676 INFO 
[org.ovirt.engine.core.vdsbroker.vdsbroker.SpmStartVDSCommand] 
(DefaultQuartzScheduler_Worker-9) [76c93871] spmStart polling started: 
taskId = caa54bef-a0c9-40e7-9a1c-ead9b7b10dd3
2015-05-18 14:36:42,839 INFO 
[org.ovirt.engine.core.vdsbroker.gluster.GlusterVolumesListVDSCommand] 
(DefaultQuartzScheduler_Worker-31) START, 
GlusterVolumesListVDSCommand(HostName = 
ovirt-node01.stuttgart.example.net, HostId = 
2e8cec66-23d7-4a5c-b6f3-9758d1d87f5c), log id: 5a5cc3b2
2015-05-18 14:36:43,075 INFO 
[org.ovirt.engine.core.vdsbroker.gluster.GlusterVolumesListVDSCommand] 
(DefaultQuartzScheduler_Worker-31) FINISH, GlusterVolumesListVDSCommand, 
return: 
{d46619e9-9368-4e82-bf3a-a2377b6e85e4=org.ovirt.engine.core.common.businessentities.gluster.GlusterVolumeEntity at b94fa000}, 
log id: 5a5cc3b2
2015-05-18 14:36:43,341 INFO 
[org.ovirt.engine.core.vdsbroker.gluster.GlusterVolumesListVDSCommand] 
(DefaultQuartzScheduler_Worker-31) START, 
GlusterVolumesListVDSCommand(HostName = 
ovirt-node04.stuttgart.example.net, HostId = 
073c24e1-003f-412a-be56-0c41a435829a), log id: 74a38188
2015-05-18 14:36:43,429 INFO 
[org.ovirt.engine.core.vdsbroker.gluster.GlusterVolumesListVDSCommand] 
(DefaultQuartzScheduler_Worker-31) FINISH, GlusterVolumesListVDSCommand, 
return: 
{e952fd41-45bf-42d9-b494-8e0195cb9756=org.ovirt.engine.core.common.businessentities.gluster.GlusterVolumeEntity at b7eeb15b}, 
log id: 74a38188
2015-05-18 14:36:43,444 INFO 
[org.ovirt.engine.core.vdsbroker.gluster.GlusterVolumesListVDSCommand] 
(DefaultQuartzScheduler_Worker-31) START, 
GlusterVolumesListVDSCommand(HostName = 
ovirt-node07-stgt.stuttgart.example.net, HostId = 
32dfc1f1-c474-4410-94c0-38776c121289), log id: 71a18ff2
2015-05-18 14:36:43,486 INFO 
[org.ovirt.engine.core.vdsbroker.gluster.GlusterVolumesListVDSCommand] 
(DefaultQuartzScheduler_Worker-31) FINISH, GlusterVolumesListVDSCommand, 
return: 
{8786357b-9d11-4c01-a34b-aee949c116e9=org.ovirt.engine.core.common.businessentities.gluster.GlusterVolumeEntity at 694b34c3}, 
log id: 71a18ff2
2015-05-18 14:36:43,685 ERROR 
[org.ovirt.engine.core.vdsbroker.vdsbroker.HSMGetTaskStatusVDSCommand] 
(DefaultQuartzScheduler_Worker-9) [76c93871] Failed in 
HSMGetTaskStatusVDS method
2015-05-18 14:36:43,687 INFO 
[org.ovirt.engine.core.vdsbroker.vdsbroker.SpmStartVDSCommand] 
(DefaultQuartzScheduler_Worker-9) [76c93871] spmStart polling ended: 
taskId = caa54bef-a0c9-40e7-9a1c-ead9b7b10dd3 task status = finished
2015-05-18 14:36:43,689 ERROR 
[org.ovirt.engine.core.vdsbroker.vdsbroker.SpmStartVDSCommand] 
(DefaultQuartzScheduler_Worker-9) [76c93871] Start SPM Task failed - 
result: cleanSuccess, message: VDSGenericException: VDSErrorException: 
Failed to HSMGetTaskStatusVDS, error = Storage domain does not exist, 
code = 358
2015-05-18 14:36:43,744 INFO 
[org.ovirt.engine.core.vdsbroker.vdsbroker.SpmStartVDSCommand] 
(DefaultQuartzScheduler_Worker-9) [76c93871] spmStart polling ended, spm 
status: Free
2015-05-18 14:36:43,801 INFO 
[org.ovirt.engine.core.vdsbroker.vdsbroker.HSMClearTaskVDSCommand] 
(DefaultQuartzScheduler_Worker-9) [76c93871] START, 
HSMClearTaskVDSCommand(HostName = ovirt-node02.stuttgart.example.net, 
HostId = 6948da12-0b8a-4b6d-a9af-162e6c25dad3, 
taskId=caa54bef-a0c9-40e7-9a1c-ead9b7b10dd3), log id: 137e2e59
2015-05-18 14:36:43,811 INFO 
[org.ovirt.engine.core.vdsbroker.vdsbroker.HSMClearTaskVDSCommand] 
(DefaultQuartzScheduler_Worker-9) [76c93871] FINISH, 
HSMClearTaskVDSCommand, log id: 137e2e59
2015-05-18 14:36:43,812 INFO 
[org.ovirt.engine.core.vdsbroker.vdsbroker.SpmStartVDSCommand] 
(DefaultQuartzScheduler_Worker-9) [76c93871] FINISH, SpmStartVDSCommand, 
return: 
org.ovirt.engine.core.common.businessentities.SpmStatusResult at 14bd85e2, 
log id: 39ed2e0a
2015-05-18 14:36:43,819 INFO 
[org.ovirt.engine.core.bll.storage.SetStoragePoolStatusCommand] 
(DefaultQuartzScheduler_Worker-9) [26a0c311] Running command: 
SetStoragePoolStatusCommand internal: true. Entities affected :  ID: 
b384b3da-02a6-44f3-a3f6-56751ce8c26d Type: StoragePool
2015-05-18 14:36:43,889 WARN 
[org.ovirt.engine.core.dal.dbbroker.auditloghandling.AuditLogDirector] 
(DefaultQuartzScheduler_Worker-9) [26a0c311] Correlation ID: 26a0c311, 
Call Stack: null, Custom Event ID: -1, Message: Invalid status on Data 
Center HP_Proliant_DL180G6. Setting status to Non Responsive.
2015-05-18 14:36:43,895 ERROR 
[org.ovirt.engine.core.vdsbroker.irsbroker.IrsBrokerCommand] 
(DefaultQuartzScheduler_Worker-9) [26a0c311] 
IrsBroker::Failed::GetStoragePoolInfoVDS due to: 
IrsSpmStartFailedException: IRSGenericException: IRSErrorException: 
SpmStart failed
2015-05-18 14:36:48,689 INFO 
[org.ovirt.engine.core.vdsbroker.gluster.GlusterVolumesListVDSCommand] 
(DefaultQuartzScheduler_Worker-62) START, 
GlusterVolumesListVDSCommand(HostName = 
ovirt-node01.stuttgart.example.net, HostId = 
2e8cec66-23d7-4a5c-b6f3-9758d1d87f5c), log id: 5cf2af7b
2015-05-18 14:36:48,958 INFO 
[org.ovirt.engine.core.vdsbroker.gluster.GlusterVolumesListVDSCommand] 
(DefaultQuartzScheduler_Worker-62) FINISH, GlusterVolumesListVDSCommand, 
return: 
{d46619e9-9368-4e82-bf3a-a2377b6e85e4=org.ovirt.engine.core.common.businessentities.gluster.GlusterVolumeEntity at 7b61fce}, 
log id: 5cf2af7b
2015-05-18 14:36:49,136 INFO 
[org.ovirt.engine.core.vdsbroker.gluster.GlusterVolumesListVDSCommand] 
(DefaultQuartzScheduler_Worker-62) START, 
GlusterVolumesListVDSCommand(HostName = 
ovirt-node04.stuttgart.example.net, HostId = 
073c24e1-003f-412a-be56-0c41a435829a), log id: 58b3a1f1
2015-05-18 14:36:49,241 INFO 
[org.ovirt.engine.core.vdsbroker.gluster.GlusterVolumesListVDSCommand] 
(DefaultQuartzScheduler_Worker-62) FINISH, GlusterVolumesListVDSCommand, 
return: 
{e952fd41-45bf-42d9-b494-8e0195cb9756=org.ovirt.engine.core.common.businessentities.gluster.GlusterVolumeEntity at a2b43ca1}, 
log id: 58b3a1f1
2015-05-18 14:36:49,255 INFO 
[org.ovirt.engine.core.vdsbroker.gluster.GlusterVolumesListVDSCommand] 
(DefaultQuartzScheduler_Worker-62) START, 
GlusterVolumesListVDSCommand(HostName = 
ovirt-node07-stgt.stuttgart.example.net, HostId = 
32dfc1f1-c474-4410-94c0-38776c121289), log id: 231ed250
2015-05-18 14:36:49,293 INFO 
[org.ovirt.engine.core.vdsbroker.gluster.GlusterVolumesListVDSCommand] 
(DefaultQuartzScheduler_Worker-62) FINISH, GlusterVolumesListVDSCommand, 
return: 
{8786357b-9d11-4c01-a34b-aee949c116e9=org.ovirt.engine.core.common.businessentities.gluster.GlusterVolumeEntity at 878a1db8}, 
log id: 231ed250
2015-05-18 14:36:54,186 INFO 
[org.ovirt.engine.core.vdsbroker.vdsbroker.HSMGetAllTasksStatusesVDSCommand] 
(DefaultQuartzScheduler_Worker-84) Command 
org.ovirt.engine.core.vdsbroker.vdsbroker.HSMGetAllTasksStatusesVDSCommand 
return value

TaskStatusListReturnForXmlRpc [mStatus=StatusForXmlRpc [mCode=654, 
mMessage=Not SPM]]

2015-05-18 14:36:54,247 INFO 
[org.ovirt.engine.core.vdsbroker.vdsbroker.HSMGetAllTasksStatusesVDSCommand] 
(DefaultQuartzScheduler_Worker-84) HostName = 
ovirt-node02.stuttgart.example.net
2015-05-18 14:36:54,249 ERROR 
[org.ovirt.engine.core.vdsbroker.vdsbroker.HSMGetAllTasksStatusesVDSCommand] 
(DefaultQuartzScheduler_Worker-84) Command 
HSMGetAllTasksStatusesVDSCommand(HostName = 
ovirt-node02.stuttgart.example.net, HostId = 
6948da12-0b8a-4b6d-a9af-162e6c25dad3) execution failed. Exception: 
IRSNonOperationalException: IRSGenericException: IRSErrorException: 
IRSNonOperationalException: Not SPM
2015-05-18 14:36:54,411 INFO 
[org.ovirt.engine.core.vdsbroker.irsbroker.IrsProxyData] 
(DefaultQuartzScheduler_Worker-84) hostFromVds::selectedVds - 
ovirt-node01.stuttgart.example.net, spmStatus Free, storage pool 
HP_Proliant_DL180G6
2015-05-18 14:36:54,415 INFO 
[org.ovirt.engine.core.vdsbroker.irsbroker.IrsProxyData] 
(DefaultQuartzScheduler_Worker-84) starting spm on vds 
ovirt-node01.stuttgart.example.net, storage pool HP_Proliant_DL180G6, 
prevId -1, LVER -1
2015-05-18 14:36:54,448 INFO 
[org.ovirt.engine.core.vdsbroker.vdsbroker.SpmStartVDSCommand] 
(DefaultQuartzScheduler_Worker-84) START, SpmStartVDSCommand(HostName = 
ovirt-node01.stuttgart.example.net, HostId = 
2e8cec66-23d7-4a5c-b6f3-9758d1d87f5c, storagePoolId = 
b384b3da-02a6-44f3-a3f6-56751ce8c26d, prevId=-1, prevLVER=-1, 
storagePoolFormatType=V3, recoveryMode=Manual, SCSIFencing=false), log 
id: 6b7e937c
2015-05-18 14:36:54,458 INFO 
[org.ovirt.engine.core.vdsbroker.vdsbroker.SpmStartVDSCommand] 
(DefaultQuartzScheduler_Worker-84) spmStart polling started: taskId = 
1df26d40-a863-4be2-8e4d-cc63e144b0e1
2015-05-18 14:36:54,500 INFO 
[org.ovirt.engine.core.vdsbroker.gluster.GlusterVolumesListVDSCommand] 
(DefaultQuartzScheduler_Worker-81) START, 
GlusterVolumesListVDSCommand(HostName = 
ovirt-node01.stuttgart.example.net, HostId = 
2e8cec66-23d7-4a5c-b6f3-9758d1d87f5c), log id: 21a76860
2015-05-18 14:36:54,732 INFO 
[org.ovirt.engine.core.vdsbroker.gluster.GlusterVolumesListVDSCommand] 
(DefaultQuartzScheduler_Worker-81) FINISH, GlusterVolumesListVDSCommand, 
return: 
{d46619e9-9368-4e82-bf3a-a2377b6e85e4=org.ovirt.engine.core.common.businessentities.gluster.GlusterVolumeEntity at 236c3904}, 
log id: 21a76860
2015-05-18 14:36:54,898 INFO 
[org.ovirt.engine.core.vdsbroker.gluster.GlusterVolumesListVDSCommand] 
(DefaultQuartzScheduler_Worker-81) START, 
GlusterVolumesListVDSCommand(HostName = 
ovirt-node04.stuttgart.example.net, HostId = 
073c24e1-003f-412a-be56-0c41a435829a), log id: 697b317e
2015-05-18 14:36:54,981 INFO 
[org.ovirt.engine.core.vdsbroker.gluster.GlusterVolumesListVDSCommand] 
(DefaultQuartzScheduler_Worker-81) FINISH, GlusterVolumesListVDSCommand, 
return: 
{e952fd41-45bf-42d9-b494-8e0195cb9756=org.ovirt.engine.core.common.businessentities.gluster.GlusterVolumeEntity at 17d1446c}, 
log id: 697b317e
2015-05-18 14:36:54,995 INFO 
[org.ovirt.engine.core.vdsbroker.gluster.GlusterVolumesListVDSCommand] 
(DefaultQuartzScheduler_Worker-81) START, 
GlusterVolumesListVDSCommand(HostName = 
ovirt-node07-stgt.stuttgart.example.net, HostId = 
32dfc1f1-c474-4410-94c0-38776c121289), log id: 7d8aa4d
2015-05-18 14:36:55,030 INFO 
[org.ovirt.engine.core.vdsbroker.gluster.GlusterVolumesListVDSCommand] 
(DefaultQuartzScheduler_Worker-81) FINISH, GlusterVolumesListVDSCommand, 
return: 
{8786357b-9d11-4c01-a34b-aee949c116e9=org.ovirt.engine.core.common.businessentities.gluster.GlusterVolumeEntity at 7c985d32}, 
log id: 7d8aa4d
2015-05-18 14:36:55,467 ERROR 
[org.ovirt.engine.core.vdsbroker.vdsbroker.HSMGetTaskStatusVDSCommand] 
(DefaultQuartzScheduler_Worker-84) Failed in HSMGetTaskStatusVDS method
2015-05-18 14:36:55,469 INFO 
[org.ovirt.engine.core.vdsbroker.vdsbroker.SpmStartVDSCommand] 
(DefaultQuartzScheduler_Worker-84) spmStart polling ended: taskId = 
1df26d40-a863-4be2-8e4d-cc63e144b0e1 task status = finished
2015-05-18 14:36:55,471 ERROR 
[org.ovirt.engine.core.vdsbroker.vdsbroker.SpmStartVDSCommand] 
(DefaultQuartzScheduler_Worker-84) Start SPM Task failed - result: 
cleanSuccess, message: VDSGenericException: VDSErrorException: Failed to 
HSMGetTaskStatusVDS, error = Storage domain does not exist, code = 358
2015-05-18 14:36:55,524 INFO 
[org.ovirt.engine.core.vdsbroker.vdsbroker.SpmStartVDSCommand] 
(DefaultQuartzScheduler_Worker-84) spmStart polling ended, spm status: Free
2015-05-18 14:36:55,582 INFO 
[org.ovirt.engine.core.vdsbroker.vdsbroker.HSMClearTaskVDSCommand] 
(DefaultQuartzScheduler_Worker-84) START, 
HSMClearTaskVDSCommand(HostName = ovirt-node01.stuttgart.example.net, 
HostId = 2e8cec66-23d7-4a5c-b6f3-9758d1d87f5c, 
taskId=1df26d40-a863-4be2-8e4d-cc63e144b0e1), log id: 1e686f88
2015-05-18 14:36:55,593 INFO 
[org.ovirt.engine.core.vdsbroker.vdsbroker.HSMClearTaskVDSCommand] 
(DefaultQuartzScheduler_Worker-84) FINISH, HSMClearTaskVDSCommand, log 
id: 1e686f88
2015-05-18 14:36:55,595 INFO 
[org.ovirt.engine.core.vdsbroker.vdsbroker.SpmStartVDSCommand] 
(DefaultQuartzScheduler_Worker-84) FINISH, SpmStartVDSCommand, return: 
org.ovirt.engine.core.common.businessentities.SpmStatusResult at 60d70516, 
log id: 6b7e937c
2015-05-18 14:36:55,605 INFO 
[org.ovirt.engine.core.bll.storage.SetStoragePoolStatusCommand] 
(DefaultQuartzScheduler_Worker-84) [448f1905] Running command: 
SetStoragePoolStatusCommand internal: true. Entities affected :  ID: 
b384b3da-02a6-44f3-a3f6-56751ce8c26d Type: StoragePool
2015-05-18 14:36:55,686 ERROR 
[org.ovirt.engine.core.vdsbroker.irsbroker.IrsBrokerCommand] 
(DefaultQuartzScheduler_Worker-84) [448f1905] 
IrsBroker::Failed::GetStoragePoolInfoVDS due to: 
IrsSpmStartFailedException: IRSGenericException: IRSErrorException: 
SpmStart failed
2015-05-18 14:36:55,814 INFO 
[org.ovirt.engine.core.vdsbroker.irsbroker.IrsProxyData] 
(DefaultQuartzScheduler_Worker-84) [448f1905] Irs placed on server 
2e8cec66-23d7-4a5c-b6f3-9758d1d87f5c failed. Proceed Failover
2015-05-18 14:36:55,892 INFO 
[org.ovirt.engine.core.vdsbroker.irsbroker.IrsProxyData] 
(DefaultQuartzScheduler_Worker-84) [448f1905] hostFromVds::selectedVds - 
ovirt-node02.stuttgart.example.net, spmStatus Free, storage pool 
HP_Proliant_DL180G6
2015-05-18 14:36:55,898 INFO 
[org.ovirt.engine.core.vdsbroker.irsbroker.IrsProxyData] 
(DefaultQuartzScheduler_Worker-84) [448f1905] starting spm on vds 
ovirt-node02.stuttgart.example.net, storage pool HP_Proliant_DL180G6, 
prevId -1, LVER -1
2015-05-18 14:36:55,958 INFO 
[org.ovirt.engine.core.vdsbroker.vdsbroker.SpmStartVDSCommand] 
(DefaultQuartzScheduler_Worker-84) [448f1905] START, 
SpmStartVDSCommand(HostName = ovirt-node02.stuttgart.example.net, HostId 
= 6948da12-0b8a-4b6d-a9af-162e6c25dad3, storagePoolId = 
b384b3da-02a6-44f3-a3f6-56751ce8c26d, prevId=-1, prevLVER=-1, 
storagePoolFormatType=V3, recoveryMode=Manual, SCSIFencing=false), log 
id: 3bcd36d4
2015-05-18 14:36:55,969 INFO 
[org.ovirt.engine.core.vdsbroker.vdsbroker.SpmStartVDSCommand] 
(DefaultQuartzScheduler_Worker-84) [448f1905] spmStart polling started: 
taskId = 8ebae0ff-b0cd-4e78-ac98-1ffa2de94c2c
2015-05-18 14:36:56,978 ERROR 
[org.ovirt.engine.core.vdsbroker.vdsbroker.HSMGetTaskStatusVDSCommand] 
(DefaultQuartzScheduler_Worker-84) [448f1905] Failed in 
HSMGetTaskStatusVDS method
2015-05-18 14:36:56,980 INFO 
[org.ovirt.engine.core.vdsbroker.vdsbroker.SpmStartVDSCommand] 
(DefaultQuartzScheduler_Worker-84) [448f1905] spmStart polling ended: 
taskId = 8ebae0ff-b0cd-4e78-ac98-1ffa2de94c2c task status = finished
2015-05-18 14:36:56,982 ERROR 
[org.ovirt.engine.core.vdsbroker.vdsbroker.SpmStartVDSCommand] 
(DefaultQuartzScheduler_Worker-84) [448f1905] Start SPM Task failed - 
result: cleanSuccess, message: VDSGenericException: VDSErrorException: 
Failed to HSMGetTaskStatusVDS, error = Storage domain does not exist, 
code = 358
2015-05-18 14:36:57,034 INFO 
[org.ovirt.engine.core.vdsbroker.vdsbroker.SpmStartVDSCommand] 
(DefaultQuartzScheduler_Worker-84) [448f1905] spmStart polling ended, 
spm status: Free
2015-05-18 14:36:57,092 INFO 
[org.ovirt.engine.core.vdsbroker.vdsbroker.HSMClearTaskVDSCommand] 
(DefaultQuartzScheduler_Worker-84) [448f1905] START, 
HSMClearTaskVDSCommand(HostName = ovirt-node02.stuttgart.example.net, 
HostId = 6948da12-0b8a-4b6d-a9af-162e6c25dad3, 
taskId=8ebae0ff-b0cd-4e78-ac98-1ffa2de94c2c), log id: aa83196
2015-05-18 14:36:57,103 INFO 
[org.ovirt.engine.core.vdsbroker.vdsbroker.HSMClearTaskVDSCommand] 
(DefaultQuartzScheduler_Worker-84) [448f1905] FINISH, 
HSMClearTaskVDSCommand, log id: aa83196
2015-05-18 14:36:57,104 INFO 
[org.ovirt.engine.core.vdsbroker.vdsbroker.SpmStartVDSCommand] 
(DefaultQuartzScheduler_Worker-84) [448f1905] FINISH, 
SpmStartVDSCommand, return: 
org.ovirt.engine.core.common.businessentities.SpmStatusResult at 31dc73ed, 
log id: 3bcd36d4
2015-05-18 14:36:57,111 INFO 
[org.ovirt.engine.core.bll.storage.SetStoragePoolStatusCommand] 
(DefaultQuartzScheduler_Worker-84) [3ee62e] Running command: 
SetStoragePoolStatusCommand internal: true. Entities affected :  ID: 
b384b3da-02a6-44f3-a3f6-56751ce8c26d Type: StoragePool
2015-05-18 14:36:57,180 ERROR 
[org.ovirt.engine.core.vdsbroker.irsbroker.IrsBrokerCommand] 
(DefaultQuartzScheduler_Worker-84) [3ee62e] 
IrsBroker::Failed::GetStoragePoolInfoVDS due to: 
IrsSpmStartFailedException: IRSGenericException: IRSErrorException: 
SpmStart failed
2015-05-18 14:37:00,260 INFO 
[org.ovirt.engine.core.vdsbroker.gluster.GlusterVolumesListVDSCommand] 
(DefaultQuartzScheduler_Worker-16) START, 
GlusterVolumesListVDSCommand(HostName = 
ovirt-node01.stuttgart.example.net, HostId = 
2e8cec66-23d7-4a5c-b6f3-9758d1d87f5c), log id: 5f3899c0
2015-05-18 14:37:00,453 INFO 
[org.ovirt.engine.core.vdsbroker.vdsbroker.GetFileStatsVDSCommand] 
(org.ovirt.thread.pool-8-thread-46) START, GetFileStatsVDSCommand( 
storagePoolId = cf165524-0971-434c-8a5e-f2d5cb877506, 
ignoreFailoverLimit = false), log id: 672df19
2015-05-18 14:37:00,478 INFO 
[org.ovirt.engine.core.vdsbroker.gluster.GlusterVolumesListVDSCommand] 
(DefaultQuartzScheduler_Worker-16) FINISH, GlusterVolumesListVDSCommand, 
return: 
{d46619e9-9368-4e82-bf3a-a2377b6e85e4=org.ovirt.engine.core.common.businessentities.gluster.GlusterVolumeEntity at c4308c6f}, 
log id: 5f3899c0
2015-05-18 14:37:00,544 INFO 
[org.ovirt.engine.core.vdsbroker.vdsbroker.GetFileStatsVDSCommand] 
(org.ovirt.thread.pool-8-thread-46) FINISH, GetFileStatsVDSCommand, 
return: {KNOPPIX_V5.0.1CD-2006-06-01-DE.iso={status=0, 
ctime=1424080000.0, size=730036224}, 
5.4-RELEASE-i386-disc1.iso={status=0, ctime=1431100000.0, 
size=563701760}, debian-31r8-i386-netinst.iso={status=0, 
ctime=1424080000.0, size=110628864}, grml64-small_2013.09.iso={status=0, 
ctime=1424080000.0, size=188743680}, 
debian-7.3.0-amd64-netinst.iso={status=0, ctime=1424080000.0, 
size=231735296}, debian-7.8.0-amd64-netinst.iso={status=0, 
ctime=1424080000.0, size=232783872}, 
debian-7.7.0-amd64-netinst.iso={status=0, ctime=1424080000.0, 
size=232783872}, grml32-small_2014.03.iso={status=0, ctime=1424080000.0, 
size=207618048}}, log id: 672df19
2015-05-18 14:37:00,593 INFO 
[org.ovirt.engine.core.bll.IsoDomainListSyncronizer] 
(org.ovirt.thread.pool-8-thread-46) Finished automatic refresh process 
for ISO file type with success, for storage domain id 
6d882c77-cdbc-48ef-ae21-1a6d45e7f8a1.
2015-05-18 14:37:00,756 INFO 
[org.ovirt.engine.core.vdsbroker.gluster.GlusterVolumesListVDSCommand] 
(DefaultQuartzScheduler_Worker-16) START, 
GlusterVolumesListVDSCommand(HostName = 
ovirt-node04.stuttgart.example.net, HostId = 
073c24e1-003f-412a-be56-0c41a435829a), log id: 72f57b74
2015-05-18 14:37:00,841 INFO 
[org.ovirt.engine.core.vdsbroker.gluster.GlusterVolumesListVDSCommand] 
(DefaultQuartzScheduler_Worker-16) FINISH, GlusterVolumesListVDSCommand, 
return: 
{e952fd41-45bf-42d9-b494-8e0195cb9756=org.ovirt.engine.core.common.businessentities.gluster.GlusterVolumeEntity at 999d3275}, 
log id: 72f57b74
2015-05-18 14:37:00,860 INFO 
[org.ovirt.engine.core.vdsbroker.gluster.GlusterVolumesListVDSCommand] 
(DefaultQuartzScheduler_Worker-16) START, 
GlusterVolumesListVDSCommand(HostName = 
ovirt-node07-stgt.stuttgart.example.net, HostId = 
32dfc1f1-c474-4410-94c0-38776c121289), log id: 23b84657
2015-05-18 14:37:00,900 INFO 
[org.ovirt.engine.core.vdsbroker.gluster.GlusterVolumesListVDSCommand] 
(DefaultQuartzScheduler_Worker-16) FINISH, GlusterVolumesListVDSCommand, 
return: 
{8786357b-9d11-4c01-a34b-aee949c116e9=org.ovirt.engine.core.common.businessentities.gluster.GlusterVolumeEntity at 106e5ec1}, 
log id: 23b84657
2015-05-18 14:37:02,992 INFO 
[org.ovirt.engine.core.vdsbroker.gluster.GetGlusterVolumeAdvancedDetailsVDSCommand] 
(DefaultQuartzScheduler_Worker-8) [1f6e11f1] START, 
GetGlusterVolumeAdvancedDetailsVDSCommand(HostName = 
ovirt-node02.stuttgart.example.net, HostId = 
6948da12-0b8a-4b6d-a9af-162e6c25dad3), log id: 1bad432c
2015-05-18 14:37:03,005 INFO 
[org.ovirt.engine.core.bll.scheduling.HaReservationHandling] 
(DefaultQuartzScheduler_Worker-31) HA reservation status for cluster 
HP_Proliant_DL180G6_Cluster_II is OK
2015-05-18 14:37:03,322 INFO 
[org.ovirt.engine.core.vdsbroker.gluster.GetGlusterVolumeAdvancedDetailsVDSCommand] 
(DefaultQuartzScheduler_Worker-8) [1f6e11f1] FINISH, 
GetGlusterVolumeAdvancedDetailsVDSCommand, return: 
org.ovirt.engine.core.common.businessentities.gluster.GlusterVolumeAdvancedDetails at 68b72574, 
log id: 1bad432c
2015-05-18 14:37:03,485 INFO 
[org.ovirt.engine.core.vdsbroker.gluster.GetGlusterVolumeAdvancedDetailsVDSCommand] 
(DefaultQuartzScheduler_Worker-8) [1f6e11f1] START, 
GetGlusterVolumeAdvancedDetailsVDSCommand(HostName = 
ovirt-node03.stuttgart.example.net, HostId = 
58c3a03d-d289-45ad-a5e5-f3d37d66cca7), log id: 2b6cf052
2015-05-18 14:37:03,611 INFO 
[org.ovirt.engine.core.vdsbroker.gluster.GetGlusterVolumeAdvancedDetailsVDSCommand] 
(DefaultQuartzScheduler_Worker-8) [1f6e11f1] FINISH, 
GetGlusterVolumeAdvancedDetailsVDSCommand, return: 
org.ovirt.engine.core.common.businessentities.gluster.GlusterVolumeAdvancedDetails at cf009849, 
log id: 2b6cf052
2015-05-18 14:37:03,629 INFO 
[org.ovirt.engine.core.vdsbroker.gluster.GetGlusterVolumeAdvancedDetailsVDSCommand] 
(DefaultQuartzScheduler_Worker-8) [1f6e11f1] START, 
GetGlusterVolumeAdvancedDetailsVDSCommand(HostName = 
ovirt-node07-stgt.stuttgart.example.net, HostId = 
32dfc1f1-c474-4410-94c0-38776c121289), log id: 18cc227c
2015-05-18 14:37:04,024 INFO 
[org.ovirt.engine.core.vdsbroker.gluster.GetGlusterVolumeAdvancedDetailsVDSCommand] 
(DefaultQuartzScheduler_Worker-8) [1f6e11f1] FINISH, 
GetGlusterVolumeAdvancedDetailsVDSCommand, return: 
org.ovirt.engine.core.common.businessentities.gluster.GlusterVolumeAdvancedDetails at e7361042, 
log id: 18cc227c
2015-05-18 14:37:05,479 INFO 
[org.ovirt.engine.core.vdsbroker.gluster.GlusterTasksListVDSCommand] 
(DefaultQuartzScheduler_Worker-49) START, 
GlusterTasksListVDSCommand(HostName = 
ovirt-node02.stuttgart.example.net, HostId = 
6948da12-0b8a-4b6d-a9af-162e6c25dad3), log id: f8356a4
2015-05-18 14:37:05,593 INFO 
[org.ovirt.engine.core.vdsbroker.gluster.GlusterTasksListVDSCommand] 
(DefaultQuartzScheduler_Worker-49) FINISH, GlusterTasksListVDSCommand, 
return: [], log id: f8356a4
2015-05-18 14:37:05,789 INFO 
[org.ovirt.engine.core.vdsbroker.gluster.GlusterTasksListVDSCommand] 
(DefaultQuartzScheduler_Worker-49) START, 
GlusterTasksListVDSCommand(HostName = 
ovirt-node04.stuttgart.example.net, HostId = 
073c24e1-003f-412a-be56-0c41a435829a), log id: 1707c285
2015-05-18 14:37:05,816 INFO 
[org.ovirt.engine.core.vdsbroker.gluster.GlusterTasksListVDSCommand] 
(DefaultQuartzScheduler_Worker-49) FINISH, GlusterTasksListVDSCommand, 
return: [], log id: 1707c285
2015-05-18 14:37:05,821 INFO 
[org.ovirt.engine.core.vdsbroker.gluster.GlusterTasksListVDSCommand] 
(DefaultQuartzScheduler_Worker-49) START, 
GlusterTasksListVDSCommand(HostName = 
ovirt-node05-stgt.stuttgart.example.net, HostId = 
4ca44033-4f16-413d-828a-cbfe4dacf891), log id: 29e4ebc3
2015-05-18 14:37:05,885 INFO 
[org.ovirt.engine.core.vdsbroker.gluster.GlusterTasksListVDSCommand] 
(DefaultQuartzScheduler_Worker-49) FINISH, GlusterTasksListVDSCommand, 
return: [], log id: 29e4ebc3
2015-05-18 14:37:06,071 INFO 
[org.ovirt.engine.core.vdsbroker.gluster.GlusterVolumesListVDSCommand] 
(DefaultQuartzScheduler_Worker-61) START, 
GlusterVolumesListVDSCommand(HostName = 
ovirt-node01.stuttgart.example.net, HostId = 
2e8cec66-23d7-4a5c-b6f3-9758d1d87f5c), log id: 377c4636
2015-05-18 14:37:06,250 INFO 
[org.ovirt.engine.core.vdsbroker.gluster.GlusterVolumesListVDSCommand] 
(DefaultQuartzScheduler_Worker-61) FINISH, GlusterVolumesListVDSCommand, 
return: 
{d46619e9-9368-4e82-bf3a-a2377b6e85e4=org.ovirt.engine.core.common.businessentities.gluster.GlusterVolumeEntity at d9e03a61}, 
log id: 377c4636
2015-05-18 14:37:06,429 INFO 
[org.ovirt.engine.core.vdsbroker.gluster.GlusterVolumesListVDSCommand] 
(DefaultQuartzScheduler_Worker-61) START, 
GlusterVolumesListVDSCommand(HostName = 
ovirt-node04.stuttgart.example.net, HostId = 
073c24e1-003f-412a-be56-0c41a435829a), log id: 2c323dc0
2015-05-18 14:37:06,577 INFO 
[org.ovirt.engine.core.vdsbroker.gluster.GlusterVolumesListVDSCommand] 
(DefaultQuartzScheduler_Worker-61) FINISH, GlusterVolumesListVDSCommand, 
return: 
{e952fd41-45bf-42d9-b494-8e0195cb9756=org.ovirt.engine.core.common.businessentities.gluster.GlusterVolumeEntity at bd6050c7}, 
log id: 2c323dc0
2015-05-18 14:37:06,603 INFO 
[org.ovirt.engine.core.vdsbroker.gluster.GlusterVolumesListVDSCommand] 
(DefaultQuartzScheduler_Worker-61) START, 
GlusterVolumesListVDSCommand(HostName = 
ovirt-node07-stgt.stuttgart.example.net, HostId = 
32dfc1f1-c474-4410-94c0-38776c121289), log id: 3159e468
2015-05-18 14:37:06,637 INFO 
[org.ovirt.engine.core.vdsbroker.gluster.GlusterVolumesListVDSCommand] 
(DefaultQuartzScheduler_Worker-61) FINISH, GlusterVolumesListVDSCommand, 
return: 
{8786357b-9d11-4c01-a34b-aee949c116e9=org.ovirt.engine.core.common.businessentities.gluster.GlusterVolumeEntity at 9d150a0a}, 
log id: 3159e468
2015-05-18 14:37:07,468 INFO 
[org.ovirt.engine.core.vdsbroker.vdsbroker.HSMGetAllTasksStatusesVDSCommand] 
(DefaultQuartzScheduler_Worker-70) Command 
org.ovirt.engine.core.vdsbroker.vdsbroker.HSMGetAllTasksStatusesVDSCommand 
return value

TaskStatusListReturnForXmlRpc [mStatus=StatusForXmlRpc [mCode=654, 
mMessage=Not SPM]]

2015-05-18 14:37:07,527 INFO 
[org.ovirt.engine.core.vdsbroker.vdsbroker.HSMGetAllTasksStatusesVDSCommand] 
(DefaultQuartzScheduler_Worker-70) HostName = 
ovirt-node02.stuttgart.example.net
2015-05-18 14:37:07,529 ERROR 
[org.ovirt.engine.core.vdsbroker.vdsbroker.HSMGetAllTasksStatusesVDSCommand] 
(DefaultQuartzScheduler_Worker-70) Command 
HSMGetAllTasksStatusesVDSCommand(HostName = 
ovirt-node02.stuttgart.example.net, HostId = 
6948da12-0b8a-4b6d-a9af-162e6c25dad3) execution failed. Exception: 
IRSNonOperationalException: IRSGenericException: IRSErrorException: 
IRSNonOperationalException: Not SPM
2015-05-18 14:37:07,734 INFO 
[org.ovirt.engine.core.vdsbroker.irsbroker.IrsProxyData] 
(DefaultQuartzScheduler_Worker-70) hostFromVds::selectedVds - 
ovirt-node02.stuttgart.example.net, spmStatus Free, storage pool 
HP_Proliant_DL180G6
2015-05-18 14:37:07,740 INFO 
[org.ovirt.engine.core.vdsbroker.irsbroker.IrsProxyData] 
(DefaultQuartzScheduler_Worker-70) starting spm on vds 
ovirt-node02.stuttgart.example.net, storage pool HP_Proliant_DL180G6, 
prevId -1, LVER -1
2015-05-18 14:37:07,780 INFO 
[org.ovirt.engine.core.vdsbroker.vdsbroker.SpmStartVDSCommand] 
(DefaultQuartzScheduler_Worker-70) START, SpmStartVDSCommand(HostName = 
ovirt-node02.stuttgart.example.net, HostId = 
6948da12-0b8a-4b6d-a9af-162e6c25dad3, storagePoolId = 
b384b3da-02a6-44f3-a3f6-56751ce8c26d, prevId=-1, prevLVER=-1, 
storagePoolFormatType=V3, recoveryMode=Manual, SCSIFencing=false), log 
id: 2cf7641
2015-05-18 14:37:07,792 INFO 
[org.ovirt.engine.core.vdsbroker.vdsbroker.SpmStartVDSCommand] 
(DefaultQuartzScheduler_Worker-70) spmStart polling started: taskId = 
9370fc1f-e128-469f-afd1-67dfa4815089
(END)



Thanks,
Mario

Am 18.05.15 um 14:21 schrieb Maor Lipchuk:
> Hi Mario,
>
> Can u try to mount this directly from the Host?
> Can u please attach the VDSM and engine logs
>
> Thanks,
> Maor
>
>
> ----- Original Message -----
>> From: ml at ohnewald.net
>> To: "Maor Lipchuk" <mlipchuk at redhat.com>
>> Cc: users at ovirt.org
>> Sent: Monday, May 18, 2015 2:36:38 PM
>> Subject: Re: [ovirt-users] vdsm storage problem - maybe cache problem?
>>
>> Hi Maor,
>>
>> thanks for the quick reply.
>>
>> Am 18.05.15 um 13:25 schrieb Maor Lipchuk:
>>
>>>> Now my Question: Why does the vdsm node not know that i deleted the
>>>> storage? Has the vdsm cached this mount informations? Why does it still
>>>> try to access 036b5575-51fa-4f14-8b05-890d7807894c?
>>>
>>>
>>> Yes, the vdsm use a cache for Storage Domains, you can try to restart the
>>> vdsmd service instead of rebooting the host.
>>>
>>
>> I am still getting the same error.
>>
>>
>> [root at ovirt-node01 ~]# /etc/init.d/vdsmd stop
>> Shutting down vdsm daemon:
>> vdsm watchdog stop                                         [  OK  ]
>> vdsm: Running run_final_hooks                              [  OK  ]
>> vdsm stop                                                  [  OK  ]
>> [root at ovirt-node01 ~]#
>> [root at ovirt-node01 ~]#
>> [root at ovirt-node01 ~]#
>> [root at ovirt-node01 ~]# ps aux | grep vdsmd
>> root      3198  0.0  0.0  11304   740 ?        S<   May07   0:00
>> /bin/bash -e /usr/share/vdsm/respawn --minlifetime 10 --daemon
>> --masterpid /var/run/vdsm/supervdsm_respawn.pid
>> /usr/share/vdsm/supervdsmServer --sockfile /var/run/vdsm/svdsm.sock
>> --pidfile /var/run/vdsm/supervdsmd.pid
>> root      3205  0.0  0.0 922368 26724 ?        S<l  May07  12:10
>> /usr/bin/python /usr/share/vdsm/supervdsmServer --sockfile
>> /var/run/vdsm/svdsm.sock --pidfile /var/run/vdsm/supervdsmd.pid
>> root     15842  0.0  0.0 103248   900 pts/0    S+   13:35   0:00 grep vdsmd
>>
>>
>> [root at ovirt-node01 ~]# /etc/init.d/vdsmd start
>> initctl: Job is already running: libvirtd
>> vdsm: Running mkdirs
>> vdsm: Running configure_coredump
>> vdsm: Running configure_vdsm_logs
>> vdsm: Running run_init_hooks
>> vdsm: Running gencerts
>> vdsm: Running check_is_configured
>> libvirt is already configured for vdsm
>> sanlock service is already configured
>> vdsm: Running validate_configuration
>> SUCCESS: ssl configured to true. No conflicts
>> vdsm: Running prepare_transient_repository
>> vdsm: Running syslog_available
>> vdsm: Running nwfilter
>> vdsm: Running dummybr
>> vdsm: Running load_needed_modules
>> vdsm: Running tune_system
>> vdsm: Running test_space
>> vdsm: Running test_lo
>> vdsm: Running restore_nets
>> vdsm: Running unified_network_persistence_upgrade
>> vdsm: Running upgrade_300_nets
>> Starting up vdsm daemon:
>> vdsm start                                                 [  OK  ]
>> [root at ovirt-node01 ~]#
>>
>> [root at ovirt-node01 ~]# grep ERROR /var/log/vdsm/vdsm.log | tail -n 20
>> Thread-13::ERROR::2015-05-18
>> 13:35:03,631::sdc::137::Storage.StorageDomainCache::(_findDomain)
>> looking for unfetched domain abc51e26-7175-4b38-b3a8-95c6928fbc2b
>> Thread-13::ERROR::2015-05-18
>> 13:35:03,632::sdc::154::Storage.StorageDomainCache::(_findUnfetchedDomain)
>> looking for domain abc51e26-7175-4b38-b3a8-95c6928fbc2b
>> Thread-36::ERROR::2015-05-18
>> 13:35:11,607::sdc::137::Storage.StorageDomainCache::(_findDomain)
>> looking for unfetched domain 036b5575-51fa-4f14-8b05-890d7807894c
>> Thread-36::ERROR::2015-05-18
>> 13:35:11,621::sdc::154::Storage.StorageDomainCache::(_findUnfetchedDomain)
>> looking for domain 036b5575-51fa-4f14-8b05-890d7807894c
>> Thread-36::ERROR::2015-05-18
>> 13:35:11,960::sdc::143::Storage.StorageDomainCache::(_findDomain) domain
>> 036b5575-51fa-4f14-8b05-890d7807894c not found
>> Thread-36::ERROR::2015-05-18
>> 13:35:11,960::domainMonitor::239::Storage.DomainMonitorThread::(_monitorDomain)
>> Error while collecting domain 036b5575-51fa-4f14-8b05-890d7807894c
>> monitoring information
>> Thread-36::ERROR::2015-05-18
>> 13:35:21,962::sdc::137::Storage.StorageDomainCache::(_findDomain)
>> looking for unfetched domain 036b5575-51fa-4f14-8b05-890d7807894c
>> Thread-36::ERROR::2015-05-18
>> 13:35:21,965::sdc::154::Storage.StorageDomainCache::(_findUnfetchedDomain)
>> looking for domain 036b5575-51fa-4f14-8b05-890d7807894c
>> Thread-36::ERROR::2015-05-18
>> 13:35:22,068::sdc::143::Storage.StorageDomainCache::(_findDomain) domain
>> 036b5575-51fa-4f14-8b05-890d7807894c not found
>> Thread-36::ERROR::2015-05-18
>> 13:35:22,072::domainMonitor::239::Storage.DomainMonitorThread::(_monitorDomain)
>> Error while collecting domain 036b5575-51fa-4f14-8b05-890d7807894c
>> monitoring information
>> Thread-15::ERROR::2015-05-18
>> 13:35:33,821::task::866::TaskManager.Task::(_setError)
>> Task=`54bdfc77-f63a-493b-b24e-e5a3bc4977bb`::Unexpected error
>> Thread-15::ERROR::2015-05-18
>> 13:35:33,864::dispatcher::65::Storage.Dispatcher.Protect::(run)
>> {'status': {'message': "Unknown pool id, pool not connected:
>> ('b384b3da-02a6-44f3-a3f6-56751ce8c26d',)", 'code': 309}}
>> Thread-13::ERROR::2015-05-18
>> 13:35:33,930::sdc::137::Storage.StorageDomainCache::(_findDomain)
>> looking for unfetched domain abc51e26-7175-4b38-b3a8-95c6928fbc2b
>> Thread-15::ERROR::2015-05-18
>> 13:35:33,928::task::866::TaskManager.Task::(_setError)
>> Task=`fe9bb0fa-cf1e-4b21-af00-0698c6d1718f`::Unexpected error
>> Thread-13::ERROR::2015-05-18
>> 13:35:33,932::sdc::154::Storage.StorageDomainCache::(_findUnfetchedDomain)
>> looking for domain abc51e26-7175-4b38-b3a8-95c6928fbc2b
>> Thread-15::ERROR::2015-05-18
>> 13:35:33,978::dispatcher::65::Storage.Dispatcher.Protect::(run)
>> {'status': {'message': 'Not SPM: ()', 'code': 654}}
>> Thread-36::ERROR::2015-05-18
>> 13:35:41,117::sdc::137::Storage.StorageDomainCache::(_findDomain)
>> looking for unfetched domain 036b5575-51fa-4f14-8b05-890d7807894c
>> Thread-36::ERROR::2015-05-18
>> 13:35:41,131::sdc::154::Storage.StorageDomainCache::(_findUnfetchedDomain)
>> looking for domain 036b5575-51fa-4f14-8b05-890d7807894c
>> Thread-36::ERROR::2015-05-18
>> 13:35:41,452::sdc::143::Storage.StorageDomainCache::(_findDomain) domain
>> 036b5575-51fa-4f14-8b05-890d7807894c not found
>> Thread-36::ERROR::2015-05-18
>> 13:35:41,453::domainMonitor::239::Storage.DomainMonitorThread::(_monitorDomain)
>> Error while collecting domain 036b5575-51fa-4f14-8b05-890d7807894c
>> monitoring information
>>
>>
>> Thanks,
>> Mario
>>
>>




More information about the Users mailing list