<html dir="ltr">
<head>
<meta http-equiv="Content-Type" content="text/html; charset=iso-8859-1">
<style id="owaParaStyle" type="text/css">P {margin-top:0;margin-bottom:0;}</style>
</head>
<body ocsi="0" fpstyle="1">
<div style="direction: ltr;font-family: Tahoma;color: #000000;font-size: 10pt;">Hello,<br>
<br>
me again. Adding a NFS Domain to the cluster fails. It looks like the error<br>
from http://lists.ovirt.org/pipermail/users/2013-April/014080.html but I <br>
have a Fedora 19 host and sanlock seems to be a newer version. I wanted <br>
to restart the setup process but ovirt-engine tells me that the storage <br>
domain already exists on the host. How can I recover from this failed <br>
action? <br>
<br>
Just in case you are interested in the current situation you will find the<br>
logs attached.<br>
<br>
Markus<br>
<br>
[root@colovn1 dom_md]# getsebool -a | grep virt_use_nfs<br>
virt_use_nfs --&gt; on<br>
<br>
[root@colovn1 dom_md]# df<br>
Filesystem<br>
...<br>
10.10.30.251:/var/nas5/ovirt 7810410496 7094904832 715505664&nbsp;&nbsp; 91% /rhev/data-center/mnt/10.10.30.251:_var_nas5_ovirt<br>
<br>
content of /var/log/sanlock<br>
2013-09-04 15:47:04&#43;0200 14844 [2655]: sanlock daemon started 2.8 host e35ca5ca-dfab-4521-b267-4d9b1f48ded4.colovn1.co<br>
2013-09-04 15:58:04&#43;0200 23 [1024]: sanlock daemon started 2.8 host b3a8ef4a-0966-4ffa-9086-e5a8c6ad7363.colovn1.co<br>
2013-09-04 19:25:53&#43;0200 23 [919]: sanlock daemon started 2.8 host 9654f1d0-6d87-4dd1-abdc-68c00fc4fe64.colovn1.co<br>
2013-09-05 06:39:47&#43;0200 25 [1173]: sanlock daemon started 2.8 host e2829012-4b8f-417f-809e-532c64108955.colovn1.co<br>
2013-09-05 06:48:01&#43;0200 518 [1178]: s1 lockspace 8dd2a4a5-b49f-4fef-a427-e62627fc09f7:250:/rhev/data-center/mnt/10.10.30.251:_var_nas5_ovirt/8dd2a4a5-b49f-4fef-a427-e62627fc09f7/dom_md/ids:0<br>
2013-09-05 06:48:01&#43;0200 518 [1786]: open error -13 /rhev/data-center/mnt/10.10.30.251:_var_nas5_ovirt/8dd2a4a5-b49f-4fef-a427-e62627fc09f7/dom_md/ids<br>
2013-09-05 06:48:01&#43;0200 518 [1786]: s1 open_disk /rhev/data-center/mnt/10.10.30.251:_var_nas5_ovirt/8dd2a4a5-b49f-4fef-a427-e62627fc09f7/dom_md/ids error -13<br>
2013-09-05 06:48:02&#43;0200 519 [1178]: s1 add_lockspace fail result -19<br>
<br>
permissions of file:<br>
[root@colovn1 dom_md]# ls -al /rhev/data-center/mnt/10.10.30.251:_var_nas5_ovirt/8dd2a4a5-b49f-4fef-a427-e62627fc09f7/dom_md/ids<br>
-rw-rw----. 1 vdsm kvm 1048576&nbsp; 5. Sep 06:47 /rhev/data-center/mnt/10.10.30.251:_var_nas5_ovirt/8dd2a4a5-b49f-4fef-a427-e62627fc09f7/dom_md/ids<br>
<br>
vdsm.log<br>
Thread-186::DEBUG::2013-09-05 06:48:01,872::task::974::TaskManager.Task::(_decref) Task=`22e150b3-2fae-42e2-ac85-52cee9eadfb9`::ref 1 aborting False<br>
Thread-186::INFO::2013-09-05 06:48:01,872::sp::592::Storage.StoragePool::(create) spUUID=b054727d-fe4a-41ed-8393-a81e36b8a1af poolName=Collogia master_sd=8dd2a4a5-b49f-4fef-a427-e62627fc09f7 domList=['8dd2a4a5-b49f-4fef-a427-e62627fc09f7'] masterVersion=1 {'LEASETIMESEC':
 60, 'IOOPTIMEOUTSEC': 10, 'LEASERETRIES': 3, 'LOCKRENEWALINTERVALSEC': 5}<br>
Thread-186::INFO::2013-09-05 06:48:01,872::fileSD::315::Storage.StorageDomain::(validate) sdUUID=8dd2a4a5-b49f-4fef-a427-e62627fc09f7<br>
Thread-186::DEBUG::2013-09-05 06:48:01,888::persistentDict::234::Storage.PersistentDict::(refresh) read lines (FileMetadataRW)=['CLASS=Data', 'DESCRIPTION=NAS3_IB', 'IOOPTIMEOUTSEC=1', 'LEASERETRIES=3', 'LEASETIMESEC=5', 'LOCKPOLICY=', 'LOCKRENEWALINTERVALSEC=5',
 'POOL_UUID=', 'REMOTE_PATH=10.10.30.251:/var/nas5/ovirt', 'ROLE=Regular', 'SDUUID=8dd2a4a5-b49f-4fef-a427-e62627fc09f7', 'TYPE=NFS', 'VERSION=3', '_SHA_CKSUM=b0b0af59d3c7c6ec83dd18ca11a6c1653de9a3b6']<br>
Thread-186::DEBUG::2013-09-05 06:48:01,898::persistentDict::234::Storage.PersistentDict::(refresh) read lines (FileMetadataRW)=['CLASS=Data', 'DESCRIPTION=NAS3_IB', 'IOOPTIMEOUTSEC=1', 'LEASERETRIES=3', 'LEASETIMESEC=5', 'LOCKPOLICY=', 'LOCKRENEWALINTERVALSEC=5',
 'POOL_UUID=', 'REMOTE_PATH=10.10.30.251:/var/nas5/ovirt', 'ROLE=Regular', 'SDUUID=8dd2a4a5-b49f-4fef-a427-e62627fc09f7', 'TYPE=NFS', 'VERSION=3', '_SHA_CKSUM=b0b0af59d3c7c6ec83dd18ca11a6c1653de9a3b6']<br>
Thread-186::DEBUG::2013-09-05 06:48:01,899::persistentDict::167::Storage.PersistentDict::(transaction) Starting transaction<br>
Thread-186::DEBUG::2013-09-05 06:48:01,900::persistentDict::173::Storage.PersistentDict::(transaction) Flushing changes<br>
Thread-186::DEBUG::2013-09-05 06:48:01,900::persistentDict::299::Storage.PersistentDict::(flush) about to write lines (FileMetadataRW)=['CLASS=Data', 'DESCRIPTION=NAS3_IB', 'IOOPTIMEOUTSEC=10', 'LEASERETRIES=3', 'LEASETIMESEC=60', 'LOCKPOLICY=', 'LOCKRENEWALINTERVALSEC=5',
 'POOL_UUID=', 'REMOTE_PATH=10.10.30.251:/var/nas5/ovirt', 'ROLE=Regular', 'SDUUID=8dd2a4a5-b49f-4fef-a427-e62627fc09f7', 'TYPE=NFS', 'VERSION=3', '_SHA_CKSUM=e2dfa292d09c0cb420dc4d30ab5eed11c84a399e']<br>
Thread-186::DEBUG::2013-09-05 06:48:01,903::persistentDict::175::Storage.PersistentDict::(transaction) Finished transaction<br>
Thread-186::INFO::2013-09-05 06:48:01,903::clusterlock::174::SANLock::(acquireHostId) Acquiring host id for domain 8dd2a4a5-b49f-4fef-a427-e62627fc09f7 (id: 250)<br>
Thread-186::ERROR::2013-09-05 06:48:02,905::task::850::TaskManager.Task::(_setError) Task=`22e150b3-2fae-42e2-ac85-52cee9eadfb9`::Unexpected error<br>
Traceback (most recent call last):<br>
&nbsp; File &quot;/usr/share/vdsm/storage/task.py&quot;, line 857, in _run<br>
&nbsp;&nbsp;&nbsp; return fn(*args, **kargs)<br>
&nbsp; File &quot;/usr/share/vdsm/logUtils.py&quot;, line 45, in wrapper<br>
&nbsp;&nbsp;&nbsp; res = f(*args, **kwargs)<br>
&nbsp; File &quot;/usr/share/vdsm/storage/hsm.py&quot;, line 960, in createStoragePool<br>
&nbsp;&nbsp;&nbsp; masterVersion, leaseParams)<br>
&nbsp; File &quot;/usr/share/vdsm/storage/sp.py&quot;, line 617, in create<br>
&nbsp;&nbsp;&nbsp; self._acquireTemporaryClusterLock(msdUUID, leaseParams)<br>
&nbsp; File &quot;/usr/share/vdsm/storage/sp.py&quot;, line 559, in _acquireTemporaryClusterLock<br>
&nbsp;&nbsp;&nbsp; msd.acquireHostId(self.id)<br>
&nbsp; File &quot;/usr/share/vdsm/storage/sd.py&quot;, line 458, in acquireHostId<br>
&nbsp;&nbsp;&nbsp; self._clusterLock.acquireHostId(hostId, async)<br>
&nbsp; File &quot;/usr/share/vdsm/storage/clusterlock.py&quot;, line 189, in acquireHostId<br>
&nbsp;&nbsp;&nbsp; raise se.AcquireHostIdFailure(self._sdUUID, e)<br>
AcquireHostIdFailure: Cannot acquire host id: ('8dd2a4a5-b49f-4fef-a427-e62627fc09f7', SanlockException(19, 'Sanlock lockspace add failure', 'No such device'))<br>
<br>
/var/log/messages content<br>
Sep&nbsp; 5 06:48:01 colovn1 multipathd: dm-3: remove map (uevent)<br>
Sep&nbsp; 5 06:48:01 colovn1 multipathd: dm-3: remove map (uevent)<br>
Sep&nbsp; 5 06:48:01 colovn1 multipathd: dm-3: remove map (uevent)<br>
Sep&nbsp; 5 06:48:01 colovn1 multipathd: dm-3: remove map (uevent)<br>
Sep&nbsp; 5 06:48:01 colovn1 sanlock[1173]: 2013-09-05 06:48:01&#43;0200 518 [1786]: open error -13 /rhev/data-center/mnt/10.10.30.251:_var_nas5_ovirt/8dd2a4a5-b49f-4fef-a427-e62627fc09f7/dom_md/ids<br>
Sep&nbsp; 5 06:48:01 colovn1 sanlock[1173]: 2013-09-05 06:48:01&#43;0200 518 [1786]: s1 open_disk /rhev/data-center/mnt/10.10.30.251:_var_nas5_ovirt/8dd2a4a5-b49f-4fef-a427-e62627fc09f7/dom_md/ids error -13<br>
Sep&nbsp; 5 06:48:02 colovn1 sanlock[1173]: 2013-09-05 06:48:02&#43;0200 519 [1178]: s1 add_lockspace fail result -19<br>
Sep&nbsp; 5 06:48:02 colovn1 vdsm TaskManager.Task ERROR Task=`22e150b3-2fae-42e2-ac85-52cee9eadfb9`::Unexpected error<br>
Sep&nbsp; 5 06:50:13 colovn1 su: (to root) root on none<br>
Sep&nbsp; 5 06:54:23 colovn1 systemd[1]: Starting Cleanup of Temporary Directories...<br>
Sep&nbsp; 5 06:54:23 colovn1 systemd[1]: Started Cleanup of Temporary Directories.<br>
Sep&nbsp; 5 06:55:05 colovn1 ntpd[1241]: 0.0.0.0 c612 02 freq_set kernel -32.960 PPM<br>
Sep&nbsp; 5 06:55:05 colovn1 ntpd[1241]: 0.0.0.0 c615 05 clock_sync<br>
<br>
sanlock version<br>
[root@colovn1 dom_md]# yum list | grep sanlock<br>
libvirt-lock-sanlock.x86_64&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; 1.0.5.5-1.fc19&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; @updates<br>
sanlock.x86_64&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; 2.8-1.fc19&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; @updates<br>
sanlock-lib.x86_64&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; 2.8-1.fc19&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; @updates<br>
sanlock-python.x86_64&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; 2.8-1.fc19&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; @updates<br>
fence-sanlock.x86_64&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; 2.8-1.fc19&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; updates<br>
sanlock-devel.i686&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; 2.8-1.fc19&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; updates<br>
sanlock-devel.x86_64&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; 2.8-1.fc19&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; updates<br>
sanlock-lib.i686&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; 2.8-1.fc19&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; updates<br>
<br>
</div>
</body>
</html>