2019-01-15 13:50:35,317+01 INFO [org.ovirt.engine.core.vdsbroker.vdsbroker.CreateStoragePoolVDSCommand] (default task-2) [51725212] START, CreateStoragePoolVDSCommand(HostName = ov4301.localdomain.lo
cal, CreateStoragePoolVDSCommandParameters:{hostId='e8f105f1-37ed-4ac4-bfc3-b1e55ed3027f', storagePoolId='96a31a7e-18bb-11e9-9a34-00163e6196f3', storagePoolName='Default', masterDomainId='14ec2fc7-8c2
b-487c-8f4f-428644650928', domainsIdList='[14ec2fc7-8c2b-487c-8f4f-428644650928]', masterVersion='1'}), log id: 4baccd53
2019-01-15 13:50:36,345+01 ERROR [org.ovirt.engine.core.vdsbroker.vdsbroker.CreateStoragePoolVDSCommand] (default task-2) [51725212] Failed in 'CreateStoragePoolVDS' method
2019-01-15 13:50:36,354+01 ERROR [org.ovirt.engine.core.dal.dbbroker.auditloghandling.AuditLogDirector] (default task-2) [51725212] EVENT_ID: VDS_BROKER_COMMAND_FAILURE(10,802), VDSM ov4301.localdomain.local command CreateStoragePoolVDS failed: Cannot acquire host id: (u'14ec2fc7-8c2b-487c-8f4f-428644650928', SanlockException(-203, 'Sanlock lockspace add failure', 'Watchdog device error'))
2019-01-15 13:50:36,354+01 INFO [org.ovirt.engine.core.vdsbroker.vdsbroker.CreateStoragePoolVDSCommand] (default task-2) [51725212] Command 'org.ovirt.engine.core.vdsbroker.vdsbroker.CreateStoragePoolVDSCommand' return value 'StatusOnlyReturn [status=Status [code=661, message=Cannot acquire host id: (u'14ec2fc7-8c2b-487c-8f4f-428644650928', SanlockException(-203, 'Sanlock lockspace add failure', 'Watchdog device error'))]]'
2019-01-15 13:50:36,354+01 INFO [org.ovirt.engine.core.vdsbroker.vdsbroker.CreateStoragePoolVDSCommand] (default task-2) [51725212] HostName = ov4301.localdomain.local
2019-01-15 13:50:36,355+01 ERROR [org.ovirt.engine.core.vdsbroker.vdsbroker.CreateStoragePoolVDSCommand] (default task-2) [51725212] Command 'CreateStoragePoolVDSCommand(HostName = ov4301.localdomain.local, CreateStoragePoolVDSCommandParameters:{hostId='e8f105f1-37ed-4ac4-bfc3-b1e55ed3027f', storagePoolId='96a31a7e-18bb-11e9-9a34-00163e6196f3', storagePoolName='Default', masterDomainId='14ec2fc7-8c2b-487c-8f4f-428644650928', domainsIdList='[14ec2fc7-8c2b-487c-8f4f-428644650928]', masterVersion='1'})' execution failed: VDSGenericException: VDSErrorException: Failed to CreateStoragePoolVDS, error = Cannot acquire host id: (u'14ec2fc7-8c2b-487c-8f4f-428644650928', SanlockException(-203, 'Sanlock lockspace add failure', 'Watchdog device error')), code = 661
2019-01-15 13:50:36,355+01 INFO [org.ovirt.engine.core.vdsbroker.vdsbroker.CreateStoragePoolVDSCommand] (default task-2) [51725212] FINISH, CreateStoragePoolVDSCommand, return: , log id: 4baccd53
2019-01-15 13:50:36,355+01 ERROR [org.ovirt.engine.core.bll.storage.pool.AddStoragePoolWithStoragesCommand] (default task-2) [51725212] Command 'org.ovirt.engine.core.bll.storage.pool.AddStoragePoolWithStoragesCommand' failed: EngineException: org.ovirt.engine.core.vdsbroker.vdsbroker.VDSErrorException: VDSGenericException: VDSErrorException: Failed to CreateStoragePoolVDS, error = Cannot acquire host id: (u'14ec2fc7-8c2b-487c-8f4f-428644650928', SanlockException(-203, 'Sanlock lockspace add failure', 'Watchdog device error')), code = 661 (Failed with error AcquireHostIdFailure and code 661)
2019-01-15 13:50:36,379+01 INFO [org.ovirt.engine.core.bll.CommandCompensator] (default task-2) [51725212] Command [id=c55d9962-368e-4e0c-8fee-bd06e7570062]: Compensating DELETED_OR_UPDATED_ENTITY of org.ovirt.engine.core.common.businessentities.StoragePool; snapshot: id=96a31a7e-18bb-11e9-9a34-00163e6196f3.
On host:
[root@ov4301 log]# cat /etc/hosts
192.168.124.50 ov43eng.localdomain.local # temporary entry added by hosted-engine-setup for the bootstrap VM
127.0.0.1 localhost localhost.localdomain localhost4 localhost4.localdomain4
::1 localhost localhost.localdomain localhost6 localhost6.localdomain6
192.168.122.210 ov43eng.localdomain.local ov43eng
192.168.122.211 ov4301.localdomain.local ov4301
[root@ov4301 log]#
[root@ov4301 log]# df -h | grep gluster
/dev/mapper/gluster_vg_sdb-gluster_lv_engine 64G 36M 64G 1% /gluster_bricks/engine
/dev/mapper/gluster_vg_sdb-gluster_lv_data 30G 34M 30G 1% /gluster_bricks/data
/dev/mapper/gluster_vg_sdb-gluster_lv_vmstore 20G 34M 20G 1% /gluster_bricks/vmstore
192.168.123.211:/engine 64G 691M 64G 2% /rhev/data-center/mnt/glusterSD/192.168.123.211:_engine
[root@ov4301 log]#
and in its messages:
Jan 15 13:35:49 ov4301 dnsmasq-dhcp[22934]: DHCPREQUEST(virbr0) 192.168.124.50 00:16:3e:61:96:f3
Jan 15 13:35:49 ov4301 dnsmasq-dhcp[22934]: DHCPACK(virbr0) 192.168.124.50 00:16:3e:61:96:f3 ov43eng
Jan 15 13:35:49 ov4301 dnsmasq-dhcp[22934]: not giving name ov43eng to the DHCP lease of 192.168.124.50 because the name exists in /etc/hosts with address 192.168.122.210
Jan 15 13:40:01 ov4301 systemd: Started Session 38 of user root.
Jan 15 13:47:12 ov4301 vdsm[29591]: WARN MOM not available.
Jan 15 13:47:12 ov4301 vdsm[29591]: WARN MOM not available, KSM stats will be missing.
Jan 15 13:49:05 ov4301 python: ansible-setup Invoked with filter=* gather_subset=['all'] fact_path=/etc/ansible/facts.d gather_timeout=10
Jan 15 13:49:19 ov4301 python: ansible-stat Invoked with checksum_algorithm=sha1 get_checksum=True follow=False path=/var/tmp/localvmOIXI_W get_md5=None get_mime=True get_attributes=True
Jan 15 13:49:24 ov4301 python: ansible-ovirt_auth Invoked with username=None kerberos=False timeout=0 url=None insecure=True hostname=None compress=True state=present headers=None token=None ovirt_auth=None ca_file=None password=NOT_LOGGING_PARAMETER
Jan 15 13:49:29 ov4301 python: ansible-ovirt_host_facts Invoked with all_content=False pattern=name=ov4301.localdomain.local fetch_nested=False nested_attributes=[] auth={'timeout': 0, 'url': '
https://ov43eng.localdomain.local/ovirt-engine/api', 'insecure': True, 'kerberos': False, 'compress': True, 'headers': None, 'token': 'Q8qt0Z9DmHJRdg3wk7YxNOAs0JPpBMxxstVx3I8skbulwRWp1SsVXuZYq4DUuPWeEnUZ2bD8TAuwCzJ3qlFYlw', 'ca_file': None}
Jan 15 13:49:35 ov4301 python: ansible-ovirt_cluster_facts Invoked with pattern= fetch_nested=False nested_attributes=[] auth={'timeout': 0, 'url': '
https://ov43eng.localdomain.local/ovirt-engine/api', 'insecure': True, 'kerberos': False, 'compress': True, 'headers': None, 'token': 'Q8qt0Z9DmHJRdg3wk7YxNOAs0JPpBMxxstVx3I8skbulwRWp1SsVXuZYq4DUuPWeEnUZ2bD8TAuwCzJ3qlFYlw', 'ca_file': None}
Jan 15 13:49:43 ov4301 python: ansible-ovirt_datacenter_facts Invoked with pattern= fetch_nested=False nested_attributes=[] auth={'timeout': 0, 'url': '
https://ov43eng.localdomain.local/ovirt-engine/api', 'insecure': True, 'kerberos': False, 'compress': True, 'headers': None, 'token': 'Q8qt0Z9DmHJRdg3wk7YxNOAs0JPpBMxxstVx3I8skbulwRWp1SsVXuZYq4bD8TAuwCzJ3qlFYlw', 'ca_file': None} host=ov4301.localdomain.local nested_attributes=[] wait=True domain_function=data name=hosted_storage critical_space_action_blocker=None posixfs=None poll_interval=3 glusterfs=None nfs=None timeout=180 backup=None discard_after_delete=None
Jan 15 13:50:34 ov4301 systemd: Started Session c24 of user root.
Jan 15 13:50:34 ov4301 sanlock[26802]: 2019-01-15 13:50:34 11549 [21290]: s1 wdmd_connect failed -13
Jan 15 13:50:34 ov4301 sanlock[26802]: 2019-01-15 13:50:34 11549 [21290]: s1 connect_watchdog failed -1
Jan 15 13:50:35 ov4301 sanlock[26802]: 2019-01-15 13:50:35 11550 [26810]: s1 add_lockspace fail result -203
Jan 15 13:56:48 ov4301 dnsmasq-dhcp[22934]: DHCPREQUEST(virbr0) 192.168.124.50 00:16:3e:61:96:f3
Jan 15 13:56:48 ov4301 dnsmasq-dhcp[22934]: DHCPACK(virbr0) 192.168.124.50 00:16:3e:61:96:f3 ov43eng
Jan 15 13:56:48 ov4301 dnsmasq-dhcp[22934]: not giving name ov43eng to the DHCP lease of 192.168.124.50 because the name exists in /etc/hosts with address 192.168.122.210
Jan 15 13:59:34 ov4301 chronyd[26447]: Source 212.45.144.206 replaced with 80.211.52.109
Jan 15 13:59:44 ov4301 vdsm[29591]: WARN MOM not available.
Jan 15 13:59:44 ov4301 vdsm[29591]: WARN MOM not available, KSM stats will be missing.
DUuPWeEnUZ2bD8TAuwCzJ3qlFYlw', 'ca_file': None}
Jan 15 13:49:54 ov4301 python: ansible-ovirt_storage_domain Invoked with comment=None warning_low_space=None fetch_nested=False localfs=None data_center=Default id=None iscsi=None state=unattached wipe_after_delete=None destroy=None fcp=None description=None format=None auth={'username':********@internal', 'url': '
https://ov43eng.localdomain.local/ovirt-engine/api', 'insecure': True, 'password': 'passw0rd'} host=ov4301.localdomain.local nested_attributes=[] wait=True domain_function=data name=hosted_storage critical_space_action_blocker=None posixfs=None poll_interval=3 glusterfs={'path': '/engine', 'mount_options': None, 'address': '192.168.123.211'} nfs=None timeout=180 backup=None discard_after_delete=None
Jan 15 13:50:01 ov4301 systemd: Started Session 39 of user root.
Jan 15 13:50:01 ov4301 systemd: Created slice vdsm-glusterfs.slice.
Jan 15 13:50:01 ov4301 systemd: Started /usr/bin/mount -t glusterfs 192.168.123.211:/engine /rhev/data-center/mnt/glusterSD/192.168.123.211:_engine.
Jan 15 13:50:01 ov4301 kernel: fuse init (API version 7.22)
Jan 15 13:50:01 ov4301 systemd: Mounting FUSE Control File System...
Jan 15 13:50:02 ov4301 systemd: Mounted FUSE Control File System.
Jan 15 13:50:02 ov4301 systemd: Started Session c20 of user root.
Jan 15 13:50:02 ov4301 systemd: Started Session c21 of user root.
Jan 15 13:50:03 ov4301 systemd: Started Session c22 of user root.
Jan 15 13:50:03 ov4301 systemd: Started Session c23 of user root.
Jan 15 13:50:12 ov4301 python: ansible-ovirt_storage_domain_facts Invoked with pattern=name=hosted_storage fetch_nested=False nested_attributes=[] auth={'timeout': 0, 'url': '
https://ov43eng.localdomain.local/ovirt-engine/api', 'insecure': True, 'kerberos': False, 'compress': True, 'headers': None, 'token': 'Q8qt0Z9DmHJRdg3wk7YxNOAs0JPpBMxxstVx3I8skbulwRWp1SsVXuZYq4DUuPWeEnUZ2bD8TAuwCzJ3qlFYlw', 'ca_file': None}
Jan 15 13:50:18 ov4301 python: ansible-find Invoked with excludes=None paths=['/var/tmp/localvmOIXI_W/master'] file_type=file age=None contains=None recurse=True age_stamp=mtime patterns=['^.*.(?<!meta).ovf$'] depth=None get_checksum=False use_regex=True follow=False hidden=False size=None
Jan 15 13:50:30 ov4301 python: ansible-ovirt_storage_domain Invoked with comment=None warning_low_space=None fetch_nested=False localfs=None data_center=Default id=None iscsi=None state=present wipe_after_delete=None destroy=None fcp=None description=None format=None auth={'timeout': 0, 'url': '
https://ov43eng.localdomain.local/ovirt-engine/api', 'insecure': True, 'kerberos': False, 'compress': True, 'headers': None, 'token': 'Q8qt0Z9DmHJRdg3wk7YxNOAs0JPpBMxxstVx3I8skbulwRWp1SsVXuZYq4DUuPWeEnUZ2bD8TAuwCzJ3qlFYlw', 'ca_file': None} host=ov4301.localdomain.local nested_attributes=[] wait=True domain_function=data name=hosted_storage critical_space_action_blocker=None posixfs=None poll_interval=3 glusterfs=None nfs=None timeout=180 backup=None discard_after_delete=None
Jan 15 13:50:34 ov4301 systemd: Started Session c24 of user root.
Jan 15 13:50:34 ov4301 sanlock[26802]: 2019-01-15 13:50:34 11549 [21290]: s1 wdmd_connect failed -13
Jan 15 13:50:34 ov4301 sanlock[26802]: 2019-01-15 13:50:34 11549 [21290]: s1 connect_watchdog failed -1
Jan 15 13:50:35 ov4301 sanlock[26802]: 2019-01-15 13:50:35 11550 [26810]: s1 add_lockspace fail result -203
Jan 15 13:56:48 ov4301 dnsmasq-dhcp[22934]: DHCPREQUEST(virbr0) 192.168.124.50 00:16:3e:61:96:f3
Jan 15 13:56:48 ov4301 dnsmasq-dhcp[22934]: DHCPACK(virbr0) 192.168.124.50 00:16:3e:61:96:f3 ov43eng
Jan 15 13:56:48 ov4301 dnsmasq-dhcp[22934]: not giving name ov43eng to the DHCP lease of 192.168.124.50 because the name exists in /etc/hosts with address 192.168.122.210
Jan 15 13:59:34 ov4301 chronyd[26447]: Source 212.45.144.206 replaced with 80.211.52.109
Jan 15 13:59:44 ov4301 vdsm[29591]: WARN MOM not available.
Jan 15 13:59:44 ov4301 vdsm[29591]: WARN MOM not available, KSM stats will be missing.
In vdsm.log :
2019-01-15 13:50:34,980+0100 INFO (jsonrpc/2) [storage.StorageDomain] sdUUID=14ec2fc7-8c2b-487c-8f4f-428644650928 (fileSD:533)
2019-01-15 13:50:34,984+0100 INFO (jsonrpc/2) [storage.StoragePool] Creating pool directory '/rhev/data-center/96a31a7e-18bb-11e9-9a34-00163e6196f3' (sp:634)
2019-01-15 13:50:34,984+0100 INFO (jsonrpc/2) [storage.fileUtils] Creating directory: /rhev/data-center/96a31a7e-18bb-11e9-9a34-00163e6196f3 mode: None (fileUtils:199)
2019-01-15 13:50:34,985+0100 INFO (jsonrpc/2) [storage.SANLock] Acquiring host id for domain 14ec2fc7-8c2b-487c-8f4f-428644650928 (id=250, async=False) (clusterlock:294)
2019-01-15 13:50:35,987+0100 INFO (jsonrpc/2) [vdsm.api] FINISH createStoragePool error=Cannot acquire host id: (u'14ec2fc7-8c2b-487c-8f4f-428644650928', SanlockException(-203, 'Sanlock lockspace add failure', 'Watchdog device error')) from=::ffff:192.168.124.50,42356, flow_id=51725212, task_id=7cbd7c09-e934-4396-bd9d-61e9f0e00bd3 (api:52)
2019-01-15 13:50:35,988+0100 ERROR (jsonrpc/2) [storage.TaskManager.Task] (Task='7cbd7c09-e934-4396-bd9d-61e9f0e00bd3') Unexpected error (task:875)
Traceback (most recent call last):
File "/usr/lib/python2.7/site-packages/vdsm/storage/task.py", line 882, in _run
return fn(*args, **kargs)
File "<string>", line 2, in createStoragePool
File "/usr/lib/python2.7/site-packages/vdsm/common/api.py", line 50, in method
ret = func(*args, **kwargs)
File "/usr/lib/python2.7/site-packages/vdsm/storage/hsm.py", line 1003, in createStoragePool
leaseParams)
File "/usr/lib/python2.7/site-packages/vdsm/storage/sp.py", line 636, in create
self._acquireTemporaryClusterLock(msdUUID, leaseParams)
File "/usr/lib/python2.7/site-packages/vdsm/storage/sp.py", line 567, in _acquireTemporaryClusterLock
File "/usr/lib/python2.7/site-packages/vdsm/storage/sd.py", line 860, in acquireHostId
self._manifest.acquireHostId(hostId, async)
File "/usr/lib/python2.7/site-packages/vdsm/storage/sd.py", line 467, in acquireHostId
self._domainLock.acquireHostId(hostId, async)
File "/usr/lib/python2.7/site-packages/vdsm/storage/clusterlock.py", line 325, in acquireHostId
raise se.AcquireHostIdFailure(self._sdUUID, e)
AcquireHostIdFailure: Cannot acquire host id: (u'14ec2fc7-8c2b-487c-8f4f-428644650928', SanlockException(-203, 'Sanlock lockspace add failure', 'Watchdog device error'))
2019-01-15 13:50:35,988+0100 INFO (jsonrpc/2) [storage.TaskManager.Task] (Task='7cbd7c09-e934-4396-bd9d-61e9f0e00bd3') aborting: Task is aborted: "Cannot acquire host id: (u'14ec2fc7-8c2b-487c-8f4f-428644650928', SanlockException(-203, 'Sanlock lockspace add failure', 'Watchdog device error'))" - code 661 (task:1181)
2019-01-15 13:50:35,989+0100 ERROR (jsonrpc/2) [storage.Dispatcher] FINISH createStoragePool error=Cannot acquire host id: (u'14ec2fc7-8c2b-487c-8f4f-428644650928', SanlockException(-203, 'Sanlock lockspace add failure', 'Watchdog device error')) (dispatcher:81)
2019-01-15 13:50:35,990+0100 INFO (jsonrpc/2) [jsonrpc.JsonRpcServer] RPC call StoragePool.create failed (error 661) in 1.01 seconds (__init__:312)
2019-01-15 13:50:38,109+0100 INFO (vmrecovery) [vdsm.api] START getConnectedStoragePoolsList(options=None) from=internal, task_id=e69af7e1-c456-4822-ae06-7b309263257d (api:48)
2019-01-15 13:50:38,109+0100 INFO (vmrecovery) [vdsm.api] FINISH getConnectedStoragePoolsList return={'poollist': []} from=internal, task_id=e69af7e1-c456-4822-ae06-7b309263257d (api:54)
2019-01-15 13:50:38,110+0100 INFO (vmrecovery) [vds] recovery: waiting for storage pool to go up (clientIF:705)
2019-01-15 13:50:39,802+0100 INFO (jsonrpc/4) [jsonrpc.JsonRpcServer] RPC call GlusterHost.list succeeded in 0.33 seconds (__init__:312)
2019-01-15 13:50:39,996+0100 INFO (jsonrpc/7) [jsonrpc.JsonRpcServer] RPC call GlusterVolume.list succeeded in 0.18 seconds (__init__:312)
2019-01-15 13:50:43,115+0100 INFO (vmrecovery) [vdsm.api] START getConnectedStoragePoolsList(options=None) from=internal, task_id=729c054e-021a-4d2e-b36c-edfb186a1210 (api:48)
2019-01-15 13:50:43,116+0100 INFO (vmrecovery) [vdsm.api] FINISH getConnectedStoragePoolsList return={'poollist': []} from=internal, task_id=729c054e-021a-4d2e-b36c-edfb186a1210 (api:54)
2019-01-15 13:50:43,116+0100 INFO (vmrecovery) [vds] recovery: waiting for storage pool to go up (clientIF:705)
2019-01-15 13:50:43,611+0100 INFO (jsonrpc/5) [api.host] START getStats() from=::ffff:192.168.124.50,42356 (api:48)
2019-01-15 13:50:43,628+0100 INFO (jsonrpc/5) [vdsm.api] START repoStats(domains=()) from=::ffff:192.168.124.50,42356, task_id=868d05f4-5535-4cb3-b283-92d3c1595bb3 (api:48)
2019-01-15 13:50:43,628+0100 INFO (jsonrpc/5) [vdsm.api] FINISH repoStats return={} from=::ffff:192.168.124.50,42356, task_id=868d05f4-5535-4cb3-b283-92d3c1595bb3 (api:54)
2019-01-15 13:50:43,628+0100 INFO (jsonrpc/5) [vdsm.api] START multipath_health() from=::ffff:192.168.124.50,42356, task_id=7f4fdad0-2b2e-4dcf-88b8-b7cf2689d4d9 (api:48)
Gianluca