Enabling libgfapi disk access with oVirt 4.2
by Alessandro De Salvo
Hi,
I'm using the latest 4.2 beta release and want to try the gfapi access,
but I'm currently failing to use it.
My test setup has an external glusterfs cluster v3.12, not managed by oVirt.
The compatibility flag is correctly showing gfapi should be enabled with
4.2:
# engine-config -g LibgfApiSupported
LibgfApiSupported: false version: 3.6
LibgfApiSupported: false version: 4.0
LibgfApiSupported: false version: 4.1
LibgfApiSupported: true version: 4.2
The data center and cluster have the 4.2 compatibility flags as well.
However, when starting a VM with a disk on gluster I can still see the
disk is mounted via fuse.
Any clue of what I'm still missing?
Thanks,
Alessandro
7 years, 5 months
Migration from Redhat Virtualization Manager 2.2 to Ovirt 4.1
by Julio Cesar Bustamante
Hi there,
I have to migrate from Red Hat Virtualization Manager 2.2 to Ovirt 4.1. But
I have a question, is it possible configure a Ovirt manager and a Ovirt
Host with and a 2Tb sata disk and virtualize two o more virtual machine on
this Ovirt host ?
What do you recommed ?
--
Julio Cesar Bustamante.
7 years, 5 months
Unable to attach ISO images
by Rudi Ahlers
Hi,
I have setup oVirt Engine Version: 4.1.7.6-1.el7.centos with Hosted-engine,
with a node Gluster cluster. When attempting to upload ISO images, the
"Attach ISO" button is greyd out.
How do I fix this?
[image: Inline image 1]
--
Kind Regards
Rudi Ahlers
Website: http://www.rudiahlers.co.za
7 years, 5 months
Hosted engine not starting up after system reboot
by Rudi Ahlers
Hi,
I wonder if someone can help. After a system reboot, the Hosted-Agent isn't
running. This is on a fresh installaion CentOS Linux release 7.4.1708
running ovirt-release41-4.1.7-1.el7.centos.noarch. Gluster is setup on 3
nodes, but hosted-engine is only setup on the 1st node for now.
[root@virt1 ~]# hosted-engine --console
Virtual machine does not exist
The engine VM is not on this host
[root@virt1 ~]# hosted-engine --vm-status
The hosted engine configuration has not been retrieved from shared storage.
Please ensure that ovirt-ha-agent is running and the storage server is
reachable.
[root@virt1 ~]# ps ax | grep ovirt-ha-agent
41309 ? Rsl 0:14 /usr/bin/python
/usr/share/ovirt-hosted-engine-ha/ovirt-ha-agent --no-daemon
42818 pts/0 S+ 0:00 grep --color=auto ovirt-ha-agent
[root@virt1 ~]# mount | grep engine
/dev/mapper/storage-engine on /storage/engine type xfs
(rw,relatime,seclabel,attr2,inode64,sunit=1024,swidth=2048,noquota)
virt1:/engine on /mnt/engine type fuse.glusterfs
(rw,relatime,user_id=0,group_id=0,default_permissions,allow_other,max_read=131072)
virt1:/engine on /rhev/data-center/mnt/glusterSD/virt1:_engine type
fuse.glusterfs
(rw,relatime,user_id=0,group_id=0,default_permissions,allow_other,max_read=131072)
And then I see this error:
[root@virt1 ~]# systemctl status ovirt-ha-agent -l
● ovirt-ha-agent.service - oVirt Hosted Engine High Availability Monitoring
Agent
Loaded: loaded (/usr/lib/systemd/system/ovirt-ha-agent.service; enabled;
vendor preset: disabled)
Active: active (running) since Thu 2017-11-16 07:44:43 SAST; 4min 23s ago
Main PID: 41309 (ovirt-ha-agent)
CGroup: /system.slice/ovirt-ha-agent.service
└─41309 /usr/bin/python
/usr/share/ovirt-hosted-engine-ha/ovirt-ha-agent --no-daemon
Nov 16 07:48:30 virt ovirt-ha-agent[41309]: ovirt-ha-agent
ovirt_hosted_engine_ha.agent.hosted_engine.HostedEngine.config ERROR Unable
to identify the OVF_STORE volume, falling back to initial vm.conf. Please
ensure you already added your first data domain for regular VMs
Nov 16 07:48:30 virt ovirt-ha-agent[41309]: ovirt-ha-agent
ovirt_hosted_engine_ha.agent.hosted_engine.HostedEngine.config ERROR
'version' is not stored in the HE configuration image
Nov 16 07:48:39 virt ovirt-ha-agent[41309]: ovirt-ha-agent
ovirt_hosted_engine_ha.agent.hosted_engine.HostedEngine.config ERROR Unable
to identify the OVF_STORE volume, falling back to initial vm.conf. Please
ensure you already added your first data domain for regular VMs
Nov 16 07:48:39 virt ovirt-ha-agent[41309]: ovirt-ha-agent
ovirt_hosted_engine_ha.agent.hosted_engine.HostedEngine.config ERROR
'version' is not stored in the HE configuration image
Nov 16 07:48:41 virt ovirt-ha-agent[41309]: ovirt-ha-agent
ovirt_hosted_engine_ha.agent.hosted_engine.HostedEngine.config ERROR Unable
to identify the OVF_STORE volume, falling back to initial vm.conf. Please
ensure you already added your first data domain for regular VMs
Nov 16 07:48:41 virt ovirt-ha-agent[41309]: ovirt-ha-agent
ovirt_hosted_engine_ha.agent.hosted_engine.HostedEngine.config ERROR
'version' is not stored in the HE configuration image
Nov 16 07:48:41 virt ovirt-ha-agent[41309]: ovirt-ha-agent
ovirt_hosted_engine_ha.agent.agent.Agent ERROR Traceback (most recent call
last):
File
"/usr/lib/python2.7/site-packages/ovirt_hosted_engine_ha/agent/agent.py",
line 191, in _run_agent
return
action(he)
File
"/usr/lib/python2.7/site-packages/ovirt_hosted_engine_ha/agent/agent.py",
line 64, in action_proper
return
he.start_monitoring()
File
"/usr/lib/python2.7/site-packages/ovirt_hosted_engine_ha/agent/hosted_engine.py",
line 423, in start_monitoring
for
old_state, state, delay in self.fsm:
File
"/usr/lib/python2.7/site-packages/ovirt_hosted_engine_ha/lib/fsm/machine.py",
line 127, in next
new_data =
self.refresh(self._state.data)
File
"/usr/lib/python2.7/site-packages/ovirt_hosted_engine_ha/agent/state_machine.py",
line 123, in refresh
] =
self.hosted_engine.min_memory_threshold
File
"/usr/lib/python2.7/site-packages/ovirt_hosted_engine_ha/agent/hosted_engine.py",
line 183, in min_memory_threshold
return
int(self._config.get(config.VM, config.MEM_SIZE))
File
"/usr/lib/python2.7/site-packages/ovirt_hosted_engine_ha/env/config.py",
line 226, in get
key
KeyError:
'Configuration value not found:
file=/var/run/ovirt-hosted-engine-ha/vm.conf, key=memSize'
Nov 16 07:48:41 virt ovirt-ha-agent[41309]: ovirt-ha-agent
ovirt_hosted_engine_ha.agent.agent.Agent ERROR Trying to restart agent
Nov 16 07:49:01 virt ovirt-ha-agent[41309]: ovirt-ha-agent
ovirt_hosted_engine_ha.agent.hosted_engine.HostedEngine.config ERROR Unable
to identify the OVF_STORE volume, falling back to initial vm.conf. Please
ensure you already added your first data domain for regular VMs
Nov 16 07:49:01 virt ovirt-ha-agent[41309]: ovirt-ha-agent
ovirt_hosted_engine_ha.agent.hosted_engine.HostedEngine.config ERROR
'version' is not stored
All the nodes resolve fine, and has been added to /etc/hosts:
[root@virt1 ~]# systemctl status ovirt-ha-agent -l
● ovirt-ha-agent.service - oVirt Hosted Engine High Availability Monitoring
Agent
Loaded: loaded (/usr/lib/systemd/system/ovirt-ha-agent.service; enabled;
vendor preset: disabled)
Active: active (running) since Thu 2017-11-16 07:44:43 SAST; 4min 23s ago
Main PID: 41309 (ovirt-ha-agent)
CGroup: /system.slice/ovirt-ha-agent.service
└─41309 /usr/bin/python
/usr/share/ovirt-hosted-engine-ha/ovirt-ha-agent --no-daemon
Nov 16 07:48:30 virt ovirt-ha-agent[41309]: ovirt-ha-agent
ovirt_hosted_engine_ha.agent.hosted_engine.HostedEngine.config ERROR Unable
to identify the OVF_STORE volume, falling back to initial vm.conf. Please
ensure you already added your first data domain for regular VMs
Nov 16 07:48:30 virt ovirt-ha-agent[41309]: ovirt-ha-agent
ovirt_hosted_engine_ha.agent.hosted_engine.HostedEngine.config ERROR
'version' is not stored in the HE configuration image
Nov 16 07:48:39 virt ovirt-ha-agent[41309]: ovirt-ha-agent
ovirt_hosted_engine_ha.agent.hosted_engine.HostedEngine.config ERROR Unable
to identify the OVF_STORE volume, falling back to initial vm.conf. Please
ensure you already added your first data domain for regular VMs
Nov 16 07:48:39 virt ovirt-ha-agent[41309]: ovirt-ha-agent
ovirt_hosted_engine_ha.agent.hosted_engine.HostedEngine.config ERROR
'version' is not stored in the HE configuration image
Nov 16 07:48:41 virt ovirt-ha-agent[41309]: ovirt-ha-agent
ovirt_hosted_engine_ha.agent.hosted_engine.HostedEngine.config ERROR Unable
to identify the OVF_STORE volume, falling back to initial vm.conf. Please
ensure you already added your first data domain for regular VMs
Nov 16 07:48:41 virt ovirt-ha-agent[41309]: ovirt-ha-agent
ovirt_hosted_engine_ha.agent.hosted_engine.HostedEngine.config ERROR
'version' is not stored in the HE configuration image
Nov 16 07:48:41 virt ovirt-ha-agent[41309]: ovirt-ha-agent
ovirt_hosted_engine_ha.agent.agent.Agent ERROR Traceback (most recent call
last):
File
"/usr/lib/python2.7/site-packages/ovirt_hosted_engine_ha/agent/agent.py",
line 191, in _run_agent
return
action(he)
File
"/usr/lib/python2.7/site-packages/ovirt_hosted_engine_ha/agent/agent.py",
line 64, in action_proper
return
he.start_monitoring()
File
"/usr/lib/python2.7/site-packages/ovirt_hosted_engine_ha/agent/hosted_engine.py",
line 423, in start_monitoring
for
old_state, state, delay in self.fsm:
File
"/usr/lib/python2.7/site-packages/ovirt_hosted_engine_ha/lib/fsm/machine.py",
line 127, in next
new_data =
self.refresh(self._state.data)
File
"/usr/lib/python2.7/site-packages/ovirt_hosted_engine_ha/agent/state_machine.py",
line 123, in refresh
] =
self.hosted_engine.min_memory_threshold
File
"/usr/lib/python2.7/site-packages/ovirt_hosted_engine_ha/agent/hosted_engine.py",
line 183, in min_memory_threshold
return
int(self._config.get(config.VM, config.MEM_SIZE))
File
"/usr/lib/python2.7/site-packages/ovirt_hosted_engine_ha/env/config.py",
line 226, in get
key
KeyError:
'Configuration value not found:
file=/var/run/ovirt-hosted-engine-ha/vm.conf, key=memSize'
Nov 16 07:48:41 virt ovirt-ha-agent[41309]: ovirt-ha-agent
ovirt_hosted_engine_ha.agent.agent.Agent ERROR Trying to restart agent
Nov 16 07:49:01 virt ovirt-ha-agent[41309]: ovirt-ha-agent
ovirt_hosted_engine_ha.agent.hosted_engine.HostedEngine.config ERROR Unable
to identify the OVF_STORE volume, falling back to initial vm.conf. Please
ensure you already added your first data domain for regular VMs
Nov 16 07:49:01 virt ovirt-ha-agent[41309]: ovirt-ha-agent
ovirt_hosted_engine_ha.agent.hosted_engine.HostedEngine.config ERROR
'version' is not stored
--
Kind Regards
Rudi Ahlers
Website: http://www.rudiahlers.co.za
7 years, 5 months
where to change vnc password?
by Rudi Ahlers
Hi,
I installed hosted-engine on a VM but failed to see the VNC password due to
power failure. The hosted-engine VM is running now, but I don't know where
to see or change the VNC password. Can someone please tell me where to
change it?
--
Kind Regards
Rudi Ahlers
Website: http://www.rudiahlers.co.za
7 years, 5 months
cannot remove gluster brick
by Rudi Ahlers
Hi,
I am trying to remove a brick, from a server which is no longer part of the
gluster pool, but I keep running into errors for which I cannot find
answers on google.
[root@virt2 ~]# gluster peer status
Number of Peers: 3
Hostname: srv1
Uuid: 2bed7e51-430f-49f5-afbc-06f8cec9baeb
State: Peer in Cluster (Disconnected)
Hostname: srv3
Uuid: 0e78793c-deca-4e3b-a36f-2333c8f91825
State: Peer in Cluster (Connected)
Hostname: srv4
Uuid: 1a6eedc6-59eb-4329-b091-2b9bc6f0834f
State: Peer in Cluster (Connected)
[root@virt2 ~]#
[root@virt2 ~]# gluster volume info data
Volume Name: data
Type: Replicate
Volume ID: d09e4534-8bc0-4b30-be89-bc1ec2b439c7
Status: Started
Snapshot Count: 0
Number of Bricks: 1 x 3 = 3
Transport-type: tcp
Bricks:
Brick1: srv1:/gluster/data/brick1
Brick2: srv2:/gluster/data/brick1
Brick3: srv3:/gluster/data/brick1
Options Reconfigured:
nfs.disable: on
transport.address-family: inet
performance.quick-read: off
performance.read-ahead: off
performance.io-cache: off
performance.low-prio-threads: 32
network.remote-dio: enable
cluster.eager-lock: enable
cluster.quorum-type: auto
cluster.server-quorum-type: server
cluster.data-self-heal-algorithm: full
cluster.locking-scheme: granular
cluster.shd-max-threads: 8
cluster.shd-wait-qlength: 10000
features.shard: on
user.cifs: off
storage.owner-uid: 36
storage.owner-gid: 36
features.shard-block-size: 512MB
[root@virt2 ~]# gluster volume remove-brick data replica 2
srv1:/gluster/data/brick1 start
volume remove-brick start: failed: Migration of data is not needed when
reducing replica count. Use the 'force' option
[root@virt2 ~]# gluster volume remove-brick data replica 2
srv1:/gluster/data/brick1 commit
Removing brick(s) can result in data loss. Do you want to Continue? (y/n) y
volume remove-brick commit: failed: Brick srv1:/gluster/data/brick1 is not
decommissioned. Use start or force option
The server virt1 is not part of the cluster anymore.
--
Kind Regards
Rudi Ahlers
Website: http://www.rudiahlers.co.za
7 years, 5 months
Could not retrieve LUNs, please check your storage.
by luminal@interfree.it
Hi,
I'm trying to connect a new oVirt Engine Version: 4.1.2.2-1.el7.centos
to a Dell MD3200i SAN.
I can discover the SAN from the storage new domain window but when I
login, instead of seeing the + symbol I get the "Could not retrieve
LUNs, please check your storage" error message.
I tried with both a raw LUN and with a partitioned LUN but still no
luck.
Any idea what could cause the problem?
Here some commands and logs
[root@ov1 vdsm]# multipath -ll
36d4ae52000662da400006cbd59f95a20 dm-3 DELL ,MD32xxi
size=20G features='3 queue_if_no_path pg_init_retries 50' hwhandler='1
rdac' wp=rw
`-+- policy='round-robin 0' prio=11 status=active
|- 66:0:0:0 sdb 8:16 active ready running
`- 67:0:0:0 sdc 8:32 active ready running
[root@ov1 vdsm]# lsblk
NAME MAJ:MIN RM SIZE RO TYPE
MOUNTPOINT
sda 8:0 0 136.1G 0 disk
├─sda1 8:1 0 1G 0 part /boot
└─sda2 8:2 0 135.1G 0 part
├─rhel_ov1-root 253:0 0 50G 0 lvm /
├─rhel_ov1-swap 253:1 0 4G 0 lvm [SWAP]
└─rhel_ov1-home 253:2 0 81.1G 0 lvm /home
sdb 8:16 0 20G 0 disk
└─36d4ae52000662da400006cbd59f95a20 253:3 0 20G 0 mpath
sdc 8:32 0 20G 0 disk
└─36d4ae52000662da400006cbd59f95a20 253:3 0 20G 0 mpath
[root@ov1 vdsm]# dmsetup ls
rhel_ov1-home (253:2)
rhel_ov1-swap (253:1)
rhel_ov1-root (253:0)
36d4ae52000662da400006cbd59f95a20 (253:3)
[root@ov1 vdsm]# dmsetup table
rhel_ov1-home: 0 170123264 linear 8:2 8390656
rhel_ov1-swap: 0 8388608 linear 8:2 2048
rhel_ov1-root: 0 104857600 linear 8:2 178513920
36d4ae52000662da400006cbd59f95a20: 0 41943040 multipath 3
queue_if_no_path pg_init_retries 50 1 rdac 1 1 round-robin 0 2 1 8:16 1
8:32 1
engine.log
2017-11-15 11:22:32,243Z INFO
[org.ovirt.engine.core.vdsbroker.vdsbroker.DiscoverSendTargetsVDSCommand]
(default task-70) [e7e6c1fb-27f3-46a5-9863-e454bdcd1898] START,
DiscoverSendTargetsVDSCommand(HostName = ov1.foo.bar.org,
DiscoverSendTargetsVDSCommandParameters:{runAsync='true',
hostId='f1e4fdad-1cf1-473a-97cb-79e2641c2c86',
connection='StorageServerConnections:{id='null',
connection='10.1.8.200', iqn='null', vfsType='null',
mountOptions='null', nfsVersion='null', nfsRetrans='null',
nfsTimeo='null', iface='null', netIfaceName='null'}'}), log id: 19df6841
2017-11-15 11:22:33,436Z INFO
[org.ovirt.engine.core.vdsbroker.vdsbroker.DiscoverSendTargetsVDSCommand]
(default task-70) [e7e6c1fb-27f3-46a5-9863-e454bdcd1898] FINISH,
DiscoverSendTargetsVDSCommand, return:
[StorageServerConnections:{id='null', connection='10.1.8.200',
iqn='iqn.1984-05.com.dell:powervault.md3200i.6d4ae5200063fd45000000004f0c6e44',
vfsType='null', mountOptions='null', nfsVersion='null',
nfsRetrans='null', nfsTimeo='null', iface='null', netIfaceName='null'},
StorageServerConnections:{id='null', connection='10.1.8.201',
iqn='iqn.1984-05.com.dell:powervault.md3200i.6d4ae5200063fd45000000004f0c6e44',
vfsType='null', mountOptions='null', nfsVersion='null',
nfsRetrans='null', nfsTimeo='null', iface='null', netIfaceName='null'},
StorageServerConnections:{id='null', connection='10.1.8.202',
iqn='iqn.1984-05.com.dell:powervault.md3200i.6d4ae5200063fd45000000004f0c6e44',
vfsType='null', mountOptions='null', nfsVersion='null',
nfsRetrans='null', nfsTimeo='null', iface='null', netIfaceName='null'},
StorageServerConnections:{id='null', connection='10.1.8.203',
iqn='iqn.1984-05.com.dell:powervault.md3200i.6d4ae5200063fd45000000004f0c6e44',
vfsType='null', mountOptions='null', nfsVersion='null',
nfsRetrans='null', nfsTimeo='null', iface='null', netIfaceName='null'}],
log id: 19df6841
2017-11-15 11:22:46,654Z INFO
[org.ovirt.engine.core.bll.storage.connection.ConnectStorageToVdsCommand]
(default task-78) [bc8ed6d7-264a-43bf-a076-b15f05ef34b8] Running
command: ConnectStorageToVdsCommand internal: false. Entities affected :
ID: aaa00000-0000-0000-0000-123456789aaa Type: SystemAction group
CREATE_STORAGE_DOMAIN with role type ADMIN
2017-11-15 11:22:46,657Z INFO
[org.ovirt.engine.core.vdsbroker.vdsbroker.ConnectStorageServerVDSCommand]
(default task-78) [bc8ed6d7-264a-43bf-a076-b15f05ef34b8] START,
ConnectStorageServerVDSCommand(HostName = ov1.foo.bar.org,
StorageServerConnectionManagementVDSParameters:{runAsync='true',
hostId='f1e4fdad-1cf1-473a-97cb-79e2641c2c86',
storagePoolId='00000000-0000-0000-0000-000000000000',
storageType='ISCSI',
connectionList='[StorageServerConnections:{id='null',
connection='10.1.8.200',
iqn='iqn.1984-05.com.dell:powervault.md3200i.6d4ae5200063fd45000000004f0c6e44',
vfsType='null', mountOptions='null', nfsVersion='null',
nfsRetrans='null', nfsTimeo='null', iface='null',
netIfaceName='null'}]'}), log id: 252cf708
2017-11-15 11:22:47,861Z INFO
[org.ovirt.engine.core.vdsbroker.vdsbroker.ConnectStorageServerVDSCommand]
(default task-78) [bc8ed6d7-264a-43bf-a076-b15f05ef34b8] FINISH,
ConnectStorageServerVDSCommand, return:
{00000000-0000-0000-0000-000000000000=0}, log id: 252cf708
2017-11-15 11:22:48,028Z INFO
[org.ovirt.engine.core.vdsbroker.vdsbroker.GetDeviceListVDSCommand]
(default task-90) [8c71fc65-afed-4de9-b62e-cbfb8fcc5ea2] START,
GetDeviceListVDSCommand(HostName = ov1.foo.bar.org,
GetDeviceListVDSCommandParameters:{runAsync='true',
hostId='f1e4fdad-1cf1-473a-97cb-79e2641c2c86', storageType='ISCSI',
checkStatus='false', lunIds='null'}), log id: 23ff3186
2017-11-15 11:22:49,198Z INFO
[org.ovirt.engine.core.vdsbroker.vdsbroker.GetDeviceListVDSCommand]
(default task-90) [8c71fc65-afed-4de9-b62e-cbfb8fcc5ea2] Command
'org.ovirt.engine.core.vdsbroker.vdsbroker.GetDeviceListVDSCommand'
return value '
LUNListReturn:{status='Status [code=0, message=Done]'}
status = unknown
vendorID = DELL
capacity = 21474836480
fwrev = 0820
discard_zeroes_data = 0
vgUUID =
pvsize =
pathlist:
[{initiatorname=default, connection=10.1.8.201,
iqn=iqn.1984-05.com.dell:powervault.md3200i.6d4ae5200063fd45000000004f0c6e44,
portal=1, user=null, password=dEk4n49Sa3og, port=3260},
{initiatorname=default, connection=10.1.8.200,
iqn=iqn.1984-05.com.dell:powervault.md3200i.6d4ae5200063fd45000000004f0c6e44,
portal=1, user=null, password=dEk4n49Sa3og, port=3260}]
logicalblocksize = 512
discard_max_bytes = 0
pathstatus:
[{type=iSCSI, physdev=sdb, capacity=21474836480, state=active, lun=0},
{type=iSCSI, physdev=sdc, capacity=21474836480, state=active, lun=0}]
devtype = iSCSI
physicalblocksize = 512
pvUUID =
serial = SDELL_MD32xxi_358001G
GUID = 36d4ae52000662da400006cbd59f95a20
productID = MD32xxi
'
2017-11-15 11:22:49,198Z INFO
[org.ovirt.engine.core.vdsbroker.vdsbroker.GetDeviceListVDSCommand]
(default task-90) [8c71fc65-afed-4de9-b62e-cbfb8fcc5ea2] HostName =
ov1.foo.bar.org
2017-11-15 11:22:49,198Z ERROR
[org.ovirt.engine.core.vdsbroker.vdsbroker.GetDeviceListVDSCommand]
(default task-90) [8c71fc65-afed-4de9-b62e-cbfb8fcc5ea2] Failed in
'GetDeviceListVDS' method, for vds: 'ov1.foo.bar.org'; host:
'10.0.2.161': null
2017-11-15 11:22:49,198Z ERROR
[org.ovirt.engine.core.vdsbroker.vdsbroker.GetDeviceListVDSCommand]
(default task-90) [8c71fc65-afed-4de9-b62e-cbfb8fcc5ea2] Command
'GetDeviceListVDSCommand(HostName = ov1.foo.bar.org,
GetDeviceListVDSCommandParameters:{runAsync='true',
hostId='f1e4fdad-1cf1-473a-97cb-79e2641c2c86', storageType='ISCSI',
checkStatus='false', lunIds='null'})' execution failed: null
2017-11-15 11:22:49,199Z INFO
[org.ovirt.engine.core.vdsbroker.vdsbroker.GetDeviceListVDSCommand]
(default task-90) [8c71fc65-afed-4de9-b62e-cbfb8fcc5ea2] FINISH,
GetDeviceListVDSCommand, log id: 23ff3186
2017-11-15 11:22:49,199Z ERROR
[org.ovirt.engine.core.bll.storage.disk.lun.GetDeviceListQuery] (default
task-90) [8c71fc65-afed-4de9-b62e-cbfb8fcc5ea2] Query
'GetDeviceListQuery' failed: EngineException:
java.lang.NullPointerException (Failed with error ENGINE and code 5001)
2017-11-15 11:22:49,199Z ERROR
[org.ovirt.engine.core.bll.storage.disk.lun.GetDeviceListQuery] (default
task-90) [8c71fc65-afed-4de9-b62e-cbfb8fcc5ea2] Exception:
org.ovirt.engine.core.common.errors.EngineException: EngineException:
java.lang.NullPointerException (Failed with error ENGINE and code 5001)
at
org.ovirt.engine.core.bll.VdsHandler.handleVdsResult(VdsHandler.java:118)
[bll.jar:]
at
org.ovirt.engine.core.bll.VDSBrokerFrontendImpl.runVdsCommand(VDSBrokerFrontendImpl.java:33)
[bll.jar:]
at
org.ovirt.engine.core.bll.QueriesCommandBase.runVdsCommand(QueriesCommandBase.java:242)
[bll.jar:]
at
org.ovirt.engine.core.bll.storage.disk.lun.GetDeviceListQuery.executeQueryCommand(GetDeviceListQuery.java:34)
[bll.jar:]
at
org.ovirt.engine.core.bll.QueriesCommandBase.executeCommand(QueriesCommandBase.java:110)
[bll.jar:]
at
org.ovirt.engine.core.dal.VdcCommandBase.execute(VdcCommandBase.java:33)
[dal.jar:]
at
org.ovirt.engine.core.bll.executor.DefaultBackendQueryExecutor.execute(DefaultBackendQueryExecutor.java:14)
[bll.jar:]
at
org.ovirt.engine.core.bll.Backend.runQueryImpl(Backend.java:582)
[bll.jar:]
at org.ovirt.engine.core.bll.Backend.runQuery(Backend.java:550)
[bll.jar:]
at sun.reflect.GeneratedMethodAccessor78.invoke(Unknown Source)
[:1.8.0_151]
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
[rt.jar:1.8.0_151]
at java.lang.reflect.Method.invoke(Method.java:498)
[rt.jar:1.8.0_151]
at
org.jboss.as.ee.component.ManagedReferenceMethodInterceptor.processInvocation(ManagedReferenceMethodInterceptor.java:52)
at
org.jboss.invocation.InterceptorContext.proceed(InterceptorContext.java:340)
at
org.jboss.invocation.InterceptorContext$Invocation.proceed(InterceptorContext.java:437)
at
org.jboss.as.weld.ejb.Jsr299BindingsInterceptor.delegateInterception(Jsr299BindingsInterceptor.java:70)
[wildfly-weld-10.1.0.Final.jar:10.1.0.Final]
at
org.jboss.as.weld.ejb.Jsr299BindingsInterceptor.doMethodInterception(Jsr299BindingsInterceptor.java:80)
[wildfly-weld-10.1.0.Final.jar:10.1.0.Final]
at
org.jboss.as.weld.ejb.Jsr299BindingsInterceptor.processInvocation(Jsr299BindingsInterceptor.java:93)
[wildfly-weld-10.1.0.Final.jar:10.1.0.Final]
at
org.jboss.as.ee.component.interceptors.UserInterceptorFactory$1.processInvocation(UserInterceptorFactory.java:63)
at
org.jboss.invocation.InterceptorContext.proceed(InterceptorContext.java:340)
at
org.jboss.invocation.InterceptorContext$Invocation.proceed(InterceptorContext.java:437)
at
org.ovirt.engine.core.bll.interceptors.CorrelationIdTrackerInterceptor.aroundInvoke(CorrelationIdTrackerInterceptor.java:13)
[bll.jar:]
at sun.reflect.GeneratedMethodAccessor77.invoke(Unknown Source)
[:1.8.0_151]
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
[rt.jar:1.8.0_151]
at java.lang.reflect.Method.invoke(Method.java:498)
[rt.jar:1.8.0_151]
at
org.jboss.as.ee.component.ManagedReferenceLifecycleMethodInterceptor.processInvocation(ManagedReferenceLifecycleMethodInterceptor.java:89)
at
org.jboss.invocation.InterceptorContext.proceed(InterceptorContext.java:340)
at
org.jboss.as.ejb3.component.invocationmetrics.ExecutionTimeInterceptor.processInvocation(ExecutionTimeInterceptor.java:43)
[wildfly-ejb3-10.1.0.Final.jar:10.1.0.Final]
at
org.jboss.invocation.InterceptorContext.proceed(InterceptorContext.java:340)
at
org.jboss.invocation.InterceptorContext$Invocation.proceed(InterceptorContext.java:437)
at
org.jboss.weld.ejb.AbstractEJBRequestScopeActivationInterceptor.aroundInvoke(AbstractEJBRequestScopeActivationInterceptor.java:73)
[weld-core-impl-2.3.5.Final.jar:2.3.5.Final]
at
org.jboss.as.weld.ejb.EjbRequestScopeActivationInterceptor.processInvocation(EjbRequestScopeActivationInterceptor.java:83)
[wildfly-weld-10.1.0.Final.jar:10.1.0.Final]
at
org.jboss.invocation.InterceptorContext.proceed(InterceptorContext.java:340)
at
org.jboss.as.ee.concurrent.ConcurrentContextInterceptor.processInvocation(ConcurrentContextInterceptor.java:45)
[wildfly-ee-10.1.0.Final.jar:10.1.0.Final]
at
org.jboss.invocation.InterceptorContext.proceed(InterceptorContext.java:340)
at
org.jboss.invocation.InitialInterceptor.processInvocation(InitialInterceptor.java:21)
at
org.jboss.invocation.InterceptorContext.proceed(InterceptorContext.java:340)
at
org.jboss.invocation.ChainedInterceptor.processInvocation(ChainedInterceptor.java:61)
at
org.jboss.as.ee.component.interceptors.ComponentDispatcherInterceptor.processInvocation(ComponentDispatcherInterceptor.java:52)
at
org.jboss.invocation.InterceptorContext.proceed(InterceptorContext.java:340)
at
org.jboss.as.ejb3.component.singleton.SingletonComponentInstanceAssociationInterceptor.processInvocation(SingletonComponentInstanceAssociationInterceptor.java:53)
[wildfly-ejb3-10.1.0.Final.jar:10.1.0.Final]
at
org.jboss.invocation.InterceptorContext.proceed(InterceptorContext.java:340)
at
org.jboss.as.ejb3.tx.CMTTxInterceptor.invokeInNoTx(CMTTxInterceptor.java:263)
[wildfly-ejb3-10.1.0.Final.jar:10.1.0.Final]
at
org.jboss.as.ejb3.tx.CMTTxInterceptor.supports(CMTTxInterceptor.java:374)
[wildfly-ejb3-10.1.0.Final.jar:10.1.0.Final]
at
org.jboss.as.ejb3.tx.CMTTxInterceptor.processInvocation(CMTTxInterceptor.java:243)
[wildfly-ejb3-10.1.0.Final.jar:10.1.0.Final]
at
org.jboss.invocation.InterceptorContext.proceed(InterceptorContext.java:340)
at
org.jboss.as.ejb3.component.interceptors.CurrentInvocationContextInterceptor.processInvocation(CurrentInvocationContextInterceptor.java:41)
[wildfly-ejb3-10.1.0.Final.jar:10.1.0.Final]
at
org.jboss.invocation.InterceptorContext.proceed(InterceptorContext.java:340)
at
org.jboss.as.ejb3.component.invocationmetrics.WaitTimeInterceptor.processInvocation(WaitTimeInterceptor.java:47)
[wildfly-ejb3-10.1.0.Final.jar:10.1.0.Final]
at
org.jboss.invocation.InterceptorContext.proceed(InterceptorContext.java:340)
at
org.jboss.as.ejb3.security.SecurityContextInterceptor.processInvocation(SecurityContextInterceptor.java:100)
[wildfly-ejb3-10.1.0.Final.jar:10.1.0.Final]
at
org.jboss.invocation.InterceptorContext.proceed(InterceptorContext.java:340)
at
org.jboss.as.ejb3.deployment.processors.StartupAwaitInterceptor.processInvocation(StartupAwaitInterceptor.java:22)
[wildfly-ejb3-10.1.0.Final.jar:10.1.0.Final]
at
org.jboss.invocation.InterceptorContext.proceed(InterceptorContext.java:340)
at
org.jboss.as.ejb3.component.interceptors.ShutDownInterceptorFactory$1.processInvocation(ShutDownInterceptorFactory.java:64)
[wildfly-ejb3-10.1.0.Final.jar:10.1.0.Final]
at
org.jboss.invocation.InterceptorContext.proceed(InterceptorContext.java:340)
at
org.jboss.as.ejb3.component.interceptors.LoggingInterceptor.processInvocation(LoggingInterceptor.java:67)
[wildfly-ejb3-10.1.0.Final.jar:10.1.0.Final]
at
org.jboss.invocation.InterceptorContext.proceed(InterceptorContext.java:340)
at
org.jboss.as.ee.component.NamespaceContextInterceptor.processInvocation(NamespaceContextInterceptor.java:50)
at
org.jboss.invocation.InterceptorContext.proceed(InterceptorContext.java:340)
at
org.jboss.invocation.ContextClassLoaderInterceptor.processInvocation(ContextClassLoaderInterceptor.java:64)
at
org.jboss.invocation.InterceptorContext.proceed(InterceptorContext.java:340)
at
org.jboss.invocation.InterceptorContext.run(InterceptorContext.java:356)
at
org.wildfly.security.manager.WildFlySecurityManager.doChecked(WildFlySecurityManager.java:636)
at
org.jboss.invocation.AccessCheckingInterceptor.processInvocation(AccessCheckingInterceptor.java:61)
at
org.jboss.invocation.InterceptorContext.proceed(InterceptorContext.java:340)
at
org.jboss.invocation.InterceptorContext.run(InterceptorContext.java:356)
at
org.jboss.invocation.PrivilegedWithCombinerInterceptor.processInvocation(PrivilegedWithCombinerInterceptor.java:80)
at
org.jboss.invocation.InterceptorContext.proceed(InterceptorContext.java:340)
at
org.jboss.invocation.ChainedInterceptor.processInvocation(ChainedInterceptor.java:61)
at
org.jboss.as.ee.component.ViewService$View.invoke(ViewService.java:198)
at
org.jboss.as.ee.component.ViewDescription$1.processInvocation(ViewDescription.java:185)
at
org.jboss.invocation.InterceptorContext.proceed(InterceptorContext.java:340)
at
org.jboss.invocation.ChainedInterceptor.processInvocation(ChainedInterceptor.java:61)
at
org.jboss.as.ee.component.ProxyInvocationHandler.invoke(ProxyInvocationHandler.java:73)
at
org.ovirt.engine.core.common.interfaces.BackendLocal$$$view3.runQuery(Unknown
Source) [common.jar:]
at
org.ovirt.engine.ui.frontend.server.gwt.GenericApiGWTServiceImpl.runQuery(GenericApiGWTServiceImpl.java:89)
at
org.ovirt.engine.ui.frontend.server.gwt.GenericApiGWTServiceImpl.runMultipleQueries(GenericApiGWTServiceImpl.java:123)
at sun.reflect.GeneratedMethodAccessor93.invoke(Unknown Source)
[:1.8.0_151]
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
[rt.jar:1.8.0_151]
at java.lang.reflect.Method.invoke(Method.java:498)
[rt.jar:1.8.0_151]
at
com.google.gwt.user.server.rpc.RPC.invokeAndEncodeResponse(RPC.java:561)
at
com.google.gwt.user.server.rpc.RemoteServiceServlet.processCall(RemoteServiceServlet.java:265)
at
com.google.gwt.user.server.rpc.RemoteServiceServlet.processPost(RemoteServiceServlet.java:305)
at
com.google.gwt.user.server.rpc.AbstractRemoteServiceServlet.doPost(AbstractRemoteServiceServlet.java:62)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:707)
[jboss-servlet-api_3.1_spec-1.0.0.Final.jar:1.0.0.Final]
at
org.ovirt.engine.ui.frontend.server.gwt.GenericApiGWTServiceImpl.service(GenericApiGWTServiceImpl.java:77)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:790)
[jboss-servlet-api_3.1_spec-1.0.0.Final.jar:1.0.0.Final]
at
io.undertow.servlet.handlers.ServletHandler.handleRequest(ServletHandler.java:85)
at
io.undertow.servlet.handlers.FilterHandler$FilterChainImpl.doFilter(FilterHandler.java:129)
at
org.ovirt.engine.core.utils.servlet.HeaderFilter.doFilter(HeaderFilter.java:94)
[utils.jar:]
at
io.undertow.servlet.core.ManagedFilter.doFilter(ManagedFilter.java:61)
at
io.undertow.servlet.handlers.FilterHandler$FilterChainImpl.doFilter(FilterHandler.java:131)
at
org.ovirt.engine.ui.frontend.server.gwt.GwtCachingFilter.doFilter(GwtCachingFilter.java:132)
at
io.undertow.servlet.core.ManagedFilter.doFilter(ManagedFilter.java:61)
at
io.undertow.servlet.handlers.FilterHandler$FilterChainImpl.doFilter(FilterHandler.java:131)
at
org.ovirt.engine.core.branding.BrandingFilter.doFilter(BrandingFilter.java:73)
[branding.jar:]
at
io.undertow.servlet.core.ManagedFilter.doFilter(ManagedFilter.java:61)
at
io.undertow.servlet.handlers.FilterHandler$FilterChainImpl.doFilter(FilterHandler.java:131)
at
org.ovirt.engine.core.utils.servlet.LocaleFilter.doFilter(LocaleFilter.java:66)
[utils.jar:]
at
io.undertow.servlet.core.ManagedFilter.doFilter(ManagedFilter.java:61)
at
io.undertow.servlet.handlers.FilterHandler$FilterChainImpl.doFilter(FilterHandler.java:131)
at
io.undertow.servlet.handlers.FilterHandler.handleRequest(FilterHandler.java:84)
at
io.undertow.servlet.handlers.security.ServletSecurityRoleHandler.handleRequest(ServletSecurityRoleHandler.java:62)
at
io.undertow.servlet.handlers.ServletDispatchingHandler.handleRequest(ServletDispatchingHandler.java:36)
at
org.wildfly.extension.undertow.security.SecurityContextAssociationHandler.handleRequest(SecurityContextAssociationHandler.java:78)
at
io.undertow.server.handlers.PredicateHandler.handleRequest(PredicateHandler.java:43)
at
io.undertow.servlet.handlers.security.SSLInformationAssociationHandler.handleRequest(SSLInformationAssociationHandler.java:131)
at
io.undertow.servlet.handlers.security.ServletAuthenticationCallHandler.handleRequest(ServletAuthenticationCallHandler.java:57)
at
io.undertow.server.handlers.PredicateHandler.handleRequest(PredicateHandler.java:43)
at
io.undertow.security.handlers.AuthenticationConstraintHandler.handleRequest(AuthenticationConstraintHandler.java:53)
at
io.undertow.security.handlers.AbstractConfidentialityHandler.handleRequest(AbstractConfidentialityHandler.java:46)
at
io.undertow.servlet.handlers.security.ServletConfidentialityConstraintHandler.handleRequest(ServletConfidentialityConstraintHandler.java:64)
at
io.undertow.servlet.handlers.security.ServletSecurityConstraintHandler.handleRequest(ServletSecurityConstraintHandler.java:59)
at
io.undertow.security.handlers.AuthenticationMechanismsHandler.handleRequest(AuthenticationMechanismsHandler.java:60)
at
io.undertow.servlet.handlers.security.CachedAuthenticatedSessionHandler.handleRequest(CachedAuthenticatedSessionHandler.java:77)
at
io.undertow.security.handlers.NotificationReceiverHandler.handleRequest(NotificationReceiverHandler.java:50)
at
io.undertow.security.handlers.AbstractSecurityContextAssociationHandler.handleRequest(AbstractSecurityContextAssociationHandler.java:43)
at
io.undertow.server.handlers.PredicateHandler.handleRequest(PredicateHandler.java:43)
at
org.wildfly.extension.undertow.security.jacc.JACCContextIdHandler.handleRequest(JACCContextIdHandler.java:61)
at
io.undertow.server.handlers.PredicateHandler.handleRequest(PredicateHandler.java:43)
at
io.undertow.server.handlers.PredicateHandler.handleRequest(PredicateHandler.java:43)
at
io.undertow.servlet.handlers.ServletInitialHandler.handleFirstRequest(ServletInitialHandler.java:292)
at
io.undertow.servlet.handlers.ServletInitialHandler.access$100(ServletInitialHandler.java:81)
at
io.undertow.servlet.handlers.ServletInitialHandler$2.call(ServletInitialHandler.java:138)
at
io.undertow.servlet.handlers.ServletInitialHandler$2.call(ServletInitialHandler.java:135)
at
io.undertow.servlet.core.ServletRequestContextThreadSetupAction$1.call(ServletRequestContextThreadSetupAction.java:48)
at
io.undertow.servlet.core.ContextClassLoaderSetupAction$1.call(ContextClassLoaderSetupAction.java:43)
at
io.undertow.servlet.api.LegacyThreadSetupActionWrapper$1.call(LegacyThreadSetupActionWrapper.java:44)
at
io.undertow.servlet.api.LegacyThreadSetupActionWrapper$1.call(LegacyThreadSetupActionWrapper.java:44)
at
io.undertow.servlet.api.LegacyThreadSetupActionWrapper$1.call(LegacyThreadSetupActionWrapper.java:44)
at
io.undertow.servlet.api.LegacyThreadSetupActionWrapper$1.call(LegacyThreadSetupActionWrapper.java:44)
at
io.undertow.servlet.handlers.ServletInitialHandler.dispatchRequest(ServletInitialHandler.java:272)
at
io.undertow.servlet.handlers.ServletInitialHandler.access$000(ServletInitialHandler.java:81)
at
io.undertow.servlet.handlers.ServletInitialHandler$1.handleRequest(ServletInitialHandler.java:104)
at
io.undertow.server.Connectors.executeRootHandler(Connectors.java:202)
at
io.undertow.server.HttpServerExchange$1.run(HttpServerExchange.java:805)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
[rt.jar:1.8.0_151]
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
[rt.jar:1.8.0_151]
at java.lang.Thread.run(Thread.java:748) [rt.jar:1.8.0_151]
Caused by: java.lang.NullPointerException
at
org.ovirt.engine.core.vdsbroker.vdsbroker.GetDeviceListVDSCommand.parseConnection(GetDeviceListVDSCommand.java:211)
[vdsbroker.jar:]
at
org.ovirt.engine.core.vdsbroker.vdsbroker.GetDeviceListVDSCommand.parseLun(GetDeviceListVDSCommand.java:139)
[vdsbroker.jar:]
at
org.ovirt.engine.core.vdsbroker.vdsbroker.GetDeviceListVDSCommand.parseLUNList(GetDeviceListVDSCommand.java:63)
[vdsbroker.jar:]
at
org.ovirt.engine.core.vdsbroker.vdsbroker.GetDeviceListVDSCommand.executeVdsBrokerCommand(GetDeviceListVDSCommand.java:57)
[vdsbroker.jar:]
at
org.ovirt.engine.core.vdsbroker.vdsbroker.VdsBrokerCommand.executeVDSCommand(VdsBrokerCommand.java:111)
[vdsbroker.jar:]
at
org.ovirt.engine.core.vdsbroker.VDSCommandBase.executeCommand(VDSCommandBase.java:73)
[vdsbroker.jar:]
at
org.ovirt.engine.core.dal.VdcCommandBase.execute(VdcCommandBase.java:33)
[dal.jar:]
at
org.ovirt.engine.core.vdsbroker.vdsbroker.DefaultVdsCommandExecutor.execute(DefaultVdsCommandExecutor.java:14)
[vdsbroker.jar:]
at
org.ovirt.engine.core.vdsbroker.ResourceManager.runVdsCommand(ResourceManager.java:407)
[vdsbroker.jar:]
... 139 more
vdsm.log
2017-11-15 11:22:32,245+0000 INFO (jsonrpc/6) [vdsm.api] START
discoverSendTargets(con={'ipv6_enabled': False, 'connection':
u'10.1.8.200', 'password': '', 'port': u'3260', 'user': ''},
options=None) from=::ffff:10.0.2.161,43392,
flow_id=e7e6c1fb-27f3-46a5-9863-e454bdcd1898,
task_id=b770efcc-fb83-487a-8344-84c376f01ae3 (api:46)
2017-11-15 11:22:32,433+0000 INFO (jsonrpc/6) [vdsm.api] FINISH
discoverSendTargets return={'fullTargets': ['10.1.8.200:3260,1
iqn.1984-05.com.dell:powervault.md3200i.6d4ae5200063fd45000000004f0c6e44',
'10.1.8.201:3260,1
iqn.1984-05.com.dell:powervault.md3200i.6d4ae5200063fd45000000004f0c6e44',
'10.1.8.202:3260,2
iqn.1984-05.com.dell:powervault.md3200i.6d4ae5200063fd45000000004f0c6e44',
'10.1.8.203:3260,2
iqn.1984-05.com.dell:powervault.md3200i.6d4ae5200063fd45000000004f0c6e44'],
'targets':
['iqn.1984-05.com.dell:powervault.md3200i.6d4ae5200063fd45000000004f0c6e44',
'iqn.1984-05.com.dell:powervault.md3200i.6d4ae5200063fd45000000004f0c6e44',
'iqn.1984-05.com.dell:powervault.md3200i.6d4ae5200063fd45000000004f0c6e44',
'iqn.1984-05.com.dell:powervault.md3200i.6d4ae5200063fd45000000004f0c6e44']}
from=::ffff:10.0.2.161,43392,
flow_id=e7e6c1fb-27f3-46a5-9863-e454bdcd1898,
task_id=b770efcc-fb83-487a-8344-84c376f01ae3 (api:52)
2017-11-15 11:22:32,434+0000 INFO (jsonrpc/6) [jsonrpc.JsonRpcServer]
RPC call ISCSIConnection.discoverSendTargets succeeded in 0.19 seconds
(__init__:539)
2017-11-15 11:22:36,183+0000 INFO (jsonrpc/0) [jsonrpc.JsonRpcServer]
RPC call Host.getAllVmStats succeeded in 0.00 seconds (__init__:539)
2017-11-15 11:22:40,606+0000 INFO (jsonrpc/7) [vdsm.api] START
repoStats(options=None) from=::ffff:10.0.2.161,43392,
task_id=c79b71ef-1bbe-4d46-be48-d739565938d6 (api:46)
2017-11-15 11:22:40,606+0000 INFO (jsonrpc/7) [vdsm.api] FINISH
repoStats return={} from=::ffff:10.0.2.161,43392,
task_id=c79b71ef-1bbe-4d46-be48-d739565938d6 (api:52)
2017-11-15 11:22:40,611+0000 INFO (jsonrpc/7) [jsonrpc.JsonRpcServer]
RPC call Host.getStats succeeded in 0.01 seconds (__init__:539)
2017-11-15 11:22:41,571+0000 INFO (jsonrpc/1) [jsonrpc.JsonRpcServer]
RPC call Host.getAllVmStats succeeded in 0.00 seconds (__init__:539)
2017-11-15 11:22:44,078+0000 INFO (periodic/3) [vdsm.api] START
repoStats(options=None) from=internal,
task_id=8704275f-ccd1-4203-ade7-3db3727b17e1 (api:46)
2017-11-15 11:22:44,078+0000 INFO (periodic/3) [vdsm.api] FINISH
repoStats return={} from=internal,
task_id=8704275f-ccd1-4203-ade7-3db3727b17e1 (api:52)
2017-11-15 11:22:46,659+0000 INFO (jsonrpc/2) [vdsm.api] START
connectStorageServer(domType=3,
spUUID=u'00000000-0000-0000-0000-000000000000', conList=[{u'id':
u'00000000-0000-0000-0000-000000000000', u'connection': u'10.1.8.200',
u'iqn':
u'iqn.1984-05.com.dell:powervault.md3200i.6d4ae5200063fd45000000004f0c6e44',
u'user': u'', u'tpgt': u'1', u'password': '********', u'port':
u'3260'}], options=None) from=::ffff:10.0.2.161,43392,
flow_id=bc8ed6d7-264a-43bf-a076-b15f05ef34b8,
task_id=784fea0b-c972-4226-8071-f9f5eca657b5 (api:46)
2017-11-15 11:22:46,689+0000 INFO (jsonrpc/2) [storage.ISCSI] iSCSI
iface.net_ifacename not provided. Skipping. (iscsi:590)
2017-11-15 11:22:46,858+0000 INFO (jsonrpc/2) [vdsm.api] FINISH
connectStorageServer return={'statuslist': [{'status': 0, 'id':
u'00000000-0000-0000-0000-000000000000'}]} from=::ffff:10.0.2.161,43392,
flow_id=bc8ed6d7-264a-43bf-a076-b15f05ef34b8,
task_id=784fea0b-c972-4226-8071-f9f5eca657b5 (api:52)
2017-11-15 11:22:46,858+0000 INFO (jsonrpc/2) [jsonrpc.JsonRpcServer]
RPC call StoragePool.connectStorageServer succeeded in 0.20 seconds
(__init__:539)
2017-11-15 11:22:48,030+0000 INFO (jsonrpc/3) [vdsm.api] START
getDeviceList(storageType=3, guids=(), checkStatus=False, options={})
from=::ffff:10.0.2.161,43392,
flow_id=8c71fc65-afed-4de9-b62e-cbfb8fcc5ea2,
task_id=12ce2e46-7552-443a-b531-9817a601e814 (api:46)
2017-11-15 11:22:48,195+0000 INFO (jsonrpc/3) [vdsm.api] FINISH
getDeviceList return={'devList': [{'status': 'unknown', 'vendorID':
'DELL', 'capacity': '21474836480', 'fwrev': '0820',
'discard_zeroes_data': 0, 'vgUUID': '', 'pvsize': '', 'pathlist':
[{'initiatorname': 'default', 'connection': '10.1.8.201', 'iqn':
'iqn.1984-05.com.dell:powervault.md3200i.6d4ae5200063fd45000000004f0c6e44',
'portal': '1', 'user': None, 'password': '********', 'port': '3260'},
{'initiatorname': 'default', 'connection': '10.1.8.200', 'iqn':
'iqn.1984-05.com.dell:powervault.md3200i.6d4ae5200063fd45000000004f0c6e44',
'portal': '1', 'user': None, 'password': '********', 'port': '3260'}],
'logicalblocksize': '512', 'discard_max_bytes': 0, 'pathstatus':
[{'type': 'iSCSI', 'physdev': 'sdb', 'capacity': '21474836480', 'state':
'active', 'lun': '0'}, {'type': 'iSCSI', 'physdev': 'sdc', 'capacity':
'21474836480', 'state': 'active', 'lun': '0'}], 'devtype': 'iSCSI',
'physicalblocksize': '512', 'pvUUID': '', 'serial':
'SDELL_MD32xxi_358001G', 'GUID': '36d4ae52000662da400006cbd59f95a20',
'productID': 'MD32xxi'}]} from=::ffff:10.0.2.161,43392,
flow_id=8c71fc65-afed-4de9-b62e-cbfb8fcc5ea2,
task_id=12ce2e46-7552-443a-b531-9817a601e814 (api:52)
2017-11-15 11:22:48,195+0000 INFO (jsonrpc/3) [jsonrpc.JsonRpcServer]
RPC call Host.getDeviceList succeeded in 0.16 seconds (__init__:539)
2017-11-15 11:22:51,197+0000 INFO (jsonrpc/4) [jsonrpc.JsonRpcServer]
RPC call Host.getAllVmStats succeeded in 0.00 seconds (__init__:539)
2017-11-15 11:22:56,610+0000 INFO (jsonrpc/5) [vdsm.api] START
repoStats(options=None) from=::ffff:10.0.2.161,43392,
task_id=a35d1845-956f-4d63-9b83-ae3b074336df (api:46)
2017-11-15 11:22:56,610+0000 INFO (jsonrpc/5) [vdsm.api] FINISH
repoStats return={} from=::ffff:10.0.2.161,43392,
task_id=a35d1845-956f-4d63-9b83-ae3b074336df (api:52)
2017-11-15 11:22:56,615+0000 INFO (jsonrpc/5) [jsonrpc.JsonRpcServer]
RPC call Host.getStats succeeded in 0.00 seconds (__init__:539)
2017-11-15 11:22:57,576+0000 INFO (jsonrpc/6) [jsonrpc.JsonRpcServer]
RPC call Host.getAllVmStats succeeded in 0.00 seconds (__init__:539)
2017-11-15 11:22:59,094+0000 INFO (periodic/2) [vdsm.api] START
repoStats(options=None) from=internal,
task_id=48877af7-0645-4c21-bbb2-ebeca4e6353a (api:46)
2017-11-15 11:22:59,095+0000 INFO (periodic/2) [vdsm.api] FINISH
repoStats return={} from=internal,
task_id=48877af7-0645-4c21-bbb2-ebeca4e6353a (api:52)
2017-11-15 11:23:06,210+0000 INFO (jsonrpc/0) [jsonrpc.JsonRpcServer]
RPC call Host.getAllVmStats succeeded in 0.00 seconds (__init__:539)
2017-11-15 11:23:12,611+0000 INFO (jsonrpc/7) [vdsm.api] START
repoStats(options=None) from=::ffff:10.0.2.161,43392,
task_id=425d52ca-43ca-4e25-ac87-e19bf1faf96c (api:46)
2017-11-15 11:23:12,612+0000 INFO (jsonrpc/7) [vdsm.api] FINISH
repoStats return={} from=::ffff:10.0.2.161,43392,
task_id=425d52ca-43ca-4e25-ac87-e19bf1faf96c (api:52)
2017-11-15 11:23:12,616+0000 INFO (jsonrpc/7) [jsonrpc.JsonRpcServer]
RPC call Host.getStats succeeded in 0.00 seconds (__init__:539)
2017-11-15 11:23:13,580+0000 INFO (jsonrpc/1) [jsonrpc.JsonRpcServer]
RPC call Host.getAllVmStats succeeded in 0.00 seconds (__init__:539)
2017-11-15 11:23:14,110+0000 INFO (periodic/0) [vdsm.api] START
repoStats(options=None) from=internal,
task_id=9ee18ca7-bb48-4abd-ac29-dd901d389162 (api:46)
2017-11-15 11:23:14,110+0000 INFO (periodic/0) [vdsm.api] FINISH
repoStats return={} from=internal,
task_id=9ee18ca7-bb48-4abd-ac29-dd901d389162 (api:52)
2017-11-15 11:23:21,222+0000 INFO (jsonrpc/2) [jsonrpc.JsonRpcServer]
RPC call Host.getAllVmStats succeeded in 0.00 seconds (__init__:539)
2017-11-15 11:23:28,613+0000 INFO (jsonrpc/3) [vdsm.api] START
repoStats(options=None) from=::ffff:10.0.2.161,43392,
task_id=0fec4f25-e4c6-4d9c-b3da-66c508a06d1c (api:46)
2017-11-15 11:23:28,613+0000 INFO (jsonrpc/3) [vdsm.api] FINISH
repoStats return={} from=::ffff:10.0.2.161,43392,
task_id=0fec4f25-e4c6-4d9c-b3da-66c508a06d1c (api:52)
2017-11-15 11:23:28,618+0000 INFO (jsonrpc/3) [jsonrpc.JsonRpcServer]
RPC call Host.getStats succeeded in 0.01 seconds (__init__:539)
2017-11-15 11:23:29,125+0000 INFO (periodic/1) [vdsm.api] START
repoStats(options=None) from=internal,
task_id=e753422c-6b63-4fa2-9878-afe9b47ca1c1 (api:46)
2017-11-15 11:23:29,125+0000 INFO (periodic/1) [vdsm.api] FINISH
repoStats return={} from=internal,
task_id=e753422c-6b63-4fa2-9878-afe9b47ca1c1 (api:52)
2017-11-15 11:23:29,585+0000 INFO (jsonrpc/4) [jsonrpc.JsonRpcServer]
RPC call Host.getAllVmStats succeeded in 0.00 seconds (__init__:539)
2017-11-15 11:23:36,236+0000 INFO (jsonrpc/5) [jsonrpc.JsonRpcServer]
RPC call Host.getAllVmStats succeeded in 0.00 seconds (__init__:539)
2017-11-15 11:23:44,142+0000 INFO (periodic/0) [vdsm.api] START
repoStats(options=None) from=internal,
task_id=fbfaae38-6998-41f3-9979-3660ddb7b7c1 (api:46)
2017-11-15 11:23:44,142+0000 INFO (periodic/0) [vdsm.api] FINISH
repoStats return={} from=internal,
task_id=fbfaae38-6998-41f3-9979-3660ddb7b7c1 (api:52)
2017-11-15 11:23:44,621+0000 INFO (jsonrpc/6) [vdsm.api] START
repoStats(options=None) from=::ffff:10.0.2.161,43392,
task_id=5bd875c9-b457-4b7c-97c1-874b406e688e (api:46)
2017-11-15 11:23:44,622+0000 INFO (jsonrpc/6) [vdsm.api] FINISH
repoStats return={} from=::ffff:10.0.2.161,43392,
task_id=5bd875c9-b457-4b7c-97c1-874b406e688e (api:52)
2017-11-15 11:23:44,626+0000 INFO (jsonrpc/6) [jsonrpc.JsonRpcServer]
RPC call Host.getStats succeeded in 0.00 seconds (__init__:539)
2017-11-15 11:23:45,589+0000 INFO (jsonrpc/0) [jsonrpc.JsonRpcServer]
RPC call Host.getAllVmStats succeeded in 0.00 seconds (__init__:539)
2017-11-15 11:23:51,241+0000 INFO (jsonrpc/7) [jsonrpc.JsonRpcServer]
RPC call Host.getAllVmStats succeeded in 0.00 seconds (__init__:539)
supervdsm.log
MainProcess|jsonrpc/2::DEBUG::2017-11-15
11:22:46,771::supervdsmServer::90::SuperVdsm.ServerCallback::(wrapper)
call hbaRescan with () {}
MainProcess|jsonrpc/2::DEBUG::2017-11-15
11:22:46,772::commands::69::storage.HBA::(execCmd) /usr/bin/taskset
--cpu-list 0-7 /usr/libexec/vdsm/fc-scan (cwd None)
MainProcess|jsonrpc/2::DEBUG::2017-11-15
11:22:46,823::supervdsmServer::97::SuperVdsm.ServerCallback::(wrapper)
return hbaRescan with None
MainProcess|jsonrpc/3::DEBUG::2017-11-15
11:22:48,077::supervdsmServer::90::SuperVdsm.ServerCallback::(wrapper)
call hbaRescan with () {}
MainProcess|jsonrpc/3::DEBUG::2017-11-15
11:22:48,077::commands::69::storage.HBA::(execCmd) /usr/bin/taskset
--cpu-list 0-7 /usr/libexec/vdsm/fc-scan (cwd None)
MainProcess|jsonrpc/3::DEBUG::2017-11-15
11:22:48,129::supervdsmServer::97::SuperVdsm.ServerCallback::(wrapper)
return hbaRescan with None
MainProcess|jsonrpc/3::DEBUG::2017-11-15
11:22:48,164::supervdsmServer::90::SuperVdsm.ServerCallback::(wrapper)
call getPathsStatus with () {}
MainProcess|jsonrpc/3::DEBUG::2017-11-15
11:22:48,164::commands::69::storage.Misc.excCmd::(execCmd)
/usr/bin/taskset --cpu-list 0-7 /usr/sbin/dmsetup status (cwd None)
MainProcess|jsonrpc/3::DEBUG::2017-11-15
11:22:48,168::commands::93::storage.Misc.excCmd::(execCmd) SUCCESS:
<err> = ''; <rc> = 0
MainProcess|jsonrpc/3::DEBUG::2017-11-15
11:22:48,169::supervdsmServer::97::SuperVdsm.ServerCallback::(wrapper)
return getPathsStatus with {'sdb': 'active', 'sdc': 'active'}
MainProcess|jsonrpc/3::DEBUG::2017-11-15
11:22:48,169::supervdsmServer::90::SuperVdsm.ServerCallback::(wrapper)
call getScsiSerial with ('dm-3',) {}
MainProcess|jsonrpc/3::DEBUG::2017-11-15
11:22:48,169::commands::69::storage.Misc.excCmd::(execCmd)
/usr/bin/taskset --cpu-list 0-7 /usr/lib/udev/scsi_id --page=0x80
--whitelisted --export --replace-whitespace --device=/dev/dm-3 (cwd
None)
MainProcess|jsonrpc/3::DEBUG::2017-11-15
11:22:48,172::commands::93::storage.Misc.excCmd::(execCmd) SUCCESS:
<err> = ''; <rc> = 0
MainProcess|jsonrpc/3::DEBUG::2017-11-15
11:22:48,173::supervdsmServer::97::SuperVdsm.ServerCallback::(wrapper)
return getScsiSerial with SDELL_MD32xxi_358001G
MainProcess|jsonrpc/3::DEBUG::2017-11-15
11:22:48,174::supervdsmServer::90::SuperVdsm.ServerCallback::(wrapper)
call readSessionInfo with (56,) {}
MainProcess|jsonrpc/3::DEBUG::2017-11-15
11:22:48,175::commands::69::storage.Misc.excCmd::(execCmd)
/usr/bin/taskset --cpu-list 0-7 /usr/sbin/iscsiadm -m iface -I default
(cwd None)
MainProcess|jsonrpc/3::DEBUG::2017-11-15
11:22:48,183::commands::93::storage.Misc.excCmd::(execCmd) SUCCESS:
<err> = ''; <rc> = 0
MainProcess|jsonrpc/3::DEBUG::2017-11-15
11:22:48,174::supervdsmServer::97::SuperVdsm.ServerCallback::(wrapper)
return readSessionInfo with IscsiSession(id=56, iface=<IscsiInterface
name='default' transport='tcp' netIfaceName='None'>,
target=IscsiTarget(portal=IscsiPortal(hostname='10.1.8.201', port=3260),
tpgt=1,
iqn='iqn.1984-05.com.dell:powervault.md3200i.6d4ae5200063fd45000000004f0c6e44'),
credentials=<vdsm.storage.iscsi.ChapCredentials object at
0x7f4818169cd0>)
MainProcess|jsonrpc/3::DEBUG::2017-11-15
11:22:48,185::supervdsmServer::90::SuperVdsm.ServerCallback::(wrapper)
call readSessionInfo with (57,) {}
MainProcess|jsonrpc/3::DEBUG::2017-11-15
11:22:48,185::commands::69::storage.Misc.excCmd::(execCmd)
/usr/bin/taskset --cpu-list 0-7 /usr/sbin/iscsiadm -m iface -I default
(cwd None)
MainProcess|jsonrpc/3::DEBUG::2017-11-15
11:22:48,194::commands::93::storage.Misc.excCmd::(execCmd) SUCCESS:
<err> = ''; <rc> = 0
MainProcess|jsonrpc/3::DEBUG::2017-11-15
11:22:48,185::supervdsmServer::97::SuperVdsm.ServerCallback::(wrapper)
return readSessionInfo with IscsiSession(id=57, iface=<IscsiInterface
name='default' transport='tcp' netIfaceName='None'>,
target=IscsiTarget(portal=IscsiPortal(hostname='10.1.8.200', port=3260),
tpgt=1,
iqn='iqn.1984-05.com.dell:powervault.md3200i.6d4ae5200063fd45000000004f0c6e44'),
credentials=<vdsm.storage.iscsi.ChapCredentials object at
0x7f48181699d0>)
Thanks,
Alberto.
7 years, 5 months
Trouble Connecting ISCSI Storage to Hosted Engine
by Kyle Conti
------=_Part_2596671_1320190900.1510668774590
Content-Type: text/plain; charset="utf-8"
Content-Transfer-Encoding: 7bit
Hello,
I'm brand new to Ovirt and trying to get my Hosted Engine setup configured with ISCSI storage. I have ~8TB usable storage available on an LVM partition. This storage is on the same server that is hosting the ovirt engine virtual machine. After I use the discovery/sendtargets command via Centos 7 engine vm, it shows the correct IQN. When I use ovirt's storage discovery in GUI, I can see the storage IQN just fine as well, but when I try to connect to it, I get the following:
"Error while executing action: Failed to login to iSCSI node due to authorization failure"
Is NFS recommended instead when trying to connect the storage from server host to Ovirt Engine VM? There is nothing in this storage domain yet. This is a brand new setup.
One other thing to note...I have iscsi storage working with a NAS for my ISO storage domain. I don't want to use the NAS for the virtual machine storage domain. What's so different about the Ovirt Engine vm?
Any help would be much appreciated. Please let me know If I'm taking the wrong approach here, or I'm trying to do something that this system is not meant to do.
Regards,
KC
------=_Part_2596671_1320190900.1510668774590
Content-Type: text/html; charset="utf-8"
Content-Transfer-Encoding: quoted-printable
<html><body><div style=3D"font-family: Arial; font-size: 10pt; color: #0000=
00"><div>Hello,<br></div><div><br data-mce-bogus=3D"1"></div><div>I'm brand=
new to Ovirt and trying to get my Hosted Engine setup configured with ISCS=
I storage. I have ~8TB usable storage available on an LVM partition.&=
nbsp; This storage is on the same server that is hosting the ovirt engine v=
irtual machine. After I use the discovery/sendtargets command via Cen=
tos 7 engine vm, it shows the correct IQN. When I use ovirt's storage=
discovery in GUI, I can see the storage IQN just fine as well, but when I =
try to connect to it, I get the following:</div><div><br data-mce-bogus=3D"=
1"></div><div>"Error while executing action: Failed to login to iSCSI node =
due to authorization failure"</div><div><br></div><div>Is NFS recommended i=
nstead when trying to connect the storage from server host to Ovirt Engine =
VM? There is nothing in this storage domain yet. This is a bran=
d new setup.<br data-mce-bogus=3D"1"></div><div><br data-mce-bogus=3D"1"></=
div><div>One other thing to note...I have iscsi storage working with a NAS =
for my ISO storage domain. I don't want to use the NAS for the virtua=
l machine storage domain. What's so different about the Ovirt Engine =
vm?<br data-mce-bogus=3D"1"></div><div><br data-mce-bogus=3D"1"></div><div>=
Any help would be much appreciated. Please let me know If I'm taking =
the wrong approach here, or I'm trying to do something that this system is =
not meant to do.<br data-mce-bogus=3D"1"></div><div><br data-mce-bogus=3D"1=
"></div><div data-marker=3D"__SIG_PRE__"><div><span style=3D"font-family: v=
erdana; font-size: 10pt;" size=3D"2" face=3D"verdana" data-mce-style=3D"fon=
t-family: verdana; font-size: 10pt;">Regards,</span></div><div><span style=
=3D"font-family: verdana; font-size: 10pt;" size=3D"2" face=3D"verdana" dat=
a-mce-style=3D"font-family: verdana; font-size: 10pt;"></span><br></div><di=
v><span style=3D"font-family: verdana; font-size: 10pt;" size=3D"2" face=3D=
"verdana" data-mce-style=3D"font-family: verdana; font-size: 10pt;"><b>KC<b=
r></b></span></div></div></div></body></html>
------=_Part_2596671_1320190900.1510668774590--
7 years, 5 months