
Thanks Alexey, Nir/Adam/Raz - any initial thoughts about that? have you encountered that issue before? Regards, Liron On Wed, Apr 5, 2017 at 12:21 PM, Николаев Алексей < alexeynikolaev.post@yandex.ru> wrote:
Hi Liron,
Well, this error appear on freshly installed oVirt hosted engine 4.1.1.6-1 with oVirt Node NG 4.1. There is no engine major version upgrade. Node NG was updated from 4.1.0 to 4.1.1 by the ovirt-node-ng-image* pkgs.
vdsm version is:
Installed Packages Name : vdsm Arch : x86_64 Version : 4.19.10 Release : 1.el7.centos ------------------------------------------------------
I also install oVirt 3.6 hosted engine on another server (CentOS 7.2 minimal, not updated to 7.3) and add the some ISCSI target as the data domain. This work OK. After I update engine from 3.6 to 4.1. ISCSI data domain is OK after update. Another ISCSI targets was added as data domain without errors.
vdsm version is:
Installed Packages Name : vdsm Arch : noarch Version : 4.17.32 Release : 0.el7.centos
04.04.2017, 16:48, "Liron Aravot" <laravot@redhat.com>:
Hi Alexey,
Looking in your logs - it indeed look weird, the vg is created seconds later we fail to get it when attempting to create the domain.
Did you upgrade/downgraded only the engine version or the vdsm version as well?
2017-04-03 11:16:39,595+0300 INFO (jsonrpc/4) [dispatcher] Run and protect: createVG(vgname=u'3050e121-83b5-474e-943f-6dc52af38a17', devlist=[u'3600140505d11d22e61f46a6ba4a 7bdc7'], force=False, options=None) (logUtils:51)
output error', ' /dev/mapper/36001405b7dd7b800e7049ecbf6830637: read failed after 0 of 4096 at 4294967230464: Input/output error', ' /dev/mapper/36001405b7dd7b800e7049ecbf 6830637: read failed after 0 of 4096 at 4294967287808: Input/output error', ' WARNING: Error counts reached a limit of 3. Device /dev/mapper/ 36001405b7dd7b800e7049ecbf68306 37 was disabled', ' Failed to find physical volume "/dev/mapper/ 3600140505d11d22e61f46a6ba4a7bdc7".'] (lvm:323) 2017-04-03 11:16:38,832+0300 WARN (jsonrpc/5) [storage.HSM] getPV failed for guid: 3600140505d11d22e61f46a6ba4a7bdc7 (hsm:1973) Traceback (most recent call last): File "/usr/share/vdsm/storage/hsm.py", line 1970, in _getDeviceList pv = lvm.getPV(guid) File "/usr/share/vdsm/storage/lvm.py", line 853, in getPV raise se.((pvName,))
On Tue, Apr 4, 2017 at 3:33 PM, Николаев Алексей < alexeynikolaev.post@yandex.ru> wrote:
The last I was update oVirt oVirt 3.6.7.5-1 to oVirt 4.1.1.6-1. Main ISCSI datadomain is OK. Another one ISCSI datadomain was added successfully.
The problem are only with freshly installed oVirt 4.1.1.6-1.
Added logs after update ovirt and add new ISCSI datadomains.
03.04.2017, 17:26, "Николаев Алексей" <alexeynikolaev.post@yandex.ru>:
Also I test again ISCSI TARGET with oVirt 3.6.7.5-1 and CentOS 7.2.1511 host. ISCSI data domain was added succesfully. "the truth is out there". Add logs.
03.04.2017, 11:24, "Николаев Алексей" <alexeynikolaev.post@yandex.ru>:
Add logs. Thx!
02.04.2017, 16:19, "Liron Aravot" <laravot@redhat.com>:
Hi Alexey, can you please attach the engine/vdsm logs?
thanks.
On Fri, Mar 31, 2017 at 11:08 AM, Николаев Алексей < alexeynikolaev.post@yandex.ru> wrote:
Hi, community!
I'm try using ISCSI DATA DOMAIN with oVirt 4.1.1.6-1 and oVirt Node 4.1.1. But get this error while adding data domain.
Error while executing action New SAN Storage Domain: Cannot deactivate Logical Volume
2017-03-31 10:47:45,099+03 ERROR [org.ovirt.engine.core. vdsbroker.vdsbroker.CreateStorageDomainVDSCommand] (default task-41) [717af17f] Command 'CreateStorageDomainVDSCommand(HostName = node169-07., CreateStorageDomainVDSCommandParameters:{runAsync='true', hostId='ca177a14-56ef-4736-a0a9-ab9dc2a8eb90', storageDomain=' StorageDomainStatic:{name='data1', id='56038842-2fbe-4ada-96f1-c4b4e66fd0b7'}', args='sUG5rL-zqvF-DMox-uMK3-Xl91-ZiYw-k27C3B'})' execution failed: VDSGenericException: VDSErrorException: Failed in vdscommand to CreateStorageDomainVDS, error = Cannot deactivate Logical Volume: ('General Storage Exception: (\'5 [] [\\\' /dev/mapper/36001405b7dd7b800e7049ecbf 6830637: read failed after 0 of 4096 at 0: Input/output error\\\', \\\' /dev/mapper/36001405b7dd7b800e7049ecbf6830637: read failed after 0 of 4096 at 4294967230464: Input/output error\\\', \\\' /dev/mapper/36001405 b7dd7b800e7049ecbf6830637: read failed after 0 of 4096 at 4294967287808: Input/output error\\\', \\\' WARNING: Error counts reached a limit of 3. Device /dev/mapper/36001405b7dd7b800e7049ecbf6830637 was disabled\\\', \\\' WARNING: Error counts reached a limit of 3. Device /dev/mapper/ 36001405b7dd7b800e7049ecbf6830637 was disabled\\\', \\\' Volume group " 56038842-2fbe-4ada-96f1-c4b4e66fd0b7" not found\\\', \\\' Cannot process volume group 56038842-2fbe-4ada-96f1-c4b4e66fd0b7\\\']\\n56038842- 2fbe-4ada-96f1-c4b4e66fd0b7/[\\\'master\\\']\',)',)
Previously, this ISCSI DATA DOMAIN works well with oVirt 3.6 and CentOS 7.2 host. How I can debug this error?
_______________________________________________ Users mailing list Users@ovirt.org http://lists.ovirt.org/mailman/listinfo/users
,
_______________________________________________ Users mailing list Users@ovirt.org http://lists.ovirt.org/mailman/listinfo/users
,
_______________________________________________ Users mailing list Users@ovirt.org http://lists.ovirt.org/mailman/listinfo/users