Hi all,

I added couple of new lvm tests, creating a vg based on a loop
device and performing several operations on the lvs.

All the test pass on my laptop (fedora 24, no special configuration)
and my rhel host (7.3 beta, configured for vdsm).

On the CI all the tests fail with the error bellow.

It seems that the first issue is having use_lvmetad = 1 in /etc/lvm/lvm.conf
but lvm2-lvmetad.socket is broken.

The code try to use pvscan --cache only if

    systemctl status lvm2-lvmetad.socket

succeeds - but later we get:

Error: Command ('pvscan', '--cache') failed with rc=5 out='' err='  /run/lvm/lvmetad.socket: connect failed: No such file or directory\n  WARNING: Failed to connect to lvmetad. Falling back to internal scanning.\n  Cannot proceed since lvmetad is not active.\n'

This error currently hides previous failure:

07:04:30 2016-09-25 07:02:55,632 DEBUG   [root] (MainThread) /usr/bin/taskset --cpu-list 0-3 lvcreate -n ovirt-lv-1 -L 128m ovirt-vg (cwd None)
07:04:30 2016-09-25 07:02:55,714 DEBUG   [root] (MainThread) FAILED: <err> = '  /run/lvm/lvmetad.socket: connect failed: No such file or directory\n  WARNING: Failed to connect to lvmetad. Falling back to internal scanning.\n  /dev/ovirt-vg/ovirt-lv-1: not found: device not cleared\n  Aborting. Failed to wipe start of new LV.\n'; <rc> = 5

But it looks like they are related.

Is it possible to change the ci configuration? do we have similar test
running on the ci?

See https://gerrit.ovirt.org/#/c/64367/
and the next patches in this topic.

Here are al the tests, all fail in the CI in the same way:
https://gerrit.ovirt.org/#/c/64370/3/tests/storage_lvm_test.py

I can filter the tests on the ci, but I like them to run automatically.

Thanks,
Nir

07:04:30 ======================================================================
07:04:30 ERROR: test_deactivate_unused_ovirt_lvs (storage_lvm_test.TestDeactivation)
07:04:30 ----------------------------------------------------------------------
07:04:30 Traceback (most recent call last):
07:04:30   File "/home/jenkins/workspace/vdsm_master_check-patch-fc24-x86_64/vdsm/tests/testValidation.py", line 97, in wrapper
07:04:30     return f(*args, **kwargs)
07:04:30   File "/home/jenkins/workspace/vdsm_master_check-patch-fc24-x86_64/vdsm/tests/storage_lvm_test.py", line 74, in test_deactivate_unused_ovirt_lvs
07:04:30     run("vgchange", "-an", "ovirt-vg")
07:04:30   File "/usr/lib64/python2.7/contextlib.py", line 35, in __exit__
07:04:30     self.gen.throw(type, value, traceback)
07:04:30   File "/home/jenkins/workspace/vdsm_master_check-patch-fc24-x86_64/vdsm/tests/storage_lvm_test.py", line 129, in fake_env
07:04:30     run("pvscan", "--cache")
07:04:30   File "/home/jenkins/workspace/vdsm_master_check-patch-fc24-x86_64/vdsm/tests/storage_lvm_test.py", line 87, in run
07:04:30     raise cmdutils.Error(cmd, rc, out, err)
07:04:30 Error: Command ('pvscan', '--cache') failed with rc=5 out='' err='  /run/lvm/lvmetad.socket: connect failed: No such file or directory\n  WARNING: Failed to connect to lvmetad. Falling back to internal scanning.\n  Cannot proceed since lvmetad is not active.\n'
07:04:30 -------------------- >> begin captured logging << --------------------
07:04:30 2016-09-25 07:02:55,386 DEBUG   [root] (MainThread) /usr/bin/taskset --cpu-list 0-3 losetup --find --show /var/tmp/tmpRGpRw9/backing_file (cwd None)
07:04:30 2016-09-25 07:02:55,400 DEBUG   [root] (MainThread) SUCCESS: <err> = ''; <rc> = 0
07:04:30 2016-09-25 07:02:55,400 DEBUG   [test] (MainThread) Using loop device /dev/loop0
07:04:30 2016-09-25 07:02:55,401 DEBUG   [test] (MainThread) Creating ovirt lvs
07:04:30 2016-09-25 07:02:55,401 DEBUG   [root] (MainThread) /usr/bin/taskset --cpu-list 0-3 pvcreate -ff /dev/loop0 (cwd None)
07:04:30 2016-09-25 07:02:55,495 DEBUG   [root] (MainThread) SUCCESS: <err> = '  /run/lvm/lvmetad.socket: connect failed: No such file or directory\n  WARNING: Failed to connect to lvmetad. Falling back to internal scanning.\n'; <rc> = 0
07:04:30 2016-09-25 07:02:55,495 DEBUG   [root] (MainThread) /usr/bin/taskset --cpu-list 0-3 vgcreate ovirt-vg /dev/loop0 (cwd None)
07:04:30 2016-09-25 07:02:55,589 DEBUG   [root] (MainThread) SUCCESS: <err> = '  /run/lvm/lvmetad.socket: connect failed: No such file or directory\n  WARNING: Failed to connect to lvmetad. Falling back to internal scanning.\n'; <rc> = 0
07:04:30 2016-09-25 07:02:55,589 DEBUG   [root] (MainThread) /usr/bin/taskset --cpu-list 0-3 vgchange --addtag RHAT_storage_domain (cwd None)
07:04:30 2016-09-25 07:02:55,631 DEBUG   [root] (MainThread) SUCCESS: <err> = '  /run/lvm/lvmetad.socket: connect failed: No such file or directory\n  WARNING: Failed to connect to lvmetad. Falling back to internal scanning.\n'; <rc> = 0
07:04:30 2016-09-25 07:02:55,632 DEBUG   [root] (MainThread) /usr/bin/taskset --cpu-list 0-3 lvcreate -n ovirt-lv-1 -L 128m ovirt-vg (cwd None)
07:04:30 2016-09-25 07:02:55,714 DEBUG   [root] (MainThread) FAILED: <err> = '  /run/lvm/lvmetad.socket: connect failed: No such file or directory\n  WARNING: Failed to connect to lvmetad. Falling back to internal scanning.\n  /dev/ovirt-vg/ovirt-lv-1: not found: device not cleared\n  Aborting. Failed to wipe start of new LV.\n'; <rc> = 5
07:04:30 2016-09-25 07:02:55,714 DEBUG   [root] (MainThread) /usr/bin/taskset --cpu-list 0-3 losetup --detach /dev/loop0 (cwd None)
07:04:30 2016-09-25 07:02:55,734 DEBUG   [root] (MainThread) SUCCESS: <err> = ''; <rc> = 0
07:04:30 2016-09-25 07:02:55,735 DEBUG   [root] (MainThread) /usr/bin/taskset --cpu-list 0-3 /sbin/udevadm settle --timeout=5 (cwd None)
07:04:30 2016-09-25 07:02:55,746 DEBUG   [root] (MainThread) SUCCESS: <err> = ''; <rc> = 0
07:04:30 2016-09-25 07:02:55,746 DEBUG   [root] (MainThread) /usr/bin/taskset --cpu-list 0-3 pvscan --cache (cwd None)
07:04:30 2016-09-25 07:02:55,758 DEBUG   [root] (MainThread) FAILED: <err> = '  /run/lvm/lvmetad.socket: connect failed: No such file or directory\n  WARNING: Failed to connect to lvmetad. Falling back to internal scanning.\n  Cannot proceed since lvmetad is not active.\n'; <rc> = 5
07:04:43 --------------------- >> end captured logging << ---------------------