Are you mounted with discard ? perhaps fstrim ?
On Mon, Jul 2, 2018 at 10:23 PM, Matt Simonsen <matt(a)khoza.com> wrote:
Yes, it shows 8g on the VG
I removed the LV for /var/crash, then installed again, and it is still
failing on the step:
2018-07-02 12:21:10,015 [DEBUG] (MainThread) Calling: (['lvcreate',
'--thin', '--virtualsize', u'53750005760B', '--name',
'ovirt-node-ng-4.2.4-0.20180626.0', u'onn_node1-g8-h4/pool00'],)
{'close_fds': True, 'stderr': -2}
2018-07-02 12:21:10,069 [DEBUG] (MainThread) Exception! Cannot create
new thin volume, free space in thin pool onn_node1-g8-h4/pool00 reached
threshold.
2018-07-02 12:21:10,069 [DEBUG] (MainThread) Calling binary: (['umount',
'-l', u'/tmp/mnt.ZYOjC'],) {}
Thanks
Matt
On 07/02/2018 10:55 AM, Yuval Turgeman wrote:
Not in front of my laptop so it's a little hard to read but does it say 8g
free on the vg ?
On Mon, Jul 2, 2018, 20:00 Matt Simonsen <matt(a)khoza.com> wrote:
> This error adds some clarity.
>
> That said, I'm a bit unsure how the space can be the issue given I have
> several hundred GB of storage in the thin pool that's unused...
>
> How do you suggest I proceed?
>
> Thank you for your help,
>
> Matt
>
>
> [root@node6-g8-h4 ~]# lvs
>
> LV VG Attr
> LSize Pool Origin Data% Meta% Move Log
> Cpy%Sync Convert
> home onn_node1-g8-h4 Vwi-aotz--
> 1.00g pool00
> 4.79
> ovirt-node-ng-4.2.2-0.20180423.0 onn_node1-g8-h4 Vwi---tz-k
> <50.06g pool00 root
>
> ovirt-node-ng-4.2.2-0.20180423.0+1 onn_node1-g8-h4 Vwi---tz--
> <50.06g pool00 ovirt-node-ng-4.2.2-0.20180423.0
>
> ovirt-node-ng-4.2.3.1-0.20180530.0 onn_node1-g8-h4 Vri---tz-k
> <50.06g pool00
>
> ovirt-node-ng-4.2.3.1-0.20180530.0+1 onn_node1-g8-h4 Vwi-aotz--
> <50.06g pool00 ovirt-node-ng-4.2.3.1-0.20180530.0
> 6.95
> pool00 onn_node1-g8-h4 twi-aotz--
> <1.30t 76.63
> 50.34
> root onn_node1-g8-h4 Vwi---tz--
> <50.06g pool00
>
> tmp onn_node1-g8-h4 Vwi-aotz--
> 1.00g pool00
> 5.04
> var onn_node1-g8-h4 Vwi-aotz--
> 15.00g pool00
> 5.86
> var_crash onn_node1-g8-h4 Vwi---tz--
> 10.00g pool00
>
> var_local_images onn_node1-g8-h4 Vwi-aotz--
> 1.10t pool00
> 89.72
> var_log onn_node1-g8-h4 Vwi-aotz--
> 8.00g pool00
> 6.84
> var_log_audit onn_node1-g8-h4 Vwi-aotz--
> 2.00g pool00
> 6.16
> [root@node6-g8-h4 ~]# vgs
> VG #PV #LV #SN Attr VSize VFree
> onn_node1-g8-h4 1 13 0 wz--n- <1.31t 8.00g
>
>
> 2018-06-29 14:19:31,142 [DEBUG] (MainThread) Version: imgbased-1.0.20
> 2018-06-29 14:19:31,147 [DEBUG] (MainThread) Arguments:
> Namespace(FILENAME='/usr/share/ovirt-node-ng/image//
> ovirt-node-ng-4.2.0-0.20180626.0.el7.squashfs.img', command='update',
> debug=True, experimental=False, format='liveimg', stream='Image')
> 2018-06-29 14:19:31,147 [INFO] (MainThread) Extracting image
> '/usr/share/ovirt-node-ng/image//ovirt-node-ng-4.2.0-0.
> 20180626.0.el7.squashfs.img'
> 2018-06-29 14:19:31,148 [DEBUG] (MainThread) Calling binary: (['mktemp',
> '-d', '--tmpdir', 'mnt.XXXXX'],) {}
> 2018-06-29 14:19:31,148 [DEBUG] (MainThread) Calling: (['mktemp',
'-d',
> '--tmpdir', 'mnt.XXXXX'],) {'close_fds': True,
'stderr': -2}
> 2018-06-29 14:19:31,150 [DEBUG] (MainThread) Returned: /tmp/mnt.1OhaU
> 2018-06-29 14:19:31,151 [DEBUG] (MainThread) Calling binary: (['mount',
>
'/usr/share/ovirt-node-ng/image//ovirt-node-ng-4.2.0-0.20180626.0.el7.squashfs.img',
> u'/tmp/mnt.1OhaU'],) {}
> 2018-06-29 14:19:31,151 [DEBUG] (MainThread) Calling: (['mount',
>
'/usr/share/ovirt-node-ng/image//ovirt-node-ng-4.2.0-0.20180626.0.el7.squashfs.img',
> u'/tmp/mnt.1OhaU'],) {'close_fds': True, 'stderr': -2}
> 2018-06-29 14:19:31,157 [DEBUG] (MainThread) Returned:
> 2018-06-29 14:19:31,158 [DEBUG] (MainThread) Mounted squashfs
> 2018-06-29 14:19:31,158 [DEBUG] (MainThread) Found fsimage at
> '/tmp/mnt.1OhaU/LiveOS/rootfs.img'
> 2018-06-29 14:19:31,159 [DEBUG] (MainThread) Calling binary: (['mktemp',
> '-d', '--tmpdir', 'mnt.XXXXX'],) {}
> 2018-06-29 14:19:31,159 [DEBUG] (MainThread) Calling: (['mktemp',
'-d',
> '--tmpdir', 'mnt.XXXXX'],) {'close_fds': True,
'stderr': -2}
> 2018-06-29 14:19:31,162 [DEBUG] (MainThread) Returned: /tmp/mnt.153do
> 2018-06-29 14:19:31,162 [DEBUG] (MainThread) Calling binary: (['mount',
> u'/tmp/mnt.1OhaU/LiveOS/rootfs.img', u'/tmp/mnt.153do'],) {}
> 2018-06-29 14:19:31,162 [DEBUG] (MainThread) Calling: (['mount',
> u'/tmp/mnt.1OhaU/LiveOS/rootfs.img', u'/tmp/mnt.153do'],)
{'close_fds':
> True, 'stderr': -2}
> 2018-06-29 14:19:31,177 [DEBUG] (MainThread) Returned:
> 2018-06-29 14:19:31,189 [DEBUG] (MainThread) Using nvr:
> ovirt-node-ng-4.2.4-0.20180626.0
> 2018-06-29 14:19:31,189 [DEBUG] (MainThread) Fetching image for '/'
> 2018-06-29 14:19:31,189 [DEBUG] (MainThread) Calling binary: (['findmnt',
> '--noheadings', '-o', 'SOURCE', '/'],) {}
> 2018-06-29 14:19:31,190 [DEBUG] (MainThread) Calling: (['findmnt',
> '--noheadings', '-o', 'SOURCE', '/'],)
{'close_fds': True, 'stderr': -2}
> 2018-06-29 14:19:31,203 [DEBUG] (MainThread) Returned:
> /dev/mapper/onn_node1--g8--h4-ovirt--node--ng--4.2.3.1--0.20180530.0+1
> 2018-06-29 14:19:31,204 [DEBUG] (MainThread) Found
> '/dev/mapper/onn_node1--g8--h4-ovirt--node--ng--4.2.3.1--0.20180530.0+1'
> 2018-06-29 14:19:31,204 [DEBUG] (MainThread) Calling binary: (['lvs',
> '--noheadings', '--ignoreskippedcluster',
'-ovg_name,lv_name',
> u'/dev/mapper/onn_node1--g8--h4-ovirt--node--ng--4.2.3.1--0.20180530.0+1'],)
> {'stderr': <open file '/dev/null', mode 'w' at
0x7f56b787eed0>}
> 2018-06-29 14:19:31,204 [DEBUG] (MainThread) Calling: (['lvs',
> '--noheadings', '--ignoreskippedcluster',
'-ovg_name,lv_name',
> u'/dev/mapper/onn_node1--g8--h4-ovirt--node--ng--4.2.3.1--0.20180530.0+1'],)
> {'close_fds': True, 'stderr': <open file '/dev/null', mode
'w' at
> 0x7f56b787eed0>}
> 2018-06-29 14:19:31,283 [DEBUG] (MainThread) Returned: onn_node1-g8-h4
> ovirt-node-ng-4.2.3.1-0.20180530.0+1
> 2018-06-29 14:19:31,283 [DEBUG] (MainThread) Found LV for path
> /dev/mapper/onn_node1--g8--h4-ovirt--node--ng--4.2.3.1--0.20180530.0+1:
> onn_node1-g8-h4 ovirt-node-ng-4.2.3.1-0.20180530.0+1
> 2018-06-29 14:19:31,283 [DEBUG] (MainThread) Found LV
> 'ovirt-node-ng-4.2.3.1-0.20180530.0+1' for path
> '/dev/mapper/onn_node1--g8--h4-ovirt--node--ng--4.2.3.1--0.20180530.0+1'
> 2018-06-29 14:19:31,284 [DEBUG] (MainThread) Calling binary: (['vgs',
> '--noheadings', '--ignoreskippedcluster', '--select',
'vg_tags =
> imgbased:vg', '-o', 'vg_name'],) {'stderr': <open file
'/dev/null', mode
> 'w' at 0x7f56b787eed0>}
> 2018-06-29 14:19:31,284 [DEBUG] (MainThread) Calling: (['vgs',
> '--noheadings', '--ignoreskippedcluster', '--select',
'vg_tags =
> imgbased:vg', '-o', 'vg_name'],) {'close_fds': True,
'stderr': <open file
> '/dev/null', mode 'w' at 0x7f56b787eed0>}
> 2018-06-29 14:19:31,321 [DEBUG] (MainThread) Returned: onn_node1-g8-h4
> 2018-06-29 14:19:31,322 [DEBUG] (MainThread) Calling binary: (['lvs',
> '--noheadings', '--ignoreskippedcluster', '-osize',
'--units', 'B',
> u'onn_node1-g8-h4/ovirt-node-ng-4.2.3.1-0.20180530.0+1'],)
{'stderr':
> <open file '/dev/null', mode 'w' at 0x7f56b787eed0>}
> 2018-06-29 14:19:31,322 [DEBUG] (MainThread) Calling: (['lvs',
> '--noheadings', '--ignoreskippedcluster', '-osize',
'--units', 'B',
> u'onn_node1-g8-h4/ovirt-node-ng-4.2.3.1-0.20180530.0+1'],)
{'close_fds':
> True, 'stderr': <open file '/dev/null', mode 'w' at
0x7f56b787eed0>}
> 2018-06-29 14:19:31,355 [DEBUG] (MainThread) Returned: 53750005760B
> 2018-06-29 14:19:31,355 [DEBUG] (MainThread) Recommeneded base size:
> 53750005760B
> 2018-06-29 14:19:31,355 [INFO] (MainThread) Starting base creation
> 2018-06-29 14:19:31,355 [INFO] (MainThread) New base will be:
> ovirt-node-ng-4.2.4-0.20180626.0
> 2018-06-29 14:19:31,356 [DEBUG] (MainThread) Calling binary: (['vgs',
> '--noheadings', '--ignoreskippedcluster', '@imgbased:pool',
'-o',
> 'lv_full_name'],) {'stderr': <open file '/dev/null', mode
'w' at
> 0x7f56b787eed0>}
> 2018-06-29 14:19:31,356 [DEBUG] (MainThread) Calling: (['vgs',
> '--noheadings', '--ignoreskippedcluster', '@imgbased:pool',
'-o',
> 'lv_full_name'],) {'close_fds': True, 'stderr': <open file
'/dev/null',
> mode 'w' at 0x7f56b787eed0>}
> 2018-06-29 14:19:31,381 [DEBUG] (MainThread) Returned:
> onn_node1-g8-h4/pool00
> 2018-06-29 14:19:31,381 [DEBUG] (MainThread) Pool: <LV
> 'onn_node1-g8-h4/pool00' />
> 2018-06-29 14:19:31,382 [DEBUG] (MainThread) Calling binary:
> (['lvcreate', '--thin', '--virtualsize',
u'53750005760B', '--name',
> 'ovirt-node-ng-4.2.4-0.20180626.0', u'onn_node1-g8-h4/pool00'],) {}
> 2018-06-29 14:19:31,382 [DEBUG] (MainThread) Calling: (['lvcreate',
> '--thin', '--virtualsize', u'53750005760B',
'--name',
> 'ovirt-node-ng-4.2.4-0.20180626.0', u'onn_node1-g8-h4/pool00'],)
> {'close_fds': True, 'stderr': -2}
> 2018-06-29 14:19:31,406 [DEBUG] (MainThread) Exception! Cannot create
> new thin volume, free space in thin pool onn_node1-g8-h4/pool00 reached
> threshold.
>
> 2018-06-29 14:19:31,406 [DEBUG] (MainThread) Calling binary: (['umount',
> '-l', u'/tmp/mnt.153do'],) {}
> 2018-06-29 14:19:31,406 [DEBUG] (MainThread) Calling: (['umount',
'-l',
> u'/tmp/mnt.153do'],) {'close_fds': True, 'stderr': -2}
> 2018-06-29 14:19:31,422 [DEBUG] (MainThread) Returned:
> 2018-06-29 14:19:31,422 [DEBUG] (MainThread) Calling binary: (['rmdir',
> u'/tmp/mnt.153do'],) {}
> 2018-06-29 14:19:31,422 [DEBUG] (MainThread) Calling: (['rmdir',
> u'/tmp/mnt.153do'],) {'close_fds': True, 'stderr': -2}
> 2018-06-29 14:19:31,425 [DEBUG] (MainThread) Returned:
> 2018-06-29 14:19:31,425 [DEBUG] (MainThread) Calling binary: (['umount',
> '-l', u'/tmp/mnt.1OhaU'],) {}
> 2018-06-29 14:19:31,425 [DEBUG] (MainThread) Calling: (['umount',
'-l',
> u'/tmp/mnt.1OhaU'],) {'close_fds': True, 'stderr': -2}
> 2018-06-29 14:19:31,437 [DEBUG] (MainThread) Returned:
> 2018-06-29 14:19:31,437 [DEBUG] (MainThread) Calling binary: (['rmdir',
> u'/tmp/mnt.1OhaU'],) {}
> 2018-06-29 14:19:31,437 [DEBUG] (MainThread) Calling: (['rmdir',
> u'/tmp/mnt.1OhaU'],) {'close_fds': True, 'stderr': -2}
> 2018-06-29 14:19:31,440 [DEBUG] (MainThread) Returned:
> Traceback (most recent call last):
> File "/usr/lib64/python2.7/runpy.py", line 162, in _run_module_as_main
> "__main__", fname, loader, pkg_name)
> File "/usr/lib64/python2.7/runpy.py", line 72, in _run_code
> exec code in run_globals
> File
"/tmp/tmp.mzQBYouvWT/usr/lib/python2.7/site-packages/imgbased/__main__.py",
> line 53, in <module>
> CliApplication()
> File
"/tmp/tmp.mzQBYouvWT/usr/lib/python2.7/site-packages/imgbased/__init__.py",
> line 82, in CliApplication
> app.hooks.emit("post-arg-parse", args)
> File
"/tmp/tmp.mzQBYouvWT/usr/lib/python2.7/site-packages/imgbased/hooks.py",
> line 120, in emit
> cb(self.context, *args)
> File
"/tmp/tmp.mzQBYouvWT/usr/lib/python2.7/site-packages/imgbased/plugins/update.py",
> line 56, in post_argparse
> base_lv, _ = LiveimgExtractor(app.imgbase).extract(args.FILENAME)
> File
"/tmp/tmp.mzQBYouvWT/usr/lib/python2.7/site-packages/imgbased/plugins/update.py",
> line 118, in extract
> "%s" % size, nvr)
> File
"/tmp/tmp.mzQBYouvWT/usr/lib/python2.7/site-packages/imgbased/plugins/update.py",
> line 84, in add_base_with_tree
> lvs)
> File
"/tmp/tmp.mzQBYouvWT/usr/lib/python2.7/site-packages/imgbased/imgbase.py",
> line 310, in add_base
> new_base_lv = pool.create_thinvol(new_base.lv_name, size)
> File
"/tmp/tmp.mzQBYouvWT/usr/lib/python2.7/site-packages/imgbased/lvm.py",
> line 324, in create_thinvol
> self.lvm_name])
> File
"/tmp/tmp.mzQBYouvWT/usr/lib/python2.7/site-packages/imgbased/utils.py",
> line 390, in lvcreate
> return self.call(["lvcreate"] + args, **kwargs)
> File
"/tmp/tmp.mzQBYouvWT/usr/lib/python2.7/site-packages/imgbased/utils.py",
> line 378, in call
> stdout = call(*args, **kwargs)
> File
"/tmp/tmp.mzQBYouvWT/usr/lib/python2.7/site-packages/imgbased/utils.py",
> line 153, in call
> return subprocess.check_output(*args, **kwargs).strip()
> File "/usr/lib64/python2.7/subprocess.py", line 575, in check_output
> raise CalledProcessError(retcode, cmd, output=output)
> subprocess.CalledProcessError: Command '['lvcreate', '--thin',
> '--virtualsize', u'53750005760B', '--name',
'ovirt-node-ng-4.2.4-0.20180626.0',
> u'onn_node1-g8-h4/pool00']' returned non-zero exit status 5
>
>
>
>
>
> On 07/02/2018 04:58 AM, Yuval Turgeman wrote:
>
> Looks like the upgrade script failed - can you please attach
> /var/log/imgbased.log or /tmp/imgbased.log ?
>
> Thanks,
> Yuval.
>
> On Mon, Jul 2, 2018 at 2:54 PM, Sandro Bonazzola <sbonazzo(a)redhat.com>
> wrote:
>
>> Yuval, can you please have a look?
>>
>> 2018-06-30 7:48 GMT+02:00 Oliver Riesener <Oliver.Riesener(a)hs-bremen.de>
>> :
>>
>>> Yes, here is the same.
>>>
>>> It seams the bootloader isn’t configured right ?
>>>
>>> I did the Upgrade and reboot to 4.2.4 from UI and got:
>>>
>>> [root@ovn-monster ~]# nodectl info
>>> layers:
>>> ovirt-node-ng-4.2.4-0.20180626.0:
>>> ovirt-node-ng-4.2.4-0.20180626.0+1
>>> ovirt-node-ng-4.2.3.1-0.20180530.0:
>>> ovirt-node-ng-4.2.3.1-0.20180530.0+1
>>> ovirt-node-ng-4.2.3-0.20180524.0:
>>> ovirt-node-ng-4.2.3-0.20180524.0+1
>>> ovirt-node-ng-4.2.1.1-0.20180223.0:
>>> ovirt-node-ng-4.2.1.1-0.20180223.0+1
>>> bootloader:
>>> default: ovirt-node-ng-4.2.3-0.20180524.0+1
>>> entries:
>>> ovirt-node-ng-4.2.3-0.20180524.0+1:
>>> index: 0
>>> title: ovirt-node-ng-4.2.3-0.20180524.0
>>> kernel: /boot/ovirt-node-ng-4.2.3-0.20180524.0+1/vmlinuz-3.10.0-
>>> 862.3.2.el7.x86_64
>>> args: "ro crashkernel=auto rd.lvm.lv=onn_ovn-monster/
>>> ovirt-node-ng-4.2.3-0.20180524.0+1 rd.lvm.lv=onn_ovn-monster/swap
>>> rd.md.uuid=c6c3013b:027a9346:67dfd181:89635587 rhgb quiet
>>> LANG=de_DE.UTF-8 img.bootid=ovirt-node-ng-4.2.3-0.20180524.0+1"
>>> initrd: /boot/ovirt-node-ng-4.2.3-0.
>>> 20180524.0+1/initramfs-3.10.0-862.3.2.el7.x86_64.img
>>> root: /dev/onn_ovn-monster/ovirt-node-ng-4.2.3-0.20180524.0+1
>>> ovirt-node-ng-4.2.1.1-0.20180223.0+1:
>>> index: 1
>>> title: ovirt-node-ng-4.2.1.1-0.20180223.0
>>> kernel: /boot/ovirt-node-ng-4.2.1.1-0.
>>> 20180223.0+1/vmlinuz-3.10.0-693.17.1.el7.x86_64
>>> args: "ro crashkernel=auto rd.lvm.lv=onn_ovn-monster/
>>> ovirt-node-ng-4.2.1.1-0.20180223.0+1 rd.lvm.lv=onn_ovn-monster/swap
>>> rd.md.uuid=c6c3013b:027a9346:67dfd181:89635587 rhgb quiet
>>> LANG=de_DE.UTF-8 img.bootid=ovirt-node-ng-4.2.1.1-0.20180223.0+1"
>>> initrd: /boot/ovirt-node-ng-4.2.1.1-0.
>>> 20180223.0+1/initramfs-3.10.0-693.17.1.el7.x86_64.img
>>> root: /dev/onn_ovn-monster/ovirt-node-ng-4.2.1.1-0.20180223.0+1
>>> current_layer: ovirt-node-ng-4.2.3-0.20180524.0+1
>>> [root@ovn-monster ~]# uptime
>>> 07:35:27 up 2 days, 15:42, 1 user, load average: 1,07, 1,00, 0,95
>>>
>>> Am 29.06.2018 um 23:53 schrieb Matt Simonsen <matt(a)khoza.com>:
>>>
>>> Hello,
>>>
>>> I did yum updates on 2 of my oVirt 4.2.3 nodes running the prebuilt
>>> node platform and it doesn't appear the updates worked.
>>>
>>>
>>> [root@node6-g8-h4 ~]# yum update
>>> Loaded plugins: enabled_repos_upload, fastestmirror, imgbased-persist,
>>> : package_upload, product-id, search-disabled-repos,
>>> subscription-
>>> : manager
>>> This system is not registered with an entitlement server. You can use
>>> subscription-manager to register.
>>> Loading mirror speeds from cached hostfile
>>> * ovirt-4.2-epel:
linux.mirrors.es.net
>>> Resolving Dependencies
>>> --> Running transaction check
>>> ---> Package ovirt-node-ng-image-update.noarch 0:4.2.3.1-1.el7 will be
>>> updated
>>> ---> Package ovirt-node-ng-image-update.noarch 0:4.2.4-1.el7 will be
>>> obsoleting
>>> ---> Package ovirt-node-ng-image-update-placeholder.noarch
>>> 0:4.2.3.1-1.el7 will be obsoleted
>>> --> Finished Dependency Resolution
>>>
>>> Dependencies Resolved
>>>
>>> ============================================================
>>> =============================================================
>>> Package Arch
>>> Version Repository Size
>>> ============================================================
>>> =============================================================
>>> Installing:
>>> ovirt-node-ng-image-update noarch
>>> 4.2.4-1.el7 ovirt-4.2 647 M
>>> replacing ovirt-node-ng-image-update-placeholder.noarch
>>> 4.2.3.1-1.el7
>>>
>>> Transaction Summary
>>> ============================================================
>>> =============================================================
>>> Install 1 Package
>>>
>>> Total download size: 647 M
>>> Is this ok [y/d/N]: y
>>> Downloading packages:
>>> warning: /var/cache/yum/x86_64/7/ovirt-4.2/packages/ovirt-node-ng-
>>> image-update-4.2.4-1.el7.noarch.rpm: Header V4 RSA/SHA1 Signature, key
>>> ID fe590cb7: NOKEY
>>> Public key for ovirt-node-ng-image-update-4.2.4-1.el7.noarch.rpm is
>>> not installed
>>> ovirt-node-ng-image-update-4.2.4-1.el7.noarch.rpm | 647 MB 00:02:07
>>> Retrieving key from file:///etc/pki/rpm-gpg/RPM-GPG-ovirt-4.2
>>> Importing GPG key 0xFE590CB7:
>>> Userid : "oVirt <infra(a)ovirt.org>"
>>> Fingerprint: 31a5 d783 7fad 7cb2 86cd 3469 ab8c 4f9d fe59 0cb7
>>> Package : ovirt-release42-4.2.3.1-1.el7.noarch (installed)
>>> From : /etc/pki/rpm-gpg/RPM-GPG-ovirt-4.2
>>> Is this ok [y/N]: y
>>> Running transaction check
>>> Running transaction test
>>> Transaction test succeeded
>>> Running transaction
>>> Installing : ovirt-node-ng-image-update-4.2.4-1.el7.noarch 1/3
>>> warning: %post(ovirt-node-ng-image-update-4.2.4-1.el7.noarch)
>>> scriptlet failed, exit status 1
>>> Non-fatal POSTIN scriptlet failure in rpm package
>>> ovirt-node-ng-image-update-4.2.4-1.el7.noarch
>>> Erasing : ovirt-node-ng-image-update-placeholder-4.2.3.1-1.el7.noarch
>>> 2/3
>>> Cleanup : ovirt-node-ng-image-update-4.2.3.1-1.el7.noarch 3/3
>>> warning: file
/usr/share/ovirt-node-ng/image/ovirt-node-ng-4.2.0-0.20180530.0.el7.squashfs.img:
>>> remove failed: No such file or directory
>>> Uploading Package Profile
>>> Unable to upload Package Profile
>>> Verifying : ovirt-node-ng-image-update-4.2.4-1.el7.noarch 1/3
>>> Verifying : ovirt-node-ng-image-update-4.2.3.1-1.el7.noarch 2/3
>>> Verifying : ovirt-node-ng-image-update-placeholder-4.2.3.1-1.el7.noarch
>>> 3/3
>>>
>>> Installed:
>>> ovirt-node-ng-image-update.noarch 0:4.2.4-1.el7
>>>
>>> Replaced:
>>> ovirt-node-ng-image-update-placeholder.noarch 0:4.2.3.1-1.el7
>>>
>>> Complete!
>>> Uploading Enabled Repositories Report
>>> Loaded plugins: fastestmirror, product-id, subscription-manager
>>> This system is not registered with an entitlement server. You can use
>>> subscription-manager to register.
>>> Cannot upload enabled repos report, is this client registered?
>>>
>>>
>>> My engine shows the nodes as having no updates, however the major
>>> components including the kernel version and port 9090 admin GUI show 4.2.3
>>>
>>> Is there anything I can provide to help diagnose the issue?
>>>
>>>
>>> [root@node6-g8-h4 ~]# rpm -qa | grep ovirt
>>>
>>> ovirt-imageio-common-1.3.1.2-0.el7.centos.noarch
>>> ovirt-host-deploy-1.7.3-1.el7.centos.noarch
>>> ovirt-vmconsole-host-1.0.5-4.el7.centos.noarch
>>> ovirt-provider-ovn-driver-1.2.10-1.el7.centos.noarch
>>> ovirt-engine-sdk-python-3.6.9.1-1.el7.noarch
>>> ovirt-setup-lib-1.1.4-1.el7.centos.noarch
>>> ovirt-release42-4.2.3.1-1.el7.noarch
>>> ovirt-imageio-daemon-1.3.1.2-0.el7.centos.noarch
>>> ovirt-hosted-engine-setup-2.2.20-1.el7.centos.noarch
>>> ovirt-host-dependencies-4.2.2-2.el7.centos.x86_64
>>> ovirt-hosted-engine-ha-2.2.11-1.el7.centos.noarch
>>> ovirt-host-4.2.2-2.el7.centos.x86_64
>>> ovirt-node-ng-image-update-4.2.4-1.el7.noarch
>>> ovirt-vmconsole-1.0.5-4.el7.centos.noarch
>>> ovirt-release-host-node-4.2.3.1-1.el7.noarch
>>> cockpit-ovirt-dashboard-0.11.24-1.el7.centos.noarch
>>> ovirt-node-ng-nodectl-4.2.0-0.20180524.0.el7.noarch
>>> python-ovirt-engine-sdk4-4.2.6-2.el7.centos.x86_64
>>>
>>> [root@node6-g8-h4 ~]# yum update
>>> Loaded plugins: enabled_repos_upload, fastestmirror, imgbased-persist,
>>> package_upload, product-id, search-disabled-repos, subscription-manager
>>> This system is not registered with an entitlement server. You can use
>>> subscription-manager to register.
>>> Loading mirror speeds from cached hostfile
>>> * ovirt-4.2-epel:
linux.mirrors.es.net
>>> No packages marked for update
>>> Uploading Enabled Repositories Report
>>> Loaded plugins: fastestmirror, product-id, subscription-manager
>>> This system is not registered with an entitlement server. You can use
>>> subscription-manager to register.
>>> Cannot upload enabled repos report, is this client registered?
>>> _______________________________________________
>>> Users mailing list -- users(a)ovirt.org
>>> To unsubscribe send an email to users-leave(a)ovirt.org
>>> Privacy Statement:
https://www.ovirt.org/site/privacy-policy/
>>> oVirt Code of Conduct:
https://www.ovirt.org/community/about/community-
>>> guidelines/
>>> List Archives:
https://lists.ovirt.org/archives/list/users@ovirt.org/
>>> message/UHQMGULUHL4GBBHUBNGOAICJEM6W3RVW/
>>>
>>>
>>>
>>> _______________________________________________
>>> Users mailing list -- users(a)ovirt.org
>>> To unsubscribe send an email to users-leave(a)ovirt.org
>>> Privacy Statement:
https://www.ovirt.org/site/privacy-policy/
>>> oVirt Code of Conduct:
https://www.ovirt.org/community/about/community-
>>> guidelines/
>>> List Archives:
https://lists.ovirt.org/archives/list/users@ovirt.org/
>>> message/RTOXFNAXQ3NJBWX7RXOYK5H5RZBHX2OK/
>>>
>>>
>>
>>
>> --
>>
>> SANDRO BONAZZOLA
>>
>> MANAGER, SOFTWARE ENGINEERING, EMEA R&D RHV
>>
>> Red Hat EMEA <
https://www.redhat.com/>
>>
>> sbonazzo(a)redhat.com
>> <
https://red.ht/sig>
>>
>
>
> _______________________________________________
> Users mailing list -- users(a)ovirt.org
> To unsubscribe send an email to users-leave(a)ovirt.org
> Privacy Statement:
https://www.ovirt.org/site/privacy-policy/
> oVirt Code of Conduct:
https://www.ovirt.org/community/about/community-
> guidelines/
> List Archives:
https://lists.ovirt.org/archives/list/users@ovirt.org/
> message/AS3UWWIO5PBFYUPW5DIJ6O6VYJAL5ZIL/
>