Migration from Proxmox
by Gabriel Stein
Hi again!
well, I'm trying to migrate all my VMs from Proxmox to oVirt. Proxmox
doesn't have libvirt and I can dump the files using vzdump <vm-id>
<directory> and the output is a *.vma file I think from Proxmox. I can't
even find the files, Proxmox create Logical Volumes for every VM.
I converted that to *.qcow2 using qemu-img convert, the conversion
worked(at least no errors) but I can't import it using a script that I
found on web* and the export storage domain(oVirt didn't found it).
I would like to know if there is a way to do that. I read a lot about and
found that one could build a conversion server and use virt-v2v to import.
But it will require a RH Enterprise for that, right? I don't have a
subscription and I would like to know if is possible without the
subscription?
And sorry for the offtopic question, if there is a "redhat" which would
like to answer me privately, if I buy a subscription for 1 Server and I
have NN CentOS Servers, it will be possible to use all benefits since I
have a valid subscription(of course, support just for the RH Server)?
* https://rwmj.wordpress.com/2015/09/18/importing-kvm-
guests-to-ovirt-or-rhev/
Thanks in Advance!
All the best
Gabriel
Gabriel Stein
------------------------------
Gabriel Ferraz Stein
Tel.: +49 (0) 170 2881531
3 years, 2 months
Re: [ovirt-users] Method to easily verify version of host
by Sandro Bonazzola
Il 01 Dic 2017 10:41, "Gianluca Cecchi" <gianluca.cecchi(a)gmail.com> ha
scritto:
Hello,
currently in web admin gui one can easily verify the exact version of its
engine.
In my case I see
oVirt Engine Version: 4.1.7.6-1.el7.centos
Supposing you fully update your hosts with yum update at every upgrade you
do, ovirt-release41 package gives you exact release installed on your host.
But the same it seems doesn't happen for hypervisors.
Eg in my case I see that they are not aligned with my engine level because
of the "Update available" icon in side of them in Hosts tab and the related
event of type
"
Check for available updates on host ov300 was completed successfully with
message 'found updates for packages ... qemu-img-ev-2.9.0-16.el7_4.8.1 ...
vdsm-4.19.37-1.el7.centos ...
"
One can crosscheck vdsm version for each hypervisor, but it is suboptimal
if you have many hosts and it is not immediate to see the potentially
different levels of misalignment, in case you have many hosts...
Is there anything or do you think it worth's while to open an RFE for it?
Gianluca
_______________________________________________
Users mailing list
Users(a)ovirt.org
http://lists.ovirt.org/mailman/listinfo/users
3 years, 3 months
Re: [ovirt-users] oVirt Node ng upgrade failed
by Yuval Turgeman
Great, thanks! we already have a patch in POST ready here:
https://gerrit.ovirt.org/#/c/84957/
Thanks,
Yuval
On Dec 1, 2017 15:04, "Kilian Ries" <mail(a)kilian-ries.de> wrote:
Bug is opened:
https://bugzilla.redhat.com/show_bug.cgi?id=1519784
Ok, i try to fix my host next week. Thanks for your help ;)
------------------------------
*Von:* Yuval Turgeman <yuvalt(a)redhat.com>
*Gesendet:* Donnerstag, 30. November 2017 09:22:39
*An:* Kilian Ries
*Cc:* users
*Betreff:* Re: [ovirt-users] oVirt Node ng upgrade failed
Looks like it, yes - we try to add setfiles_t to permissive, because we
assume selinux is on, and if it's disabled, semanage fails with the error
you mentioned. Can you open a bug on this ?
If you would like to fix the system, you will need to clean the unused LVs,
remove the relevant boot entries from grub (if they exist) and
/boot/ovirt-node-ng-4.1.7-0.20171108.0+1 (if it exists), then reinstall the
rpm.
On Thu, Nov 30, 2017 at 10:16 AM, Kilian Ries <mail(a)kilian-ries.de> wrote:
> Yes, selinux is disabled via /etc/selinux/config; Is that the problem? :/
> ------------------------------
> *Von:* Yuval Turgeman <yuvalt(a)redhat.com>
> *Gesendet:* Donnerstag, 30. November 2017 09:13:34
> *An:* Kilian Ries
> *Cc:* users
>
> *Betreff:* Re: [ovirt-users] oVirt Node ng upgrade failed
>
> Kilian, did you disable selinux by any chance ? (selinux=0 on boot) ?
>
> On Thu, Nov 30, 2017 at 9:57 AM, Yuval Turgeman <yuvalt(a)redhat.com> wrote:
>
>> Looks like selinux is broken on your machine for some reason, can you
>> share /etc/selinux ?
>>
>> Thanks,
>> Yuval.
>>
>> On Tue, Nov 28, 2017 at 6:31 PM, Kilian Ries <mail(a)kilian-ries.de> wrote:
>>
>>> @Yuval Turgeman
>>>
>>>
>>> ###
>>>
>>>
>>> [17:27:10][root@vm5:~]$semanage permissive -a setfiles_t
>>>
>>> SELinux: Could not downgrade policy file /etc/selinux/targeted/policy/policy.30,
>>> searching for an older version.
>>>
>>> SELinux: Could not open policy file <= /etc/selinux/targeted/policy/policy.30:
>>> No such file or directory
>>>
>>> /sbin/load_policy: Can't load policy: No such file or directory
>>>
>>> libsemanage.semanage_reload_policy: load_policy returned error code 2.
>>> (No such file or directory).
>>>
>>> SELinux: Could not downgrade policy file /etc/selinux/targeted/policy/policy.30,
>>> searching for an older version.
>>>
>>> SELinux: Could not open policy file <= /etc/selinux/targeted/policy/policy.30:
>>> No such file or directory
>>>
>>> /sbin/load_policy: Can't load policy: No such file or directory
>>>
>>> libsemanage.semanage_reload_policy: load_policy returned error code 2.
>>> (No such file or directory).
>>>
>>> OSError: No such file or directory
>>>
>>>
>>> ###
>>>
>>>
>>> @Ryan Barry
>>>
>>>
>>> Manual yum upgrade finished without any error but imgbased.log still
>>> shows me the following:
>>>
>>>
>>> ###
>>>
>>>
>>> 2017-11-28 17:25:28,372 [DEBUG] (MainThread) Returned:
>>>
>>> 2017-11-28 17:25:28,434 [DEBUG] (MainThread) Creating /home as
>>> {'attach': True, 'size': '1G'}
>>>
>>> 2017-11-28 17:25:28,434 [DEBUG] (MainThread) Calling binary: (['vgs',
>>> '--noheadings', '@imgbased:volume', '-o', 'lv_full_name'],) {'stderr':
>>> <open file '/dev/null', mode 'w' at 0x7fa2d1ad8ed0>}
>>>
>>> 2017-11-28 17:25:28,434 [DEBUG] (MainThread) Calling: (['vgs',
>>> '--noheadings', '@imgbased:volume', '-o', 'lv_full_name'],) {'close_fds':
>>> True, 'stderr': <open file '/dev/null', mode 'w' at 0x7fa2d1ad8ed0>}
>>>
>>> 2017-11-28 17:25:28,533 [DEBUG] (MainThread) Returned: onn/home
>>>
>>> onn/tmp
>>>
>>> onn/var_log
>>>
>>> onn/var_log_audit
>>>
>>> 2017-11-28 17:25:28,533 [DEBUG] (MainThread) Calling binary: (['umount',
>>> '-l', '/etc'],) {}
>>>
>>> 2017-11-28 17:25:28,534 [DEBUG] (MainThread) Calling: (['umount', '-l',
>>> '/etc'],) {'close_fds': True, 'stderr': -2}
>>>
>>> 2017-11-28 17:25:28,539 [DEBUG] (MainThread) Returned:
>>>
>>> 2017-11-28 17:25:28,540 [DEBUG] (MainThread) Calling binary: (['umount',
>>> '-l', u'/tmp/mnt.tuHU8'],) {}
>>>
>>> 2017-11-28 17:25:28,540 [DEBUG] (MainThread) Calling: (['umount', '-l',
>>> u'/tmp/mnt.tuHU8'],) {'close_fds': True, 'stderr': -2}
>>>
>>> 2017-11-28 17:25:28,635 [DEBUG] (MainThread) Returned:
>>>
>>> 2017-11-28 17:25:28,635 [DEBUG] (MainThread) Calling binary: (['rmdir',
>>> u'/tmp/mnt.tuHU8'],) {}
>>>
>>> 2017-11-28 17:25:28,635 [DEBUG] (MainThread) Calling: (['rmdir',
>>> u'/tmp/mnt.tuHU8'],) {'close_fds': True, 'stderr': -2}
>>>
>>> 2017-11-28 17:25:28,640 [DEBUG] (MainThread) Returned:
>>>
>>> 2017-11-28 17:25:28,641 [ERROR] (MainThread) Failed to migrate etc
>>>
>>> Traceback (most recent call last):
>>>
>>> File "/tmp/tmp.ipxGZrbQEi/usr/lib/python2.7/site-packages/imgbased/plugins/osupdater.py",
>>> line 109, in on_new_layer
>>>
>>> check_nist_layout(imgbase, new_lv)
>>>
>>> File "/tmp/tmp.ipxGZrbQEi/usr/lib/python2.7/site-packages/imgbased/plugins/osupdater.py",
>>> line 179, in check_nist_layout
>>>
>>> v.create(t, paths[t]["size"], paths[t]["attach"])
>>>
>>> File "/tmp/tmp.ipxGZrbQEi/usr/lib/python2.7/site-packages/imgbased/volume.py",
>>> line 48, in create
>>>
>>> "Path is already a volume: %s" % where
>>>
>>> AssertionError: Path is already a volume: /home
>>>
>>> 2017-11-28 17:25:28,642 [DEBUG] (MainThread) Calling binary: (['umount',
>>> '-l', u'/tmp/mnt.bEW2k'],) {}
>>>
>>> 2017-11-28 17:25:28,642 [DEBUG] (MainThread) Calling: (['umount', '-l',
>>> u'/tmp/mnt.bEW2k'],) {'close_fds': True, 'stderr': -2}
>>>
>>> 2017-11-28 17:25:29,061 [DEBUG] (MainThread) Returned:
>>>
>>> 2017-11-28 17:25:29,061 [DEBUG] (MainThread) Calling binary: (['rmdir',
>>> u'/tmp/mnt.bEW2k'],) {}
>>>
>>> 2017-11-28 17:25:29,061 [DEBUG] (MainThread) Calling: (['rmdir',
>>> u'/tmp/mnt.bEW2k'],) {'close_fds': True, 'stderr': -2}
>>>
>>> 2017-11-28 17:25:29,067 [DEBUG] (MainThread) Returned:
>>>
>>> 2017-11-28 17:25:29,067 [DEBUG] (MainThread) Calling binary: (['umount',
>>> '-l', u'/tmp/mnt.UB5Yg'],) {}
>>>
>>> 2017-11-28 17:25:29,067 [DEBUG] (MainThread) Calling: (['umount', '-l',
>>> u'/tmp/mnt.UB5Yg'],) {'close_fds': True, 'stderr': -2}
>>>
>>> 2017-11-28 17:25:29,625 [DEBUG] (MainThread) Returned:
>>>
>>> 2017-11-28 17:25:29,625 [DEBUG] (MainThread) Calling binary: (['rmdir',
>>> u'/tmp/mnt.UB5Yg'],) {}
>>>
>>> 2017-11-28 17:25:29,626 [DEBUG] (MainThread) Calling: (['rmdir',
>>> u'/tmp/mnt.UB5Yg'],) {'close_fds': True, 'stderr': -2}
>>>
>>> 2017-11-28 17:25:29,631 [DEBUG] (MainThread) Returned:
>>>
>>> Traceback (most recent call last):
>>>
>>> File "/usr/lib64/python2.7/runpy.py", line 162, in _run_module_as_main
>>>
>>> "__main__", fname, loader, pkg_name)
>>>
>>> File "/usr/lib64/python2.7/runpy.py", line 72, in _run_code
>>>
>>> exec code in run_globals
>>>
>>> File "/tmp/tmp.ipxGZrbQEi/usr/lib/python2.7/site-packages/imgbased/__main__.py",
>>> line 53, in <module>
>>>
>>> CliApplication()
>>>
>>> File "/tmp/tmp.ipxGZrbQEi/usr/lib/python2.7/site-packages/imgbased/__init__.py",
>>> line 82, in CliApplication
>>>
>>> app.hooks.emit("post-arg-parse", args)
>>>
>>> File "/tmp/tmp.ipxGZrbQEi/usr/lib/python2.7/site-packages/imgbased/hooks.py",
>>> line 120, in emit
>>>
>>> cb(self.context, *args)
>>>
>>> File "/tmp/tmp.ipxGZrbQEi/usr/lib/python2.7/site-packages/imgbased/plugins/update.py",
>>> line 56, in post_argparse
>>>
>>> base_lv, _ = LiveimgExtractor(app.imgbase).extract(args.FILENAME)
>>>
>>> File "/tmp/tmp.ipxGZrbQEi/usr/lib/python2.7/site-packages/imgbased/plugins/update.py",
>>> line 118, in extract
>>>
>>> "%s" % size, nvr)
>>>
>>> File "/tmp/tmp.ipxGZrbQEi/usr/lib/python2.7/site-packages/imgbased/plugins/update.py",
>>> line 99, in add_base_with_tree
>>>
>>> new_layer_lv = self.imgbase.add_layer(new_base)
>>>
>>> File "/tmp/tmp.ipxGZrbQEi/usr/lib/python2.7/site-packages/imgbased/imgbase.py",
>>> line 191, in add_layer
>>>
>>> self.hooks.emit("new-layer-added", prev_lv, new_lv)
>>>
>>> File "/tmp/tmp.ipxGZrbQEi/usr/lib/python2.7/site-packages/imgbased/hooks.py",
>>> line 120, in emit
>>>
>>> cb(self.context, *args)
>>>
>>> File "/tmp/tmp.ipxGZrbQEi/usr/lib/python2.7/site-packages/imgbased/plugins/osupdater.py",
>>> line 123, in on_new_layer
>>>
>>> raise ConfigMigrationError()
>>>
>>> imgbased.plugins.osupdater.ConfigMigrationError
>>>
>>>
>>> ###
>>> ------------------------------
>>> *Von:* Yuval Turgeman <yuvalt(a)redhat.com>
>>> *Gesendet:* Sonntag, 26. November 2017 17:23:55
>>> *An:* Kilian Ries
>>> *Cc:* Ryan Barry; Lev Veyde; users
>>>
>>> *Betreff:* Re: [ovirt-users] oVirt Node ng upgrade failed
>>>
>>> Hi,
>>>
>>> Can you try to run on your 4.1.1 `semanage permissive -a setfiles_t` and
>>> share your output ?
>>>
>>> Thanks,
>>> Yuval
>>>
>>> On Fri, Nov 24, 2017 at 11:01 AM, Kilian Ries <mail(a)kilian-ries.de>
>>> wrote:
>>>
>>>> This is the imgbased.log:
>>>>
>>>>
>>>> https://www.dropbox.com/s/v9dmgz14cpzfcsn/imgbased.log.tar.gz?dl=0
>>>>
>>>> Ok, i'll try your steps and come back later ...
>>>>
>>>>
>>>> ------------------------------
>>>> *Von:* Ryan Barry <rbarry(a)redhat.com>
>>>> *Gesendet:* Donnerstag, 23. November 2017 23:33:34
>>>> *An:* Kilian Ries; Lev Veyde; users
>>>> *Betreff:* Re: [ovirt-users] oVirt Node ng upgrade failed
>>>>
>>>> Can you grab imgbased.log?
>>>>
>>>> To retry, "rpm -e ovirt-node-ng-image-update" and remove the new LVs.
>>>> "yum install ovirt-node-ng-image-update" from the CLI instead of engine so
>>>> we can get full logs would be useful
>>>>
>>>> On Thu, Nov 23, 2017 at 16:01 Lev Veyde <lveyde(a)redhat.com> wrote:
>>>>
>>>>>
>>>>> ---------- Forwarded message ----------
>>>>> From: Kilian Ries <mail(a)kilian-ries.de>
>>>>> Date: Thu, Nov 23, 2017 at 5:16 PM
>>>>> Subject: [ovirt-users] oVirt Node ng upgrade failed
>>>>> To: "Users(a)ovirt.org" <Users(a)ovirt.org>
>>>>>
>>>>>
>>>>> Hi,
>>>>>
>>>>>
>>>>> just tried to upgrade from
>>>>>
>>>>>
>>>>> ovirt-node-ng-4.1.1.1-0.20170504.0+1
>>>>>
>>>>>
>>>>> to
>>>>>
>>>>>
>>>>> ovirt-node-ng-4.1.7-0.20171108.0+1
>>>>>
>>>>>
>>>>> but it failed:
>>>>>
>>>>>
>>>>> ###
>>>>>
>>>>>
>>>>> 2017-11-23 10:19:21 INFO otopi.plugins.otopi.packagers.yumpackager
>>>>> yumpackager.info:80 Yum Verify: 1/4: ovirt-node-ng-image-update.noarch
>>>>> 0:4.1.7-1.el7.centos - u
>>>>>
>>>>> 2017-11-23 10:19:21 INFO otopi.plugins.otopi.packagers.yumpackager
>>>>> yumpackager.info:80 Yum Verify: 2/4: ovirt-node-ng-image-update-placeholder.noarch
>>>>> 0:4.1.1.1-1.el7.centos - od
>>>>>
>>>>> 2017-11-23 10:19:21 INFO otopi.plugins.otopi.packagers.yumpackager
>>>>> yumpackager.info:80 Yum Verify: 3/4: ovirt-node-ng-image.noarch
>>>>> 0:4.1.1.1-1.el7.centos - od
>>>>>
>>>>> 2017-11-23 10:19:21 INFO otopi.plugins.otopi.packagers.yumpackager
>>>>> yumpackager.info:80 Yum Verify: 4/4: ovirt-node-ng-image-update.noarch
>>>>> 0:4.1.1.1-1.el7.centos - ud
>>>>>
>>>>> 2017-11-23 10:19:21 DEBUG otopi.plugins.otopi.packagers.yumpackager
>>>>> yumpackager.verbose:76 Yum Transaction processed
>>>>>
>>>>> 2017-11-23 10:19:21 DEBUG otopi.context context._executeMethod:142
>>>>> method exception
>>>>>
>>>>> Traceback (most recent call last):
>>>>>
>>>>> File "/tmp/ovirt-3JI9q14aGS/pythonlib/otopi/context.py", line 132,
>>>>> in _executeMethod
>>>>>
>>>>> method['method']()
>>>>>
>>>>> File "/tmp/ovirt-3JI9q14aGS/otopi-plugins/otopi/packagers/yumpackager.py",
>>>>> line 261, in _packages
>>>>>
>>>>> self._miniyum.processTransaction()
>>>>>
>>>>> File "/tmp/ovirt-3JI9q14aGS/pythonlib/otopi/miniyum.py", line 1049,
>>>>> in processTransaction
>>>>>
>>>>> _('One or more elements within Yum transaction failed')
>>>>>
>>>>> RuntimeError: One or more elements within Yum transaction failed
>>>>>
>>>>> 2017-11-23 10:19:21 ERROR otopi.context context._executeMethod:151
>>>>> Failed to execute stage 'Package installation': One or more elements within
>>>>> Yum transaction failed
>>>>>
>>>>> 2017-11-23 10:19:21 DEBUG otopi.transaction transaction.abort:119
>>>>> aborting 'Yum Transaction'
>>>>>
>>>>> 2017-11-23 10:19:21 INFO otopi.plugins.otopi.packagers.yumpackager
>>>>> yumpackager.info:80 Yum Performing yum transaction rollback
>>>>>
>>>>> 2017-11-23 10:19:21 DEBUG otopi.plugins.otopi.packagers.yumpackager
>>>>> yumpackager.verbose:76 Yum Downloading: centos-opstools-release/7/x86_64/filelists_db
>>>>> (0%)
>>>>>
>>>>> 2017-11-23 10:19:21 DEBUG otopi.plugins.otopi.packagers.yumpackager
>>>>> yumpackager.verbose:76 Yum Downloading: centos-opstools-release/7/x86_64/filelists_db
>>>>> 374 k(100%)
>>>>>
>>>>> 2017-11-23 10:19:22 DEBUG otopi.plugins.otopi.packagers.yumpackager
>>>>> yumpackager.verbose:76 Yum Downloading: centos-opstools-release/7/x86_64/other_db
>>>>> (0%)
>>>>>
>>>>> 2017-11-23 10:19:22 DEBUG otopi.plugins.otopi.packagers.yumpackager
>>>>> yumpackager.verbose:76 Yum Downloading: centos-opstools-release/7/x86_64/other_db
>>>>> 53 k(100%)
>>>>>
>>>>> 2017-11-23 10:19:22 DEBUG otopi.plugins.otopi.packagers.yumpackager
>>>>> yumpackager.verbose:76 Yum Downloading: ovirt-4.1/7/filelists_db (0%)
>>>>>
>>>>> 2017-11-23 10:19:22 DEBUG otopi.plugins.otopi.packagers.yumpackager
>>>>> yumpackager.verbose:76 Yum Downloading: ovirt-4.1/7/filelists_db 55 k(4%)
>>>>>
>>>>> 2017-11-23 10:19:23 DEBUG otopi.plugins.otopi.packagers.yumpackager
>>>>> yumpackager.verbose:76 Yum Downloading: ovirt-4.1/7/filelists_db 201 k(17%)
>>>>>
>>>>> 2017-11-23 10:19:23 DEBUG otopi.plugins.otopi.packagers.yumpackager
>>>>> yumpackager.verbose:76 Yum Downloading: ovirt-4.1/7/filelists_db 648 k(56%)
>>>>>
>>>>> 2017-11-23 10:19:23 DEBUG otopi.plugins.otopi.packagers.yumpackager
>>>>> yumpackager.verbose:76 Yum Downloading: ovirt-4.1/7/filelists_db 1.1 M(99%)
>>>>>
>>>>> 2017-11-23 10:19:23 DEBUG otopi.plugins.otopi.packagers.yumpackager
>>>>> yumpackager.verbose:76 Yum Downloading: ovirt-4.1/7/filelists_db 1.1 M(100%)
>>>>>
>>>>> 2017-11-23 10:19:25 DEBUG otopi.plugins.otopi.packagers.yumpackager
>>>>> yumpackager.verbose:76 Yum Downloading: ovirt-4.1/7/other_db (0%)
>>>>>
>>>>> 2017-11-23 10:19:25 DEBUG otopi.plugins.otopi.packagers.yumpackager
>>>>> yumpackager.verbose:76 Yum Downloading: ovirt-4.1/7/other_db 45 k(14%)
>>>>>
>>>>> 2017-11-23 10:19:26 DEBUG otopi.plugins.otopi.packagers.yumpackager
>>>>> yumpackager.verbose:76 Yum Downloading: ovirt-4.1/7/other_db 207 k(66%)
>>>>>
>>>>> 2017-11-23 10:19:26 DEBUG otopi.plugins.otopi.packagers.yumpackager
>>>>> yumpackager.verbose:76 Yum Downloading: ovirt-4.1/7/other_db 311 k(100%)
>>>>>
>>>>> 2017-11-23 10:19:26 DEBUG otopi.plugins.otopi.packagers.yumpackager
>>>>> yumpackager.verbose:76 Yum Downloading: ovirt-4.1-centos-gluster38/x86_64/filelists_db
>>>>> (0%)
>>>>>
>>>>> 2017-11-23 10:19:26 DEBUG otopi.plugins.otopi.packagers.yumpackager
>>>>> yumpackager.verbose:76 Yum Downloading: ovirt-4.1-centos-gluster38/x86_64/filelists_db
>>>>> 18 k(100%)
>>>>>
>>>>> 2017-11-23 10:19:26 DEBUG otopi.plugins.otopi.packagers.yumpackager
>>>>> yumpackager.verbose:76 Yum Downloading: ovirt-4.1-centos-gluster38/x86_64/other_db
>>>>> (0%)
>>>>>
>>>>> 2017-11-23 10:19:26 DEBUG otopi.plugins.otopi.packagers.yumpackager
>>>>> yumpackager.verbose:76 Yum Downloading: ovirt-4.1-centos-gluster38/x86_64/other_db
>>>>> 7.6 k(100%)
>>>>>
>>>>> 2017-11-23 10:19:26 DEBUG otopi.plugins.otopi.packagers.yumpackager
>>>>> yumpackager.verbose:76 Yum Downloading: ovirt-4.1-epel/x86_64/filelists_db
>>>>> (0%)
>>>>>
>>>>> 2017-11-23 10:19:27 DEBUG otopi.plugins.otopi.packagers.yumpackager
>>>>> yumpackager.verbose:76 Yum Downloading: ovirt-4.1-epel/x86_64/filelists_db
>>>>> 7.5 M(76%)
>>>>>
>>>>> 2017-11-23 10:19:27 DEBUG otopi.plugins.otopi.packagers.yumpackager
>>>>> yumpackager.verbose:76 Yum Downloading: ovirt-4.1-epel/x86_64/filelists_db
>>>>> 9.9 M(100%)
>>>>>
>>>>> 2017-11-23 10:19:29 DEBUG otopi.plugins.otopi.packagers.yumpackager
>>>>> yumpackager.verbose:76 Yum Downloading: ovirt-4.1-epel/x86_64/other_db (0%)
>>>>>
>>>>> 2017-11-23 10:19:29 DEBUG otopi.plugins.otopi.packagers.yumpackager
>>>>> yumpackager.verbose:76 Yum Downloading: ovirt-4.1-epel/x86_64/other_db 2.9
>>>>> M(100%)
>>>>>
>>>>> 2017-11-23 10:19:30 DEBUG otopi.plugins.otopi.packagers.yumpackager
>>>>> yumpackager.verbose:76 Yum Downloading: ovirt-4.1-patternfly1-noarch-epel/x86_64/filelists_db
>>>>> (0%)
>>>>>
>>>>> 2017-11-23 10:19:30 DEBUG otopi.plugins.otopi.packagers.yumpackager
>>>>> yumpackager.verbose:76 Yum Downloading: ovirt-4.1-patternfly1-noarch-epel/x86_64/filelists_db
>>>>> 6.5 k(100%)
>>>>>
>>>>> 2017-11-23 10:19:31 DEBUG otopi.plugins.otopi.packagers.yumpackager
>>>>> yumpackager.verbose:76 Yum Downloading: ovirt-4.1-patternfly1-noarch-epel/x86_64/other_db
>>>>> (0%)
>>>>>
>>>>> 2017-11-23 10:19:31 DEBUG otopi.plugins.otopi.packagers.yumpackager
>>>>> yumpackager.verbose:76 Yum Downloading: ovirt-4.1-patternfly1-noarch-epel/x86_64/other_db
>>>>> 851 (100%)
>>>>>
>>>>> 2017-11-23 10:19:31 DEBUG otopi.plugins.otopi.packagers.yumpackager
>>>>> yumpackager.verbose:76 Yum Downloading: ovirt-centos-ovirt41/7/x86_64/filelists_db
>>>>> (0%)
>>>>>
>>>>> 2017-11-23 10:19:31 DEBUG otopi.plugins.otopi.packagers.yumpackager
>>>>> yumpackager.verbose:76 Yum Downloading: ovirt-centos-ovirt41/7/x86_64/filelists_db
>>>>> 312 k(100%)
>>>>>
>>>>> 2017-11-23 10:19:31 DEBUG otopi.plugins.otopi.packagers.yumpackager
>>>>> yumpackager.verbose:76 Yum Downloading: ovirt-centos-ovirt41/7/x86_64/other_db
>>>>> (0%)
>>>>>
>>>>> 2017-11-23 10:19:31 DEBUG otopi.plugins.otopi.packagers.yumpackager
>>>>> yumpackager.verbose:76 Yum Downloading: ovirt-centos-ovirt41/7/x86_64/other_db
>>>>> 84 k(100%)
>>>>>
>>>>> 2017-11-23 10:19:32 DEBUG otopi.plugins.otopi.packagers.yumpackager
>>>>> yumpackager.verbose:76 Yum Downloading: rnachimu-gdeploy/x86_64/filelists_db
>>>>> (0%)
>>>>>
>>>>> 2017-11-23 10:19:32 DEBUG otopi.plugins.otopi.packagers.yumpackager
>>>>> yumpackager.verbose:76 Yum Downloading: rnachimu-gdeploy/x86_64/filelists_db
>>>>> 4.5 k(100%)
>>>>>
>>>>> 2017-11-23 10:19:32 DEBUG otopi.plugins.otopi.packagers.yumpackager
>>>>> yumpackager.verbose:76 Yum Downloading: rnachimu-gdeploy/x86_64/other_db
>>>>> (0%)
>>>>>
>>>>> 2017-11-23 10:19:32 DEBUG otopi.plugins.otopi.packagers.yumpackager
>>>>> yumpackager.verbose:76 Yum Downloading: rnachimu-gdeploy/x86_64/other_db
>>>>> 1.4 k(100%)
>>>>>
>>>>> 2017-11-23 10:19:32 DEBUG otopi.plugins.otopi.packagers.yumpackager
>>>>> yumpackager.verbose:76 Yum Downloading: virtio-win-stable/filelists_db (0%)
>>>>>
>>>>> 2017-11-23 10:19:32 DEBUG otopi.plugins.otopi.packagers.yumpackager
>>>>> yumpackager.verbose:76 Yum Downloading: virtio-win-stable/filelists_db 3.9
>>>>> k(100%)
>>>>>
>>>>> 2017-11-23 10:19:33 DEBUG otopi.plugins.otopi.packagers.yumpackager
>>>>> yumpackager.verbose:76 Yum Downloading: virtio-win-stable/other_db (0%)
>>>>>
>>>>> 2017-11-23 10:19:33 DEBUG otopi.plugins.otopi.packagers.yumpackager
>>>>> yumpackager.verbose:76 Yum Downloading: virtio-win-stable/other_db 4.3
>>>>> k(100%)
>>>>>
>>>>> 2017-11-23 10:19:33 ERROR otopi.plugins.otopi.packagers.yumpackager
>>>>> yumpackager.error:85 Yum Transaction close failed: Traceback (most recent
>>>>> call last):
>>>>>
>>>>> File "/tmp/ovirt-3JI9q14aGS/pythonlib/otopi/miniyum.py", line 761,
>>>>> in endTransaction
>>>>>
>>>>> if self._yb.history_undo(transactionCurrent):
>>>>>
>>>>> File "/usr/lib/python2.7/site-packages/yum/__init__.py", line 6086,
>>>>> in history_undo
>>>>>
>>>>> if self.install(pkgtup=pkg.pkgtup):
>>>>>
>>>>> File "/usr/lib/python2.7/site-packages/yum/__init__.py", line 4910,
>>>>> in install
>>>>>
>>>>> raise Errors.InstallError, _('No package(s) available to install')
>>>>>
>>>>> InstallError: Kein(e) Paket(e) zum Installieren verfügbar.
>>>>>
>>>>>
>>>>> ###
>>>>>
>>>>>
>>>>>
>>>>> Some more information on my system:
>>>>>
>>>>>
>>>>> ###
>>>>>
>>>>>
>>>>> $ mount
>>>>>
>>>>> ...
>>>>>
>>>>> /dev/mapper/onn-ovirt--node--ng--4.1.1.1--0.20170504.0+1 on / type
>>>>> ext4 (rw,relatime,discard,stripe=128,data=ordered)
>>>>>
>>>>>
>>>>>
>>>>> $ imgbase layout
>>>>>
>>>>> ovirt-node-ng-4.1.1.1-0.20170406.0
>>>>>
>>>>> ovirt-node-ng-4.1.1.1-0.20170504.0
>>>>>
>>>>> +- ovirt-node-ng-4.1.1.1-0.20170504.0+1
>>>>>
>>>>> ovirt-node-ng-4.1.7-0.20171108.0
>>>>>
>>>>> +- ovirt-node-ng-4.1.7-0.20171108.0+1
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>> $ rpm -q ovirt-node-ng-image
>>>>>
>>>>> Das Paket ovirt-node-ng-image ist nicht installiert
>>>>>
>>>>>
>>>>>
>>>>> $ nodectl check
>>>>>
>>>>> Status: OK
>>>>>
>>>>> Bootloader ... OK
>>>>>
>>>>> Layer boot entries ... OK
>>>>>
>>>>> Valid boot entries ... OK
>>>>>
>>>>> Mount points ... OK
>>>>>
>>>>> Separate /var ... OK
>>>>>
>>>>> Discard is used ... OK
>>>>>
>>>>> Basic storage ... OK
>>>>>
>>>>> Initialized VG ... OK
>>>>>
>>>>> Initialized Thin Pool ... OK
>>>>>
>>>>> Initialized LVs ... OK
>>>>>
>>>>> Thin storage ... OK
>>>>>
>>>>> Checking available space in thinpool ... OK
>>>>>
>>>>> Checking thinpool auto-extend ... OK
>>>>>
>>>>> vdsmd ... OK
>>>>>
>>>>>
>>>>> ###
>>>>>
>>>>>
>>>>> I can restart my Node and VMs are running, but oVirt Engine tells me
>>>>> no update is available. It seems 4.1.7 is installed, but Node still boots
>>>>> the old 4.1.1 image.
>>>>>
>>>>>
>>>>> Can i force run the upgrade again or is there another way to fix this?
>>>>>
>>>>>
>>>>> Thanks
>>>>>
>>>>> Greets
>>>>>
>>>>> Kilian
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>> _______________________________________________
>>>>> Users mailing list
>>>>> Users(a)ovirt.org
>>>>> http://lists.ovirt.org/mailman/listinfo/users
>>>>>
>>>>>
>>>>>
>>>>>
>>>>> --
>>>>>
>>>>> Lev Veyde
>>>>>
>>>>> Software Engineer, RHCE | RHCVA | MCITP
>>>>>
>>>>> Red Hat Israel
>>>>>
>>>>> <https://www.redhat.com>
>>>>>
>>>>> lev(a)redhat.com | lveyde(a)redhat.com
>>>>> <https://red.ht/sig>
>>>>> TRIED. TESTED. TRUSTED. <https://redhat.com/trusted>
>>>>>
>>>> --
>>>>
>>>> RYAN BARRY
>>>>
>>>> SENIOR SOFTWARE ENGINEER - TEAM LEAD - RHEV HYPERVISOR
>>>>
>>>> Red Hat NA <https://www.redhat.com/>
>>>>
>>>> rbarry(a)redhat.com M: +1-651-815-9306 IM: rbarry
>>>> <https://red.ht/sig>
>>>>
>>>> _______________________________________________
>>>> Users mailing list
>>>> Users(a)ovirt.org
>>>> http://lists.ovirt.org/mailman/listinfo/users
>>>>
>>>>
>>>
>>> _______________________________________________
>>> Users mailing list
>>> Users(a)ovirt.org
>>> http://lists.ovirt.org/mailman/listinfo/users
>>>
>>>
>>
>
3 years, 3 months
Re: [ovirt-users] oVirt Node ng upgrade failed
by Ryan Barry
Can you grab imgbased.log?
To retry, "rpm -e ovirt-node-ng-image-update" and remove the new LVs. "yum
install ovirt-node-ng-image-update" from the CLI instead of engine so we
can get full logs would be useful
On Thu, Nov 23, 2017 at 16:01 Lev Veyde <lveyde(a)redhat.com> wrote:
>
> ---------- Forwarded message ----------
> From: Kilian Ries <mail(a)kilian-ries.de>
> Date: Thu, Nov 23, 2017 at 5:16 PM
> Subject: [ovirt-users] oVirt Node ng upgrade failed
> To: "Users(a)ovirt.org" <Users(a)ovirt.org>
>
>
> Hi,
>
>
> just tried to upgrade from
>
>
> ovirt-node-ng-4.1.1.1-0.20170504.0+1
>
>
> to
>
>
> ovirt-node-ng-4.1.7-0.20171108.0+1
>
>
> but it failed:
>
>
> ###
>
>
> 2017-11-23 10:19:21 INFO otopi.plugins.otopi.packagers.yumpackager
> yumpackager.info:80 Yum Verify: 1/4: ovirt-node-ng-image-update.noarch
> 0:4.1.7-1.el7.centos - u
>
> 2017-11-23 10:19:21 INFO otopi.plugins.otopi.packagers.yumpackager
> yumpackager.info:80 Yum Verify: 2/4:
> ovirt-node-ng-image-update-placeholder.noarch 0:4.1.1.1-1.el7.centos - od
>
> 2017-11-23 10:19:21 INFO otopi.plugins.otopi.packagers.yumpackager
> yumpackager.info:80 Yum Verify: 3/4: ovirt-node-ng-image.noarch
> 0:4.1.1.1-1.el7.centos - od
>
> 2017-11-23 10:19:21 INFO otopi.plugins.otopi.packagers.yumpackager
> yumpackager.info:80 Yum Verify: 4/4: ovirt-node-ng-image-update.noarch
> 0:4.1.1.1-1.el7.centos - ud
>
> 2017-11-23 10:19:21 DEBUG otopi.plugins.otopi.packagers.yumpackager
> yumpackager.verbose:76 Yum Transaction processed
>
> 2017-11-23 10:19:21 DEBUG otopi.context context._executeMethod:142 method
> exception
>
> Traceback (most recent call last):
>
> File "/tmp/ovirt-3JI9q14aGS/pythonlib/otopi/context.py", line 132, in
> _executeMethod
>
> method['method']()
>
> File
> "/tmp/ovirt-3JI9q14aGS/otopi-plugins/otopi/packagers/yumpackager.py", line
> 261, in _packages
>
> self._miniyum.processTransaction()
>
> File "/tmp/ovirt-3JI9q14aGS/pythonlib/otopi/miniyum.py", line 1049, in
> processTransaction
>
> _('One or more elements within Yum transaction failed')
>
> RuntimeError: One or more elements within Yum transaction failed
>
> 2017-11-23 10:19:21 ERROR otopi.context context._executeMethod:151 Failed
> to execute stage 'Package installation': One or more elements within Yum
> transaction failed
>
> 2017-11-23 10:19:21 DEBUG otopi.transaction transaction.abort:119 aborting
> 'Yum Transaction'
>
> 2017-11-23 10:19:21 INFO otopi.plugins.otopi.packagers.yumpackager
> yumpackager.info:80 Yum Performing yum transaction rollback
>
> 2017-11-23 10:19:21 DEBUG otopi.plugins.otopi.packagers.yumpackager
> yumpackager.verbose:76 Yum Downloading:
> centos-opstools-release/7/x86_64/filelists_db (0%)
>
> 2017-11-23 10:19:21 DEBUG otopi.plugins.otopi.packagers.yumpackager
> yumpackager.verbose:76 Yum Downloading:
> centos-opstools-release/7/x86_64/filelists_db 374 k(100%)
>
> 2017-11-23 10:19:22 DEBUG otopi.plugins.otopi.packagers.yumpackager
> yumpackager.verbose:76 Yum Downloading:
> centos-opstools-release/7/x86_64/other_db (0%)
>
> 2017-11-23 10:19:22 DEBUG otopi.plugins.otopi.packagers.yumpackager
> yumpackager.verbose:76 Yum Downloading:
> centos-opstools-release/7/x86_64/other_db 53 k(100%)
>
> 2017-11-23 10:19:22 DEBUG otopi.plugins.otopi.packagers.yumpackager
> yumpackager.verbose:76 Yum Downloading: ovirt-4.1/7/filelists_db (0%)
>
> 2017-11-23 10:19:22 DEBUG otopi.plugins.otopi.packagers.yumpackager
> yumpackager.verbose:76 Yum Downloading: ovirt-4.1/7/filelists_db 55 k(4%)
>
> 2017-11-23 10:19:23 DEBUG otopi.plugins.otopi.packagers.yumpackager
> yumpackager.verbose:76 Yum Downloading: ovirt-4.1/7/filelists_db 201 k(17%)
>
> 2017-11-23 10:19:23 DEBUG otopi.plugins.otopi.packagers.yumpackager
> yumpackager.verbose:76 Yum Downloading: ovirt-4.1/7/filelists_db 648 k(56%)
>
> 2017-11-23 10:19:23 DEBUG otopi.plugins.otopi.packagers.yumpackager
> yumpackager.verbose:76 Yum Downloading: ovirt-4.1/7/filelists_db 1.1 M(99%)
>
> 2017-11-23 10:19:23 DEBUG otopi.plugins.otopi.packagers.yumpackager
> yumpackager.verbose:76 Yum Downloading: ovirt-4.1/7/filelists_db 1.1 M(100%)
>
> 2017-11-23 10:19:25 DEBUG otopi.plugins.otopi.packagers.yumpackager
> yumpackager.verbose:76 Yum Downloading: ovirt-4.1/7/other_db (0%)
>
> 2017-11-23 10:19:25 DEBUG otopi.plugins.otopi.packagers.yumpackager
> yumpackager.verbose:76 Yum Downloading: ovirt-4.1/7/other_db 45 k(14%)
>
> 2017-11-23 10:19:26 DEBUG otopi.plugins.otopi.packagers.yumpackager
> yumpackager.verbose:76 Yum Downloading: ovirt-4.1/7/other_db 207 k(66%)
>
> 2017-11-23 10:19:26 DEBUG otopi.plugins.otopi.packagers.yumpackager
> yumpackager.verbose:76 Yum Downloading: ovirt-4.1/7/other_db 311 k(100%)
>
> 2017-11-23 10:19:26 DEBUG otopi.plugins.otopi.packagers.yumpackager
> yumpackager.verbose:76 Yum Downloading:
> ovirt-4.1-centos-gluster38/x86_64/filelists_db (0%)
>
> 2017-11-23 10:19:26 DEBUG otopi.plugins.otopi.packagers.yumpackager
> yumpackager.verbose:76 Yum Downloading:
> ovirt-4.1-centos-gluster38/x86_64/filelists_db 18 k(100%)
>
> 2017-11-23 10:19:26 DEBUG otopi.plugins.otopi.packagers.yumpackager
> yumpackager.verbose:76 Yum Downloading:
> ovirt-4.1-centos-gluster38/x86_64/other_db (0%)
>
> 2017-11-23 10:19:26 DEBUG otopi.plugins.otopi.packagers.yumpackager
> yumpackager.verbose:76 Yum Downloading:
> ovirt-4.1-centos-gluster38/x86_64/other_db 7.6 k(100%)
>
> 2017-11-23 10:19:26 DEBUG otopi.plugins.otopi.packagers.yumpackager
> yumpackager.verbose:76 Yum Downloading: ovirt-4.1-epel/x86_64/filelists_db
> (0%)
>
> 2017-11-23 10:19:27 DEBUG otopi.plugins.otopi.packagers.yumpackager
> yumpackager.verbose:76 Yum Downloading: ovirt-4.1-epel/x86_64/filelists_db
> 7.5 M(76%)
>
> 2017-11-23 10:19:27 DEBUG otopi.plugins.otopi.packagers.yumpackager
> yumpackager.verbose:76 Yum Downloading: ovirt-4.1-epel/x86_64/filelists_db
> 9.9 M(100%)
>
> 2017-11-23 10:19:29 DEBUG otopi.plugins.otopi.packagers.yumpackager
> yumpackager.verbose:76 Yum Downloading: ovirt-4.1-epel/x86_64/other_db (0%)
>
> 2017-11-23 10:19:29 DEBUG otopi.plugins.otopi.packagers.yumpackager
> yumpackager.verbose:76 Yum Downloading: ovirt-4.1-epel/x86_64/other_db 2.9
> M(100%)
>
> 2017-11-23 10:19:30 DEBUG otopi.plugins.otopi.packagers.yumpackager
> yumpackager.verbose:76 Yum Downloading:
> ovirt-4.1-patternfly1-noarch-epel/x86_64/filelists_db (0%)
>
> 2017-11-23 10:19:30 DEBUG otopi.plugins.otopi.packagers.yumpackager
> yumpackager.verbose:76 Yum Downloading:
> ovirt-4.1-patternfly1-noarch-epel/x86_64/filelists_db 6.5 k(100%)
>
> 2017-11-23 10:19:31 DEBUG otopi.plugins.otopi.packagers.yumpackager
> yumpackager.verbose:76 Yum Downloading:
> ovirt-4.1-patternfly1-noarch-epel/x86_64/other_db (0%)
>
> 2017-11-23 10:19:31 DEBUG otopi.plugins.otopi.packagers.yumpackager
> yumpackager.verbose:76 Yum Downloading:
> ovirt-4.1-patternfly1-noarch-epel/x86_64/other_db 851 (100%)
>
> 2017-11-23 10:19:31 DEBUG otopi.plugins.otopi.packagers.yumpackager
> yumpackager.verbose:76 Yum Downloading:
> ovirt-centos-ovirt41/7/x86_64/filelists_db (0%)
>
> 2017-11-23 10:19:31 DEBUG otopi.plugins.otopi.packagers.yumpackager
> yumpackager.verbose:76 Yum Downloading:
> ovirt-centos-ovirt41/7/x86_64/filelists_db 312 k(100%)
>
> 2017-11-23 10:19:31 DEBUG otopi.plugins.otopi.packagers.yumpackager
> yumpackager.verbose:76 Yum Downloading:
> ovirt-centos-ovirt41/7/x86_64/other_db (0%)
>
> 2017-11-23 10:19:31 DEBUG otopi.plugins.otopi.packagers.yumpackager
> yumpackager.verbose:76 Yum Downloading:
> ovirt-centos-ovirt41/7/x86_64/other_db 84 k(100%)
>
> 2017-11-23 10:19:32 DEBUG otopi.plugins.otopi.packagers.yumpackager
> yumpackager.verbose:76 Yum Downloading:
> rnachimu-gdeploy/x86_64/filelists_db (0%)
>
> 2017-11-23 10:19:32 DEBUG otopi.plugins.otopi.packagers.yumpackager
> yumpackager.verbose:76 Yum Downloading:
> rnachimu-gdeploy/x86_64/filelists_db 4.5 k(100%)
>
> 2017-11-23 10:19:32 DEBUG otopi.plugins.otopi.packagers.yumpackager
> yumpackager.verbose:76 Yum Downloading: rnachimu-gdeploy/x86_64/other_db
> (0%)
>
> 2017-11-23 10:19:32 DEBUG otopi.plugins.otopi.packagers.yumpackager
> yumpackager.verbose:76 Yum Downloading: rnachimu-gdeploy/x86_64/other_db
> 1.4 k(100%)
>
> 2017-11-23 10:19:32 DEBUG otopi.plugins.otopi.packagers.yumpackager
> yumpackager.verbose:76 Yum Downloading: virtio-win-stable/filelists_db (0%)
>
> 2017-11-23 10:19:32 DEBUG otopi.plugins.otopi.packagers.yumpackager
> yumpackager.verbose:76 Yum Downloading: virtio-win-stable/filelists_db 3.9
> k(100%)
>
> 2017-11-23 10:19:33 DEBUG otopi.plugins.otopi.packagers.yumpackager
> yumpackager.verbose:76 Yum Downloading: virtio-win-stable/other_db (0%)
>
> 2017-11-23 10:19:33 DEBUG otopi.plugins.otopi.packagers.yumpackager
> yumpackager.verbose:76 Yum Downloading: virtio-win-stable/other_db 4.3
> k(100%)
>
> 2017-11-23 10:19:33 ERROR otopi.plugins.otopi.packagers.yumpackager
> yumpackager.error:85 Yum Transaction close failed: Traceback (most recent
> call last):
>
> File "/tmp/ovirt-3JI9q14aGS/pythonlib/otopi/miniyum.py", line 761, in
> endTransaction
>
> if self._yb.history_undo(transactionCurrent):
>
> File "/usr/lib/python2.7/site-packages/yum/__init__.py", line 6086, in
> history_undo
>
> if self.install(pkgtup=pkg.pkgtup):
>
> File "/usr/lib/python2.7/site-packages/yum/__init__.py", line 4910, in
> install
>
> raise Errors.InstallError, _('No package(s) available to install')
>
> InstallError: Kein(e) Paket(e) zum Installieren verfügbar.
>
>
> ###
>
>
>
> Some more information on my system:
>
>
> ###
>
>
> $ mount
>
> ...
>
> /dev/mapper/onn-ovirt--node--ng--4.1.1.1--0.20170504.0+1 on / type ext4
> (rw,relatime,discard,stripe=128,data=ordered)
>
>
>
> $ imgbase layout
>
> ovirt-node-ng-4.1.1.1-0.20170406.0
>
> ovirt-node-ng-4.1.1.1-0.20170504.0
>
> +- ovirt-node-ng-4.1.1.1-0.20170504.0+1
>
> ovirt-node-ng-4.1.7-0.20171108.0
>
> +- ovirt-node-ng-4.1.7-0.20171108.0+1
>
>
>
>
>
> $ rpm -q ovirt-node-ng-image
>
> Das Paket ovirt-node-ng-image ist nicht installiert
>
>
>
> $ nodectl check
>
> Status: OK
>
> Bootloader ... OK
>
> Layer boot entries ... OK
>
> Valid boot entries ... OK
>
> Mount points ... OK
>
> Separate /var ... OK
>
> Discard is used ... OK
>
> Basic storage ... OK
>
> Initialized VG ... OK
>
> Initialized Thin Pool ... OK
>
> Initialized LVs ... OK
>
> Thin storage ... OK
>
> Checking available space in thinpool ... OK
>
> Checking thinpool auto-extend ... OK
>
> vdsmd ... OK
>
>
> ###
>
>
> I can restart my Node and VMs are running, but oVirt Engine tells me no
> update is available. It seems 4.1.7 is installed, but Node still boots the
> old 4.1.1 image.
>
>
> Can i force run the upgrade again or is there another way to fix this?
>
>
> Thanks
>
> Greets
>
> Kilian
>
>
>
>
>
>
> _______________________________________________
> Users mailing list
> Users(a)ovirt.org
> http://lists.ovirt.org/mailman/listinfo/users
>
>
>
>
> --
>
> Lev Veyde
>
> Software Engineer, RHCE | RHCVA | MCITP
>
> Red Hat Israel
>
> <https://www.redhat.com>
>
> lev(a)redhat.com | lveyde(a)redhat.com
> <https://red.ht/sig>
> TRIED. TESTED. TRUSTED. <https://redhat.com/trusted>
>
--
RYAN BARRY
SENIOR SOFTWARE ENGINEER - TEAM LEAD - RHEV HYPERVISOR
Red Hat NA <https://www.redhat.com/>
rbarry(a)redhat.com M: +1-651-815-9306 <javascript:void(0);> IM: rbarry
<https://red.ht/sig>
3 years, 3 months
Method to easily verify version of host
by Gianluca Cecchi
Hello,
currently in web admin gui one can easily verify the exact version of its
engine.
In my case I see
oVirt Engine Version: 4.1.7.6-1.el7.centos
But the same it seems doesn't happen for hypervisors.
Eg in my case I see that they are not aligned with my engine level because
of the "Update available" icon in side of them in Hosts tab and the related
event of type
"
Check for available updates on host ov300 was completed successfully with
message 'found updates for packages ... qemu-img-ev-2.9.0-16.el7_4.8.1 ...
vdsm-4.19.37-1.el7.centos ...
"
One can crosscheck vdsm version for each hypervisor, but it is suboptimal
if you have many hosts and it is not immediate to see the potentially
different levels of misalignment, in case you have many hosts...
Is there anything or do you think it worth's while to open an RFE for it?
Gianluca
3 years, 3 months