Hi Gianluca,
Upgrade host time depends on the upgrading that needs to be done
As for your second question, we updated ansible only if  there was another package which required a new version of it, and we have used a different (temporarily created) yum.conf file which had the setting best=False, so yes, this behavior is currently expected, (and will be changed as we're going to allow updating ansible package for el8 hosts too)
Hope this helps,
Dana


On Tue, Jun 8, 2021 at 12:28 PM Gianluca Cecchi <gianluca.cecchi@gmail.com> wrote:
On Tue, Jun 8, 2021 at 11:15 AM Gianluca Cecchi <gianluca.cecchi@gmail.com> wrote:
Hello,
I have a 4.4.5 environment that I'm upgrading to 4.4.6.

I'm upgrading plain CentOS hosts from the GUI.
They are in 4.4.5, so in particular CentOS 8.3 and as part of the upgrade they have to be put to 8.4.

In the past I used "yum update" on the host but now it seems it is not the correct way.

But the ansible part related to package updates seems to be very slow.
It gives the impression that it is doing it one by one and not as a whole when you run "yum update"
Now it is about 30 minutes that the update is going on and my internet speed is for sure very high.

In messages of host I see every single line suche this ones:

Jun  8 11:09:30 ov300 python3[3031815]: ansible-dnf Invoked with name=['rsyslog-relp.x86_64'] state=latest lock_timeout=300 conf_file=/tmp/yum.conf allow_downgrade=False autoremove=False bugfix=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True disable_excludes=None download_dir=None list=None releasever=None
Jun  8 11:09:32 ov300 python3[3031828]: ansible-dnf Invoked with name=['runc.x86_64'] state=latest lock_timeout=300 conf_file=/tmp/yum.conf allow_downgrade=False autoremove=False bugfix=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True disable_excludes=None download_dir=None list=None releasever=None

Any clarification?

Thanks,
Gianluca

BTW the update had a duration of 33 minutes (I selected to not reboot the host):
Jun 8, 2021, 10:40:35 AM Host ov300 upgrade was started (User: tekka@mydomain).
Jun 8, 2021, 11:13:13 AM Host ov300 upgrade was completed successfully.

At the end if I open a terminal I see:

[root@ov300 ~]# rpm -q ansible
ansible-2.9.16-2.el8.noarch
[root@ov300 ~]#

and

[root@ov300 ~]# yum update
Last metadata expiration check: 0:39:50 ago on Tue 08 Jun 2021 10:41:09 AM CEST.
Dependencies resolved.
===============================================================================================================
 Package             Architecture       Version                     Repository                            Size
===============================================================================================================
Upgrading:
 ansible             noarch             2.9.21-2.el8                ovirt-4.4-centos-ovirt44              17 M

Transaction Summary
===============================================================================================================
Upgrade  1 Package

Total download size: 17 M
Is this ok [y/N]:
Operation aborted.
[root@ov300 ~]#

Is it expected?
Currently in my yum.conf I have:

[main]
gpgcheck=1
installonly_limit=3
clean_requirements_on_remove=True
best=True
skip_if_unavailable=False

Gianluca


_______________________________________________
Users mailing list -- users@ovirt.org
To unsubscribe send an email to users-leave@ovirt.org
Privacy Statement: https://www.ovirt.org/privacy-policy.html
oVirt Code of Conduct: https://www.ovirt.org/community/about/community-guidelines/
List Archives: https://lists.ovirt.org/archives/list/users@ovirt.org/message/4P6YORNJVZG67EQT5TTVGCKMGXL7VRY5/