Local (Deployment) VM Can't Reach "centos-ceph-pacific" Repo
by Matthew J Black
Hi All,
OK, new issue: :-(
If I'm reading things right the local (deployment) vm can't get to the centos-ceph-pacific repo.
The repo is installed on the host machine (along with all of the relevant dependant repos).
I thought that local 192.168.222.0/24 network was nated out the virbr0 virtual bridge - am I wrong in this (ie do we need to update/change our routing tables/whatever)?
Here is the (relevant part) of the log:
~~~
2022-10-11 16:22:14,749+1100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:115 TASK [ovirt.ovirt.engine_setup : Install oVirt Engine package]
2022-10-11 16:26:03,643+1100 DEBUG otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:109 {'results': [], 'rc': 1, 'msg': "Failed to download metadata for repo 'centos-ceph-pacific': Cannot download repomd.xml: Cannot download repodata/repomd.xml: All mirrors were tried", 'invocation': {'module_args': {'name': ['ovirt-engine'], 'state': 'present', 'allow_downgrade': False, 'autoremove': False, 'bugfix': False, 'cacheonly': False, 'disable_gpg_check': False, 'disable_plugin': [], 'disablerepo': [], 'download_only': False, 'enable_plugin': [], 'enablerepo': [], 'exclude': [], 'installroot': '/', 'install_repoquery': True, 'install_weak_deps': True, 'security': False, 'skip_broken': False, 'update_cache': False, 'update_only': False, 'validate_certs': True, 'lock_timeout': 30, 'allowerasing': False, 'nobest': False, 'conf_file': None, 'disable_excludes': None, 'download_dir': None, 'list': None, 'releasever': None}}, '_ansible_no_log': False, 'changed': False, '
_ansible_delegated_vars': {'ansible_host': '192.168.222.77', 'ansible_port': None, 'ansible_user': 'root', 'ansible_connection': 'smart'}}
2022-10-11 16:26:03,744+1100 ERROR otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:113 fatal: [localhost -> 192.168.222.77]: FAILED! => {"changed": false, "msg": "Failed to download metadata for repo 'centos-ceph-pacific': Cannot download repomd.xml: Cannot download repodata/repomd.xml: All mirrors were tried", "rc": 1, "results": []}
2022-10-11 16:26:04,045+1100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:115 TASK [ovirt.ovirt.hosted_engine_setup : Sync on engine machine]
2022-10-11 16:26:04,947+1100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:115 changed: [localhost -> 192.168.222.77]
2022-10-11 16:26:05,449+1100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:115 TASK [ovirt.ovirt.hosted_engine_setup : Set destination directory path]
2022-10-11 16:26:05,950+1100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:115 ok: [localhost -> localhost]
2022-10-11 16:26:06,352+1100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:115 TASK [ovirt.ovirt.hosted_engine_setup : Create destination directory]
2022-10-11 16:26:06,953+1100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:115 changed: [localhost -> localhost]
2022-10-11 16:26:07,355+1100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:115 TASK [ovirt.ovirt.hosted_engine_setup : include_tasks]
2022-10-11 16:26:07,856+1100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:115 ok: [localhost]
2022-10-11 16:26:08,357+1100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:115 TASK [ovirt.ovirt.hosted_engine_setup : Find the local appliance image]
2022-10-11 16:26:08,959+1100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:115 ok: [localhost -> localhost]
2022-10-11 16:26:09,460+1100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:115 TASK [ovirt.ovirt.hosted_engine_setup : Set local_vm_disk_path]
2022-10-11 16:26:09,862+1100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:115 ok: [localhost -> localhost]
2022-10-11 16:26:10,363+1100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:115 TASK [ovirt.ovirt.hosted_engine_setup : Give the vm time to flush dirty buffers]
2022-10-11 16:26:20,986+1100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:115 ok: [localhost -> localhost]
2022-10-11 16:26:21,388+1100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:115 TASK [ovirt.ovirt.hosted_engine_setup : Copy engine logs]
2022-10-11 16:26:27,901+1100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:115 changed: [localhost]
2022-10-11 16:26:28,403+1100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:115 TASK [ovirt.ovirt.hosted_engine_setup : Change ownership of copied engine logs]
2022-10-11 16:26:29,005+1100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:115 changed: [localhost -> localhost]
2022-10-11 16:26:29,506+1100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:115 TASK [ovirt.ovirt.hosted_engine_setup : Notify the user about a failure]
2022-10-11 16:26:29,908+1100 DEBUG otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:109 {'msg': 'There was a failure deploying the engine on the local engine VM. The system may not be provisioned according to the playbook results: please check the logs for the issue, fix accordingly or re-deploy from scratch.\n', '_ansible_no_log': False, 'changed': False}
2022-10-11 16:26:30,008+1100 ERROR otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:113 fatal: [localhost]: FAILED! => {"changed": false, "msg": "There was a failure deploying the engine on the local engine VM. The system may not be provisioned according to the playbook results: please check the logs for the issue, fix accordingly or re-deploy from scratch.\n"}
2022-10-11 16:26:30,410+1100 DEBUG otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:109 PLAY RECAP [localhost] : ok: 161 changed: 59 unreachable: 0 skipped: 80 failed: 1
2022-10-11 16:26:30,510+1100 DEBUG otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils.run:226 ansible-playbook rc: 2
2022-10-11 16:26:30,510+1100 DEBUG otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils.run:233 ansible-playbook stdout:
2022-10-11 16:26:30,510+1100 DEBUG otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils.run:236 ansible-playbook stderr:
2022-10-11 16:26:30,510+1100 DEBUG otopi.plugins.gr_he_ansiblesetup.core.misc misc._closeup:475 {'otopi_host_net': {'ansible_facts': {'otopi_host_net': ['bond0.40', 'ens1', 'bond0.20']}, '_ansible_no_log': False, 'changed': False}, 'otopi_localvm_dir': {'changed': True, 'path': '/var/tmp/localvmfq67djtv', 'uid': 0, 'gid': 0, 'owner': 'root', 'group': 'root', 'mode': '0700', 'state': 'directory', 'secontext': 'unconfined_u:object_r:user_tmp_t:s0', 'size': 6, 'invocation': {'module_args': {'state': 'directory', 'path': '/var/tmp', 'prefix': 'localvm', 'suffix': ''}}, '_ansible_no_log': False}, 'otopi_appliance_disk_size': {'ansible_facts': {'virtual_size': '53689188352'}, '_ansible_no_log': False, 'changed': False}, 'ansible-playbook_rc': 2}
2022-10-11 16:26:30,511+1100 DEBUG otopi.context context._executeMethod:145 method exception
Traceback (most recent call last):
File "/usr/lib/python3.6/site-packages/otopi/context.py", line 132, in _executeMethod
method['method']()
File "/usr/share/ovirt-hosted-engine-setup/scripts/../plugins/gr-he-ansiblesetup/core/misc.py", line 509, in _closeup
raise RuntimeError(_('Failed executing ansible-playbook'))
RuntimeError: Failed executing ansible-playbook
2022-10-11 16:26:30,511+1100 ERROR otopi.context context._executeMethod:154 Failed to execute stage 'Closing up': Failed executing ansible-playbook
2022-10-11 16:26:30,512+1100 DEBUG otopi.context context.dumpEnvironment:765 ENVIRONMENT DUMP - BEGIN
2022-10-11 16:26:30,512+1100 DEBUG otopi.context context.dumpEnvironment:775 ENV BASE/error=bool:'True'
2022-10-11 16:26:30,512+1100 DEBUG otopi.context context.dumpEnvironment:775 ENV BASE/exceptionInfo=list:'[(<class 'RuntimeError'>, RuntimeError('Failed executing ansible-playbook',), <traceback object at 0x7efe2a39fa08>)]'
2022-10-11 16:26:30,512+1100 DEBUG otopi.context context.dumpEnvironment:775 ENV OVEHOSTED_CORE/localVMDir=str:'/var/tmp/localvmfq67djtv'
2022-10-11 16:26:30,512+1100 DEBUG otopi.context context.dumpEnvironment:775 ENV OVEHOSTED_STORAGE/ovfSizeGB=int:'51'
2022-10-11 16:26:30,513+1100 DEBUG otopi.context context.dumpEnvironment:779 ENVIRONMENT DUMP - END
~~~
Anyone got any ideas - thanks:
Dulux-Oz
2 years, 1 month
Failed to delete snapshot
by Jirka Simon
Hello there.
we have troubles with one VM ad it's snapshot
here is short log from engine.log
2022-08-23 09:02:38,735+02 INFO
[org.ovirt.engine.core.bll.tasks.CommandAsyncTask]
(EE-ManagedThreadFactory-engine-Thread-1265784)
[4d79272b-ffac-4b2b-a987-8c19577cd8c4] CommandAsyncTask::
HandleEndActionResult [within thread]: Removing CommandMultiAsyncTasks
object for entity '5842cafc-60d4-4131-8da1-8a909123f08c'
2022-08-23 09:02:41,435+02 ERROR
[org.ovirt.engine.core.bll.snapshots.RemoveSnapshotSingleDiskLiveCommand]
(EE-ManagedScheduledExecutorService-engineScheduledThreadPool-Thread-68)
[4d79272b
-ffac-4b2b-a987-8c19577cd8c4] Command id:
'ead3fbba-7439-4337-8aaf-1d6cf65bbb15 failed child command status for
step 'REDUCE_IMAGE'
2022-08-23 09:02:41,435+02 INFO
[org.ovirt.engine.core.bll.snapshots.RemoveSnapshotSingleDiskLiveCommandCallback] (EE-ManagedScheduledExecutorService-engineScheduledThreadPool-Thread-68) [
4d79272b-ffac-4b2b-a987-8c19577cd8c4] Command
'RemoveSnapshotSingleDiskLive' id:
'ead3fbba-7439-4337-8aaf-1d6cf65bbb15' child commands
'[5b33d970-6ea0-4b8b-bc31-607afd8af1ca, 185d3fec-8c76-
44d3-a0a1-fe8fff331354, 308baf9d-1fc5-43a1-a624-b5f08f7bec1b,
f11f3550-4356-417e-af10-a3afb2051dec,
5842cafc-60d4-4131-8da1-8a909123f08c]' executions were completed, status
'FAILED'
2022-08-23 09:02:42,445+02 ERROR
[org.ovirt.engine.core.bll.snapshots.RemoveSnapshotSingleDiskLiveCommand]
(EE-ManagedScheduledExecutorService-engineScheduledThreadPool-Thread-79)
[4d79272b
-ffac-4b2b-a987-8c19577cd8c4] Merging of snapshot
'6634db4e-23d8-4849-a008-864f15d28c3d' images
'd24a9199-741b-49cb-96d2-0d76fcd21f48'..'48af3301-0cbb-4a6e-97d1-7299e7de883f'
failed. Images
have been marked illegal and can no longer be previewed or reverted to.
Please retry Live Merge on the snapshot to complete the operation.
2022-08-23 09:02:42,451+02 ERROR
[org.ovirt.engine.core.bll.snapshots.RemoveSnapshotSingleDiskLiveCommand]
(EE-ManagedScheduledExecutorService-engineScheduledThreadPool-Thread-79)
[4d79272b
-ffac-4b2b-a987-8c19577cd8c4] Ending command
'org.ovirt.engine.core.bll.snapshots.RemoveSnapshotSingleDiskLiveCommand'
with failure.
2022-08-23 09:02:42,457+02 INFO
[org.ovirt.engine.core.bll.ConcurrentChildCommandsExecutionCallback]
(EE-ManagedScheduledExecutorService-engineScheduledThreadPool-Thread-79)
[4d79272b-ffac
-4b2b-a987-8c19577cd8c4] Command 'RemoveSnapshot' id:
'73340ac8-07d7-4de2-bfeb-8062fa1e8cfd' child commands
'[ead3fbba-7439-4337-8aaf-1d6cf65bbb15]' executions were completed,
status 'FAILE
D'
2022-08-23 09:02:43,470+02 ERROR
[org.ovirt.engine.core.bll.snapshots.RemoveSnapshotCommand]
(EE-ManagedScheduledExecutorService-engineScheduledThreadPool-Thread-22)
[4d79272b-ffac-4b2b-a98
7-8c19577cd8c4] Ending command
'org.ovirt.engine.core.bll.snapshots.RemoveSnapshotCommand' with failure.
2022-08-23 09:02:43,490+02 ERROR
[org.ovirt.engine.core.dal.dbbroker.auditloghandling.AuditLogDirector]
(EE-ManagedScheduledExecutorService-engineScheduledThreadPool-Thread-22)
[4d79272b-ff
ac-4b2b-a987-8c19577cd8c4] EVENT_ID:
USER_REMOVE_SNAPSHOT_FINISHED_FAILURE(357), Failed to delete snapshot
'vProtect 2022-08-05 22:33:44.088143' for VM 'web1.wiki.prod.hq.sldev.cz-2'.
Snapshot can't be deleted, and VM is not possible to clone.
thank you for any help.
Jirka
2 years, 2 months
Help - Unable to break bond interface from Ovrit Engine
by Amrit Thakur
Hi,
Currently I am using ovirt 4.3 version with two clusters. Previously I used
a bond interface with two NICs for NFS storage. Now I want to use iscsi
with multipathing but I didn't find any option to break the bond interface
from the ovirt engine GUI.
So my question is, is there any way to break the bond interface or if I
remove bond interface from node os level, will it sync with the ovirt
engine after restart os or vdsm service ?
Thank You
2 years, 2 months
RPM issues updating Ovirt 4.5.2 to 4.5.3
by David Johnson
Good afternoon all,
We are trying to update our cluster from 4.5.2 to see if we can
resolve some issues with the VM Web console not functioning on VM's that
run on one of the hosts. So there's two problems here, if we can resolve
the one (dependency resolution) we are hoping that we can resolve the other
with a reinstall of the software.
*Symptoms:*
1. On VM's running one one host, the VM web console does not work. The
console.vv downloads to the desktop and we can attempt to launch it. On
launch, it immediately exits. The web console works on the VM's on the
other host.
2. Attempting to update or reinstall the software to any host via the ovirt
Compute -> Hosts -> Installation -> Reinstall, or Upgrade menu, we get a
dependency resolution error:
package ovirt-openvswitch-2.15-4.el8.noarch requires openvswitch2.15, but
none of the providers can be installed\\n - package
rdo-openvswitch-2:2.17-3.el8.noarch obsoletes openvswitch2.15 < 2.17
provided by openvswitch2.15-2.15.0-117.el8s.x86_64
It appears that the 4.5.2 build is running on an older release of
openvswitch?
Please advise.
*Environment:*
Production environment: Ovirt 4.5.2.4-1.el8
1 Standalone engine (upgraded to
3 hosts
*Log excerpts*
*VM Web Console Not Starting:*
Oct 18 13:22:02 cen-76-alc-qa-163 avahi-daemon[780]: Withdrawing address
record for fe80::f0ca:56ff:fe8c:7bb8 on veth8a5345f.
Oct 18 13:22:02 cen-76-alc-qa-163 avahi-daemon[780]: Withdrawing
workstation service for veth8a5345f.
Oct 18 13:22:02 cen-76-alc-qa-163 avahi-daemon[780]: Withdrawing address
record for fe80::d879:d5ff:fee4:1855 on vethc9bd12f.
Oct 18 13:22:02 cen-76-alc-qa-163 avahi-daemon[780]: Withdrawing
workstation service for vethc9bd12f.
Oct 18 13:22:02 cen-76-alc-qa-163 avahi-daemon[780]: Withdrawing address
record for fe80::42:15ff:fe5a:679 on br-1feb13c47a4f.
Oct 18 13:22:02 cen-76-alc-qa-163 avahi-daemon[780]: Withdrawing address
record for 172.22.0.1 on br-1feb13c47a4f.
Oct 18 13:22:02 cen-76-alc-qa-163 avahi-daemon[780]: Withdrawing
workstation service for br-1feb13c47a4f.
Oct 18 13:22:02 cen-76-alc-qa-163 avahi-daemon[780]: Withdrawing address
record for 172.19.0.1 on docker_gwbridge.
Oct 18 13:22:02 cen-76-alc-qa-163 avahi-daemon[780]: Withdrawing
workstation service for docker_gwbridge.
Oct 18 13:22:02 cen-76-alc-qa-163 avahi-daemon[780]: Withdrawing address
record for 172.17.0.1 on docker0.
Oct 18 13:22:02 cen-76-alc-qa-163 avahi-daemon[780]: Withdrawing
workstation service for docker0.
Oct 18 13:22:02 cen-76-alc-qa-163 avahi-daemon[780]: Withdrawing address
record for 172.18.0.1 on br-4209e789b982.
Oct 18 13:22:02 cen-76-alc-qa-163 avahi-daemon[780]: Withdrawing
workstation service for br-4209e789b982.
Oct 18 13:22:02 cen-76-alc-qa-163 avahi-daemon[780]: Withdrawing
workstation service for virbr0-nic.
Oct 18 13:22:02 cen-76-alc-qa-163 avahi-daemon[780]: Withdrawing address
record for 192.168.122.1 on virbr0.
Oct 18 13:22:02 cen-76-alc-qa-163 avahi-daemon[780]: Withdrawing
workstation service for virbr0.
Oct 18 13:22:02 cen-76-alc-qa-163 avahi-daemon[780]: Withdrawing address
record for 192.168.2.163 on eth0.
Oct 18 13:22:02 cen-76-alc-qa-163 avahi-daemon[780]: Withdrawing
workstation service for eth0.
Oct 18 13:22:02 cen-76-alc-qa-163 avahi-daemon[780]: Withdrawing
workstation service for lo.
Oct 18 13:22:02 cen-76-alc-qa-163 avahi-daemon[780]: Host name conflict,
retrying with cen-76-alc-qa-4236
Oct 18 13:22:02 cen-76-alc-qa-163 avahi-daemon[780]: Registering new
address record for fe80::f0ca:56ff:fe8c:7bb8 on veth8a5345f.*.
Oct 18 13:22:02 cen-76-alc-qa-163 avahi-daemon[780]: Registering new
address record for fe80::d879:d5ff:fee4:1855 on vethc9bd12f.*.
Oct 18 13:22:02 cen-76-alc-qa-163 avahi-daemon[780]: Registering new
address record for fe80::42:15ff:fe5a:679 on br-1feb13c47a4f.*.
Oct 18 13:22:02 cen-76-alc-qa-163 avahi-daemon[780]: Registering new
address record for 172.22.0.1 on br-1feb13c47a4f.IPv4.
Oct 18 13:22:02 cen-76-alc-qa-163 avahi-daemon[780]: Registering new
address record for 172.19.0.1 on docker_gwbridge.IPv4.
Oct 18 13:22:02 cen-76-alc-qa-163 avahi-daemon[780]: Registering new
address record for 172.17.0.1 on docker0.IPv4.
Oct 18 13:22:02 cen-76-alc-qa-163 avahi-daemon[780]: Registering new
address record for 172.18.0.1 on br-4209e789b982.IPv4.
Oct 18 13:22:02 cen-76-alc-qa-163 avahi-daemon[780]: Registering new
address record for 192.168.122.1 on virbr0.IPv4.
Oct 18 13:22:02 cen-76-alc-qa-163 avahi-daemon[780]: Registering new
address record for fe80::3a72:e773:4d49:55ba on eth0.*.
Oct 18 13:22:02 cen-76-alc-qa-163 avahi-daemon[780]: Registering new
address record for 192.168.2.163 on eth0.IPv4.
Oct 18 13:22:02 cen-76-alc-qa-163 avahi-daemon[780]: Server startup
complete. Host name is cen-76-alc-qa-4236.local. Local service cookie is
3875159144.
Oct 18 13:22:02 cen-76-alc-qa-163 avahi-daemon[780]: Registering HINFO
record with values 'X86_64'/'LINUX'.
Oct 18 13:22:02 cen-76-alc-qa-163 dockerd:
time="2022-10-18T13:22:02.828860278-05:00" level=warning msg="grpc:
addrConn.createTransport failed to connect to {192.168.2.162:2377 <nil> 0
<nil>}. Err :connection error: desc = \"transport: Error while dialing dial
tcp 192.168.2.162:2377: connect: connection refused\". Reconnecting..."
module=grpc
Oct 18 13:22:04 cen-76-alc-qa-163 dockerd:
time="2022-10-18T13:22:04.695827803-05:00" level=error msg="agent: session
failed" backoff=8s error="rpc error: code = Unavailable desc = connection
error: desc = \"transport: Error while dialing dial tcp 192.168.2.162:2377:
connect: connection refused\"" module=node/agent node.id
=iz0s4qedzrsxkdcbq6nzvxyfj
Oct 18 13:22:04 cen-76-alc-qa-163 dockerd:
time="2022-10-18T13:22:04.695994815-05:00" level=info msg="parsed scheme:
\"\"" module=grpc
Oct 18 13:22:04 cen-76-alc-qa-163 dockerd:
time="2022-10-18T13:22:04.696023188-05:00" level=info msg="scheme \"\" not
registered, fallback to default scheme" module=grpc
Oct 18 13:22:04 cen-76-alc-qa-163 dockerd:
time="2022-10-18T13:22:04.696208750-05:00" level=info
msg="ccResolverWrapper: sending update to cc: {[{192.168.2.162:2377 <nil>
0 <nil>}] <nil> <nil>}" module=grpc
Oct 18 13:22:04 cen-76-alc-qa-163 dockerd:
time="2022-10-18T13:22:04.696239844-05:00" level=info msg="ClientConn
switching balancer to \"pick_first\"" module=grpc
Oct 18 13:22:04 cen-76-alc-qa-163 dockerd:
time="2022-10-18T13:22:04.696294355-05:00" level=info msg="manager selected
by agent for new session: {z80ib665fps13yuqtzqqnltr6 192.168.2.162:2377}"
module=node/agent node.id=iz0s4qedzrsxkdcbq6nzvxyfj
Oct 18 13:22:04 cen-76-alc-qa-163 dockerd:
time="2022-10-18T13:22:04.696355540-05:00" level=info msg="waiting
6.545410435s before registering session" module=node/agent node.id
=iz0s4qedzrsxkdcbq6nzvxyfj
Oct 18 13:22:04 cen-76-alc-qa-163 dockerd:
time="2022-10-18T13:22:04.697629711-05:00" level=warning msg="grpc:
addrConn.createTransport failed to connect to {192.168.2.162:2377 <nil> 0
<nil>}. Err :connection error: desc = \"transport: Error while dialing dial
tcp 192.168.2.162:2377: connect: connection refused\". Reconnecting..."
module=grpc
Oct 18 13:22:05 cen-76-alc-qa-163 dockerd:
time="2022-10-18T13:22:05.699785481-05:00" level=warning msg="grpc:
addrConn.createTransport failed to connect to {192.168.2.162:2377 <nil> 0
<nil>}. Err :connection error: desc = \"transport: Error while dialing dial
tcp 192.168.2.162:2377: connect: connection refused\". Reconnecting..."
module=grpc
Oct 18 13:22:07 cen-76-alc-qa-163 dockerd:
time="2022-10-18T13:22:07.247677166-05:00" level=warning msg="grpc:
addrConn.createTransport failed to connect to {192.168.2.162:2377 <nil> 0
<nil>}. Err :connection error: desc = \"transport: Error while dialing dial
tcp 192.168.2.162:2377: connect: connection refused\". Reconnecting..."
module=grpc
Oct 18 13:22:10 cen-76-alc-qa-163 dockerd:
time="2022-10-18T13:22:10.062182617-05:00" level=warning msg="grpc:
addrConn.createTransport failed to connect to {192.168.2.162:2377 <nil> 0
<nil>}. Err :connection error: desc = \"transport: Error while dialing dial
tcp 192.168.2.162:2377: connect: connection refused\". Reconnecting..."
module=grpc
Oct 18 13:22:11 cen-76-alc-qa-163 dockerd:
time="2022-10-18T13:22:11.242738688-05:00" level=error msg="agent: session
failed" backoff=8s error="rpc error: code = Unavailable desc = connection
error: desc = \"transport: Error while dialing dial tcp 192.168.2.162:2377:
connect: connection refused\"" module=node/agent node.id
=iz0s4qedzrsxkdcbq6nzvxyfj
Oct 18 13:22:11 cen-76-alc-qa-163 dockerd:
time="2022-10-18T13:22:11.242918867-05:00" level=info msg="parsed scheme:
\"\"" module=grpc
Oct 18 13:22:11 cen-76-alc-qa-163 dockerd:
time="2022-10-18T13:22:11.242954837-05:00" level=info msg="scheme \"\" not
registered, fallback to default scheme" module=grpc
Oct 18 13:22:11 cen-76-alc-qa-163 dockerd:
time="2022-10-18T13:22:11.243141828-05:00" level=info
msg="ccResolverWrapper: sending update to cc: {[{192.168.2.162:2377 <nil>
0 <nil>}] <nil> <nil>}" module=grpc
Oct 18 13:22:11 cen-76-alc-qa-163 dockerd:
time="2022-10-18T13:22:11.243170522-05:00" level=info msg="ClientConn
switching balancer to \"pick_first\"" module=grpc
Oct 18 13:22:11 cen-76-alc-qa-163 dockerd:
time="2022-10-18T13:22:11.243226320-05:00" level=info msg="manager selected
by agent for new session: {z80ib665fps13yuqtzqqnltr6 192.168.2.162:2377}"
module=node/agent node.id=iz0s4qedzrsxkdcbq6nzvxyfj
Oct 18 13:22:11 cen-76-alc-qa-163 dockerd:
time="2022-10-18T13:22:11.243290968-05:00" level=info msg="waiting
3.202102632s before registering session" module=node/agent node.id
=iz0s4qedzrsxkdcbq6nzvxyfj
Oct 18 13:22:11 cen-76-alc-qa-163 dockerd:
time="2022-10-18T13:22:11.244389573-05:00" level=warning msg="grpc:
addrConn.createTransport failed to connect to {192.168.2.162:2377 <nil> 0
<nil>}. Err :connection error: desc = \"transport: Error while dialing dial
tcp 192.168.2.162:2377: connect: connection refused\". Reconnecting..."
module=grpc
Oct 18 13:22:12 cen-76-alc-qa-163 dockerd:
time="2022-10-18T13:22:12.246550975-05:00" level=warning msg="grpc:
addrConn.createTransport failed to connect to {192.168.2.162:2377 <nil> 0
<nil>}. Err :connection error: desc = \"transport: Error while dialing dial
tcp 192.168.2.162:2377: connect: connection refused\". Reconnecting..."
module=grpc
Oct 18 13:22:14 cen-76-alc-qa-163 dockerd:
time="2022-10-18T13:22:14.073299399-05:00" level=warning msg="grpc:
addrConn.createTransport failed to connect to {192.168.2.162:2377 <nil> 0
<nil>}. Err :connection error: desc = \"transport: Error while dialing dial
tcp 192.168.2.162:2377: connect: connection refused\". Reconnecting..."
module=grpc
Oct 18 13:22:14 cen-76-alc-qa-163 dockerd:
time="2022-10-18T13:22:14.446614366-05:00" level=error msg="agent: session
failed" backoff=8s error="rpc error: code = Unavailable desc = connection
error: desc = \"transport: Error while dialing dial tcp 192.168.2.162:2377:
connect: connection refused\"" module=node/agent node.id
=iz0s4qedzrsxkdcbq6nzvxyfj
Oct 18 13:22:14 cen-76-alc-qa-163 dockerd:
time="2022-10-18T13:22:14.446832560-05:00" level=info msg="parsed scheme:
\"\"" module=grpc
Oct 18 13:22:14 cen-76-alc-qa-163 dockerd:
time="2022-10-18T13:22:14.446863564-05:00" level=info msg="scheme \"\" not
registered, fallback to default scheme" module=grpc
Oct 18 13:22:14 cen-76-alc-qa-163 dockerd:
time="2022-10-18T13:22:14.447060785-05:00" level=info
msg="ccResolverWrapper: sending update to cc: {[{192.168.2.162:2377 <nil>
0 <nil>}] <nil> <nil>}" module=grpc
Oct 18 13:22:14 cen-76-alc-qa-163 dockerd:
time="2022-10-18T13:22:14.447089843-05:00" level=info msg="ClientConn
switching balancer to \"pick_first\"" module=grpc
Oct 18 13:22:14 cen-76-alc-qa-163 dockerd:
time="2022-10-18T13:22:14.447142554-05:00" level=info msg="manager selected
by agent for new session: {z80ib665fps13yuqtzqqnltr6 192.168.2.162:2377}"
module=node/agent node.id=iz0s4qedzrsxkdcbq6nzvxyfj
Oct 18 13:22:14 cen-76-alc-qa-163 dockerd:
time="2022-10-18T13:22:14.447206125-05:00" level=info msg="waiting
3.65571267s before registering session" module=node/agent node.id
=iz0s4qedzrsxkdcbq6nzvxyfj
Oct 18 13:22:14 cen-76-alc-qa-163 dockerd:
time="2022-10-18T13:22:14.448787529-05:00" level=warning msg="grpc:
addrConn.createTransport failed to connect to {192.168.2.162:2377 <nil> 0
<nil>}. Err :connection error: desc = \"transport: Error while dialing dial
tcp 192.168.2.162:2377: connect: connection refused\". Reconnecting..."
module=grpc
Oct 18 13:22:15 cen-76-alc-qa-163 dockerd:
time="2022-10-18T13:22:15.450286831-05:00" level=warning msg="grpc:
addrConn.createTransport failed to connect to {192.168.2.162:2377 <nil> 0
<nil>}. Err :connection error: desc = \"transport: Error while dialing dial
tcp 192.168.2.162:2377: connect: connection refused\". Reconnecting..."
module=grpc
Oct 18 13:22:17 cen-76-alc-qa-163 dockerd:
time="2022-10-18T13:22:17.254956187-05:00" level=warning msg="grpc:
addrConn.createTransport failed to connect to {192.168.2.162:2377 <nil> 0
<nil>}. Err :connection error: desc = \"transport: Error while dialing dial
tcp 192.168.2.162:2377: connect: connection refused\". Reconnecting..."
module=grpc
Oct 18 13:22:18 cen-76-alc-qa-163 dockerd:
time="2022-10-18T13:22:18.103309534-05:00" level=error msg="agent: session
failed" backoff=8s error="rpc error: code = Unavailable desc = connection
error: desc = \"transport: Error while dialing dial tcp 192.168.2.162:2377:
connect: connection refused\"" module=node/agent node.id
=iz0s4qedzrsxkdcbq6nzvxyfj
Oct 18 13:22:18 cen-76-alc-qa-163 dockerd:
time="2022-10-18T13:22:18.103525940-05:00" level=info msg="parsed scheme:
\"\"" module=grpc
Oct 18 13:22:18 cen-76-alc-qa-163 dockerd:
time="2022-10-18T13:22:18.103557034-05:00" level=info msg="scheme \"\" not
registered, fallback to default scheme" module=grpc
Oct 18 13:22:18 cen-76-alc-qa-163 dockerd:
time="2022-10-18T13:22:18.103843903-05:00" level=info
msg="ccResolverWrapper: sending update to cc: {[{192.168.2.162:2377 <nil>
0 <nil>}] <nil> <nil>}" module=grpc
Oct 18 13:22:18 cen-76-alc-qa-163 dockerd:
time="2022-10-18T13:22:18.103880439-05:00" level=info msg="ClientConn
switching balancer to \"pick_first\"" module=grpc
Oct 18 13:22:18 cen-76-alc-qa-163 dockerd:
time="2022-10-18T13:22:18.103970819-05:00" level=info msg="manager selected
by agent for new session: {z80ib665fps13yuqtzqqnltr6 192.168.2.162:2377}"
module=node/agent node.id=iz0s4qedzrsxkdcbq6nzvxyfj
Oct 18 13:22:18 cen-76-alc-qa-163 dockerd:
time="2022-10-18T13:22:18.104034386-05:00" level=info msg="waiting
3.770112424s before registering session" module=node/agent node.id
=iz0s4qedzrsxkdcbq6nzvxyfj
Oct 18 13:22:18 cen-76-alc-qa-163 dockerd:
time="2022-10-18T13:22:18.105363177-05:00" level=warning msg="grpc:
addrConn.createTransport failed to connect to {192.168.2.162:2377 <nil> 0
<nil>}. Err :connection error: desc = \"transport: Error while dialing dial
tcp 192.168.2.162:2377: connect: connection refused\". Reconnecting..."
module=grpc
Oct 18 13:22:19 cen-76-alc-qa-163 dockerd:
time="2022-10-18T13:22:19.107464606-05:00" level=warning msg="grpc:
addrConn.createTransport failed to connect to {192.168.2.162:2377 <nil> 0
<nil>}. Err :connection error: desc = \"transport: Error while dialing dial
tcp 192.168.2.162:2377: connect: connection refused\". Reconnecting..."
module=grpc
Oct 18 13:22:20 cen-76-alc-qa-163 dockerd:
time="2022-10-18T13:22:20.460645439-05:00" level=warning msg="grpc:
addrConn.createTransport failed to connect to {192.168.2.162:2377 <nil> 0
<nil>}. Err :connection error: desc = \"transport: Error while dialing dial
tcp 192.168.2.162:2377: connect: connection refused\". Reconnecting..."
module=grpc
*Host log from update attempt:*
2022-10-17 13:57:22 CDT - {
"uuid" : "71edb069-f43c-452b-8528-a0c3d6625fd1",
"counter" : 175,
"stdout" : "fatal: [192.168.2.18]: FAILED! => {\"changed\": false,
\"failures\": [], \"msg\": \"Depsolve Error occurred: \\n Problem 1:
package ovirt-openvswitch-2.15-4.el8.noarch requires openvswitch2.15, but
none of the providers can be installed\\n - package
rdo-openvswitch-2:2.17-3.el8.noarch obsoletes openvswitch2.15 < 2.17
provided by openvswitch2.15-2.15.0-117.el8s.x86_64\\n - package
rdo-openvswitch-2:2.17-3.el8.noarch obsoletes openvswitch2.15 < 2.17
provided by openvswitch2.15-2.15.0-106.el8s.x86_64\\n - package
rdo-openvswitch-2:2.17-3.el8.noarch obsoletes openvswitch2.15 < 2.17
provided by openvswitch2.15-2.15.0-110.el8s.x86_64\\n - package
rdo-openvswitch-2:2.17-3.el8.noarch obsoletes openvswitch2.15 < 2.17
provided by openvswitch2.15-2.15.0-115.el8s.x86_64\\n - package
rdo-openvswitch-2:2.17-3.el8.noarch obsoletes openvswitch2.15 < 2.17
provided by openvswitch2.15-2.15.0-119.el8s.x86_64\\n - package
rdo-openvswitch-2:2.17-3.el8.noarch obsoletes openvswitch2.15 < 2.17
provided by openvswitch2.15-2.15.0-22.el8s.x86_64\\n - package
rdo-openvswitch-2:2.17-3.el8.noarch obsoletes openvswitch2.15 < 2.17
provided by openvswitch2.15-2.15.0-23.el8s.x86_64\\n - package
rdo-openvswitch-2:2.17-3.el8.noarch obsoletes openvswitch2.15 < 2.17
provided by openvswitch2.15-2.15.0-24.el8s.x86_64\\n - package
rdo-openvswitch-2:2.17-3.el8.noarch obsoletes openvswitch2.15 < 2.17
provided by openvswitch2.15-2.15.0-27.el8s.x86_64\\n - package
rdo-openvswitch-2:2.17-3.el8.noarch obsoletes openvswitch2.15 < 2.17
provided by openvswitch2.15-2.15.0-30.el8s.x86_64\\n - package
rdo-openvswitch-2:2.17-3.el8.noarch obsoletes openvswitch2.15 < 2.17
provided by openvswitch2.15-2.15.0-32.el8s.x86_64\\n - package
rdo-openvswitch-2:2.17-3.el8.noarch obsoletes openvswitch2.15 < 2.17
provided by openvswitch2.15-2.15.0-35.el8s.x86_64\\n - package
rdo-openvswitch-2:2.17-3.el8.noarch obsoletes openvswitch2.15 < 2.17
provided by openvswitch2.15-2.15.0-37.el8s.x86_64\\n - package
rdo-openvswitch-2:2.17-3.el8.noarch obsoletes openvswitch2.15 < 2.17
provided by openvswitch2.15-2.15.0-39.el8s.x86_64\\n - package
rdo-openvswitch-2:2.17-3.el8.noarch obsoletes openvswitch2.15 < 2.17
provided by openvswitch2.15-2.15.0-41.el8s.x86_64\\n - package
rdo-openvswitch-2:2.17-3.el8.noarch obsoletes openvswitch2.15 < 2.17
provided by openvswitch2.15-2.15.0-47.el8s.x86_64\\n - package
rdo-openvswitch-2:2.17-3.el8.noarch obsoletes openvswitch2.15 < 2.17
provided by openvswitch2.15-2.15.0-48.el8s.x86_64\\n - package
rdo-openvswitch-2:2.17-3.el8.noarch obsoletes openvswitch2.15 < 2.17
provided by openvswitch2.15-2.15.0-51.el8s.x86_64\\n - package
rdo-openvswitch-2:2.17-3.el8.noarch obsoletes openvswitch2.15 < 2.17
provided by openvswitch2.15-2.15.0-52.el8s.x86_64\\n - package
rdo-openvswitch-2:2.17-3.el8.noarch obsoletes openvswitch2.15 < 2.17
provided by openvswitch2.15-2.15.0-53.el8s.x86_64\\n - package
rdo-openvswitch-2:2.17-3.el8.noarch obsoletes openvswitch2.15 < 2.17
provided by openvswitch2.15-2.15.0-54.el8s.x86_64\\n - package
rdo-openvswitch-2:2.17-3.el8.noarch obsoletes openvswitch2.15 < 2.17
provided by openvswitch2.15-2.15.0-56.el8s.x86_64\\n - package
rdo-openvswitch-2:2.17-3.el8.noarch obsoletes openvswitch2.15 < 2.17
provided by openvswitch2.15-2.15.0-6.el8s.x86_64\\n - package
rdo-openvswitch-2:2.17-3.el8.noarch obsoletes openvswitch2.15 < 2.17
provided by openvswitch2.15-2.15.0-72.el8s.x86_64\\n - package
rdo-openvswitch-2:2.17-3.el8.noarch obsoletes openvswitch2.15 < 2.17
provided by openvswitch2.15-2.15.0-75.el8s.x86_64\\n - package
rdo-openvswitch-2:2.17-3.el8.noarch obsoletes openvswitch2.15 < 2.17
provided by openvswitch2.15-2.15.0-80.el8s.x86_64\\n - package
rdo-openvswitch-2:2.17-3.el8.noarch obsoletes openvswitch2.15 < 2.17
provided by openvswitch2.15-2.15.0-81.el8s.x86_64\\n - package
rdo-openvswitch-2:2.17-3.el8.noarch obsoletes openvswitch2.15 < 2.17
provided by openvswitch2.15-2.15.0-88.el8s.x86_64\\n - cannot install the
best update candidate for package ovirt-openvswitch-2.15-4.el8.noarch\\n -
cannot install the best update candidate for package
openvswitch2.15-2.15.0-117.el8s.x86_64\\n Problem 2: package
python3-rdo-openvswitch-2:2.17-3.el8.noarch obsoletes
python3-openvswitch2.15 < 2.17 provided by
python3-openvswitch2.15-2.15.0-119.el8s.x86_64\\n - package
openvswitch2.15-ipsec-2.15.0-119.el8s.x86_64 requires
python3-openvswitch2.15 = 2.15.0-119.el8s, but none of the providers can be
installed\\n - cannot install the best update candidate for package
python3-openvswitch2.15-2.15.0-117.el8s.x86_64\\n - cannot install the
best update candidate for package
openvswitch2.15-ipsec-2.15.0-117.el8s.x86_64\\n Problem 3: package
ovirt-openvswitch-ovn-common-2.15-4.el8.noarch requires ovn-2021, but none
of the providers can be installed\\n - package
rdo-ovn-2:22.06-3.el8.noarch obsoletes ovn-2021 < 22.06 provided by
ovn-2021-21.12.0-82.el8s.x86_64\\n - package rdo-ovn-2:22.06-3.el8.noarch
obsoletes ovn-2021 < 22.06 provided by ovn-2021-21.03.0-21.el8s.x86_64\\n
- package rdo-ovn-2:22.06-3.el8.noarch obsoletes ovn-2021 < 22.06 provided
by ovn-2021-21.03.0-40.el8s.x86_64\\n - package
rdo-ovn-2:22.06-3.el8.noarch obsoletes ovn-2021 < 22.06 provided by
ovn-2021-21.06.0-17.el8s.x86_64\\n - package rdo-ovn-2:22.06-3.el8.noarch
obsoletes ovn-2021 < 22.06 provided by ovn-2021-21.06.0-29.el8s.x86_64\\n
- package rdo-ovn-2:22.06-3.el8.noarch obsoletes ovn-2021 < 22.06 provided
by ovn-2021-21.12.0-11.el8s.x86_64\\n - cannot install the best update
candidate for package ovn-2021-21.12.0-82.el8s.x86_64\\n - cannot install
the best update candidate for package
ovirt-openvswitch-ovn-common-2.15-4.el8.noarch\\n Problem 4: package
ovirt-openvswitch-ovn-host-2.15-4.el8.noarch requires ovn-2021-host, but
none of the providers can be installed\\n - package
rdo-ovn-host-2:22.06-3.el8.noarch obsoletes ovn-2021-host < 22.06 provided
by ovn-2021-host-21.12.0-82.el8s.x86_64\\n - package
rdo-ovn-host-2:22.06-3.el8.noarch obsoletes ovn-2021-host < 22.06 provided
by ovn-2021-host-21.03.0-21.el8s.x86_64\\n - package
rdo-ovn-host-2:22.06-3.el8.noarch obsoletes ovn-2021-host < 22.06 provided
by ovn-2021-host-21.03.0-40.el8s.x86_64\\n - package
rdo-ovn-host-2:22.06-3.el8.noarch obsoletes ovn-2021-host < 22.06 provided
by ovn-2021-host-21.06.0-17.el8s.x86_64\\n - package
rdo-ovn-host-2:22.06-3.el8.noarch obsoletes ovn-2021-host < 22.06 provided
by ovn-2021-host-21.06.0-29.el8s.x86_64\\n - package
rdo-ovn-host-2:22.06-3.el8.noarch obsoletes ovn-2021-host < 22.06 provided
by ovn-2021-host-21.12.0-11.el8s.x86_64\\n - cannot install the best
update candidate for package ovn-2021-host-21.12.0-82.el8s.x86_64\\n -
cannot install the best update candidate for package
ovirt-openvswitch-ovn-host-2.15-4.el8.noarch\", \"rc\": 1, \"results\":
[]}",
"start_line" : 171,
"end_line" : 172,
"runner_ident" : "e370b346-b9e8-4a0c-8595-9a717b14c901",
"event" : "runner_on_failed",
"pid" : 3151,
"created" : "2022-10-17T18:57:22.572472",
"parent_uuid" : "64006a61-d37f-a54d-f807-000000000042",
"event_data" : {
"playbook" : "ovirt-host-upgrade.yml",
"playbook_uuid" : "b4383145-d80b-4cf6-83a2-550febf56e5c",
"play" : "all",
"play_uuid" : "64006a61-d37f-a54d-f807-000000000006",
"play_pattern" : "all",
"task" : "Upgrade packages",
"task_uuid" : "64006a61-d37f-a54d-f807-000000000042",
"task_action" : "yum",
"task_args" : "",
"task_path" :
"/usr/share/ovirt-engine/ansible-runner-service-project/project/roles/ovirt-host-upgrade/tasks/main.yml:49",
"role" : "ovirt-host-upgrade",
"host" : "192.168.2.18",
"remote_addr" : "192.168.2.18",
"res" : {
"failures" : [ ],
"results" : [ ],
"rc" : 1,
"msg" : "Depsolve Error occurred: \n Problem 1: package
ovirt-openvswitch-2.15-4.el8.noarch requires openvswitch2.15, but none of
the providers can be installed\n - package
rdo-openvswitch-2:2.17-3.el8.noarch obsoletes openvswitch2.15 < 2.17
provided by openvswitch2.15-2.15.0-117.el8s.x86_64\n - package
rdo-openvswitch-2:2.17-3.el8.noarch obsoletes openvswitch2.15 < 2.17
provided by openvswitch2.15-2.15.0-106.el8s.x86_64\n - package
rdo-openvswitch-2:2.17-3.el8.noarch obsoletes openvswitch2.15 < 2.17
provided by openvswitch2.15-2.15.0-110.el8s.x86_64\n - package
rdo-openvswitch-2:2.17-3.el8.noarch obsoletes openvswitch2.15 < 2.17
provided by openvswitch2.15-2.15.0-115.el8s.x86_64\n - package
rdo-openvswitch-2:2.17-3.el8.noarch obsoletes openvswitch2.15 < 2.17
provided by openvswitch2.15-2.15.0-119.el8s.x86_64\n - package
rdo-openvswitch-2:2.17-3.el8.noarch obsoletes openvswitch2.15 < 2.17
provided by openvswitch2.15-2.15.0-22.el8s.x86_64\n - package
rdo-openvswitch-2:2.17-3.el8.noarch obsoletes openvswitch2.15 < 2.17
provided by openvswitch2.15-2.15.0-23.el8s.x86_64\n - package
rdo-openvswitch-2:2.17-3.el8.noarch obsoletes openvswitch2.15 < 2.17
provided by openvswitch2.15-2.15.0-24.el8s.x86_64\n - package
rdo-openvswitch-2:2.17-3.el8.noarch obsoletes openvswitch2.15 < 2.17
provided by openvswitch2.15-2.15.0-27.el8s.x86_64\n - package
rdo-openvswitch-2:2.17-3.el8.noarch obsoletes openvswitch2.15 < 2.17
provided by openvswitch2.15-2.15.0-30.el8s.x86_64\n - package
rdo-openvswitch-2:2.17-3.el8.noarch obsoletes openvswitch2.15 < 2.17
provided by openvswitch2.15-2.15.0-32.el8s.x86_64\n - package
rdo-openvswitch-2:2.17-3.el8.noarch obsoletes openvswitch2.15 < 2.17
provided by openvswitch2.15-2.15.0-35.el8s.x86_64\n - package
rdo-openvswitch-2:2.17-3.el8.noarch obsoletes openvswitch2.15 < 2.17
provided by openvswitch2.15-2.15.0-37.el8s.x86_64\n - package
rdo-openvswitch-2:2.17-3.el8.noarch obsoletes openvswitch2.15 < 2.17
provided by openvswitch2.15-2.15.0-39.el8s.x86_64\n - package
rdo-openvswitch-2:2.17-3.el8.noarch obsoletes openvswitch2.15 < 2.17
provided by openvswitch2.15-2.15.0-41.el8s.x86_64\n - package
rdo-openvswitch-2:2.17-3.el8.noarch obsoletes openvswitch2.15 < 2.17
provided by openvswitch2.15-2.15.0-47.el8s.x86_64\n - package
rdo-openvswitch-2:2.17-3.el8.noarch obsoletes openvswitch2.15 < 2.17
provided by openvswitch2.15-2.15.0-48.el8s.x86_64\n - package
rdo-openvswitch-2:2.17-3.el8.noarch obsoletes openvswitch2.15 < 2.17
provided by openvswitch2.15-2.15.0-51.el8s.x86_64\n - package
rdo-openvswitch-2:2.17-3.el8.noarch obsoletes openvswitch2.15 < 2.17
provided by openvswitch2.15-2.15.0-52.el8s.x86_64\n - package
rdo-openvswitch-2:2.17-3.el8.noarch obsoletes openvswitch2.15 < 2.17
provided by openvswitch2.15-2.15.0-53.el8s.x86_64\n - package
rdo-openvswitch-2:2.17-3.el8.noarch obsoletes openvswitch2.15 < 2.17
provided by openvswitch2.15-2.15.0-54.el8s.x86_64\n - package
rdo-openvswitch-2:2.17-3.el8.noarch obsoletes openvswitch2.15 < 2.17
provided by openvswitch2.15-2.15.0-56.el8s.x86_64\n - package
rdo-openvswitch-2:2.17-3.el8.noarch obsoletes openvswitch2.15 < 2.17
provided by openvswitch2.15-2.15.0-6.el8s.x86_64\n - package
rdo-openvswitch-2:2.17-3.el8.noarch obsoletes openvswitch2.15 < 2.17
provided by openvswitch2.15-2.15.0-72.el8s.x86_64\n - package
rdo-openvswitch-2:2.17-3.el8.noarch obsoletes openvswitch2.15 < 2.17
provided by openvswitch2.15-2.15.0-75.el8s.x86_64\n - package
rdo-openvswitch-2:2.17-3.el8.noarch obsoletes openvswitch2.15 < 2.17
provided by openvswitch2.15-2.15.0-80.el8s.x86_64\n - package
rdo-openvswitch-2:2.17-3.el8.noarch obsoletes openvswitch2.15 < 2.17
provided by openvswitch2.15-2.15.0-81.el8s.x86_64\n - package
rdo-openvswitch-2:2.17-3.el8.noarch obsoletes openvswitch2.15 < 2.17
provided by openvswitch2.15-2.15.0-88.el8s.x86_64\n - cannot install the
best update candidate for package ovirt-openvswitch-2.15-4.el8.noarch\n -
cannot install the best update candidate for package
openvswitch2.15-2.15.0-117.el8s.x86_64\n Problem 2: package
python3-rdo-openvswitch-2:2.17-3.el8.noarch obsoletes
python3-openvswitch2.15 < 2.17 provided by
python3-openvswitch2.15-2.15.0-119.el8s.x86_64\n - package
openvswitch2.15-ipsec-2.15.0-119.el8s.x86_64 requires
python3-openvswitch2.15 = 2.15.0-119.el8s, but none of the providers can be
installed\n - cannot install the best update candidate for package
python3-openvswitch2.15-2.15.0-117.el8s.x86_64\n - cannot install the best
update candidate for package openvswitch2.15-ipsec-2.15.0-117.el8s.x86_64\n
Problem 3: package ovirt-openvswitch-ovn-common-2.15-4.el8.noarch requires
ovn-2021, but none of the providers can be installed\n - package
rdo-ovn-2:22.06-3.el8.noarch obsoletes ovn-2021 < 22.06 provided by
ovn-2021-21.12.0-82.el8s.x86_64\n - package rdo-ovn-2:22.06-3.el8.noarch
obsoletes ovn-2021 < 22.06 provided by ovn-2021-21.03.0-21.el8s.x86_64\n -
package rdo-ovn-2:22.06-3.el8.noarch obsoletes ovn-2021 < 22.06 provided by
ovn-2021-21.03.0-40.el8s.x86_64\n - package rdo-ovn-2:22.06-3.el8.noarch
obsoletes ovn-2021 < 22.06 provided by ovn-2021-21.06.0-17.el8s.x86_64\n -
package rdo-ovn-2:22.06-3.el8.noarch obsoletes ovn-2021 < 22.06 provided by
ovn-2021-21.06.0-29.el8s.x86_64\n - package rdo-ovn-2:22.06-3.el8.noarch
obsoletes ovn-2021 < 22.06 provided by ovn-2021-21.12.0-11.el8s.x86_64\n -
cannot install the best update candidate for package
ovn-2021-21.12.0-82.el8s.x86_64\n - cannot install the best update
candidate for package ovirt-openvswitch-ovn-common-2.15-4.el8.noarch\n
Problem 4: package ovirt-openvswitch-ovn-host-2.15-4.el8.noarch requires
ovn-2021-host, but none of the providers can be installed\n - package
rdo-ovn-host-2:22.06-3.el8.noarch obsoletes ovn-2021-host < 22.06 provided
by ovn-2021-host-21.12.0-82.el8s.x86_64\n - package
rdo-ovn-host-2:22.06-3.el8.noarch obsoletes ovn-2021-host < 22.06 provided
by ovn-2021-host-21.03.0-21.el8s.x86_64\n - package
rdo-ovn-host-2:22.06-3.el8.noarch obsoletes ovn-2021-host < 22.06 provided
by ovn-2021-host-21.03.0-40.el8s.x86_64\n - package
rdo-ovn-host-2:22.06-3.el8.noarch obsoletes ovn-2021-host < 22.06 provided
by ovn-2021-host-21.06.0-17.el8s.x86_64\n - package
rdo-ovn-host-2:22.06-3.el8.noarch obsoletes ovn-2021-host < 22.06 provided
by ovn-2021-host-21.06.0-29.el8s.x86_64\n - package
rdo-ovn-host-2:22.06-3.el8.noarch obsoletes ovn-2021-host < 22.06 provided
by ovn-2021-host-21.12.0-11.el8s.x86_64\n - cannot install the best update
candidate for package ovn-2021-host-21.12.0-82.el8s.x86_64\n - cannot
install the best update candidate for package
ovirt-openvswitch-ovn-host-2.15-4.el8.noarch",
"invocation" : {
"module_args" : {
"name" : [ "*" ],
"state" : "latest",
"allow_downgrade" : false,
"autoremove" : false,
"bugfix" : false,
"cacheonly" : false,
"disable_gpg_check" : false,
"disable_plugin" : [ ],
"disablerepo" : [ ],
"download_only" : false,
"enable_plugin" : [ ],
"enablerepo" : [ ],
"exclude" : [ ],
"installroot" : "/",
"install_repoquery" : true,
"install_weak_deps" : true,
"security" : false,
"skip_broken" : false,
"update_cache" : false,
"update_only" : false,
"validate_certs" : true,
"lock_timeout" : 30,
"allowerasing" : false,
"nobest" : false,
"conf_file" : null,
"disable_excludes" : null,
"download_dir" : null,
"list" : null,
"releasever" : null
}
},
"_ansible_no_log" : false,
"changed" : false
},
"start" : "2022-10-17T18:57:19.992530",
"end" : "2022-10-17T18:57:22.572294",
"duration" : 2.579764,
"ignore_errors" : null,
"event_loop" : null,
"uuid" : "71edb069-f43c-452b-8528-a0c3d6625fd1"
}
}
*David Johnson*
2 years, 2 months
LDAP auth and group members
by Jiří Sléžka
Hi,
I have configured oVirt authentication against our MicroFocus/Novell
eDirectory (edir) ldap. It is working fine on per user base. Now I am
tried to set permissions per group but it seems does not work.
My CRO.properties
---
include = <rfc2307-edir.properties>
vars.server = ldap.********
vars.port = 389
vars.user = cn=*******************
vars.password = *******************
pool.default.serverset.single.server = ${global:vars.server}
pool.default.serverset.single.port = ${global:vars.port}
pool.default.auth.simple.bindDN = ${global:vars.user}
pool.default.auth.simple.password = ${global:vars.password}
pool.default.ssl.startTLS = true
pool.default.socketfactory.resolver.supportIPv6 = false
sequence-init.init.100-my-edir-init-vars = my-edir-init-vars
sequence.my-edir-init-vars.010.description = set baseDN
sequence.my-edir-init-vars.010.type = var-set
sequence.my-edir-init-vars.010.var-set.variable = simple_baseDN
sequence.my-edir-init-vars.010.var-set.value = o=su
search.default.search-request.derefPolicy = ALWAYS
---
I am able search groups in manager but users with permissions per group
are unable to login with "The user *********** with profile [CRO] is not
authorized to perform login".
When I try debug it with
ovirt-engine-extensions-tool aaa login-user --profile=CRO
--user-name=*******
I can see common attributes (name, email,...) in PrincipalRecord but not
any record mentioned group membership.
Group which holds this user has posixGroup objectClass and member
attributes which points to dn of users.
There were also similar post in this list in 2019 which unfortunately
was not much specific with solution
https://lists.ovirt.org/archives/list/users@ovirt.org/thread/PBQXDJGOZ2ET...
Could any suggest how to better debug this or how to modify group search
filter in my profile to work with member attribute?
Thanks in advance,
Jiri
2 years, 2 months
oVirt Node 4.5.3.1 Async update
by Sandro Bonazzola
oVirt Node 4.5.3.1 Async update
On October 20th 2022 the oVirt project released an async update of oVirt
Node (4.5.3.1) delivering a fix for OVS/OVN package conflict issue.
The update is already available on resources.ovirt.org and should land on
oVirt mirrors within 24 hours.
--
Sandro Bonazzola
MANAGER, SOFTWARE ENGINEERING, EMEA R&D PERFORMANCE & SCALE
Red Hat EMEA <https://www.redhat.com/>
sbonazzo(a)redhat.com
<https://www.redhat.com/>
*Red Hat respects your work life balance. Therefore there is no need to
answer this email out of your office hours.*
2 years, 2 months
Re: Users Digest, Vol 133, Issue 36
by David Johnson
> We can try this at our next maintenance window, probably in a week or so
> From: Lev Veyde <lveyde(a)redhat.com>
> To: Brett Maton <matonb(a)ltresources.co.uk>
> Cc: "mmoon(a)maxistechnology.com" <mmoon(a)maxistechnology.com>, "
> users(a)ovirt.org" <users(a)ovirt.org>
> Bcc:
> Date: Wed, 19 Oct 2022 17:36:59 +0300
> Subject: [ovirt-users] Re: Ovirt host update bug
> Hi,
>
> We built a new centos-release-ovirt package that supposed to fix that
> issue (on non manually fixed machine):
>
> https://cbs.centos.org/kojifiles/packages/centos-release-ovirt45/8.7/3.el...
>
> Could somebody please test that it indeed fixes the issue?
>
> Thanks in advance,
>
> On Wed, Oct 19, 2022 at 2:27 PM Brett Maton <matonb(a)ltresources.co.uk>
> wrote:
>
>> Thanks Ales,
>>
>> That fixed it for me, I don't think *rdo-ovn-central* is mentioned in link
>> but that was the final missing exclude for me.
>>
>> Regards,
>> Brett
>>
>> ------------------------------
>> *From:* Ales Musil <amusil(a)redhat.com>
>> *Sent:* 19 October 2022 11:50
>> *To:* Brett Maton <matonb(a)ltresources.co.uk>
>> *Cc:* Lev Veyde <lveyde(a)redhat.com>; mmoon(a)maxistechnology.com <
>> mmoon(a)maxistechnology.com>; users(a)ovirt.org <users(a)ovirt.org>
>> *Subject:* Re: [ovirt-users] Re: Ovirt host update bug
>>
>> Hi,
>>
>> you need to specify also "rdo-ovn-host", "python3-rdo-openvswitch" and "
>> rdo-ovn-central" in the excluded.
>> See
>> https://lists.ovirt.org/archives/list/users@ovirt.org/message/RIHO32QA3NT...
>>
>> Best regards,
>> Ales
>>
>> On Wed, Oct 19, 2022 at 12:35 PM Brett Maton <matonb(a)ltresources.co.uk>
>> wrote:
>>
>> I'm seeing the same error
>>
>> Repo config:
>>
>> # grep 'ovirt-45-centos-stream-openstack-yoga'
>> /etc/yum.repos.d/CentOS-oVirt-4.5.repo -B1 -A15
>>
>> [ovirt-45-centos-stream-openstack-yoga]
>> name=CentOS Stream $releasever - oVirt 4.5 - OpenStack Yoga Repository
>> # baseurl=
>> http://mirror.centos.org/centos/$stream/cloud/$basearch/openstack-yoga/
>> mirrorlist=
>> http://mirrorlist.centos.org/?release=$stream&arch=$basearch&repo=cloud-o...
>> gpgcheck=1
>> gpgkey=https://www.centos.org/keys/RPM-GPG-KEY-CentOS-SIG-Cloud
>> enabled=1
>> module_hotfixes=1
>> exclude=
>> # ansible-2.9.27-4.el8 shipped in yoga repo is breaking dependencies on
>> oVirt side
>> ansible
>> ansible-test
>> * rdo-openvswitch*
>> * rdo-ovn*
>>
>>
>> Update attempt:
>>
>> # yum clean all
>> 187 files removed
>>
>> # dnf update
>> CentOS-8-stream - Ceph Pacific
>> 781 kB/s | 456 kB 00:00
>> CentOS-8-stream - Gluster 10
>> 175 kB/s | 40 kB 00:00
>> CentOS-8 - NFV OpenvSwitch
>> 364 kB/s | 168 kB 00:00
>> CentOS-OpsTools - collectd
>> 169 kB/s | 41 kB 00:00
>> CentOS Stream 8 - AppStream
>> 24 MB/s | 25 MB 00:01
>> CentOS Stream 8 - BaseOS
>> 23 MB/s | 25 MB 00:01
>> CentOS Stream 8 - Extras
>> 39 kB/s | 18 kB 00:00
>> CentOS Stream 8 - Extras common packages
>> 24 kB/s | 4.9 kB 00:00
>> CentOS Stream 8 - PowerTools
>> 9.7 MB/s | 5.1 MB 00:00
>> CentOS Stream 8 - oVirt 4.5
>> 4.1 MB/s | 1.2 MB 00:00
>> CentOS Stream 8 - oVirt 4.5 - OpenStack Yoga Repository
>> 3.4 MB/s | 2.2 MB 00:00
>> oVirt upstream for CentOS Stream 8 - oVirt 4.5
>> 47 kB/s | 408 kB 00:08
>> Extra Packages for Enterprise Linux 8 - x86_64
>> 11 MB/s | 13 MB 00:01
>> Extra Packages for Enterprise Linux Modular 8 - x86_64
>> 830 kB/s | 733 kB 00:00
>> Extra Packages for Enterprise Linux 8 - Next - x86_64
>> 1.5 MB/s | 1.4 MB 00:00
>> Error:
>> Problem 1: package rdo-ovn-central-2:22.06-3.el8.noarch requires rdo-ovn
>> = 2:22.06-3.el8, but none of the providers can be installed
>> - cannot install the best update candidate for package
>> ovn-2021-central-21.12.0-82.el8s.x86_64
>> - package rdo-ovn-2:22.06-3.el8.noarch is filtered out by exclude
>> filtering
>> Problem 2: package python3-rdo-openvswitch-2:2.17-3.el8.noarch requires
>> rdo-openvswitch = 2:2.17-3.el8, but none of the providers can be installed
>> - cannot install the best update candidate for package
>> python3-openvswitch2.15-2.15.0-119.el8s.x86_64
>> - package rdo-openvswitch-2:2.17-3.el8.noarch is filtered out by
>> exclude filtering
>> (try to add '--skip-broken' to skip uninstallable packages or '--nobest'
>> to use not only best candidate packages)
>>
>>
>> Regards,
>> Brett
>> ------------------------------
>> *From:* Lev Veyde <lveyde(a)redhat.com>
>> *Sent:* 19 October 2022 11:14
>> *To:* mmoon(a)maxistechnology.com <mmoon(a)maxistechnology.com>
>> *Cc:* users(a)ovirt.org <users(a)ovirt.org>
>> *Subject:* [ovirt-users] Re: Ovirt host update bug
>>
>> Checked with the networking and looks like the issue is with the
>> conflicting OVS/OVN packages released on the OpenStack channel.
>>
>> Fixing that on our side will require releasing a new version, but one can
>> try to fix it manually by modifying the
>> /etc/yum.repos.d/CentOS-oVirt-4.5.repo file.
>>
>> 1. Find the [ovirt-45-centos-stream-openstack-yoga] section
>> 2. At the end of the section look for ansible-test under exclude=
>> 3. Add *rdo-openvswitch* and *rdo-ovn* each on their own line, in same
>> way as *ansible* and *ansible-test* that already exists
>>
>>
>>
>> On Wed, Oct 19, 2022 at 1:49 AM <mmoon(a)maxistechnology.com> wrote:
>>
>> Hey I'm having an issue, curious if anyone can help
>>
>> I'm trying to update my ovirt cluster from 4.5.2.4-1.el8 to 4.5.3.1 but
>> have run into a problem with the update installer.
>>
>> The environment is:
>>
>> Static hostname: ovirt2.xxx.xxx
>> Icon name: computer-desktop
>> Chassis: desktop
>> Machine ID: 0eb1fcff65214fb399c9d2ffaf1f5a29
>> Boot ID: dbc7438e4d464209ac79452410cf60e7
>> Operating System: CentOS Stream 8
>> CPE OS Name: cpe:/o:centos:centos:8
>> Kernel: Linux 4.18.0-408.el8.x86_64
>> Architecture: x86-64
>>
>>
>>
>> Filesystem 1K-blocks Used Available Use% Mounted on
>> devtmpfs 8023804 0 8023804 0% /dev
>> tmpfs 8055520 24 8055496 1% /dev/shm
>> tmpfs 8055520 99708 7955812 2% /run
>> tmpfs 8055520 0 8055520 0% /sys/fs/cgroup
>> /dev/mapper/cs-root 73364480 11401568 61962912 16% /
>> /dev/mapper/cs-home 166691304 1467260 165224044 1% /home
>> /dev/sda2 1038336 262972 775364 26% /boot
>> /dev/sda1 613184 7416 605768 2% /boot/efi
>> tmpfs 1611104 12 1611092 1% /run/user/42
>> tmpfs 1611104 4 1611100 1% /run/user/1000
>>
>>
>> There are 3 hosts, which can all detect and begin the update, getting
>> most of the way through it, before failing and returning it to a state of
>> non-operation. The log file says that the host is unable to resolve the
>> virtual switch dependency
>> log excerpt:
>> "stdout" : "fatal: [192.168.2.18]: FAILED! => {\"changed\": false,
>> \"failures\": [], \"msg\": \"Depsolve Error occurred: \\n Problem 1:
>> package ovirt-openvswitch-2.15-4.el8.noarch requires openvswitch2.15, but
>> none of the providers can be installed\\n - package
>> rdo-openvswitch-2:2.17-3.el8.noarch obsoletes openvswitch2.15 < 2.17
>> provided by openvswitch2.15-2.15.0-117.el8s.x86_64\\n - package
>> rdo-openvswitch-2:2.17-3.el8.noarch obsoletes openvswitch2.15 < 2.17
>> provided by openvswitch2.15-2.15.0-106.el8s.x86_64\\n - package
>> rdo-openvswitch-2:2.17-3.el8.noarch obsoletes openvswitch2.15 < 2.17
>> provided by openvswitch2.15-2.15.0-110.el8s.x86_64\\n - package
>> rdo-openvswitch-2:2.17-3.el8.noarch obsoletes openvswitch2.15 < 2.17
>> provided by openvswitch2.15-2.15.0-115.el8s.x86_64\\n - package
>> rdo-openvswitch-2:2.17-3.el8.noarch obsoletes openvswitch2.15 < 2.17
>> provided by openvswitch2.15-2.15.0-119.el8s.x86_64\\n - package
>> rdo-openvswitch-2:2.17-3.el8.noarch obsoletes openvswitch2.15 < 2
>> .17 provided by openvswitch2.15-2.15.0-22.el8s.x86_64\\n - package
>> rdo-openvswitch-2:2.17-3.el8.noarch obsoletes openvswitch2.15 < 2.17
>> provided by openvswitch2.15-2.15.0-23.el8s.x86_64\\n - package
>> rdo-openvswitch-2:2.17-3.el8.noarch obsoletes openvswitch2.15 < 2.17
>> provided by openvswitch2.15-2.15.0-24.el8s.x86_64\\n - package
>> rdo-openvswitch-2:2.17-3.el8.noarch obsoletes openvswitch2.15 < 2.17
>> provided by openvswitch2.15-2.15.0-27.el8s.x86_64\\n - package
>> rdo-openvswitch-2:2.17-3.el8.noarch obsoletes openvswitch2.15 < 2.17
>> provided by openvswitch2.15-2.15.0-30.el8s.x86_64\\n - package
>> rdo-openvswitch-2:2.17-3.el8.noarch obsoletes openvswitch2.15 < 2.17
>> provided by openvswitch2.15-2.15.0-32.el8s.x86_64\\n - package
>> rdo-openvswitch-2:2.17-3.el8.noarch obsoletes openvswitch2.15 < 2.17
>> provided by openvswitch2.15-2.15.0-35.el8s.x86_64\\n - package
>> rdo-openvswitch-2:2.17-3.el8.noarch obsoletes openvswitch2.15 < 2.17
>> provided by openvswitch2.15-2.15.0-37.el8s.x86_64\\n - packag
>> e rdo-openvswitch-2:2.17-3.el8.noarch obsoletes openvswitch2.15 < 2.17
>> provided by openvswitch2.15-2.15.0-39.el8s.x86_64\\n - package
>> rdo-openvswitch-2:2.17-3.el8.noarch obsoletes openvswitch2.15 < 2.17
>> provided by openvswitch2.15-2.15.0-41.el8s.x86_64\\n - package
>> rdo-openvswitch-2:2.17-3.el8.noarch obsoletes openvswitch2.15 < 2.17
>> provided by openvswitch2.15-2.15.0-47.el8s.x86_64\\n - package
>> rdo-openvswitch-2:2.17-3.el8.noarch obsoletes openvswitch2.15 < 2.17
>> provided by openvswitch2.15-2.15.0-48.el8s.x86_64\\n - package
>> rdo-openvswitch-2:2.17-3.el8.noarch obsoletes openvswitch2.15 < 2.17
>> provided by openvswitch2.15-2.15.0-51.el8s.x86_64\\n - package
>> rdo-openvswitch-2:2.17-3.el8.noarch obsoletes openvswitch2.15 < 2.17
>> provided by openvswitch2.15-2.15.0-52.el8s.x86_64\\n - package
>> rdo-openvswitch-2:2.17-3.el8.noarch obsoletes openvswitch2.15 < 2.17
>> provided by openvswitch2.15-2.15.0-53.el8s.x86_64\\n - package
>> rdo-openvswitch-2:2.17-3.el8.noarch obsoletes openvswitch2.15 <
>> 2.17 provided by openvswitch2.15-2.15.0-54.el8s.x86_64\\n - package
>> rdo-openvswitch-2:2.17-3.el8.noarch obsoletes openvswitch2.15 < 2.17
>> provided by openvswitch2.15-2.15.0-56.el8s.x86_64\\n - package
>> rdo-openvswitch-2:2.17-3.el8.noarch obsoletes openvswitch2.15 < 2.17
>> provided by openvswitch2.15-2.15.0-6.el8s.x86_64\\n - package
>> rdo-openvswitch-2:2.17-3.el8.noarch obsoletes openvswitch2.15 < 2.17
>> provided by openvswitch2.15-2.15.0-72.el8s.x86_64\\n - package
>> rdo-openvswitch-2:2.17-3.el8.noarch obsoletes openvswitch2.15 < 2.17
>> provided by openvswitch2.15-2.15.0-75.el8s.x86_64\\n - package
>> rdo-openvswitch-2:2.17-3.el8.noarch obsoletes openvswitch2.15 < 2.17
>> provided by openvswitch2.15-2.15.0-80.el8s.x86_64\\n - package
>> rdo-openvswitch-2:2.17-3.el8.noarch obsoletes openvswitch2.15 < 2.17
>> provided by openvswitch2.15-2.15.0-81.el8s.x86_64\\n - package
>> rdo-openvswitch-2:2.17-3.el8.noarch obsoletes openvswitch2.15 < 2.17
>> provided by openvswitch2.15-2.15.0-88.el8s.x86_64\\n - cannot
>> install the best update candidate for package
>> ovirt-openvswitch-2.15-4.el8.noarch\\n - cannot install the best update
>> candidate for package openvswitch2.15-2.15.0-117.el8s.x86_64\\n Problem 2:
>> package python3-rdo-openvswitch-2:2.17-3.el8.noarch obsoletes
>> python3-openvswitch2.15 < 2.17 provided by
>> python3-openvswitch2.15-2.15.0-119.el8s.x86_64\\n - package
>> openvswitch2.15-ipsec-2.15.0-119.el8s.x86_64 requires
>> python3-openvswitch2.15 = 2.15.0-119.el8s, but none of the providers can be
>> installed\\n - cannot install the best update candidate for package
>> python3-openvswitch2.15-2.15.0-117.el8s.x86_64\\n - cannot install the
>> best update candidate for package
>> openvswitch2.15-ipsec-2.15.0-117.el8s.x86_64\\n Problem 3: package
>> ovirt-openvswitch-ovn-common-2.15-4.el8.noarch requires ovn-2021, but none
>> of the providers can be installed\\n - package
>> rdo-ovn-2:22.06-3.el8.noarch obsoletes ovn-2021 < 22.06 provided by
>> ovn-2021-21.12.0-82.el8s.x86_64\\n - package rdo-ovn-2:22.06-3.el8.noarc
>> h obsoletes ovn-2021 < 22.06 provided by
>> ovn-2021-21.03.0-21.el8s.x86_64\\n - package rdo-ovn-2:22.06-3.el8.noarch
>> obsoletes ovn-2021 < 22.06 provided by ovn-2021-21.03.0-40.el8s.x86_64\\n
>> - package rdo-ovn-2:22.06-3.el8.noarch obsoletes ovn-2021 < 22.06 provided
>> by ovn-2021-21.06.0-17.el8s.x86_64\\n - package
>> rdo-ovn-2:22.06-3.el8.noarch obsoletes ovn-2021 < 22.06 provided by
>> ovn-2021-21.06.0-29.el8s.x86_64\\n - package rdo-ovn-2:22.06-3.el8.noarch
>> obsoletes ovn-2021 < 22.06 provided by ovn-2021-21.12.0-11.el8s.x86_64\\n
>> - cannot install the best update candidate for package
>> ovn-2021-21.12.0-82.el8s.x86_64\\n - cannot install the best update
>> candidate for package ovirt-openvswitch-ovn-common-2.15-4.el8.noarch\\n
>> Problem 4: package ovirt-openvswitch-ovn-host-2.15-4.el8.noarch requires
>> ovn-2021-host, but none of the providers can be installed\\n - package
>> rdo-ovn-host-2:22.06-3.el8.noarch obsoletes ovn-2021-host < 22.06 provided
>> by ovn-2021-host-21.12.0-82.el8s.x86_64\\n - pa
>> ckage rdo-ovn-host-2:22.06-3.el8.noarch obsoletes ovn-2021-host < 22.06
>> provided by ovn-2021-host-21.03.0-21.el8s.x86_64\\n - package
>> rdo-ovn-host-2:22.06-3.el8.noarch obsoletes ovn-2021-host < 22.06 provided
>> by ovn-2021-host-21.03.0-40.el8s.x86_64\\n - package
>> rdo-ovn-host-2:22.06-3.el8.noarch obsoletes ovn-2021-host < 22.06 provided
>> by ovn-2021-host-21.06.0-17.el8s.x86_64\\n - package
>> rdo-ovn-host-2:22.06-3.el8.noarch obsoletes ovn-2021-host < 22.06 provided
>> by ovn-2021-host-21.06.0-29.el8s.x86_64\\n - package
>> rdo-ovn-host-2:22.06-3.el8.noarch obsoletes ovn-2021-host < 22.06 provided
>> by ovn-2021-host-21.12.0-11.el8s.x86_64\\n - cannot install the best
>> update candidate for package ovn-2021-host-21.12.0-82.el8s.x86_64\\n -
>> cannot install the best update candidate for package
>> ovirt-openvswitch-ovn-host-2.15-4.el8.noarch\", \"rc\": 1, \"results\":
>> []}",
>>
>> On one host, when I attempt to open the ovirt web console from the gui it
>> won't open, and virtual machines on that particular host are unable to run
>> an ovirt web console as well: citing handshake error failure.
>> log excerpt for host web console:
>>
>> Oct 17 15:58:29 ovirt-host-05 journal[96215]: Domain id=24
>> name='cen-79-dmz-02' uuid=82fefcfa-bce0-4397-a575-48d3d08fdb61 is tainted:
>> custom-ga-command
>> Oct 17 15:58:29 ovirt-host-05 journal[96215]: Domain id=25
>> name='win-10-utl' uuid=11f71942-1d88-40a0-a6c5-45e7718afbcf is tainted:
>> custom-ga-command
>>
>> Oct 17 03:37:01 ovirt-host-05 ovs-appctl[37436]:
>> ovs|00001|unixctl|WARN|failed to connect to
>> /var/run/ovn/ovn-controller.18617.ctl
>>
>> Thank you in advance
>> _______________________________________________
>> Users mailing list -- users(a)ovirt.org
>> To unsubscribe send an email to users-leave(a)ovirt.org
>> Privacy Statement: https://www.ovirt.org/privacy-policy.html
>> oVirt Code of Conduct:
>> https://www.ovirt.org/community/about/community-guidelines/
>> List Archives:
>> https://lists.ovirt.org/archives/list/users@ovirt.org/message/5USJX6G7ZAZ...
>>
>>
>>
>> --
>>
>> Lev Veyde
>>
>> Senior Software Engineer, RHCE | RHCVA | MCITP
>>
>> Red Hat Israel
>>
>> <https://www.redhat.com>
>>
>> lev(a)redhat.com | lveyde(a)redhat.com
>> <https://red.ht/sig>
>> TRIED. TESTED. TRUSTED. <https://redhat.com/trusted>
>> _______________________________________________
>> Users mailing list -- users(a)ovirt.org
>> To unsubscribe send an email to users-leave(a)ovirt.org
>> Privacy Statement: https://www.ovirt.org/privacy-policy.html
>> oVirt Code of Conduct:
>> https://www.ovirt.org/community/about/community-guidelines/
>> List Archives:
>> https://lists.ovirt.org/archives/list/users@ovirt.org/message/HM7YY77DZQI...
>>
>>
>>
>> --
>>
>> Ales Musil
>>
>> Senior Software Engineer - OVN Core
>>
>> Red Hat EMEA <https://www.redhat.com>
>>
>> amusil(a)redhat.com IM: amusil
>> <https://red.ht/sig>
>>
>
>
> --
>
> Lev Veyde
>
> Senior Software Engineer, RHCE | RHCVA | MCITP
>
> Red Hat Israel
>
> <https://www.redhat.com>
>
> lev(a)redhat.com | lveyde(a)redhat.com
> <https://red.ht/sig>
> TRIED. TESTED. TRUSTED. <https://redhat.com/trusted>
> _______________________________________________
> Users mailing list -- users(a)ovirt.org
> To unsubscribe send an email to users-leave(a)ovirt.org
> Privacy Statement: https://www.ovirt.org/privacy-policy.html
> oVirt Code of Conduct:
> https://www.ovirt.org/community/about/community-guidelines/
>
2 years, 2 months
Re: Ovirt host update bug
by Lev Veyde
Hi,
We built a new centos-release-ovirt package that supposed to fix that issue
(on non manually fixed machine):
https://cbs.centos.org/kojifiles/packages/centos-release-ovirt45/8.7/3.el...
Could somebody please test that it indeed fixes the issue?
Thanks in advance,
On Wed, Oct 19, 2022 at 2:27 PM Brett Maton <matonb(a)ltresources.co.uk>
wrote:
> Thanks Ales,
>
> That fixed it for me, I don't think *rdo-ovn-central* is mentioned in link
> but that was the final missing exclude for me.
>
> Regards,
> Brett
>
> ------------------------------
> *From:* Ales Musil <amusil(a)redhat.com>
> *Sent:* 19 October 2022 11:50
> *To:* Brett Maton <matonb(a)ltresources.co.uk>
> *Cc:* Lev Veyde <lveyde(a)redhat.com>; mmoon(a)maxistechnology.com <
> mmoon(a)maxistechnology.com>; users(a)ovirt.org <users(a)ovirt.org>
> *Subject:* Re: [ovirt-users] Re: Ovirt host update bug
>
> Hi,
>
> you need to specify also "rdo-ovn-host", "python3-rdo-openvswitch" and "
> rdo-ovn-central" in the excluded.
> See
> https://lists.ovirt.org/archives/list/users@ovirt.org/message/RIHO32QA3NT...
>
> Best regards,
> Ales
>
> On Wed, Oct 19, 2022 at 12:35 PM Brett Maton <matonb(a)ltresources.co.uk>
> wrote:
>
> I'm seeing the same error
>
> Repo config:
>
> # grep 'ovirt-45-centos-stream-openstack-yoga'
> /etc/yum.repos.d/CentOS-oVirt-4.5.repo -B1 -A15
>
> [ovirt-45-centos-stream-openstack-yoga]
> name=CentOS Stream $releasever - oVirt 4.5 - OpenStack Yoga Repository
> # baseurl=
> http://mirror.centos.org/centos/$stream/cloud/$basearch/openstack-yoga/
> mirrorlist=
> http://mirrorlist.centos.org/?release=$stream&arch=$basearch&repo=cloud-o...
> gpgcheck=1
> gpgkey=https://www.centos.org/keys/RPM-GPG-KEY-CentOS-SIG-Cloud
> enabled=1
> module_hotfixes=1
> exclude=
> # ansible-2.9.27-4.el8 shipped in yoga repo is breaking dependencies on
> oVirt side
> ansible
> ansible-test
> * rdo-openvswitch*
> * rdo-ovn*
>
>
> Update attempt:
>
> # yum clean all
> 187 files removed
>
> # dnf update
> CentOS-8-stream - Ceph Pacific
> 781 kB/s | 456 kB 00:00
> CentOS-8-stream - Gluster 10
> 175 kB/s | 40 kB 00:00
> CentOS-8 - NFV OpenvSwitch
> 364 kB/s | 168 kB 00:00
> CentOS-OpsTools - collectd
> 169 kB/s | 41 kB 00:00
> CentOS Stream 8 - AppStream
> 24 MB/s | 25 MB 00:01
> CentOS Stream 8 - BaseOS
> 23 MB/s | 25 MB 00:01
> CentOS Stream 8 - Extras
> 39 kB/s | 18 kB 00:00
> CentOS Stream 8 - Extras common packages
> 24 kB/s | 4.9 kB 00:00
> CentOS Stream 8 - PowerTools
> 9.7 MB/s | 5.1 MB 00:00
> CentOS Stream 8 - oVirt 4.5
> 4.1 MB/s | 1.2 MB 00:00
> CentOS Stream 8 - oVirt 4.5 - OpenStack Yoga Repository
> 3.4 MB/s | 2.2 MB 00:00
> oVirt upstream for CentOS Stream 8 - oVirt 4.5
> 47 kB/s | 408 kB 00:08
> Extra Packages for Enterprise Linux 8 - x86_64
> 11 MB/s | 13 MB 00:01
> Extra Packages for Enterprise Linux Modular 8 - x86_64
> 830 kB/s | 733 kB 00:00
> Extra Packages for Enterprise Linux 8 - Next - x86_64
> 1.5 MB/s | 1.4 MB 00:00
> Error:
> Problem 1: package rdo-ovn-central-2:22.06-3.el8.noarch requires rdo-ovn
> = 2:22.06-3.el8, but none of the providers can be installed
> - cannot install the best update candidate for package
> ovn-2021-central-21.12.0-82.el8s.x86_64
> - package rdo-ovn-2:22.06-3.el8.noarch is filtered out by exclude
> filtering
> Problem 2: package python3-rdo-openvswitch-2:2.17-3.el8.noarch requires
> rdo-openvswitch = 2:2.17-3.el8, but none of the providers can be installed
> - cannot install the best update candidate for package
> python3-openvswitch2.15-2.15.0-119.el8s.x86_64
> - package rdo-openvswitch-2:2.17-3.el8.noarch is filtered out by exclude
> filtering
> (try to add '--skip-broken' to skip uninstallable packages or '--nobest'
> to use not only best candidate packages)
>
>
> Regards,
> Brett
> ------------------------------
> *From:* Lev Veyde <lveyde(a)redhat.com>
> *Sent:* 19 October 2022 11:14
> *To:* mmoon(a)maxistechnology.com <mmoon(a)maxistechnology.com>
> *Cc:* users(a)ovirt.org <users(a)ovirt.org>
> *Subject:* [ovirt-users] Re: Ovirt host update bug
>
> Checked with the networking and looks like the issue is with the
> conflicting OVS/OVN packages released on the OpenStack channel.
>
> Fixing that on our side will require releasing a new version, but one can
> try to fix it manually by modifying the
> /etc/yum.repos.d/CentOS-oVirt-4.5.repo file.
>
> 1. Find the [ovirt-45-centos-stream-openstack-yoga] section
> 2. At the end of the section look for ansible-test under exclude=
> 3. Add *rdo-openvswitch* and *rdo-ovn* each on their own line, in same
> way as *ansible* and *ansible-test* that already exists
>
>
>
> On Wed, Oct 19, 2022 at 1:49 AM <mmoon(a)maxistechnology.com> wrote:
>
> Hey I'm having an issue, curious if anyone can help
>
> I'm trying to update my ovirt cluster from 4.5.2.4-1.el8 to 4.5.3.1 but
> have run into a problem with the update installer.
>
> The environment is:
>
> Static hostname: ovirt2.xxx.xxx
> Icon name: computer-desktop
> Chassis: desktop
> Machine ID: 0eb1fcff65214fb399c9d2ffaf1f5a29
> Boot ID: dbc7438e4d464209ac79452410cf60e7
> Operating System: CentOS Stream 8
> CPE OS Name: cpe:/o:centos:centos:8
> Kernel: Linux 4.18.0-408.el8.x86_64
> Architecture: x86-64
>
>
>
> Filesystem 1K-blocks Used Available Use% Mounted on
> devtmpfs 8023804 0 8023804 0% /dev
> tmpfs 8055520 24 8055496 1% /dev/shm
> tmpfs 8055520 99708 7955812 2% /run
> tmpfs 8055520 0 8055520 0% /sys/fs/cgroup
> /dev/mapper/cs-root 73364480 11401568 61962912 16% /
> /dev/mapper/cs-home 166691304 1467260 165224044 1% /home
> /dev/sda2 1038336 262972 775364 26% /boot
> /dev/sda1 613184 7416 605768 2% /boot/efi
> tmpfs 1611104 12 1611092 1% /run/user/42
> tmpfs 1611104 4 1611100 1% /run/user/1000
>
>
> There are 3 hosts, which can all detect and begin the update, getting most
> of the way through it, before failing and returning it to a state of
> non-operation. The log file says that the host is unable to resolve the
> virtual switch dependency
> log excerpt:
> "stdout" : "fatal: [192.168.2.18]: FAILED! => {\"changed\": false,
> \"failures\": [], \"msg\": \"Depsolve Error occurred: \\n Problem 1:
> package ovirt-openvswitch-2.15-4.el8.noarch requires openvswitch2.15, but
> none of the providers can be installed\\n - package
> rdo-openvswitch-2:2.17-3.el8.noarch obsoletes openvswitch2.15 < 2.17
> provided by openvswitch2.15-2.15.0-117.el8s.x86_64\\n - package
> rdo-openvswitch-2:2.17-3.el8.noarch obsoletes openvswitch2.15 < 2.17
> provided by openvswitch2.15-2.15.0-106.el8s.x86_64\\n - package
> rdo-openvswitch-2:2.17-3.el8.noarch obsoletes openvswitch2.15 < 2.17
> provided by openvswitch2.15-2.15.0-110.el8s.x86_64\\n - package
> rdo-openvswitch-2:2.17-3.el8.noarch obsoletes openvswitch2.15 < 2.17
> provided by openvswitch2.15-2.15.0-115.el8s.x86_64\\n - package
> rdo-openvswitch-2:2.17-3.el8.noarch obsoletes openvswitch2.15 < 2.17
> provided by openvswitch2.15-2.15.0-119.el8s.x86_64\\n - package
> rdo-openvswitch-2:2.17-3.el8.noarch obsoletes openvswitch2.15 < 2
> .17 provided by openvswitch2.15-2.15.0-22.el8s.x86_64\\n - package
> rdo-openvswitch-2:2.17-3.el8.noarch obsoletes openvswitch2.15 < 2.17
> provided by openvswitch2.15-2.15.0-23.el8s.x86_64\\n - package
> rdo-openvswitch-2:2.17-3.el8.noarch obsoletes openvswitch2.15 < 2.17
> provided by openvswitch2.15-2.15.0-24.el8s.x86_64\\n - package
> rdo-openvswitch-2:2.17-3.el8.noarch obsoletes openvswitch2.15 < 2.17
> provided by openvswitch2.15-2.15.0-27.el8s.x86_64\\n - package
> rdo-openvswitch-2:2.17-3.el8.noarch obsoletes openvswitch2.15 < 2.17
> provided by openvswitch2.15-2.15.0-30.el8s.x86_64\\n - package
> rdo-openvswitch-2:2.17-3.el8.noarch obsoletes openvswitch2.15 < 2.17
> provided by openvswitch2.15-2.15.0-32.el8s.x86_64\\n - package
> rdo-openvswitch-2:2.17-3.el8.noarch obsoletes openvswitch2.15 < 2.17
> provided by openvswitch2.15-2.15.0-35.el8s.x86_64\\n - package
> rdo-openvswitch-2:2.17-3.el8.noarch obsoletes openvswitch2.15 < 2.17
> provided by openvswitch2.15-2.15.0-37.el8s.x86_64\\n - packag
> e rdo-openvswitch-2:2.17-3.el8.noarch obsoletes openvswitch2.15 < 2.17
> provided by openvswitch2.15-2.15.0-39.el8s.x86_64\\n - package
> rdo-openvswitch-2:2.17-3.el8.noarch obsoletes openvswitch2.15 < 2.17
> provided by openvswitch2.15-2.15.0-41.el8s.x86_64\\n - package
> rdo-openvswitch-2:2.17-3.el8.noarch obsoletes openvswitch2.15 < 2.17
> provided by openvswitch2.15-2.15.0-47.el8s.x86_64\\n - package
> rdo-openvswitch-2:2.17-3.el8.noarch obsoletes openvswitch2.15 < 2.17
> provided by openvswitch2.15-2.15.0-48.el8s.x86_64\\n - package
> rdo-openvswitch-2:2.17-3.el8.noarch obsoletes openvswitch2.15 < 2.17
> provided by openvswitch2.15-2.15.0-51.el8s.x86_64\\n - package
> rdo-openvswitch-2:2.17-3.el8.noarch obsoletes openvswitch2.15 < 2.17
> provided by openvswitch2.15-2.15.0-52.el8s.x86_64\\n - package
> rdo-openvswitch-2:2.17-3.el8.noarch obsoletes openvswitch2.15 < 2.17
> provided by openvswitch2.15-2.15.0-53.el8s.x86_64\\n - package
> rdo-openvswitch-2:2.17-3.el8.noarch obsoletes openvswitch2.15 <
> 2.17 provided by openvswitch2.15-2.15.0-54.el8s.x86_64\\n - package
> rdo-openvswitch-2:2.17-3.el8.noarch obsoletes openvswitch2.15 < 2.17
> provided by openvswitch2.15-2.15.0-56.el8s.x86_64\\n - package
> rdo-openvswitch-2:2.17-3.el8.noarch obsoletes openvswitch2.15 < 2.17
> provided by openvswitch2.15-2.15.0-6.el8s.x86_64\\n - package
> rdo-openvswitch-2:2.17-3.el8.noarch obsoletes openvswitch2.15 < 2.17
> provided by openvswitch2.15-2.15.0-72.el8s.x86_64\\n - package
> rdo-openvswitch-2:2.17-3.el8.noarch obsoletes openvswitch2.15 < 2.17
> provided by openvswitch2.15-2.15.0-75.el8s.x86_64\\n - package
> rdo-openvswitch-2:2.17-3.el8.noarch obsoletes openvswitch2.15 < 2.17
> provided by openvswitch2.15-2.15.0-80.el8s.x86_64\\n - package
> rdo-openvswitch-2:2.17-3.el8.noarch obsoletes openvswitch2.15 < 2.17
> provided by openvswitch2.15-2.15.0-81.el8s.x86_64\\n - package
> rdo-openvswitch-2:2.17-3.el8.noarch obsoletes openvswitch2.15 < 2.17
> provided by openvswitch2.15-2.15.0-88.el8s.x86_64\\n - cannot
> install the best update candidate for package
> ovirt-openvswitch-2.15-4.el8.noarch\\n - cannot install the best update
> candidate for package openvswitch2.15-2.15.0-117.el8s.x86_64\\n Problem 2:
> package python3-rdo-openvswitch-2:2.17-3.el8.noarch obsoletes
> python3-openvswitch2.15 < 2.17 provided by
> python3-openvswitch2.15-2.15.0-119.el8s.x86_64\\n - package
> openvswitch2.15-ipsec-2.15.0-119.el8s.x86_64 requires
> python3-openvswitch2.15 = 2.15.0-119.el8s, but none of the providers can be
> installed\\n - cannot install the best update candidate for package
> python3-openvswitch2.15-2.15.0-117.el8s.x86_64\\n - cannot install the
> best update candidate for package
> openvswitch2.15-ipsec-2.15.0-117.el8s.x86_64\\n Problem 3: package
> ovirt-openvswitch-ovn-common-2.15-4.el8.noarch requires ovn-2021, but none
> of the providers can be installed\\n - package
> rdo-ovn-2:22.06-3.el8.noarch obsoletes ovn-2021 < 22.06 provided by
> ovn-2021-21.12.0-82.el8s.x86_64\\n - package rdo-ovn-2:22.06-3.el8.noarc
> h obsoletes ovn-2021 < 22.06 provided by
> ovn-2021-21.03.0-21.el8s.x86_64\\n - package rdo-ovn-2:22.06-3.el8.noarch
> obsoletes ovn-2021 < 22.06 provided by ovn-2021-21.03.0-40.el8s.x86_64\\n
> - package rdo-ovn-2:22.06-3.el8.noarch obsoletes ovn-2021 < 22.06 provided
> by ovn-2021-21.06.0-17.el8s.x86_64\\n - package
> rdo-ovn-2:22.06-3.el8.noarch obsoletes ovn-2021 < 22.06 provided by
> ovn-2021-21.06.0-29.el8s.x86_64\\n - package rdo-ovn-2:22.06-3.el8.noarch
> obsoletes ovn-2021 < 22.06 provided by ovn-2021-21.12.0-11.el8s.x86_64\\n
> - cannot install the best update candidate for package
> ovn-2021-21.12.0-82.el8s.x86_64\\n - cannot install the best update
> candidate for package ovirt-openvswitch-ovn-common-2.15-4.el8.noarch\\n
> Problem 4: package ovirt-openvswitch-ovn-host-2.15-4.el8.noarch requires
> ovn-2021-host, but none of the providers can be installed\\n - package
> rdo-ovn-host-2:22.06-3.el8.noarch obsoletes ovn-2021-host < 22.06 provided
> by ovn-2021-host-21.12.0-82.el8s.x86_64\\n - pa
> ckage rdo-ovn-host-2:22.06-3.el8.noarch obsoletes ovn-2021-host < 22.06
> provided by ovn-2021-host-21.03.0-21.el8s.x86_64\\n - package
> rdo-ovn-host-2:22.06-3.el8.noarch obsoletes ovn-2021-host < 22.06 provided
> by ovn-2021-host-21.03.0-40.el8s.x86_64\\n - package
> rdo-ovn-host-2:22.06-3.el8.noarch obsoletes ovn-2021-host < 22.06 provided
> by ovn-2021-host-21.06.0-17.el8s.x86_64\\n - package
> rdo-ovn-host-2:22.06-3.el8.noarch obsoletes ovn-2021-host < 22.06 provided
> by ovn-2021-host-21.06.0-29.el8s.x86_64\\n - package
> rdo-ovn-host-2:22.06-3.el8.noarch obsoletes ovn-2021-host < 22.06 provided
> by ovn-2021-host-21.12.0-11.el8s.x86_64\\n - cannot install the best
> update candidate for package ovn-2021-host-21.12.0-82.el8s.x86_64\\n -
> cannot install the best update candidate for package
> ovirt-openvswitch-ovn-host-2.15-4.el8.noarch\", \"rc\": 1, \"results\":
> []}",
>
> On one host, when I attempt to open the ovirt web console from the gui it
> won't open, and virtual machines on that particular host are unable to run
> an ovirt web console as well: citing handshake error failure.
> log excerpt for host web console:
>
> Oct 17 15:58:29 ovirt-host-05 journal[96215]: Domain id=24
> name='cen-79-dmz-02' uuid=82fefcfa-bce0-4397-a575-48d3d08fdb61 is tainted:
> custom-ga-command
> Oct 17 15:58:29 ovirt-host-05 journal[96215]: Domain id=25
> name='win-10-utl' uuid=11f71942-1d88-40a0-a6c5-45e7718afbcf is tainted:
> custom-ga-command
>
> Oct 17 03:37:01 ovirt-host-05 ovs-appctl[37436]:
> ovs|00001|unixctl|WARN|failed to connect to
> /var/run/ovn/ovn-controller.18617.ctl
>
> Thank you in advance
> _______________________________________________
> Users mailing list -- users(a)ovirt.org
> To unsubscribe send an email to users-leave(a)ovirt.org
> Privacy Statement: https://www.ovirt.org/privacy-policy.html
> oVirt Code of Conduct:
> https://www.ovirt.org/community/about/community-guidelines/
> List Archives:
> https://lists.ovirt.org/archives/list/users@ovirt.org/message/5USJX6G7ZAZ...
>
>
>
> --
>
> Lev Veyde
>
> Senior Software Engineer, RHCE | RHCVA | MCITP
>
> Red Hat Israel
>
> <https://www.redhat.com>
>
> lev(a)redhat.com | lveyde(a)redhat.com
> <https://red.ht/sig>
> TRIED. TESTED. TRUSTED. <https://redhat.com/trusted>
> _______________________________________________
> Users mailing list -- users(a)ovirt.org
> To unsubscribe send an email to users-leave(a)ovirt.org
> Privacy Statement: https://www.ovirt.org/privacy-policy.html
> oVirt Code of Conduct:
> https://www.ovirt.org/community/about/community-guidelines/
> List Archives:
> https://lists.ovirt.org/archives/list/users@ovirt.org/message/HM7YY77DZQI...
>
>
>
> --
>
> Ales Musil
>
> Senior Software Engineer - OVN Core
>
> Red Hat EMEA <https://www.redhat.com>
>
> amusil(a)redhat.com IM: amusil
> <https://red.ht/sig>
>
--
Lev Veyde
Senior Software Engineer, RHCE | RHCVA | MCITP
Red Hat Israel
<https://www.redhat.com>
lev(a)redhat.com | lveyde(a)redhat.com
<https://red.ht/sig>
TRIED. TESTED. TRUSTED. <https://redhat.com/trusted>
2 years, 2 months
Engine's certification is about to expire at .... Please renew the engine's certification.
by nicolas@devels.es
Hi,
I'm running oVirt 4.4 and recently I got a message in the events list
like this:
Engine's certification is about to expire at 2022-10-30. Please renew
the engine's certification.
What does that exactly mean? And how can it be renewed?
I'm using a custom TLS certificate both for web access and websocket
proxy. Does it need to be renewed anyways?
Thanks.
2 years, 2 months