solved by using master repo
https://www.ovirt.org/download/
As discussed
in oVirt Users mailing list we
suggest the user community to use oVirt
master snapshot repositories ensuring
that the latest fixes for the platform regressions will be
promptly available.
Marek
Dne 2024-07-11 v 18:01 marek napsal(a):
hi,
today i tried to add new hypervisor - rocky 9.4
ovirt engine - 4.5
ovirtmgmt iface is not functional when i try add host to cluster
it exist for while (seconds) but then i see message "ovirtmgmt:
port 1(eno3) entered disabled state" and ovirtmgmt iface is gone
and network is unreacheable
i have working hypervisors based on rocky 9.3
i tried downgrade nmstate to version from 9.3
dnf downgrade
https://dl.rockylinux.org/vault/rocky/9.3/AppStream/x86_64/os/Packages/p/python3-libnmstate-2.2.23-1.el9_3.x86_64.rpm
https://dl.rockylinux.org/vault/rocky/9.3/AppStream/x86_64/os/Packages/n/nmstate-2.2.23-1.el9_3.x86_64.rpm
https://dl.rockylinux.org/vault/rocky/9.3/AppStream/x86_64/os/Packages/n/nmstate-libs-2.2.23-1.el9_3.x86_64.rpm
any ideas/tips howto debug ovirmgmt iface creation process?
vdsm.log
2024-07-11 17:42:07,305+0200 INFO (jsonrpc/6) [api.network] START
setupNetworks(networks={'ovirtmgmt': {'netmask': '255.255.255.0',
'ipv6autoconf': False, 'nic': 'eno3', 'bridged': 'true', 'ipaddr':
'10.10.10.237', 'defaultRoute': True, 'dhcpv6': False, 'STP':
'no', 'mtu': 1500, 'switch': 'legacy'}}, bondings={},
options={'connectivityTimeout': 120, 'commitOnSuccess': True,
'connectivityCheck': 'true'}) from=::ffff:10.10.12.12,33842
(api:31)
2024-07-11 17:43:07,312+0200 WARN (vdsm.Scheduler) [Executor]
Worker blocked: <Worker name=jsonrpc/6 running <Task
<JsonRpcTask {'jsonrpc': '2.0', 'method': 'Host.setupNetworks',
'params': {'networks': {'ovirtmgmt': {'netmask': '255.255.255.0',
'ipv6autoconf': False, 'nic': 'eno3', 'bridged': 'true', 'ipaddr':
'10.10.10.237', 'defaultRoute': True, 'dhcpv6': False, 'STP':
'no', 'mtu': 1500, 'switch': 'legacy'}}, 'bondings': {},
'options': {'connectivityTimeout': 120, 'commitOnSuccess': True,
'connectivityCheck': 'true'}}, 'id':
'65b89a79-0e7a-4a05-b310-f9903bdec2ce'} at 0x7f5e8c245b20>
timeout=60, duration=60.01 at 0x7f5e8c245df0> task#=0 at
0x7f5e8c27eaf0>, traceback:
File: "/usr/lib64/python3.9/threading.py", line 937, in _bootstrap
self._bootstrap_inner()
File: "/usr/lib64/python3.9/threading.py", line 980, in
_bootstrap_inner
self.run()
File: "/usr/lib64/python3.9/threading.py", line 917, in run
self._target(*self._args, **self._kwargs)
File:
"/usr/lib/python3.9/site-packages/vdsm/common/concurrent.py", line
243, in run
ret = func(*args, **kwargs)
File: "/usr/lib/python3.9/site-packages/vdsm/executor.py", line
284, in _run
self._execute_task()
File: "/usr/lib/python3.9/site-packages/vdsm/executor.py", line
298, in _execute_task
task()
File: "/usr/lib/python3.9/site-packages/vdsm/executor.py", line
374, in __call__
self._callable()
File: "/usr/lib/python3.9/site-packages/yajsonrpc/__init__.py",
line 253, in __call__
self._handler(self._ctx, self._req)
File: "/usr/lib/python3.9/site-packages/yajsonrpc/__init__.py",
line 296, in _serveRequest
response = self._handle_request(req, ctx)
File: "/usr/lib/python3.9/site-packages/yajsonrpc/__init__.py",
line 338, in _handle_request
res = method(**params)
File: "/usr/lib/python3.9/site-packages/vdsm/rpc/Bridge.py", line
186, in _dynamicMethod
result = fn(*methodArgs)
File: "<decorator-gen-508>", line 2, in setupNetworks
File: "/usr/lib/python3.9/site-packages/vdsm/common/api.py", line
33, in method
ret = func(*args, **kwargs)
File: "/usr/lib/python3.9/site-packages/vdsm/API.py", line 1576,
in setupNetworks
supervdsm.getProxy().setupNetworks(networks, bondings, options)
File: "/usr/lib/python3.9/site-packages/vdsm/common/supervdsm.py",
line 38, in __call__
return callMethod()
File: "/usr/lib/python3.9/site-packages/vdsm/common/supervdsm.py",
line 35, in <lambda>
getattr(self._supervdsmProxy._svdsm, self._funcName)(*args,
File: "<string>", line 2, in setupNetworks
File: "/usr/lib64/python3.9/multiprocessing/managers.py", line
810, in _callmethod
kind, result = conn.recv()
File: "/usr/lib64/python3.9/multiprocessing/connection.py", line
254, in recv
buf = self._recv_bytes()
File: "/usr/lib64/python3.9/multiprocessing/connection.py", line
418, in _recv_bytes
buf = self._recv(4)
File: "/usr/lib64/python3.9/multiprocessing/connection.py", line
383, in _recv
chunk = read(handle, remaining) (executor:345)
installed ovirt packages on host
centos-release-ovirt45-9.2-1.el9s.noarch
ovirt-vmconsole-1.0.9-3.el9.noarch
ovirt-imageio-common-2.5.0-1.el9.x86_64
python3-ovirt-engine-sdk4-4.6.2-1.el9.x86_64
ovirt-openvswitch-ovn-2.17-1.el9.noarch
ovirt-openvswitch-ovn-common-2.17-1.el9.noarch
ovirt-imageio-client-2.5.0-1.el9.x86_64
ovirt-imageio-daemon-2.5.0-1.el9.x86_64
ovirt-openvswitch-ovn-host-2.17-1.el9.noarch
ovirt-vmconsole-host-1.0.9-3.el9.noarch
ovirt-python-openvswitch-2.17-1.el9.noarch
ovirt-openvswitch-2.17-1.el9.noarch
python3.11-ovirt-imageio-common-2.5.0-1.el9.x86_64
python3.11-ovirt-engine-sdk4-4.6.2-1.el9.x86_64
python3.11-ovirt-imageio-client-2.5.0-1.el9.x86_64
ovirt-openvswitch-ipsec-2.17-1.el9.noarch
python3-ovirt-setup-lib-1.3.3-1.el9.noarch
ovirt-ansible-collection-3.2.0-1.el9.noarch
ovirt-hosted-engine-ha-2.5.1-1.el9.noarch
ovirt-provider-ovn-driver-1.2.36-1.el9.noarch
ovirt-host-dependencies-4.5.0-3.el9.x86_64
ovirt-hosted-engine-setup-2.7.1-1.el9.noarch
ovirt-host-4.5.0-3.el9.x86_64