Hosted engine setup Failed
by Fedele Stabile
Good morning,
I have a fresh installed host node ovirt v.4.5 and i would install engine using terminal, using the commanda hosted-engine --deploy
host node has a ip on 160.97.xx and i want the engine on the same network (160.97.xx)
The installation seems to be good but at the end
exit leaving the host-engine running on 192.168.222.x
Seems that the error is here:
2023-04-25 06:28:18,953+0200 DEBUG otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:109 ignored: [localhost]: FAILED! => {"msg": "The task includes an option with an undefined variable. The error was: 'local_vm_ip' is undefined\n\nThe error appears to be in '/usr/share/ansible/collections/ansible_collections/ovirt/ovirt/roles/hosted_engine_setup/tasks/sync_on_engine_machine.yml': line 2, column 3, but may\nbe elsewhere in the file depending on the exact syntax problem.\n\nThe offending line appears to be:\n\n---\n- name: Set the name for add_host\n ^ here\n"}
....
....
2023-04-25 06:28:19,757+0200 DEBUG otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:109 ignored: [localhost]: FAILED! => {"censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result"}
2023-04-25 06:28:19,857+0200 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:115 TASK [ovirt.ovirt.hosted_engine_setup : Sync on engine machine]
2023-04-25 06:28:19,958+0200 DEBUG otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:109 {'msg': "The field 'delegate_to' has an invalid value, which includes an undefined variable. The error was: 'dict object' has no attribute 'engine'\n\nThe error appears to be in '/usr/share/ansible/collections/ansible_collections/ovirt/ovirt/roles/hosted_engine_setup/tasks/sync_on_engine_machine.yml': line 7, column 3, but may\nbe elsewhere in the file depending on the exact syntax problem.\n\nThe offending line appears to be:\n\n import_tasks: add_engine_as_ansible_host.yml\n- name: Sync on engine machine\n ^ here\n", '_ansible_no_log': None}
2023-04-25 06:28:20,058+0200 DEBUG otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:109 ignored: [localhost]: FAILED! => {"msg": "The field 'delegate_to' has an invalid value, which includes an undefined variable. The error was: 'dict object' has no attribute 'engine'\n\nThe error appears to be in '/usr/share/ansible/collections/ansible_collections/ovirt/ovirt/roles/hosted_engine_setup/tasks/sync_on_engine_machine.yml': line 7, column 3, but may\nbe elsewhere in the file depending on the exact syntax problem.\n\nThe offending line appears to be:\n\n import_tasks: add_engine_as_ansible_host.yml\n- name: Sync on engine machine\n ^ here\n"}
Help me, please
1 year, 11 months
New host deployment failed: Task Configure OVN for oVirt failed to execute.
by Xavier Lauzon
Hi there,
I have been trying to deploy a new host to my cluster and have been having some difficulties with one part of the installation. I've tried to add this host a few different times, and each time I get the same error.
Current troubleshooting steps tried:
Used three different distros: CentOS 9, oVirt Node 4.5.5, Rocky Linux 9,1
Used two separate machines with different network card configurations.
Used both the stable and nightly branches.
Workaround tried:
Manually configuring the host networks through the "Network Interfaces" tab.
Manually activated the host after the install failed - host is set to active state.
This workaround seems to work; however, whenever any VMs are migrated to the host, they fail with the error 'Fatal exception'.
Here is the error log:
2023-04-24 15:21:11 UTC - TASK [ovirt-provider-ovn-driver : Configure OVN for oVirt] *********************
2023-04-24 15:21:14 UTC - {
"uuid" : "2b0f6916-ce5d-4f94-b153-972c1d7c5c38",
"counter" : 423,
"stdout" : "fatal: [node.example.org]: FAILED! => {\"changed\": true, \"cmd\": [\"vdsm-tool\", \"ovn-config\", \"10.0.0.2\", \"node.example.org\"], \"delta\": \"0:00:02.351900\", \"end\": \"2023-04-24 11:21:11.968404\", \"msg\": \"non-zero return code\", \"rc\": 1, \"start\": \"2023-04-24 11:21:09.616504\", \"stderr\": \"Traceback (most recent call last):\\n File \\\"/usr/lib/python3.9/site-packages/vdsm/tool/ovn_config.py\\\", line 117, in get_network\\n return networks[net_name]\\nKeyError: 'node.example.org'\\n\\nDuring handling of the above exception, another exception occurred:\\n\\nTraceback (most recent call last):\\n File \\\"/usr/bin/vdsm-tool\\\", line 195, in main\\n return tool_command[cmd][\\\"command\\\"](*args)\\n File \\\"/usr/lib/python3.9/site-packages/vdsm/tool/ovn_config.py\\\", line 63, in ovn_config\\n ip_address = get_ip_addr(get_network(network_caps(), net_name))\\n File \\\"/usr/lib/python3.9/site-packages/vdsm/tool/ovn_config.py\\\", line 119,
in get_network\\n raise NetworkNotFoundError(net_name)\\nvdsm.tool.ovn_config.NetworkNotFoundError: node.example.org\", \"stderr_lines\": [\"Traceback (most recent call last):\", \" File \\\"/usr/lib/python3.9/site-packages/vdsm/tool/ovn_config.py\\\", line 117, in get_network\", \"
return networks[net_name]\", \"KeyError: 'node.example.org'\", \"\", \"During handling of the above exception, another exception occurred:\", \"\", \"Traceback (most recent call last):\", \" File \\\"/usr/bin/vdsm-tool\\\", line 195, in main\", \" return tool_command[cmd][\\\"command\\\"](*args)\", \" File \\\"/usr/lib/python3.9/site-packages/vdsm/tool/ovn_config.py\\\", line 63, in ovn_config\", \" ip_address = get_ip_addr(get_network(network_caps(), net_name))\", \" File \\\"/usr/lib/python3.9/site-packages/vdsm/tool/ovn_config.py\\\", line 119, in get_network\", \" raise NetworkNotFoundError(net_name)\", \"vdsm.tool.ovn_config.NetworkNotFoundError: node.example.org\"], \"stdout\": \"\", \"stdout_lines\": []}",
"start_line" : 416,
"end_line" : 417,
"runner_ident" : "a36144c0-97bc-4170-8f9e-1c2fbc10b7c3",
"event" : "runner_on_failed",
"pid" : 91786,
"created" : "2023-04-24T15:21:11.993278",
"parent_uuid" : "00163e19-a0e7-c330-a021-000000000041",
"event_data" : {
"playbook" : "ovirt-host-deploy.yml",
"playbook_uuid" : "86e8189d-85f0-4040-9615-8eb13e614105",
"play" : "all",
"play_uuid" : "00163e19-a0e7-c330-a021-000000000002",
"play_pattern" : "all",
"task" : "Configure OVN for oVirt",
"task_uuid" : "00163e19-a0e7-c330-a021-000000000041",
"task_action" : "ansible.builtin.command",
"task_args" : "",
"task_path" : "/usr/share/ovirt-engine/ansible-runner-service-project/project/roles/ovirt-provider-ovn-driver/tasks/configure.yml:43",
"role" : "ovirt-provider-ovn-driver",
"host" : "node.example.org",
"remote_addr" : "node.example.org",
"res" : {
"changed" : true,
"stdout" : "",
"stderr" : "Traceback (most recent call last):\n File \"/usr/lib/python3.9/site-packages/vdsm/tool/ovn_config.py\", line 117, in get_network\n return networks[net_name]\nKeyError: 'node.example.org'\n\nDuring handling of the above exception, another exception occurred:\n\nTraceback (most recent call last):\n File \"/usr/bin/vdsm-tool\", line 195, in main\n return tool_command[cmd][\"command\"](*args)\n File \"/usr/lib/python3.9/site-packages/vdsm/tool/ovn_config.py\", line 63, in ovn_config\n ip_address = get_ip_addr(get_network(network_caps(), net_name))\n File \"/usr/lib/python3.9/site-packages/vdsm/tool/ovn_config.py\", line 119, in get_network\n raise NetworkNotFoundError(net_name)\nvdsm.tool.ovn_config.NetworkNotFoundError: node.example.org",
"rc" : 1,
"cmd" : [ "vdsm-tool", "ovn-config", "10.0.0.2", "node.example.org" ],
"start" : "2023-04-24 11:21:09.616504",
"end" : "2023-04-24 11:21:11.968404",
"delta" : "0:00:02.351900",
"msg" : "non-zero return code",
"invocation" : {
"module_args" : {
"_raw_params" : "vdsm-tool ovn-config 10.0.0.2 node.example.org\n",
"_uses_shell" : false,
"stdin_add_newline" : true,
"strip_empty_ends" : true,
"argv" : null,
"chdir" : null,
"executable" : null,
"creates" : null,
"removes" : null,
"stdin" : null
}
},
"stdout_lines" : [ ],
"stderr_lines" : [ "Traceback (most recent call last):", " File \"/usr/lib/python3.9/site-packages/vdsm/tool/ovn_config.py\", line 117, in get_network", " return networks[net_name]", "KeyError: 'node.example.org'", "", "During handling of the above exception, another exception occurred:", "", "Traceback (most recent call last):", " File \"/usr/bin/vdsm-tool\", line 195, in main", " return tool_command[cmd][\"command\"](*args)", " File \"/usr/lib/python3.9/site-packages/vdsm/tool/ovn_config.py\", line 63, in ovn_config", " ip_address = get_ip_addr(get_network(network_caps(), net_name))", " File \"/usr/lib/python3.9/site-packages/vdsm/tool/ovn_config.py\", line 119, in get_network", " raise NetworkNotFoundError(net_name)", "vdsm.tool.ovn_config.NetworkNotFoundError: node.example.org" ],
"_ansible_no_log" : null
},
"start" : "2023-04-24T15:21:09.330408",
"end" : "2023-04-24T15:21:11.993114",
"duration" : 2.662706,
"ignore_errors" : null,
"event_loop" : null,
"uuid" : "2b0f6916-ce5d-4f94-b153-972c1d7c5c38"
}
}
Has anyone been experiencing the same issues?
Cheers,
Xavier Lauzon
1 year, 11 months
A disk cannot be created because of the size and cannot delete it now
by cagdasbs@gmail.com
Hi everyone,
A colleague of mine tried to create a 1T preallocated disk on a 1T datastore. Ovirt failed to allocate enough space for the disk and throw an error. Now there is a 1T locked disk in the disks list and I cannot delete it, because it's locked. I tried unlock_entity.sh but it doesn't list anything when I run `unlock_entity.sh -t all -q` .
Any suggestions on how do I delete this disk?
1 year, 11 months
hosted-engine to standalone
by carl langlois
Hi,
I plan to transform my hosted_engine setup to a standalone engine.
Currently I have 3 hosts that can run the engine. The engine domain is
located on a glusterfs. I want to simplify this setup by taking 1 of the 3
hosts and setting it as a standalone engine and re-installing the other
host as a standard hypervisor. Also i want to remove the glusterfs. I am
on 4.3 for now but the plan is to upgrade after this simplification. The
step i plan to do is:
1. global maintenance
2. stop engine
3. backup engine
4. shutdown engine
5. install fresh standalone engine and restore from the backup
6. boot the standalone engine.
7. after not sure what the step to clean the old engine domain..
Any suggestion?
Regards,
Carl
1 year, 12 months
Installing Server 2022 VM
by Devin A. Bougie
We've been struggling to install Windows Server 2022 on oVirt. We recently upgraded to the latest oVirt 4.5 on EL9 hosts, but it didn't help.
In the past, we could boot a VM from the install CD, add the mass storage drivers from the virt-io CD, and proceed from there. However, oVirt 4.3 didn't have a Server 2022 selection.
After upgrading to oVirt 4.5, we couldn't get it to boot from the CD. It just gave me Windows boot errors if I do q35 EFI. If I switch back to i440fx I get the same behavior as in oVirt 4.3 - it'll boot the DVD but doesn't find a disk with the virtio-scsi drivers.
We'd greatly appreciate if anyone has successfully installed Server 2022 in oVirt.
Many thanks,
Devin
1 year, 12 months
Not able to create a storage domain of
by kushagra.gupta@hsc.com
Hi team,
We have installed ovirt 4.4.
We have a self hosted Engine setup in the environment which has 1 hosted engine on top of 1 deployment-host.
Goal:
We want to create a storage domain of POSIX compliant type for mounting a ceph based infrastructure.
We have done SRV based resolution in our DNS server but we are unable to create a storage domain.
Issue:
We are passing the following information:
path: :/volumes/xyz/conf/00593e1d-b674-4b00-a289-20bec06761c9
vfs-type: ceph
mounting option:rw,name=foo,secret=AQABDzRkTaJCEhAAC7rC6E68ofwULnx6qX/VDA=
We get the following errors:
====================vdsm.log==================================
2023-04-20 11:26:30,318+0530 INFO (jsonrpc/7) [storage.Mount] mounting :/volumes/xyz/conf/2ee9c2d0-873b-4d04-8c46-4c0da02787b8 at /rhev/data-center/mnt/ :_volumes_xyz_conf_2ee9c2d0-873b-4d04-8c46-4c0da02787b8 (mount:207)
2023-04-20 11:26:30,384+0530 ERROR (jsonrpc/7) [storage.HSM] Could not connect to storageServer (hsm:2374)
Traceback (most recent call last):
File "/usr/lib/python3.6/site-packages/vdsm/storage/hsm.py", line 2371, in connectStorageServer
conObj.connect()
File "/usr/lib/python3.6/site-packages/vdsm/storage/storageServer.py", line 180, in connect
six.reraise(t, v, tb)
File "/usr/lib/python3.6/site-packages/six.py", line 703, in reraise
raise value
File "/usr/lib/python3.6/site-packages/vdsm/storage/storageServer.py", line 171, in connect
self._mount.mount(self.options, self._vfsType, cgroup=self.CGROUP)
File "/usr/lib/python3.6/site-packages/vdsm/storage/mount.py", line 210, in mount
cgroup=cgroup)
File "/usr/lib/python3.6/site-packages/vdsm/common/supervdsm.py", line 56, in __call__
return callMethod()
File "/usr/lib/python3.6/site-packages/vdsm/common/supervdsm.py", line 54, in <lambda>
**kwargs)
File "<string>", line 2, in mount
File "/usr/lib64/python3.6/multiprocessing/managers.py", line 772, in _callmethod
raise convert_to_error(kind, result)
vdsm.storage.mount.MountError: Command ['/usr/bin/mount', '-t', 'ceph', '-o', 'rw,name=foo,secret=AQABDzRkTaJCEhAAC7rC6E68ofwULnx6qX/VDA==', ' :/volumes/xyz/conf/2ee9c2d0-873b-4d04-8c46-4c0da02787b8', '/rhev/data-center/mnt/ :_volumes_xyz_conf_2ee9c2d0-873b-4d04-8c46-4c0da02787b8'] failed with rc=32 out=b'' err=b'mount error 3 = No such process\n'
2023-04-20 11:31:05,715+0530 ERROR (jsonrpc/3) [storage.Dispatcher] FINISH connectStorageServer error=:/volumes/xyz/conf/2ee9c2d0-873b-4d04-8c46-4c0da02787b8 is not a valid hosttail address: (dispatcher:87)
Traceback (most recent call last):
File "/usr/lib/python3.6/site-packages/vdsm/common/network/address.py", line 42, in hosttail_split
raise ValueError
ValueError
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/usr/lib/python3.6/site-packages/vdsm/storage/dispatcher.py", line 74, in wrapper
result = ctask.prepare(func, *args, **kwargs)
File "/usr/lib/python3.6/site-packages/vdsm/storage/task.py", line 110, in wrapper
return m(self, *a, **kw)
File "/usr/lib/python3.6/site-packages/vdsm/storage/task.py", line 1190, in prepare
raise self.error
File "/usr/lib/python3.6/site-packages/vdsm/storage/task.py", line 884, in _run
return fn(*args, **kargs)
File "<decorator-gen-117>", line 2, in connectStorageServer
File "/usr/lib/python3.6/site-packages/vdsm/common/api.py", line 50, in method
ret = func(*args, **kwargs)
File "/usr/lib/python3.6/site-packages/vdsm/storage/hsm.py", line 2368, in connectStorageServer
conObj = storageServer.ConnectionFactory.createConnection(conInfo)
File "/usr/lib/python3.6/site-packages/vdsm/storage/storageServer.py", line 741, in createConnection
return ctor(**params)
File "/usr/lib/python3.6/site-packages/vdsm/storage/storageServer.py", line 154, in __init__
self._remotePath = fileUtils.normalize_path(spec)
File "/usr/lib/python3.6/site-packages/vdsm/storage/fileUtils.py", line 98, in normalize_path
host, tail = address.hosttail_split(path)
File "/usr/lib/python3.6/site-packages/vdsm/common/network/address.py", line 45, in hosttail_split
raise HosttailError('%s is not a valid hosttail address:' % hosttail)
vdsm.common.network.address.HosttailError: :/volumes/xyz/conf/2ee9c2d0-873b-4d04-8c46-4c0da02787b8 is not a valid hosttail address:
2023-04-20 11:31:05,715+0530 INFO (jsonrpc/3) [jsonrpc.JsonRpcServer] RPC call StoragePool.connectStorageServer failed (error 451) in 0.00 seconds (__init__:312)
====================engine.log================================
2023-04-20 11:31:03,818+05 INFO [org.ovirt.engine.core.vdsbroker.vdsbroker.ConnectStorageServerVDSCommand] (default task-29) [6d7913e2-83cf-450d-8746-40f1582d959d] HostName = deployment-host
2023-04-20 11:31:03,818+05 ERROR [org.ovirt.engine.core.vdsbroker.vdsbroker.ConnectStorageServerVDSCommand] (default task-29) [6d7913e2-83cf-450d-8746-40f1582d959d] Command 'ConnectStorageServerVDSCommand(HostName = deployment-host, StorageServerConnectionManagementVDSParameters:{hostId='745b7584-0a43-47d9-985f-af0a0155e787', storagePoolId='00000000-0000-0000-0000-000000000000', storageType='POSIXFS', connectionList='[StorageServerConnections:{id='null', connection=':/volumes/xyz/conf/2ee9c2d0-873b-4d04-8c46-4c0da02787b8', iqn='null', vfsType='ceph', mountOptions='rw,name=foo,secret=AQABDzRkTaJCEhAAC7rC6E68ofwULnx6qX/VDA==', nfsVersion='null', nfsRetrans='null', nfsTimeo='null', iface='null', netIfaceName='null'}]', sendNetworkEventOnFailure='true'})' execution failed: VDSGenericException: VDSErrorException: Failed to ConnectStorageServerVDS, error = Error storage server connection: ("domType=6, spUUID=00000000-0000-0000-0000-000000000000, conList=[{'password': '********', 'vfs_type'
: 'ceph', 'port': '', 'mnt_options': 'rw,name=foo,secret=AQABDzRkTaJCEhAAC7rC6E68ofwULnx6qX/VDA==', 'iqn': '', 'connection': ':/volumes/xyz/conf/2ee9c2d0-873b-4d04-8c46-4c0da02787b8', 'ipv6_enabled': 'false', 'id': '00000000-0000-0000-0000-000000000000', 'user': '', 'tpgt': '1'}]",), code = 451
2023-04-20 11:31:03,818+05 INFO [org.ovirt.engine.core.vdsbroker.vdsbroker.ConnectStorageServerVDSCommand] (default task-29) [6d7913e2-83cf-450d-8746-40f1582d959d] FINISH, ConnectStorageServerVDSCommand, return: , log id: 19c06f3f
2023-04-20 11:31:03,819+05 ERROR [org.ovirt.engine.core.bll.storage.connection.AddStorageServerConnectionCommand] (default task-29) [6d7913e2-83cf-450d-8746-40f1582d959d] Command 'org.ovirt.engine.core.bll.storage.connection.AddStorageServerConnectionCommand' failed: EngineException: org.ovirt.engine.core.vdsbroker.vdsbroker.VDSErrorException: VDSGenericException: VDSErrorException: Failed to ConnectStorageServerVDS, error = Error storage server connection: ("domType=6, spUUID=00000000-0000-0000-0000-000000000000, conList=[{'password': '********', 'vfs_type': 'ceph', 'port': '', 'mnt_options': 'rw,name=foo,secret=AQABDzRkTaJCEhAAC7rC6E68ofwULnx6qX/VDA==', 'iqn': '', 'connection': ':/volumes/xyz/conf/2ee9c2d0-873b-4d04-8c46-4c0da02787b8', 'ipv6_enabled': 'false', 'id': '00000000-0000-0000-0000-000000000000', 'user': '', 'tpgt': '1'}]",), code = 451 (Failed with error StorageServerConnectionError and code 451)
2023-04-20 11:31:03,824+05 ERROR [org.ovirt.engine.core.bll.storage.connection.AddStorageServerConnectionCommand] (default task-29) [6d7913e2-83cf-450d-8746-40f1582d959d] Transaction rolled-back for command 'org.ovirt.engine.core.bll.storage.connection.AddStorageServerConnectionCommand'.
2023-04-20 11:31:03,847+05 INFO [org.ovirt.engine.core.bll.storage.connection.AddStorageServerConnectionCommand] (default task-29) [6d7913e2-83cf-450d-8746-40f1582d959d] Lock freed to object 'EngineLock:{exclusiveLocks='[:/volumes/xyz/conf/2ee9c2d0-873b-4d04-8c46-4c0da02787b8=STORAGE_CONNECTION]', sharedLocks=''}'
2023-04-20 11:34:27,788+05 INFO [org.ovirt.engine.core.bll.provider.network.SyncNetworkProviderCommand] (EE-ManagedScheduledExecutorService-engineScheduledThreadPool-Thread-28) [63fbb9de] Lock Acquired to object 'EngineLock:{exclusiveLocks='[c6e0aa38-20ff-4be3-9b34-81a7b0fabb6a=PROVIDER]', sharedLocks=''}'
2023-04-20 11:34:27,809+05 INFO [org.ovirt.engine.core.bll.provider.network.SyncNetworkProviderCommand] (EE-ManagedScheduledExecutorService-engineScheduledThreadPool-Thread-28) [63fbb9de] Running command: SyncNetworkProviderCommand internal: true.
2023-04-20 11:34:28,017+05 INFO [org.ovirt.engine.core.sso.service.AuthenticationService] (default task-29) [] User admin@internal-authz with profile [internal] successfully logged in with scopes: ovirt-app-api ovirt-ext=token-info:authz-search ovirt-ext=token-info:public-authz-search ovirt-ext=token-info:validate ovirt-ext=token:password-access
===============================================================================
We tried creating a NFS type storage domain using the hostname of the nfs server. This hostname is resolved using the dns server. We are able to create this storage domain.
We also tried creating the POSIX compliant domain using the hostname of the ceph-mon node but it is failing.
We are passing the following information:
path: ceph-node2:6789:/volumes/xyz/conf/00593e1d-b674-4b00-a289-20bec06761c9
vfs-type: ceph
mounting option:rw,name=foo,secret=AQABDzRkTaJCEhAAC7rC6E68ofwULnx6qX/VDA=
In this scenario though the mounting is happening but the storage domain is not created correctly.
we are getting the following errors:
====================vdsm.log==================================
2023-04-20 12:23:54,759+0530 INFO (jsonrpc/0) [api.virt] FINISH getStats return={'status': {'code': 0, 'message': 'Done'}, 'statsList': [{'statusTime': '3009297867', 'status': 'Up', 'vmId': '9ea50595-33cc-4fde-9cd2-ac35c4e97a2c', 'vmName': 'HostedEngine', 'vmType': 'kvm', 'kvmEnable': 'true', 'acpiEnable': 'true', 'elapsedTime': '71325', 'monitorResponse': '0', 'clientIp': '', 'timeOffset': '0', 'pauseCode': 'NOERR', 'cpuUser': '7.12', 'cpuSys': '1.80', 'cpuUsage': '1264730000000', 'network': {'vnet1': {'macAddr': '00:16:3e:54:7f:23', 'name': 'vnet1', 'speed': '1000', 'state': 'unknown', 'rxErrors': '0', 'rxDropped': '0', 'txErrors': '0', 'txDropped': '0', 'rx': '94068359', 'tx': '107873946', 'sampleTime': 861814.22111595}}, 'disks': {'sdc': {'truesize': '0', 'apparentsize': '0', 'readLatency': '0', 'writeLatency': '0', 'flushLatency': '0', 'writtenBytes': '0', 'writeOps': '0', 'readOps': '22', 'readBytes': '406', 'readRate': '0.0', 'writeRate': '0.0'}, 'vda': {'truesize': '76436357
12', 'apparentsize': '53689188352', 'readLatency': '0', 'writeLatency': '18963692.40625', 'flushLatency': '132326.26666666666', 'writtenBytes': '1946506752', 'writeOps': '165176', 'readOps': '31194', 'readBytes': '859866624', 'readRate': '0.0', 'writeRate': '14743.1258832571', 'imageID': 'bcd042b1-0978-43bd-bf24-eb1f554cd520'}}, 'balloonInfo': {'balloon_max': '13489152', 'balloon_min': '13489152', 'balloon_cur': '13489152', 'balloon_target': '13489152', 'ballooning_enabled': True}, 'vcpuCount': '4', 'memoryStats': {'mem_total': '12887304', 'mem_unused': '7100564', 'mem_free': '7994896', 'swap_in': 0, 'swap_out': 0, 'majflt': 0, 'minflt': 196, 'pageflt': 196}, 'displayInfo': [{'type': 'vnc', 'port': '5900', 'tlsPort': '', 'ipAddress': '10.0.1.47'}, {'type': 'spice', 'port': '5901', 'tlsPort': '5902', 'ipAddress': '10.0.1.47'}], 'hash': '-7465763413854713032', 'vmJobs': {}, 'vcpuQuota': '-1', 'vcpuPeriod': 100000, 'username': 'root', 'session': 'Unknown', 'memUsage': '40', 'guestCPUCo
unt': -1, 'appsList': ('kernel-4.18.0-486.el8.x86_64', 'qemu-guest-agent-6.2.0'), 'guestIPs': '', 'guestFQDN': 'manager-hosted-engine.com', 'netIfaces': [{'hw': '00:00:00:00:00:00', 'inet': ['127.0.0.1'], 'inet6': ['::1'], 'name': 'lo'}, {'hw': '00:16:3e:54:7f:23', 'inet': ['10.0.1.48'], 'inet6': ['fe80::216:3eff:fe54:7f23'], 'name': 'eth0'}], 'disksUsage': [{'path': '/', 'total': '7505707008', 'used': '4872785920', 'fs': 'xfs'}, {'path': '/var', 'total': '21464350720', 'used': '1688195072', 'fs': 'xfs'}, {'path': '/home', 'total': '1063256064', 'used': '41271296', 'fs': 'xfs'}, {'path': '/tmp', 'total': '2136997888', 'used': '48943104', 'fs': 'xfs'}, {'path': '/var/log', 'total': '10726932480', 'used': '132661248', 'fs': 'xfs'}, {'path': '/var/log/audit', 'total': '1063256064', 'used': '42233856', 'fs': 'xfs'}, {'path': '/boot', 'total': '1063256064', 'used': '352448512', 'fs': 'xfs'}], 'guestName': 'manager-hosted-engine.com', 'guestOs': '4.18.0-486.el8.x86_64', 'guestOsInfo': {'t
ype': 'linux', 'arch': 'x86_64', 'kernel': '4.18.0-486.el8.x86_64', 'distribution': 'CentOS Stream', 'version': '8', 'codename': ''}, 'guestTimezone': {'offset': 330, 'zone': 'IST'}}]} from=::1,57200, vmId=9ea50595-33cc-4fde-9cd2-ac35c4e97a2c (api:54)
2023-04-20 12:23:55,230+0530 INFO (jsonrpc/3) [vdsm.api] START repoStats(domains=['963b7fd2-c32f-400d-9abf-ade3d702cb4b']) from=::1,57200, task_id=0279b915-ac22-46d2-afbc-8466144d22d8 (api:48)
2023-04-20 12:23:55,231+0530 INFO (jsonrpc/3) [vdsm.api] FINISH repoStats return={'963b7fd2-c32f-400d-9abf-ade3d702cb4b': {'code': 0, 'lastCheck': '5.1', 'delay': '0.000864457', 'valid': True, 'version': 5, 'acquired': True, 'actual': True}} from=::1,57200, task_id=0279b915-ac22-46d2-afbc-8466144d22d8 (api:54)
2023-04-20 12:23:55,735+0530 ERROR (monitor/6ac48b4) [storage.Monitor] Error checking domain 6ac48b40-a9dc-4475-b17e-247b6018abc1 (monitor:453)
Traceback (most recent call last):
File "/usr/lib/python3.6/site-packages/vdsm/storage/monitor.py", line 434, in _checkDomainStatus
self.domain.selftest()
File "/usr/lib/python3.6/site-packages/vdsm/storage/sdc.py", line 48, in __getattr__
return getattr(self.getRealDomain(), attrName)
File "/usr/lib/python3.6/site-packages/vdsm/storage/sdc.py", line 51, in getRealDomain
return self._cache._realProduce(self._sdUUID)
File "/usr/lib/python3.6/site-packages/vdsm/storage/sdc.py", line 139, in _realProduce
domain = self._findDomain(sdUUID)
File "/usr/lib/python3.6/site-packages/vdsm/storage/sdc.py", line 156, in _findDomain
return findMethod(sdUUID)
File "/usr/lib/python3.6/site-packages/vdsm/storage/nfsSD.py", line 146, in findDomain
return NfsStorageDomain(NfsStorageDomain.findDomainPath(sdUUID))
File "/usr/lib/python3.6/site-packages/vdsm/storage/nfsSD.py", line 136, in findDomainPath
raise se.StorageDomainDoesNotExist(sdUUID)
vdsm.storage.exception.StorageDomainDoesNotExist: Storage domain does not exist: ('6ac48b40-a9dc-4475-b17e-247b6018abc1',)
2023-04-20 12:23:56,307+0530 ERROR (check/loop) [storage.Monitor] Error checking path /rhev/data-center/mnt/[abcd:abcd:abcd::51]:6789:_volumes_xyz_conf_00593e1d-b674-4b00-a289-20bec06761c9/6ac48b40-a9dc-4475-b17e-247b6018abc1/dom_md/metadata (monitor:511)
====================engine.log================================
2023-04-20 12:23:55,064+05 INFO [org.ovirt.engine.core.bll.storage.connection.AddStorageServerConnectionCommand] (default task-37) [31e859ef-97d0-435c-81ba-9d29bf637527] Lock Acquired to object 'EngineLock:{exclusiveLocks='[ceph-node2.myhsc.com:6789:/volumes/xyz/conf/00593e1d-b674-4b00-a289-20bec06761c9=STORAGE_CONNECTION]', sharedLocks=''}'
2023-04-20 12:23:55,118+05 INFO [org.ovirt.engine.core.bll.storage.connection.AddStorageServerConnectionCommand] (default task-37) [31e859ef-97d0-435c-81ba-9d29bf637527] Running command: AddStorageServerConnectionCommand internal: false. Entities affected : ID: aaa00000-0000-0000-0000-123456789aaa Type: SystemAction group CREATE_STORAGE_DOMAIN with role type ADMIN
2023-04-20 12:23:55,119+05 INFO [org.ovirt.engine.core.vdsbroker.vdsbroker.ConnectStorageServerVDSCommand] (default task-37) [31e859ef-97d0-435c-81ba-9d29bf637527] START, ConnectStorageServerVDSCommand(HostName = deployment-host, StorageServerConnectionManagementVDSParameters:{hostId='745b7584-0a43-47d9-985f-af0a0155e787', storagePoolId='00000000-0000-0000-0000-000000000000', storageType='POSIXFS', connectionList='[StorageServerConnections:{id='null', connection='ceph-node2.myhsc.com:6789:/volumes/xyz/conf/00593e1d-b674-4b00-a289-20bec06761c9', iqn='null', vfsType='ceph', mountOptions='rw,name=foo,secret=AQABDzRkTaJCEhAAC7rC6E68ofwULnx6qX/VDA==', nfsVersion='null', nfsRetrans='null', nfsTimeo='null', iface='null', netIfaceName='null'}]', sendNetworkEventOnFailure='true'}), log id: 7e34045e
2023-04-20 12:23:55,223+05 INFO [org.ovirt.engine.core.vdsbroker.vdsbroker.ConnectStorageServerVDSCommand] (default task-37) [31e859ef-97d0-435c-81ba-9d29bf637527] FINISH, ConnectStorageServerVDSCommand, return: {00000000-0000-0000-0000-000000000000=100}, log id: 7e34045e
2023-04-20 12:23:55,248+05 ERROR [org.ovirt.engine.core.dal.dbbroker.auditloghandling.AuditLogDirector] (default task-37) [31e859ef-97d0-435c-81ba-9d29bf637527] EVENT_ID: STORAGE_DOMAIN_ERROR(996), The error message for connection ceph-node2.myhsc.com:6789:/volumes/xyz/conf/00593e1d-b674-4b00-a289-20bec06761c9 returned by VDSM was: General Exception
2023-04-20 12:23:55,248+05 ERROR [org.ovirt.engine.core.bll.storage.connection.FileStorageHelper] (default task-37) [31e859ef-97d0-435c-81ba-9d29bf637527] The connection with details 'ceph-node2.myhsc.com:6789:/volumes/xyz/conf/00593e1d-b674-4b00-a289-20bec06761c9' failed because of error code '100' and error message is: general exception
2023-04-20 12:23:55,249+05 ERROR [org.ovirt.engine.core.bll.storage.connection.AddStorageServerConnectionCommand] (default task-37) [31e859ef-97d0-435c-81ba-9d29bf637527] Command 'org.ovirt.engine.core.bll.storage.connection.AddStorageServerConnectionCommand' failed: EngineException: GeneralException (Failed with error GeneralException and code 100)
2023-04-20 12:23:55,255+05 ERROR [org.ovirt.engine.core.bll.storage.connection.AddStorageServerConnectionCommand] (default task-37) [31e859ef-97d0-435c-81ba-9d29bf637527] Transaction rolled-back for command 'org.ovirt.engine.core.bll.storage.connection.AddStorageServerConnectionCommand'.
2023-04-20 12:23:55,278+05 INFO [org.ovirt.engine.core.bll.storage.connection.AddStorageServerConnectionCommand] (default task-37) [31e859ef-97d0-435c-81ba-9d29bf637527] Lock freed to object 'EngineLock:{exclusiveLocks='[ceph-node2.myhsc.com:6789:/volumes/xyz/conf/00593e1d-b674-4b00-a289-20bec06761c9=STORAGE_CONNECTION]', sharedLocks=''}'
This also doesn't work.
We tried mounting the ceph-mon nodes cluster manually on the deployment host CLI , we were able to mount the same.
sudo mount -t ceph :/volumes/xyz/conf/00593e1d-b674-4b00-a289-20bec06761c9 /rhev/data-center/mnt/:_volumes_xyz_conf_00593e1d-b674-4b00-a289-20bec06761c9 -o rw,name=foo,secret=AQABDzRkTaJCEhAAC7rC6E68ofwULnx6qX/VDA==
[root@deployment-host mnt]# df -kh
df: /run/user/0/gvfs: Transport endpoint is not connected
Filesystem Size Used Avail Use% Mounted on
[abcd:abcd:abcd::51]:6789,[abcd:abcd:abcd::52]:6789,[abcd:abcd:abcd::53]:6789:/volumes/xyz/conf/00593e1d-b674-4b00-a289-20bec06761c9 19G 0 19G 0% /rhev/data-center/mnt/:_volumes_xyz_conf_00593e1d-b674-4b00-a289-20bec06761c9
Queries:
1. Could you help us in creating a storage domain for the same from UI.
2. Do we have any process of doing the same using the CLI.
Thanks and Regards
Kushagra Gupta
1 year, 12 months
Ovirt installation, deployment failed..
by John Bnet
Hi,
I'm new to ovirt. I'm trying to get it up and running but I stuck at the deployement point.
Can someone have any idea of what is going on?
this is what i did to install on cent0s8
-->
hostnamectl set-hostname node1.local.net
vi /etc/hosts
dnf install -y centos-release-ovirt45
dnf module enable -y javapackages-tools pki-deps postgresql:12 389-ds mod_auth_openidc
dnf update
dnf install ovirt-engine
engine-setup
dnf install cockpit-ovirt-dashboard vdsm-gluster ovirt-host
systemctl enable cockpit.socket
systemctl start cockpit.socket
firewall-cmd --list-services
firewall-cmd --permanent --add-service=cockpit
hosted-engine --deploy
an here is a part of the log output:
2023-04-19 09:32:36,628-0400 DEBUG otopi.context context.dumpEnvironment:765 ENVIRONMENT DUMP - BEGIN
2023-04-19 09:32:36,628-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV CORE/internalPackageTransaction=Transaction:'[DNF Transaction]'
2023-04-19 09:32:36,628-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV CORE/mainTransaction=Transaction:'[DNF Transaction]'
2023-04-19 09:32:36,629-0400 DEBUG otopi.context context.dumpEnvironment:779 ENVIRONMENT DUMP - END
2023-04-19 09:32:36,629-0400 DEBUG otopi.context context._executeMethod:127 Stage setup METHOD otopi.plugins.otopi.packagers.yumpackager.Plugin._setup
2023-04-19 09:32:36,629-0400 DEBUG otopi.context context._executeMethod:136 otopi.plugins.otopi.packagers.yumpackager.Plugin._setup condition False
2023-04-19 09:32:36,630-0400 DEBUG otopi.context context._executeMethod:127 Stage setup METHOD otopi.plugins.gr_he_common.engine.fqdn.Plugin._setup
2023-04-19 09:32:36,631-0400 DEBUG otopi.context context.dumpEnvironment:765 ENVIRONMENT DUMP - BEGIN
2023-04-19 09:32:36,631-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV COMMAND/dig=NoneType:'None'
2023-04-19 09:32:36,631-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV COMMAND/ip=NoneType:'None'
2023-04-19 09:32:36,631-0400 DEBUG otopi.context context.dumpEnvironment:779 ENVIRONMENT DUMP - END
2023-04-19 09:32:36,632-0400 DEBUG otopi.context context._executeMethod:127 Stage setup METHOD otopi.plugins.gr_he_common.network.bridge.Plugin._setup
2023-04-19 09:32:36,633-0400 DEBUG otopi.context context._executeMethod:127 Stage setup METHOD otopi.plugins.gr_he_common.network.gateway.Plugin._setup
2023-04-19 09:32:36,634-0400 DEBUG otopi.context context._executeMethod:127 Stage setup METHOD otopi.plugins.gr_he_common.network.network_check.Plugin._setup
2023-04-19 09:32:36,634-0400 DEBUG otopi.context context.dumpEnvironment:765 ENVIRONMENT DUMP - BEGIN
2023-04-19 09:32:36,634-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV COMMAND/nc=NoneType:'None'
2023-04-19 09:32:36,634-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV COMMAND/ping=NoneType:'None'
2023-04-19 09:32:36,635-0400 DEBUG otopi.context context.dumpEnvironment:779 ENVIRONMENT DUMP - END
2023-04-19 09:32:36,635-0400 DEBUG otopi.context context._executeMethod:127 Stage setup METHOD otopi.plugins.gr_he_common.vm.cloud_init.Plugin._setup
2023-04-19 09:32:36,636-0400 DEBUG otopi.context context.dumpEnvironment:765 ENVIRONMENT DUMP - BEGIN
2023-04-19 09:32:36,636-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV COMMAND/ssh-keygen=NoneType:'None'
2023-04-19 09:32:36,636-0400 DEBUG otopi.context context.dumpEnvironment:779 ENVIRONMENT DUMP - END
2023-04-19 09:32:36,637-0400 DEBUG otopi.context context._executeMethod:127 Stage setup METHOD otopi.plugins.otopi.network.firewalld.Plugin._setup
2023-04-19 09:32:36,637-0400 DEBUG otopi.context context.dumpEnvironment:765 ENVIRONMENT DUMP - BEGIN
2023-04-19 09:32:36,637-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV COMMAND/firewall-cmd=NoneType:'None'
2023-04-19 09:32:36,637-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV COMMAND/python3=NoneType:'None'
2023-04-19 09:32:36,638-0400 DEBUG otopi.context context.dumpEnvironment:779 ENVIRONMENT DUMP - END
2023-04-19 09:32:36,638-0400 DEBUG otopi.context context._executeMethod:127 Stage setup METHOD otopi.plugins.otopi.network.hostname.Plugin._setup
2023-04-19 09:32:36,639-0400 DEBUG otopi.context context._executeMethod:127 Stage setup METHOD otopi.plugins.otopi.services.openrc.Plugin._setup
2023-04-19 09:32:36,639-0400 DEBUG otopi.context context.dumpEnvironment:765 ENVIRONMENT DUMP - BEGIN
2023-04-19 09:32:36,639-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV COMMAND/rc=NoneType:'None'
2023-04-19 09:32:36,639-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV COMMAND/rc-update=NoneType:'None'
2023-04-19 09:32:36,640-0400 DEBUG otopi.context context.dumpEnvironment:779 ENVIRONMENT DUMP - END
2023-04-19 09:32:36,640-0400 DEBUG otopi.context context._executeMethod:127 Stage setup METHOD otopi.plugins.otopi.services.rhel.Plugin._setup
2023-04-19 09:32:36,641-0400 DEBUG otopi.context context.dumpEnvironment:765 ENVIRONMENT DUMP - BEGIN
2023-04-19 09:32:36,641-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV COMMAND/chkconfig=NoneType:'None'
2023-04-19 09:32:36,641-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV COMMAND/initctl=NoneType:'None'
2023-04-19 09:32:36,641-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV COMMAND/service=NoneType:'None'
2023-04-19 09:32:36,641-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV COMMAND/systemctl=NoneType:'None'
2023-04-19 09:32:36,642-0400 DEBUG otopi.context context.dumpEnvironment:779 ENVIRONMENT DUMP - END
2023-04-19 09:32:36,642-0400 DEBUG otopi.context context._executeMethod:127 Stage setup METHOD otopi.plugins.otopi.services.systemd.Plugin._setup
2023-04-19 09:32:36,643-0400 DEBUG otopi.context context._executeMethod:127 Stage setup METHOD otopi.plugins.otopi.system.clock.Plugin._setup
2023-04-19 09:32:36,643-0400 DEBUG otopi.context context.dumpEnvironment:765 ENVIRONMENT DUMP - BEGIN
2023-04-19 09:32:36,643-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV COMMAND/chronyc=NoneType:'None'
2023-04-19 09:32:36,643-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV COMMAND/date=NoneType:'None'
2023-04-19 09:32:36,644-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV COMMAND/hwclock=NoneType:'None'
2023-04-19 09:32:36,644-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV COMMAND/ntpq=NoneType:'None'
2023-04-19 09:32:36,644-0400 DEBUG otopi.context context.dumpEnvironment:779 ENVIRONMENT DUMP - END
2023-04-19 09:32:36,645-0400 DEBUG otopi.context context._executeMethod:127 Stage setup METHOD otopi.plugins.otopi.system.reboot.Plugin._setup
2023-04-19 09:32:36,645-0400 DEBUG otopi.context context.dumpEnvironment:765 ENVIRONMENT DUMP - BEGIN
2023-04-19 09:32:36,645-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV COMMAND/reboot=NoneType:'None'
2023-04-19 09:32:36,646-0400 DEBUG otopi.context context.dumpEnvironment:779 ENVIRONMENT DUMP - END
2023-04-19 09:32:36,646-0400 INFO otopi.context context.runSequence:616 Stage: Environment packages setup
2023-04-19 09:32:36,646-0400 DEBUG otopi.context context.runSequence:620 STAGE internal_packages
2023-04-19 09:32:36,646-0400 DEBUG otopi.context context._executeMethod:127 Stage internal_packages METHOD otopi.plugins.otopi.core.transaction.Plugin._pre_prepare
2023-04-19 09:32:36,647-0400 DEBUG otopi.transaction transaction._prepare:61 preparing 'DNF Transaction'
2023-04-19 09:32:36,647-0400 DEBUG otopi.plugins.otopi.packagers.dnfpackager dnfpackager.verbose:75 DNF Creating transaction
2023-04-19 09:32:38,583-0400 DEBUG otopi.context context._executeMethod:127 Stage internal_packages METHOD otopi.plugins.gr_he_common.vm.boot_disk.Plugin._internal_packages
2023-04-19 09:32:38,584-0400 DEBUG otopi.context context._executeMethod:136 otopi.plugins.gr_he_common.vm.boot_disk.Plugin._internal_packages condition False
2023-04-19 09:32:38,585-0400 DEBUG otopi.context context._executeMethod:127 Stage internal_packages METHOD otopi.plugins.otopi.packagers.dnfpackager.Plugin._internal_packages_end
2023-04-19 09:32:38,585-0400 DEBUG otopi.plugins.otopi.packagers.dnfpackager dnfpackager.verbose:75 DNF Building transaction
2023-04-19 09:32:38,870-0400 DEBUG otopi.plugins.otopi.packagers.dnfpackager dnfpackager.verbose:75 DNF Transaction built
2023-04-19 09:32:38,871-0400 DEBUG otopi.plugins.otopi.packagers.dnfpackager dnfpackager.verbose:75 DNF Empty transaction
2023-04-19 09:32:38,872-0400 DEBUG otopi.context context._executeMethod:127 Stage internal_packages METHOD otopi.plugins.otopi.packagers.yumpackager.Plugin._internal_packages_end
2023-04-19 09:32:38,872-0400 DEBUG otopi.context context._executeMethod:136 otopi.plugins.otopi.packagers.yumpackager.Plugin._internal_packages_end condition False
2023-04-19 09:32:38,873-0400 DEBUG otopi.context context._executeMethod:127 Stage internal_packages METHOD otopi.plugins.otopi.core.transaction.Plugin._pre_end
2023-04-19 09:32:38,873-0400 DEBUG otopi.transaction transaction.commit:152 committing 'DNF Transaction'
2023-04-19 09:32:38,873-0400 DEBUG otopi.plugins.otopi.packagers.dnfpackager dnfpackager.verbose:75 DNF Closing transaction with commit
2023-04-19 09:32:38,873-0400 DEBUG otopi.plugins.otopi.packagers.dnfpackager dnfpackager.verbose:75 DNF Calling _plugins._unload
2023-04-19 09:32:38,927-0400 INFO otopi.context context.runSequence:616 Stage: Programs detection
2023-04-19 09:32:38,927-0400 DEBUG otopi.context context.runSequence:620 STAGE programs
2023-04-19 09:32:38,927-0400 DEBUG otopi.context context._executeMethod:127 Stage programs METHOD otopi.plugins.gr_he_common.network.bridge.Plugin._check_NM
2023-04-19 09:32:38,928-0400 DEBUG otopi.context context._executeMethod:136 otopi.plugins.gr_he_common.network.bridge.Plugin._check_NM condition False
2023-04-19 09:32:38,929-0400 DEBUG otopi.context context._executeMethod:127 Stage programs METHOD otopi.plugins.otopi.system.command.Plugin._programs
2023-04-19 09:32:38,930-0400 DEBUG otopi.context context.dumpEnvironment:765 ENVIRONMENT DUMP - BEGIN
2023-04-19 09:32:38,930-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV COMMAND/chkconfig=str:'/usr/sbin/chkconfig'
2023-04-19 09:32:38,930-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV COMMAND/chronyc=str:'/usr/bin/chronyc'
2023-04-19 09:32:38,930-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV COMMAND/date=str:'/usr/bin/date'
2023-04-19 09:32:38,930-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV COMMAND/dig=str:'/usr/bin/dig'
2023-04-19 09:32:38,931-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV COMMAND/firewall-cmd=str:'/usr/bin/firewall-cmd'
2023-04-19 09:32:38,931-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV COMMAND/hwclock=str:'/usr/sbin/hwclock'
2023-04-19 09:32:38,931-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV COMMAND/ip=str:'/usr/sbin/ip'
2023-04-19 09:32:38,931-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV COMMAND/nc=str:'/usr/bin/nc'
2023-04-19 09:32:38,931-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV COMMAND/ping=str:'/usr/sbin/ping'
2023-04-19 09:32:38,931-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV COMMAND/python3=str:'/usr/bin/python3'
2023-04-19 09:32:38,931-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV COMMAND/reboot=str:'/usr/sbin/reboot'
2023-04-19 09:32:38,931-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV COMMAND/service=str:'/usr/sbin/service'
2023-04-19 09:32:38,931-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV COMMAND/ssh-keygen=str:'/usr/bin/ssh-keygen'
2023-04-19 09:32:38,931-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV COMMAND/systemctl=str:'/usr/bin/systemctl'
2023-04-19 09:32:38,932-0400 DEBUG otopi.context context.dumpEnvironment:779 ENVIRONMENT DUMP - END
2023-04-19 09:32:38,932-0400 DEBUG otopi.context context._executeMethod:127 Stage programs METHOD otopi.plugins.otopi.services.openrc.Plugin._programs
2023-04-19 09:32:38,933-0400 DEBUG otopi.context context._executeMethod:127 Stage programs METHOD otopi.plugins.otopi.services.rhel.Plugin._programs
2023-04-19 09:32:38,934-0400 DEBUG otopi.plugins.otopi.services.rhel plugin.executeRaw:813 execute: ('/usr/bin/systemctl', 'show-environment'), executable='None', cwd='None', env=None
2023-04-19 09:32:38,943-0400 DEBUG otopi.plugins.otopi.services.rhel plugin.executeRaw:863 execute-result: ('/usr/bin/systemctl', 'show-environment'), rc=0
2023-04-19 09:32:38,944-0400 DEBUG otopi.plugins.otopi.services.rhel plugin.execute:921 execute-output: ('/usr/bin/systemctl', 'show-environment') stdout:
LANG=en_US.UTF-8
PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin
2023-04-19 09:32:38,944-0400 DEBUG otopi.plugins.otopi.services.rhel plugin.execute:926 execute-output: ('/usr/bin/systemctl', 'show-environment') stderr:
2023-04-19 09:32:38,945-0400 DEBUG otopi.context context._executeMethod:127 Stage programs METHOD otopi.plugins.otopi.services.systemd.Plugin._programs
2023-04-19 09:32:38,945-0400 DEBUG otopi.plugins.otopi.services.systemd plugin.executeRaw:813 execute: ('/usr/bin/systemctl', 'show-environment'), executable='None', cwd='None', env=None
2023-04-19 09:32:38,954-0400 DEBUG otopi.plugins.otopi.services.systemd plugin.executeRaw:863 execute-result: ('/usr/bin/systemctl', 'show-environment'), rc=0
2023-04-19 09:32:38,954-0400 DEBUG otopi.plugins.otopi.services.systemd plugin.execute:921 execute-output: ('/usr/bin/systemctl', 'show-environment') stdout:
LANG=en_US.UTF-8
PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin
2023-04-19 09:32:38,954-0400 DEBUG otopi.plugins.otopi.services.systemd plugin.execute:926 execute-output: ('/usr/bin/systemctl', 'show-environment') stderr:
2023-04-19 09:32:38,954-0400 DEBUG otopi.plugins.otopi.services.systemd systemd._programs:49 registering systemd provider
2023-04-19 09:32:38,955-0400 INFO otopi.context context.runSequence:616 Stage: Environment setup (late)
2023-04-19 09:32:38,955-0400 DEBUG otopi.context context.runSequence:620 STAGE late_setup
2023-04-19 09:32:38,956-0400 DEBUG otopi.context context._executeMethod:127 Stage late_setup METHOD otopi.plugins.gr_he_common.vm.boot_disk.Plugin._late_setup
2023-04-19 09:32:38,956-0400 DEBUG otopi.context context._executeMethod:136 otopi.plugins.gr_he_common.vm.boot_disk.Plugin._late_setup condition False
2023-04-19 09:32:38,957-0400 INFO otopi.context context.runSequence:616 Stage: Environment customization
2023-04-19 09:32:38,957-0400 DEBUG otopi.context context.runSequence:620 STAGE customization
2023-04-19 09:32:38,958-0400 DEBUG otopi.context context._executeMethod:127 Stage customization METHOD otopi.plugins.otopi.network.firewalld.Plugin._customization
2023-04-19 09:32:38,958-0400 DEBUG otopi.plugins.otopi.services.systemd systemd.exists:85 check if service firewalld exists
2023-04-19 09:32:38,958-0400 DEBUG otopi.plugins.otopi.services.systemd plugin.executeRaw:813 execute: ('/usr/bin/systemctl', 'show', '-p', 'LoadState', 'firewalld.service'), executable='None', cwd='None', env=None
2023-04-19 09:32:38,968-0400 DEBUG otopi.plugins.otopi.services.systemd plugin.executeRaw:863 execute-result: ('/usr/bin/systemctl', 'show', '-p', 'LoadState', 'firewalld.service'), rc=0
2023-04-19 09:32:38,968-0400 DEBUG otopi.plugins.otopi.services.systemd plugin.execute:921 execute-output: ('/usr/bin/systemctl', 'show', '-p', 'LoadState', 'firewalld.service') stdout:
LoadState=loaded
2023-04-19 09:32:38,968-0400 DEBUG otopi.plugins.otopi.services.systemd plugin.execute:926 execute-output: ('/usr/bin/systemctl', 'show', '-p', 'LoadState', 'firewalld.service') stderr:
2023-04-19 09:32:38,969-0400 DEBUG otopi.plugins.otopi.network.firewalld firewalld._get_firewalld_cmd_version:116 firewalld version: 0.9.3
2023-04-19 09:32:38,970-0400 DEBUG otopi.context context.dumpEnvironment:765 ENVIRONMENT DUMP - BEGIN
2023-04-19 09:32:38,970-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV NETWORK/firewalldAvailable=bool:'True'
2023-04-19 09:32:38,970-0400 DEBUG otopi.context context.dumpEnvironment:779 ENVIRONMENT DUMP - END
2023-04-19 09:32:38,971-0400 DEBUG otopi.context context._executeMethod:127 Stage customization METHOD otopi.plugins.otopi.core.config.Plugin._customize1
2023-04-19 09:32:38,972-0400 DEBUG otopi.context context._executeMethod:127 Stage customization METHOD otopi.plugins.gr_he_common.core.titles.Plugin._storage_start
2023-04-19 09:32:38,972-0400 DEBUG otopi.plugins.otopi.dialog.human dialog.__logString:204 DIALOG:SEND
2023-04-19 09:32:38,973-0400 DEBUG otopi.plugins.otopi.dialog.human dialog.__logString:204 DIALOG:SEND --== STORAGE CONFIGURATION ==--
2023-04-19 09:32:38,973-0400 DEBUG otopi.plugins.otopi.dialog.human dialog.__logString:204 DIALOG:SEND
2023-04-19 09:32:38,974-0400 DEBUG otopi.context context._executeMethod:127 Stage customization METHOD otopi.plugins.otopi.dialog.cli.Plugin._customize
2023-04-19 09:32:38,974-0400 DEBUG otopi.context context._executeMethod:136 otopi.plugins.otopi.dialog.cli.Plugin._customize condition False
2023-04-19 09:32:38,975-0400 DEBUG otopi.context context._executeMethod:127 Stage customization METHOD otopi.plugins.gr_he_common.core.titles.Plugin._storage_end
2023-04-19 09:32:38,976-0400 DEBUG otopi.context context._executeMethod:127 Stage customization METHOD otopi.plugins.gr_he_common.core.titles.Plugin._network_start
2023-04-19 09:32:38,976-0400 DEBUG otopi.plugins.otopi.dialog.human dialog.__logString:204 DIALOG:SEND
2023-04-19 09:32:38,976-0400 DEBUG otopi.plugins.otopi.dialog.human dialog.__logString:204 DIALOG:SEND --== HOST NETWORK CONFIGURATION ==--
2023-04-19 09:32:38,977-0400 DEBUG otopi.plugins.otopi.dialog.human dialog.__logString:204 DIALOG:SEND
2023-04-19 09:32:38,978-0400 DEBUG otopi.context context._executeMethod:127 Stage customization METHOD otopi.plugins.gr_he_common.network.bridge.Plugin._detect_bridges
2023-04-19 09:32:38,979-0400 DEBUG otopi.context context._executeMethod:127 Stage customization METHOD otopi.plugins.gr_he_common.network.gateway.Plugin._customization
2023-04-19 09:32:38,979-0400 DEBUG otopi.plugins.otopi.dialog.human human.queryString:174 query OVEHOSTED_GATEWAY
2023-04-19 09:32:38,979-0400 DEBUG otopi.plugins.otopi.dialog.human dialog.__logString:204 DIALOG:SEND Please indicate the gateway IP address [10.3.3.1]:
2023-04-19 09:32:40,455-0400 DEBUG otopi.context context.dumpEnvironment:765 ENVIRONMENT DUMP - BEGIN
2023-04-19 09:32:40,455-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV OVEHOSTED_NETWORK/gateway=str:'10.3.3.1'
2023-04-19 09:32:40,455-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV QUESTION/1/OVEHOSTED_GATEWAY=str:'10.3.3.1'
2023-04-19 09:32:40,456-0400 DEBUG otopi.context context.dumpEnvironment:779 ENVIRONMENT DUMP - END
2023-04-19 09:32:40,456-0400 DEBUG otopi.context context._executeMethod:127 Stage customization METHOD otopi.plugins.gr_he_common.network.bridge.Plugin._customization
2023-04-19 09:32:40,456-0400 INFO otopi.plugins.gr_he_common.network.bridge bridge._customization:143 Checking available network interfaces:
2023-04-19 09:32:40,457-0400 DEBUG otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils.run:198 ansible-playbook: cmd: ['/bin/ansible-playbook', '--module-path=/usr/share/ovirt-hosted-engine-setup/he_ansible', '--inventory=localhost,', '--extra-vars=@/tmp/tmppje659zg', '--tags=get_network_interfaces', '--skip-tags=always', '/usr/share/ovirt-hosted-engine-setup/he_ansible/trigger_role.yml']
2023-04-19 09:32:40,457-0400 DEBUG otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils.run:199 ansible-playbook: out_path: /tmp/tmp4gk4pm6s
2023-04-19 09:32:40,457-0400 DEBUG otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils.run:200 ansible-playbook: vars_path: /tmp/tmppje659zg
2023-04-19 09:32:40,457-0400 DEBUG otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils.run:201 ansible-playbook: env: {'LS_COLORS': 'rs=0:di=38;5;33:ln=38;5;51:mh=00:pi=40;38;5;11:so=38;5;13:do=38;5;5:bd=48;5;232;38;5;11:cd=48;5;232;38;5;3:or=48;5;232;38;5;9:mi=01;05;37;41:su=48;5;196;38;5;15:sg=48;5;11;38;5;16:ca=48;5;196;38;5;226:tw=48;5;10;38;5;16:ow=48;5;10;38;5;21:st=48;5;21;38;5;15:ex=38;5;40:*.tar=38;5;9:*.tgz=38;5;9:*.arc=38;5;9:*.arj=38;5;9:*.taz=38;5;9:*.lha=38;5;9:*.lz4=38;5;9:*.lzh=38;5;9:*.lzma=38;5;9:*.tlz=38;5;9:*.txz=38;5;9:*.tzo=38;5;9:*.t7z=38;5;9:*.zip=38;5;9:*.z=38;5;9:*.dz=38;5;9:*.gz=38;5;9:*.lrz=38;5;9:*.lz=38;5;9:*.lzo=38;5;9:*.xz=38;5;9:*.zst=38;5;9:*.tzst=38;5;9:*.bz2=38;5;9:*.bz=38;5;9:*.tbz=38;5;9:*.tbz2=38;5;9:*.tz=38;5;9:*.deb=38;5;9:*.rpm=38;5;9:*.jar=38;5;9:*.war=38;5;9:*.ear=38;5;9:*.sar=38;5;9:*.rar=38;5;9:*.alz=38;5;9:*.ace=38;5;9:*.zoo=38;5;9:*.cpio=38;5;9:*.7z=38;5;9:*.rz=38;5;9:*.cab=38;5;9:*.wim=38;5;9:*.swm=38;5;9:*.dwm=38;5;9:*.esd=38;
5;9:*.jpg=38;5;13:*.jpeg=38;5;13:*.mjpg=38;5;13:*.mjpeg=38;5;13:*.gif=38;5;13:*.bmp=38;5;13:*.pbm=38;5;13:*.pgm=38;5;13:*.ppm=38;5;13:*.tga=38;5;13:*.xbm=38;5;13:*.xpm=38;5;13:*.tif=38;5;13:*.tiff=38;5;13:*.png=38;5;13:*.svg=38;5;13:*.svgz=38;5;13:*.mng=38;5;13:*.pcx=38;5;13:*.mov=38;5;13:*.mpg=38;5;13:*.mpeg=38;5;13:*.m2v=38;5;13:*.mkv=38;5;13:*.webm=38;5;13:*.ogm=38;5;13:*.mp4=38;5;13:*.m4v=38;5;13:*.mp4v=38;5;13:*.vob=38;5;13:*.qt=38;5;13:*.nuv=38;5;13:*.wmv=38;5;13:*.asf=38;5;13:*.rm=38;5;13:*.rmvb=38;5;13:*.flc=38;5;13:*.avi=38;5;13:*.fli=38;5;13:*.flv=38;5;13:*.gl=38;5;13:*.dl=38;5;13:*.xcf=38;5;13:*.xwd=38;5;13:*.yuv=38;5;13:*.cgm=38;5;13:*.emf=38;5;13:*.ogv=38;5;13:*.ogx=38;5;13:*.aac=38;5;45:*.au=38;5;45:*.flac=38;5;45:*.m4a=38;5;45:*.mid=38;5;45:*.midi=38;5;45:*.mka=38;5;45:*.mp3=38;5;45:*.mpc=38;5;45:*.ogg=38;5;45:*.ra=38;5;45:*.wav=38;5;45:*.oga=38;5;45:*.opus=38;5;45:*.spx=38;5;45:*.xspf=38;5;45:', 'SSH_CONNECTION': '10.3.3.95 33474 10.3.3.55 22', 'LANG': 'en_CA.UTF-8',
'HISTCONTROL': 'ignoredups', 'GUESTFISH_RESTORE': '\\e[0m', 'HOSTNAME': 'node1.local.net', 'GUESTFISH_INIT': '\\e[1;34m', 'S_COLORS': 'auto', 'which_declare': 'declare -f', 'XDG_SESSION_ID': '4', 'USER': 'root', 'GUESTFISH_PS1': '\\[\\e[1;32m\\]><fs>\\[\\e[0;31m\\] ', 'SELINUX_ROLE_REQUESTED': '', 'PWD': '/root', 'SSH_ASKPASS': '/usr/libexec/openssh/gnome-ssh-askpass', 'HOME': '/root', 'SSH_CLIENT': '10.3.3.95 33474 22', 'SELINUX_LEVEL_REQUESTED': '', 'XDG_DATA_DIRS': '/root/.local/share/flatpak/exports/share:/var/lib/flatpak/exports/share:/usr/local/share:/usr/share', 'SSH_TTY': '/dev/pts/0', 'MAIL': '/var/spool/mail/root', 'SHELL': '/bin/bash', 'TERM': 'xterm-256color', 'SELINUX_USE_CURRENT_RANGE': '', 'SHLVL': '1', 'PYTHONPATH': '/usr/share/ovirt-hosted-engine-setup/scripts/..:', 'LOGNAME': 'root', 'DBUS_SESSION_BUS_ADDRESS': 'unix:path=/run/user/0/bus', 'XDG_RUNTIME_DIR': '/run/user/0', 'PATH': '/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/root/bin', 'GUESTFISH_OUTPUT': '
\\e[0m', 'DEBUGINFOD_URLS': 'https://debuginfod.centos.org/ ', 'HISTSIZE': '1000', 'LESSOPEN': '||/usr/bin/lesspipe.sh %s', 'BASH_FUNC_which%%': '() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}', 'OTOPI_EXECDIR': '/root', 'OTOPI_LOGFILE': '/var/log/ovirt-hosted-engine-setup/ovirt-hosted-engine-setup-20230419093227-vu7nay.log', 'OTOPI_CALLBACK_OF': '/tmp/tmp4gk4pm6s', 'ANSIBLE_CALLBACKS_ENABLED': '1_otopi_json,2_ovirt_logger', 'ANSIBLE_STDOUT_CALLBACK': '1_otopi_json', 'HE_ANSIBLE_LOG_PATH': '/var/log/ovirt-hosted-engine-setup/ovirt-hosted-engine-setup-ansible-get_network_interfaces-20230419093240-wm9s2d.log'}
2023-04-19 09:32:41,363-0400 DEBUG otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils.run:226 ansible-playbook rc: 250
2023-04-19 09:32:41,363-0400 DEBUG otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils.run:233 ansible-playbook stdout:
2023-04-19 09:32:41,363-0400 DEBUG otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils.run:235 b'to see the full traceback, use -vvv\n'
2023-04-19 09:32:41,364-0400 DEBUG otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils.run:236 ansible-playbook stderr:
2023-04-19 09:32:41,364-0400 ERROR otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils.run:238 b'[WARNING]: Skipping plugin (/usr/share/ovirt-hosted-engine-\n'
2023-04-19 09:32:41,364-0400 ERROR otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils.run:238 b'setup/he_ansible/callback_plugins/2_ovirt_logger.py), cannot load: cannot\n'
2023-04-19 09:32:41,364-0400 ERROR otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils.run:238 b"import name 'Callable' from 'collections'\n"
2023-04-19 09:32:41,364-0400 ERROR otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils.run:238 b'(/usr/lib64/python3.11/collections/__init__.py)\n'
2023-04-19 09:32:41,365-0400 ERROR otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils.run:238 b"ERROR! Unexpected Exception, this is probably a bug: cannot import name 'Callable' from 'collections' (/usr/lib64/python3.11/collections/__init__.py)\n"
2023-04-19 09:32:41,365-0400 DEBUG otopi.context context._executeMethod:145 method exception
Traceback (most recent call last):
File "/usr/lib/python3.6/site-packages/otopi/context.py", line 132, in _executeMethod
method['method']()
File "/usr/share/ovirt-hosted-engine-setup/scripts/../plugins/gr-he-common/network/bridge.py", line 152, in _customization
r = ah.run()
File "/usr/lib/python3.6/site-packages/ovirt_hosted_engine_setup/ansible_utils.py", line 240, in run
raise RuntimeError(_('Failed executing ansible-playbook'))
RuntimeError: Failed executing ansible-playbook
2023-04-19 09:32:41,367-0400 ERROR otopi.context context._executeMethod:154 Failed to execute stage 'Environment customization': Failed executing ansible-playbook
2023-04-19 09:32:41,367-0400 DEBUG otopi.context context.dumpEnvironment:765 ENVIRONMENT DUMP - BEGIN
2023-04-19 09:32:41,367-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV BASE/error=bool:'True'
2023-04-19 09:32:41,367-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV BASE/exceptionInfo=list:'[(<class 'RuntimeError'>, RuntimeError('Failed executing ansible-playbook',), <traceback object at 0x7fdef2a2b908>)]'
2023-04-19 09:32:41,368-0400 DEBUG otopi.context context.dumpEnvironment:779 ENVIRONMENT DUMP - END
2023-04-19 09:32:41,368-0400 INFO otopi.context context.runSequence:616 Stage: Clean up
2023-04-19 09:32:41,368-0400 DEBUG otopi.context context.runSequence:620 STAGE cleanup
2023-04-19 09:32:41,369-0400 DEBUG otopi.context context._executeMethod:127 Stage cleanup METHOD otopi.plugins.gr_he_ansiblesetup.core.misc.Plugin._cleanup
2023-04-19 09:32:41,369-0400 INFO otopi.plugins.gr_he_ansiblesetup.core.misc misc._cleanup:546 Cleaning temporary resources
2023-04-19 09:32:41,370-0400 DEBUG otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils.run:198 ansible-playbook: cmd: ['/bin/ansible-playbook', '--module-path=/usr/share/ovirt-hosted-engine-setup/he_ansible', '--inventory=localhost,', '--extra-vars=@/tmp/tmptom_o0os', '--tags=final_clean', '--skip-tags=always', '/usr/share/ovirt-hosted-engine-setup/he_ansible/trigger_role.yml']
2023-04-19 09:32:41,370-0400 DEBUG otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils.run:199 ansible-playbook: out_path: /tmp/tmpjo8dbijt
2023-04-19 09:32:41,370-0400 DEBUG otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils.run:200 ansible-playbook: vars_path: /tmp/tmptom_o0os
2023-04-19 09:32:41,370-0400 DEBUG otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils.run:201 ansible-playbook: env: {'LS_COLORS': 'rs=0:di=38;5;33:ln=38;5;51:mh=00:pi=40;38;5;11:so=38;5;13:do=38;5;5:bd=48;5;232;38;5;11:cd=48;5;232;38;5;3:or=48;5;232;38;5;9:mi=01;05;37;41:su=48;5;196;38;5;15:sg=48;5;11;38;5;16:ca=48;5;196;38;5;226:tw=48;5;10;38;5;16:ow=48;5;10;38;5;21:st=48;5;21;38;5;15:ex=38;5;40:*.tar=38;5;9:*.tgz=38;5;9:*.arc=38;5;9:*.arj=38;5;9:*.taz=38;5;9:*.lha=38;5;9:*.lz4=38;5;9:*.lzh=38;5;9:*.lzma=38;5;9:*.tlz=38;5;9:*.txz=38;5;9:*.tzo=38;5;9:*.t7z=38;5;9:*.zip=38;5;9:*.z=38;5;9:*.dz=38;5;9:*.gz=38;5;9:*.lrz=38;5;9:*.lz=38;5;9:*.lzo=38;5;9:*.xz=38;5;9:*.zst=38;5;9:*.tzst=38;5;9:*.bz2=38;5;9:*.bz=38;5;9:*.tbz=38;5;9:*.tbz2=38;5;9:*.tz=38;5;9:*.deb=38;5;9:*.rpm=38;5;9:*.jar=38;5;9:*.war=38;5;9:*.ear=38;5;9:*.sar=38;5;9:*.rar=38;5;9:*.alz=38;5;9:*.ace=38;5;9:*.zoo=38;5;9:*.cpio=38;5;9:*.7z=38;5;9:*.rz=38;5;9:*.cab=38;5;9:*.wim=38;5;9:*.swm=38;5;9:*.dwm=38;5;9:*.esd=38;
5;9:*.jpg=38;5;13:*.jpeg=38;5;13:*.mjpg=38;5;13:*.mjpeg=38;5;13:*.gif=38;5;13:*.bmp=38;5;13:*.pbm=38;5;13:*.pgm=38;5;13:*.ppm=38;5;13:*.tga=38;5;13:*.xbm=38;5;13:*.xpm=38;5;13:*.tif=38;5;13:*.tiff=38;5;13:*.png=38;5;13:*.svg=38;5;13:*.svgz=38;5;13:*.mng=38;5;13:*.pcx=38;5;13:*.mov=38;5;13:*.mpg=38;5;13:*.mpeg=38;5;13:*.m2v=38;5;13:*.mkv=38;5;13:*.webm=38;5;13:*.ogm=38;5;13:*.mp4=38;5;13:*.m4v=38;5;13:*.mp4v=38;5;13:*.vob=38;5;13:*.qt=38;5;13:*.nuv=38;5;13:*.wmv=38;5;13:*.asf=38;5;13:*.rm=38;5;13:*.rmvb=38;5;13:*.flc=38;5;13:*.avi=38;5;13:*.fli=38;5;13:*.flv=38;5;13:*.gl=38;5;13:*.dl=38;5;13:*.xcf=38;5;13:*.xwd=38;5;13:*.yuv=38;5;13:*.cgm=38;5;13:*.emf=38;5;13:*.ogv=38;5;13:*.ogx=38;5;13:*.aac=38;5;45:*.au=38;5;45:*.flac=38;5;45:*.m4a=38;5;45:*.mid=38;5;45:*.midi=38;5;45:*.mka=38;5;45:*.mp3=38;5;45:*.mpc=38;5;45:*.ogg=38;5;45:*.ra=38;5;45:*.wav=38;5;45:*.oga=38;5;45:*.opus=38;5;45:*.spx=38;5;45:*.xspf=38;5;45:', 'SSH_CONNECTION': '10.3.3.95 33474 10.3.3.55 22', 'LANG': 'en_CA.UTF-8',
'HISTCONTROL': 'ignoredups', 'GUESTFISH_RESTORE': '\\e[0m', 'HOSTNAME': 'node1.local.net', 'GUESTFISH_INIT': '\\e[1;34m', 'S_COLORS': 'auto', 'which_declare': 'declare -f', 'XDG_SESSION_ID': '4', 'USER': 'root', 'GUESTFISH_PS1': '\\[\\e[1;32m\\]><fs>\\[\\e[0;31m\\] ', 'SELINUX_ROLE_REQUESTED': '', 'PWD': '/root', 'SSH_ASKPASS': '/usr/libexec/openssh/gnome-ssh-askpass', 'HOME': '/root', 'SSH_CLIENT': '10.3.3.95 33474 22', 'SELINUX_LEVEL_REQUESTED': '', 'XDG_DATA_DIRS': '/root/.local/share/flatpak/exports/share:/var/lib/flatpak/exports/share:/usr/local/share:/usr/share', 'SSH_TTY': '/dev/pts/0', 'MAIL': '/var/spool/mail/root', 'SHELL': '/bin/bash', 'TERM': 'xterm-256color', 'SELINUX_USE_CURRENT_RANGE': '', 'SHLVL': '1', 'PYTHONPATH': '/usr/share/ovirt-hosted-engine-setup/scripts/..:', 'LOGNAME': 'root', 'DBUS_SESSION_BUS_ADDRESS': 'unix:path=/run/user/0/bus', 'XDG_RUNTIME_DIR': '/run/user/0', 'PATH': '/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/root/bin', 'GUESTFISH_OUTPUT': '
\\e[0m', 'DEBUGINFOD_URLS': 'https://debuginfod.centos.org/ ', 'HISTSIZE': '1000', 'LESSOPEN': '||/usr/bin/lesspipe.sh %s', 'BASH_FUNC_which%%': '() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}', 'OTOPI_EXECDIR': '/root', 'OTOPI_LOGFILE': '/var/log/ovirt-hosted-engine-setup/ovirt-hosted-engine-setup-20230419093227-vu7nay.log', 'OTOPI_CALLBACK_OF': '/tmp/tmpjo8dbijt', 'ANSIBLE_CALLBACKS_ENABLED': '1_otopi_json,2_ovirt_logger', 'ANSIBLE_STDOUT_CALLBACK': '1_otopi_json', 'HE_ANSIBLE_LOG_PATH': '/var/log/ovirt-hosted-engine-setup/ovirt-hosted-engine-setup-ansible-final_clean-20230419093241-7kk3yj.log'}
2023-04-19 09:32:42,276-0400 DEBUG otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils.run:226 ansible-playbook rc: 250
2023-04-19 09:32:42,276-0400 DEBUG otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils.run:233 ansible-playbook stdout:
2023-04-19 09:32:42,276-0400 DEBUG otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils.run:235 b'to see the full traceback, use -vvv\n'
2023-04-19 09:32:42,277-0400 DEBUG otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils.run:236 ansible-playbook stderr:
2023-04-19 09:32:42,277-0400 ERROR otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils.run:238 b'[WARNING]: Skipping plugin (/usr/share/ovirt-hosted-engine-\n'
2023-04-19 09:32:42,277-0400 ERROR otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils.run:238 b'setup/he_ansible/callback_plugins/2_ovirt_logger.py), cannot load: cannot\n'
2023-04-19 09:32:42,277-0400 ERROR otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils.run:238 b"import name 'Callable' from 'collections'\n"
2023-04-19 09:32:42,277-0400 ERROR otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils.run:238 b'(/usr/lib64/python3.11/collections/__init__.py)\n'
2023-04-19 09:32:42,278-0400 ERROR otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils.run:238 b"ERROR! Unexpected Exception, this is probably a bug: cannot import name 'Callable' from 'collections' (/usr/lib64/python3.11/collections/__init__.py)\n"
2023-04-19 09:32:42,278-0400 DEBUG otopi.context context._executeMethod:145 method exception
Traceback (most recent call last):
File "/usr/lib/python3.6/site-packages/otopi/context.py", line 132, in _executeMethod
method['method']()
File "/usr/share/ovirt-hosted-engine-setup/scripts/../plugins/gr-he-ansiblesetup/core/misc.py", line 547, in _cleanup
r = ah.run()
File "/usr/lib/python3.6/site-packages/ovirt_hosted_engine_setup/ansible_utils.py", line 240, in run
raise RuntimeError(_('Failed executing ansible-playbook'))
RuntimeError: Failed executing ansible-playbook
2023-04-19 09:32:42,279-0400 ERROR otopi.context context._executeMethod:154 Failed to execute stage 'Clean up': Failed executing ansible-playbook
2023-04-19 09:32:42,280-0400 DEBUG otopi.context context.dumpEnvironment:765 ENVIRONMENT DUMP - BEGIN
2023-04-19 09:32:42,280-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV BASE/exceptionInfo=list:'[(<class 'RuntimeError'>, RuntimeError('Failed executing ansible-playbook',), <traceback object at 0x7fdef2a2b908>), (<class 'RuntimeError'>, RuntimeError('Failed executing ansible-playbook',), <traceback object at 0x7fdef3bc4988>)]'
2023-04-19 09:32:42,280-0400 DEBUG otopi.context context.dumpEnvironment:779 ENVIRONMENT DUMP - END
2023-04-19 09:32:42,281-0400 DEBUG otopi.context context._executeMethod:127 Stage cleanup METHOD otopi.plugins.otopi.dialog.answer_file.Plugin._generate_answer_file
2023-04-19 09:32:42,281-0400 DEBUG otopi.context context.dumpEnvironment:765 ENVIRONMENT DUMP - BEGIN
2023-04-19 09:32:42,282-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV DIALOG/answerFileContent=str:'# OTOPI answer file, generated by human dialog
[environment:default]
QUESTION/1/DEPLOY_PROCEED=str:yes
QUESTION/1/FORCE_IP_PROCEED=str:yes
QUESTION/1/OVEHOSTED_GATEWAY=str:10.3.3.1
QUESTION/1/TMUX_PROCEED=str:yes
'
2023-04-19 09:32:42,282-0400 DEBUG otopi.context context.dumpEnvironment:779 ENVIRONMENT DUMP - END
2023-04-19 09:32:42,282-0400 DEBUG otopi.context context._executeMethod:127 Stage cleanup METHOD otopi.plugins.gr_he_common.core.answerfile.Plugin._save_answers_at_cleanup
2023-04-19 09:32:42,283-0400 INFO otopi.plugins.gr_he_common.core.answerfile answerfile._save_answers:74 Generating answer file '/var/lib/ovirt-hosted-engine-setup/answers/answers-20230419093242.conf'
2023-04-19 09:32:42,284-0400 INFO otopi.context context.runSequence:616 Stage: Pre-termination
2023-04-19 09:32:42,284-0400 DEBUG otopi.context context.runSequence:620 STAGE pre-terminate
2023-04-19 09:32:42,285-0400 DEBUG otopi.context context._executeMethod:127 Stage pre-terminate METHOD otopi.plugins.otopi.core.misc.Plugin._preTerminate
2023-04-19 09:32:42,285-0400 DEBUG otopi.context context.dumpEnvironment:765 ENVIRONMENT DUMP - BEGIN
2023-04-19 09:32:42,285-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV BASE/aborted=bool:'False'
2023-04-19 09:32:42,285-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV BASE/debug=int:'0'
2023-04-19 09:32:42,286-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV BASE/error=bool:'True'
2023-04-19 09:32:42,286-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV BASE/exceptionInfo=list:'[(<class 'RuntimeError'>, RuntimeError('Failed executing ansible-playbook',), <traceback object at 0x7fdef2a2b908>), (<class 'RuntimeError'>, RuntimeError('Failed executing ansible-playbook',), <traceback object at 0x7fdef3bc4988>)]'
2023-04-19 09:32:42,286-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV BASE/executionDirectory=str:'/root'
2023-04-19 09:32:42,286-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV BASE/exitCode=list:'[{'priority': 90001, 'code': 0}]'
2023-04-19 09:32:42,286-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV BASE/log=bool:'True'
2023-04-19 09:32:42,286-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV BASE/pluginGroups=str:'otopi:gr-he-common:gr-he-ansiblesetup'
2023-04-19 09:32:42,286-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV BASE/pluginPath=str:'/usr/share/otopi/plugins:/usr/share/ovirt-hosted-engine-setup/scripts/../plugins'
2023-04-19 09:32:42,286-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV BASE/suppressEnvironmentKeys=list:'[]'
2023-04-19 09:32:42,286-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV COMMAND/chkconfig=str:'/usr/sbin/chkconfig'
2023-04-19 09:32:42,286-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV COMMAND/chronyc=str:'/usr/bin/chronyc'
2023-04-19 09:32:42,287-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV COMMAND/date=str:'/usr/bin/date'
2023-04-19 09:32:42,287-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV COMMAND/dig=str:'/usr/bin/dig'
2023-04-19 09:32:42,287-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV COMMAND/firewall-cmd=str:'/usr/bin/firewall-cmd'
2023-04-19 09:32:42,287-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV COMMAND/hwclock=str:'/usr/sbin/hwclock'
2023-04-19 09:32:42,287-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV COMMAND/initctl=NoneType:'None'
2023-04-19 09:32:42,287-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV COMMAND/ip=str:'/usr/sbin/ip'
2023-04-19 09:32:42,287-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV COMMAND/nc=str:'/usr/bin/nc'
2023-04-19 09:32:42,287-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV COMMAND/ntpq=NoneType:'None'
2023-04-19 09:32:42,287-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV COMMAND/ping=str:'/usr/sbin/ping'
2023-04-19 09:32:42,287-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV COMMAND/python3=str:'/usr/bin/python3'
2023-04-19 09:32:42,288-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV COMMAND/rc=NoneType:'None'
2023-04-19 09:32:42,288-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV COMMAND/rc-update=NoneType:'None'
2023-04-19 09:32:42,288-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV COMMAND/reboot=str:'/usr/sbin/reboot'
2023-04-19 09:32:42,288-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV COMMAND/service=str:'/usr/sbin/service'
2023-04-19 09:32:42,288-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV COMMAND/ssh-keygen=str:'/usr/bin/ssh-keygen'
2023-04-19 09:32:42,288-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV COMMAND/systemctl=str:'/usr/bin/systemctl'
2023-04-19 09:32:42,288-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV CORE/configFileAppend=NoneType:'None'
2023-04-19 09:32:42,288-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV CORE/configFileName=str:'/etc/ovirt-hosted-engine-setup.conf'
2023-04-19 09:32:42,288-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV CORE/failOnPrioOverride=bool:'True'
2023-04-19 09:32:42,288-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV CORE/ignoreMissingBeforeAfter=bool:'True'
2023-04-19 09:32:42,288-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV CORE/internalPackageTransaction=Transaction:'[DNF Transaction]'
2023-04-19 09:32:42,289-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV CORE/logDir=str:'/var/log/ovirt-hosted-engine-setup'
2023-04-19 09:32:42,289-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV CORE/logFileHandle=TextIOWrapper:'<_io.TextIOWrapper name='/var/log/ovirt-hosted-engine-setup/ovirt-hosted-engine-setup-20230419093227-vu7nay.log' mode='a' encoding='UTF-8'>'
2023-04-19 09:32:42,289-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV CORE/logFileName=str:'/var/log/ovirt-hosted-engine-setup/ovirt-hosted-engine-setup-20230419093227-vu7nay.log'
2023-04-19 09:32:42,289-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV CORE/logFileNamePrefix=str:'ovirt-hosted-engine-setup'
2023-04-19 09:32:42,289-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV CORE/logFilter=_MyLoggerFilter:'filter'
2023-04-19 09:32:42,289-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV CORE/logFilterKeys=list:'['OVEHOSTED_ENGINE/adminPassword', 'OVEHOSTED_VM/cloudinitRootPwd']'
2023-04-19 09:32:42,289-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV CORE/logFilterQuestions=list:'[]'
2023-04-19 09:32:42,289-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV CORE/logFilterQuestionsKeys=set:'set()'
2023-04-19 09:32:42,289-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV CORE/logFilterRe=list:'[re.compile('\n BEGIN\\ PRIVATE\\ KEY\n (?P<filter>.*)\n END\\ PRIVATE\\ KEY\n ', re.DOTALL|re.VERBOSE)]'
2023-04-19 09:32:42,290-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV CORE/logRemoveAtExit=bool:'False'
2023-04-19 09:32:42,290-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV CORE/mainTransaction=Transaction:'[DNF Transaction]'
2023-04-19 09:32:42,290-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV CORE/modifiedFiles=list:'[]'
2023-04-19 09:32:42,290-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV CORE/randomizeEvents=bool:'False'
2023-04-19 09:32:42,290-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV DIALOG/answerFile=NoneType:'None'
2023-04-19 09:32:42,290-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV DIALOG/answerFileContent=str:'# OTOPI answer file, generated by human dialog
[environment:default]
QUESTION/1/DEPLOY_PROCEED=str:yes
QUESTION/1/FORCE_IP_PROCEED=str:yes
QUESTION/1/OVEHOSTED_GATEWAY=str:10.3.3.1
QUESTION/1/TMUX_PROCEED=str:yes
'
2023-04-19 09:32:42,290-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV DIALOG/autoAcceptDefault=bool:'False'
2023-04-19 09:32:42,290-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV DIALOG/boundary=str:'--=451b80dc-996f-432e-9e4f-2b29ef6d1141=--'
2023-04-19 09:32:42,290-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV DIALOG/cliVersion=int:'1'
2023-04-19 09:32:42,290-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV DIALOG/customization=bool:'False'
2023-04-19 09:32:42,291-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV DIALOG/dialect=str:'human'
2023-04-19 09:32:42,291-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV INFO/PACKAGE_NAME=str:'otopi'
2023-04-19 09:32:42,291-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV INFO/PACKAGE_VERSION=str:'1.10.3'
2023-04-19 09:32:42,291-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV NETWORK/firewalldAvailable=bool:'True'
2023-04-19 09:32:42,291-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV NETWORK/firewalldDisableServices=list:'[]'
2023-04-19 09:32:42,291-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV NETWORK/firewalldEnable=bool:'False'
2023-04-19 09:32:42,291-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV NETWORK/iptablesEnable=bool:'False'
2023-04-19 09:32:42,291-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV NETWORK/iptablesRules=NoneType:'None'
2023-04-19 09:32:42,291-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV NETWORK/sshEnable=bool:'False'
2023-04-19 09:32:42,291-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV NETWORK/sshKey=NoneType:'None'
2023-04-19 09:32:42,292-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV NETWORK/sshUser=str:''
2023-04-19 09:32:42,292-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV OVEHOSTED_CORE/checkRequirements=bool:'True'
2023-04-19 09:32:42,292-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV OVEHOSTED_CORE/deployProceed=bool:'True'
2023-04-19 09:32:42,292-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV OVEHOSTED_CORE/enableKeycloak=NoneType:'None'
2023-04-19 09:32:42,292-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV OVEHOSTED_CORE/etcAnswerFile=str:'/etc/ovirt-hosted-engine/answers.conf'
2023-04-19 09:32:42,292-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV OVEHOSTED_CORE/forceIpProceed=str:'yes'
2023-04-19 09:32:42,292-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV OVEHOSTED_CORE/localVMDir=NoneType:'None'
2023-04-19 09:32:42,292-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV OVEHOSTED_CORE/memCheckRequirements=bool:'True'
2023-04-19 09:32:42,292-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV OVEHOSTED_CORE/miscReached=bool:'False'
2023-04-19 09:32:42,292-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV OVEHOSTED_CORE/nodeSetup=bool:'False'
2023-04-19 09:32:42,292-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV OVEHOSTED_CORE/pauseonRestore=NoneType:'None'
2023-04-19 09:32:42,293-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV OVEHOSTED_CORE/renewPKIonRestore=NoneType:'None'
2023-04-19 09:32:42,293-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV OVEHOSTED_CORE/restoreFromFile=NoneType:'None'
2023-04-19 09:32:42,293-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV OVEHOSTED_CORE/screenProceed=bool:'True'
2023-04-19 09:32:42,293-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV OVEHOSTED_CORE/skipTTYCheck=bool:'False'
2023-04-19 09:32:42,293-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV OVEHOSTED_CORE/tempDir=str:'/var/tmp'
2023-04-19 09:32:42,293-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV OVEHOSTED_CORE/userAnswerFile=NoneType:'None'
2023-04-19 09:32:42,293-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV OVEHOSTED_ENGINE/adminPassword=NoneType:'None'
2023-04-19 09:32:42,293-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV OVEHOSTED_ENGINE/adminUsername=str:'admin@internal'
2023-04-19 09:32:42,293-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV OVEHOSTED_ENGINE/appHostName=NoneType:'None'
2023-04-19 09:32:42,293-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV OVEHOSTED_ENGINE/clusterName=NoneType:'None'
2023-04-19 09:32:42,294-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV OVEHOSTED_ENGINE/datacenterName=NoneType:'None'
2023-04-19 09:32:42,294-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV OVEHOSTED_ENGINE/enableHcGlusterService=NoneType:'None'
2023-04-19 09:32:42,294-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV OVEHOSTED_ENGINE/enableLibgfapi=NoneType:'None'
2023-04-19 09:32:42,294-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV OVEHOSTED_ENGINE/engineSetupTimeout=int:'1800'
2023-04-19 09:32:42,294-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV OVEHOSTED_ENGINE/insecureSSL=NoneType:'None'
2023-04-19 09:32:42,294-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV OVEHOSTED_ENGINE/interactiveAdminPassword=bool:'True'
2023-04-19 09:32:42,294-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV OVEHOSTED_ENGINE/temporaryCertificate=NoneType:'None'
2023-04-19 09:32:42,294-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV OVEHOSTED_FIRST_HOST/deployWithHE35Hosts=NoneType:'None'
2023-04-19 09:32:42,294-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV OVEHOSTED_FIRST_HOST/skipSharedStorageAF=bool:'False'
2023-04-19 09:32:42,294-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV OVEHOSTED_NETWORK/allowInvalidBondModes=bool:'False'
2023-04-19 09:32:42,294-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV OVEHOSTED_NETWORK/bridgeIf=NoneType:'None'
2023-04-19 09:32:42,295-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV OVEHOSTED_NETWORK/bridgeName=str:'ovirtmgmt'
2023-04-19 09:32:42,295-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV OVEHOSTED_NETWORK/forceIPv4=bool:'False'
2023-04-19 09:32:42,295-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV OVEHOSTED_NETWORK/forceIPv6=bool:'False'
2023-04-19 09:32:42,295-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV OVEHOSTED_NETWORK/fqdn=NoneType:'None'
2023-04-19 09:32:42,295-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV OVEHOSTED_NETWORK/fqdnReverseValidation=bool:'False'
2023-04-19 09:32:42,295-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV OVEHOSTED_NETWORK/gateway=str:'10.3.3.1'
2023-04-19 09:32:42,295-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV OVEHOSTED_NETWORK/host_name=NoneType:'None'
2023-04-19 09:32:42,295-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV OVEHOSTED_NETWORK/network_test=NoneType:'None'
2023-04-19 09:32:42,295-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV OVEHOSTED_NETWORK/network_test_tcp_address=NoneType:'None'
2023-04-19 09:32:42,295-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV OVEHOSTED_NETWORK/network_test_tcp_port=NoneType:'None'
2023-04-19 09:32:42,295-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV OVEHOSTED_NETWORK/refuseDeployingWithNM=bool:'False'
2023-04-19 09:32:42,296-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV OVEHOSTED_NOTIF/destEmail=NoneType:'None'
2023-04-19 09:32:42,296-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV OVEHOSTED_NOTIF/smtpPort=NoneType:'None'
2023-04-19 09:32:42,296-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV OVEHOSTED_NOTIF/smtpServer=NoneType:'None'
2023-04-19 09:32:42,296-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV OVEHOSTED_NOTIF/sourceEmail=NoneType:'None'
2023-04-19 09:32:42,296-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV OVEHOSTED_SANLOCK/lockspaceName=str:'hosted-engine'
2023-04-19 09:32:42,296-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV OVEHOSTED_SANLOCK/serviceName=str:'sanlock'
2023-04-19 09:32:42,296-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV OVEHOSTED_STORAGE/LunID=NoneType:'None'
2023-04-19 09:32:42,296-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV OVEHOSTED_STORAGE/blockDeviceSizeGB=NoneType:'None'
2023-04-19 09:32:42,296-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV OVEHOSTED_STORAGE/discardSupport=bool:'False'
2023-04-19 09:32:42,296-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV OVEHOSTED_STORAGE/domainType=NoneType:'None'
2023-04-19 09:32:42,297-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV OVEHOSTED_STORAGE/iSCSIDiscoverPassword=NoneType:'None'
2023-04-19 09:32:42,297-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV OVEHOSTED_STORAGE/iSCSIDiscoverUser=NoneType:'None'
2023-04-19 09:32:42,297-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV OVEHOSTED_STORAGE/iSCSIPortal=NoneType:'None'
2023-04-19 09:32:42,297-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV OVEHOSTED_STORAGE/iSCSIPortalIPAddress=NoneType:'None'
2023-04-19 09:32:42,297-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV OVEHOSTED_STORAGE/iSCSIPortalPassword=NoneType:'None'
2023-04-19 09:32:42,297-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV OVEHOSTED_STORAGE/iSCSIPortalPort=NoneType:'None'
2023-04-19 09:32:42,297-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV OVEHOSTED_STORAGE/iSCSIPortalUser=NoneType:'None'
2023-04-19 09:32:42,297-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV OVEHOSTED_STORAGE/iSCSITargetName=NoneType:'None'
2023-04-19 09:32:42,297-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV OVEHOSTED_STORAGE/imgDesc=str:'Hosted Engine Image'
2023-04-19 09:32:42,297-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV OVEHOSTED_STORAGE/imgSizeGB=NoneType:'None'
2023-04-19 09:32:42,297-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV OVEHOSTED_STORAGE/imgUUID=str:'6259993f-beb7-4c29-91a9-9570dd12ac8a'
2023-04-19 09:32:42,298-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV OVEHOSTED_STORAGE/lockspaceImageUUID=NoneType:'None'
2023-04-19 09:32:42,298-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV OVEHOSTED_STORAGE/lockspaceVolumeUUID=NoneType:'None'
2023-04-19 09:32:42,298-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV OVEHOSTED_STORAGE/metadataImageUUID=NoneType:'None'
2023-04-19 09:32:42,298-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV OVEHOSTED_STORAGE/metadataVolumeUUID=NoneType:'None'
2023-04-19 09:32:42,298-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV OVEHOSTED_STORAGE/mntOptions=NoneType:'None'
2023-04-19 09:32:42,298-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV OVEHOSTED_STORAGE/nfsVersion=NoneType:'None'
2023-04-19 09:32:42,298-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV OVEHOSTED_STORAGE/ovfSizeGB=NoneType:'None'
2023-04-19 09:32:42,298-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV OVEHOSTED_STORAGE/storageDomainConnection=NoneType:'None'
2023-04-19 09:32:42,298-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV OVEHOSTED_STORAGE/storageDomainName=str:'hosted_storage'
2023-04-19 09:32:42,298-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV OVEHOSTED_STORAGE/volUUID=str:'1c68fc63-4512-418e-9949-075cd297f40e'
2023-04-19 09:32:42,298-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV OVEHOSTED_VDSM/kvmGid=int:'36'
2023-04-19 09:32:42,299-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV OVEHOSTED_VDSM/serviceName=str:'vdsmd'
2023-04-19 09:32:42,299-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV OVEHOSTED_VDSM/useSSL=bool:'True'
2023-04-19 09:32:42,299-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV OVEHOSTED_VDSM/vdscli=NoneType:'None'
2023-04-19 09:32:42,299-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV OVEHOSTED_VDSM/vdsmUid=int:'36'
2023-04-19 09:32:42,299-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV OVEHOSTED_VM/OpenScapProfileName=NoneType:'None'
2023-04-19 09:32:42,299-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV OVEHOSTED_VM/acceptDownloadEApplianceRPM=NoneType:'None'
2023-04-19 09:32:42,299-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV OVEHOSTED_VM/applianceMem=NoneType:'None'
2023-04-19 09:32:42,299-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV OVEHOSTED_VM/applianceVCpus=NoneType:'None'
2023-04-19 09:32:42,299-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV OVEHOSTED_VM/applianceVersion=NoneType:'None'
2023-04-19 09:32:42,299-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV OVEHOSTED_VM/applyOpenScapProfile=NoneType:'None'
2023-04-19 09:32:42,299-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV OVEHOSTED_VM/automateVMShutdown=NoneType:'None'
2023-04-19 09:32:42,300-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV OVEHOSTED_VM/cdromUUID=str:'b941c4c0-7966-400c-a9fd-b2a85c9a4c33'
2023-04-19 09:32:42,300-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV OVEHOSTED_VM/cloudInitISO=NoneType:'None'
2023-04-19 09:32:42,300-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV OVEHOSTED_VM/cloudinitExecuteEngineSetup=NoneType:'None'
2023-04-19 09:32:42,300-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV OVEHOSTED_VM/cloudinitInstanceDomainName=NoneType:'None'
2023-04-19 09:32:42,300-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV OVEHOSTED_VM/cloudinitInstanceHostName=NoneType:'None'
2023-04-19 09:32:42,300-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV OVEHOSTED_VM/cloudinitRootPwd=NoneType:'None'
2023-04-19 09:32:42,300-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV OVEHOSTED_VM/cloudinitVMDNS=NoneType:'None'
2023-04-19 09:32:42,300-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV OVEHOSTED_VM/cloudinitVMETCHOSTS=NoneType:'None'
2023-04-19 09:32:42,300-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV OVEHOSTED_VM/cloudinitVMStaticCIDR=NoneType:'None'
2023-04-19 09:32:42,300-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV OVEHOSTED_VM/cloudinitVMTZ=NoneType:'None'
2023-04-19 09:32:42,301-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV OVEHOSTED_VM/consoleUUID=str:'4b2ee9e8-a64e-4288-8672-95d080193230'
2023-04-19 09:32:42,301-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV OVEHOSTED_VM/emulatedMachine=str:'pc'
2023-04-19 09:32:42,301-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV OVEHOSTED_VM/enableFips=NoneType:'None'
2023-04-19 09:32:42,301-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV OVEHOSTED_VM/localVmUUID=str:'a729bb55-b00c-4d2f-a6b1-d7a3008539ee'
2023-04-19 09:32:42,301-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV OVEHOSTED_VM/maxVCpus=NoneType:'None'
2023-04-19 09:32:42,301-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV OVEHOSTED_VM/nicUUID=str:'1a061d6c-8a0d-4a80-ab69-cf67d9558586'
2023-04-19 09:32:42,301-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV OVEHOSTED_VM/ovfArchive=NoneType:'None'
2023-04-19 09:32:42,301-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV OVEHOSTED_VM/rootSshAccess=NoneType:'None'
2023-04-19 09:32:42,301-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV OVEHOSTED_VM/rootSshPubkey=NoneType:'None'
2023-04-19 09:32:42,301-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV OVEHOSTED_VM/vmCDRom=NoneType:'None'
2023-04-19 09:32:42,301-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV OVEHOSTED_VM/vmMACAddr=NoneType:'None'
2023-04-19 09:32:42,302-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV OVEHOSTED_VM/vmMemSizeMB=NoneType:'None'
2023-04-19 09:32:42,302-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV OVEHOSTED_VM/vmVCpus=NoneType:'None'
2023-04-19 09:32:42,302-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV PACKAGER/dnfDisabledPlugins=list:'[]'
2023-04-19 09:32:42,302-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV PACKAGER/dnfExpireCache=bool:'True'
2023-04-19 09:32:42,302-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV PACKAGER/dnfRollback=bool:'True'
2023-04-19 09:32:42,302-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV PACKAGER/dnfpackagerEnabled=bool:'True'
2023-04-19 09:32:42,302-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV PACKAGER/keepAliveInterval=int:'30'
2023-04-19 09:32:42,302-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV PACKAGER/yumDisabledPlugins=list:'[]'
2023-04-19 09:32:42,302-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV PACKAGER/yumEnabledPlugins=list:'[]'
2023-04-19 09:32:42,302-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV PACKAGER/yumExpireCache=bool:'True'
2023-04-19 09:32:42,302-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV PACKAGER/yumRollback=bool:'True'
2023-04-19 09:32:42,303-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV PACKAGER/yumpackagerEnabled=bool:'False'
2023-04-19 09:32:42,303-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV QUESTION/1/DEPLOY_PROCEED=str:'yes'
2023-04-19 09:32:42,303-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV QUESTION/1/FORCE_IP_PROCEED=str:'yes'
2023-04-19 09:32:42,303-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV QUESTION/1/OVEHOSTED_GATEWAY=str:'10.3.3.1'
2023-04-19 09:32:42,303-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV QUESTION/1/TMUX_PROCEED=str:'yes'
2023-04-19 09:32:42,303-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV SYSTEM/clockMaxGap=int:'5'
2023-04-19 09:32:42,303-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV SYSTEM/clockSet=bool:'False'
2023-04-19 09:32:42,303-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV SYSTEM/commandPath=str:'/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/root/bin'
2023-04-19 09:32:42,303-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV SYSTEM/reboot=bool:'False'
2023-04-19 09:32:42,303-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV SYSTEM/rebootAllow=bool:'True'
2023-04-19 09:32:42,304-0400 DEBUG otopi.context context.dumpEnvironment:775 ENV SYSTEM/rebootDeferTime=int:'10'
2023-04-19 09:32:42,304-0400 DEBUG otopi.context context.dumpEnvironment:779 ENVIRONMENT DUMP - END
2023-04-19 09:32:42,305-0400 DEBUG otopi.context context._executeMethod:127 Stage pre-terminate METHOD otopi.plugins.otopi.dialog.cli.Plugin._pre_terminate
2023-04-19 09:32:42,305-0400 DEBUG otopi.context context._executeMethod:136 otopi.plugins.otopi.dialog.cli.Plugin._pre_terminate condition False
2023-04-19 09:32:42,305-0400 INFO otopi.context context.runSequence:616 Stage: Termination
2023-04-19 09:32:42,306-0400 DEBUG otopi.context context.runSequence:620 STAGE terminate
2023-04-19 09:32:42,306-0400 DEBUG otopi.context context._executeMethod:127 Stage terminate METHOD otopi.plugins.gr_he_common.core.misc.Plugin._terminate
2023-04-19 09:32:42,306-0400 ERROR otopi.plugins.gr_he_common.core.misc misc._terminate:167 Hosted Engine deployment failed
2023-04-19 09:32:42,307-0400 DEBUG otopi.plugins.otopi.dialog.human dialog.__logString:204 DIALOG:SEND Log file is located at
2023-04-19 09:32:42,307-0400 DEBUG otopi.plugins.otopi.dialog.human dialog.__logString:204 DIALOG:SEND /var/log/ovirt-hosted-engine-setup/ovirt-hosted-engine-setup-20230419093227-vu7nay.log
2023-04-19 09:32:42,308-0400 DEBUG otopi.context context._executeMethod:127 Stage terminate METHOD otopi.plugins.otopi.dialog.human.Plugin._terminate
2023-04-19 09:32:42,313-0400 DEBUG otopi.context context._executeMethod:127 Stage terminate METHOD otopi.plugins.otopi.dialog.machine.Plugin._terminate
2023-04-19 09:32:42,313-0400 DEBUG otopi.context context._executeMethod:136 otopi.plugins.otopi.dialog.machine.Plugin._terminate condition False
2023-04-19 09:32:42,314-0400 DEBUG otopi.context context._executeMethod:127 Stage terminate METHOD otopi.plugins.otopi.core.log.Plugin._terminate
1 year, 12 months
Administration portal doesn't load.
by kushagra.gupta@hsc.com
I have an odd situation:
When I go to
https://ovengine/manager-engine.com/webadmin/?locale=en_US
after authentication passes...
it shows the top banner of
oVirt OPEN VIRTUALIZATION MANAGER
and the
Loading ...
in the center. but never gets past that.
I also followed the thread and tried opening the admin portal in the private/incognito mode but I still face the issue.
I rebooted the hosted engine as well but I am still facing the same issue.
When I trigger the above url, in the engine.log, following logs are created:
2023-04-19 12:09:37,535+05 INFO [org.ovirt.engine.core.sso.service.AuthenticationService] (default task-14) [] User admin@internal-authz with profile [internal] successfully logged in with scopes: ovirt-app-admin ovirt-app-api ovirt-app-portal ovirt-ext=auth:sequence-priority=~ ovirt-ext=revoke:revoke-all ovirt-ext=token-info:authz-search ovirt-ext=token-info:public-authz-search ovirt-ext=token-info:validate ovirt-ext=token:password-access
2023-04-19 12:09:37,581+05 INFO [org.ovirt.engine.core.bll.aaa.CreateUserSessionCommand] (default task-14) [4593247b] Running command: CreateUserSessionCommand internal: false.
2023-04-19 12:09:37,655+05 INFO [org.ovirt.engine.core.dal.dbbroker.auditloghandling.AuditLogDirector] (default task-14) [4593247b] EVENT_ID: USER_VDC_LOGIN(30), User admin@internal-authz connecting from '10.0.1.2' using session 'tmdEg3zM+kPrJudpmQr+ITRboP8CWKVDlKP4XmYiB2GDqgWfSRj2H/kpNd96JfEXOuB+CjBNz72FkLx5//77kg==' logged in.
2023-04-19 12:11:01,457+05 INFO [org.ovirt.engine.core.bll.provider.network.SyncNetworkProviderCommand] (EE-ManagedScheduledExecutorService-engineScheduledThreadPool-Thread-60) [1e895f1c] Lock Acquired to object 'EngineLock:{exclusiveLocks='[7a49bb81-3c73-4119-8cb0-f3f6d88fafb9=PROVIDER]', sharedLocks=''}'
2023-04-19 12:11:01,463+05 INFO [org.ovirt.engine.core.bll.provider.network.SyncNetworkProviderCommand] (EE-ManagedScheduledExecutorService-engineScheduledThreadPool-Thread-60) [1e895f1c] Running command: SyncNetworkProviderCommand internal: true.
2023-04-19 12:11:01,564+05 INFO [org.ovirt.engine.core.sso.service.AuthenticationService] (default task-6) [] User admin@internal-authz with profile [internal] successfully logged in with scopes: ovirt-app-api ovirt-ext=token-info:authz-search ovirt-ext=token-info:public-authz-search ovirt-ext=token-info:validate ovirt-ext=token:password-access
2023-04-19 12:11:01,647+05 INFO [org.ovirt.engine.core.bll.provider.network.SyncNetworkProviderCommand] (EE-ManagedScheduledExecutorService-engineScheduledThreadPool-Thread-60) [1e895f1c] Lock freed to object 'EngineLock:{exclusiveLocks='[7a49bb81-3c73-4119-8cb0-f3f6d88fafb9=PROVIDER]', sharedLocks=''}'
As you can see there are no error logs.
But at the same time I am able to see some prior errors in the engine.log file:
ERROR [org.ovirt.engine.core.vdsbroker.vdsbroker.HSMGetAllTasksStatusesVDSCommand] (EE-ManagedScheduledExecutorService-engineScheduledThreadPool-Thread-96) [] Command 'HSMGetAllTasksStatusesVDSCommand(HostName = ovirt-host, VdsIdVDSCommandParametersBase:{hostId='b2137069-b4e7-4c16-91e1-7f889c62ab88'})' execution failed: IRSGenericException: IRSErrorException: IRSNonOperationalException: Not SPM: ()
ERROR [org.ovirt.engine.core.bll.pm.FenceProxyLocator] (EE-ManagedScheduledExecutorService-engineScheduledThreadPool-Thread-47) [1e63b6f1] Can not run fence action on host 'ovirt-host', no suitable proxy host was found.
WARN [org.ovirt.engine.core.dal.dbbroker.auditloghandling.AuditLogDirector] (EE-ManagedScheduledExecutorService-engineSch eduledThreadPool-Thread-14) [14a0ecc4] EVENT_ID: SYSTEM_CHANGE_STORAGE_POOL_STATUS_PROBLEMATIC_WITH_ERROR(987), Invalid status on Data Center Default . Setting Data Center status to Non Responsive (On host ovirt-host, Error: Network error during communication with the Host.).
2023-04-18 19:15:13,838+05 ERROR [org.ovirt.vdsm.jsonrpc.client.reactors.ReactorClient] (SSL Stomp Reactor) [] Connection timeout for host 'ovirt-hos t', last response arrived 22501 ms ago.
ERROR [org.ovirt.engine.core.vdsbroker.monitoring.HostMonitoring] (EE-ManagedScheduledExecutorService-engineScheduledThrea dPool-Thread-45) [] Unable to GetStats: VDSNetworkException: VDSGenericException: VDSNetworkException: Broken pipe
Also I can see that there is a connection between the ovirt-host and the hosted-engine:
==========================Hosted Engine=====================================================
[root@manager-engine ovirt-engine]# netstat -anlp | grep 54321
tcp6 0 0 10.0.1.103:39944 10.0.1.101:54321 ESTABLISHED 2971/ovirt-engine
tcp6 0 0 10.0.1.103:44234 10.0.1.101:54321 ESTABLISHED 2971/ovirt-engine
[root@manager-engine ovirt-engine]#
===========================Ovirt-host========================================================
[root@ovirt-host ~]# netstat -anlp | grep 54321
tcp6 0 0 :::54321 :::* LISTEN 114608/python3
tcp6 0 0 ::1:54321 ::1:37428 ESTABLISHED 114608/python3
tcp6 0 0 ::1:54321 ::1:37414 ESTABLISHED 114608/python3
tcp6 0 0 10.0.1.101:54321 10.0.1.103:44234 ESTABLISHED 114608/python3
tcp6 0 0 ::1:37428 ::1:54321 ESTABLISHED 105702/platform-pyt
tcp6 0 0 ::1:54321 ::1:37416 ESTABLISHED 114608/python3
tcp6 0 0 ::1:37416 ::1:54321 ESTABLISHED 114861/platform-pyt
tcp6 0 0 10.0.1.101:54321 10.0.1.103:39944 ESTABLISHED 114608/python3
tcp6 0 0 ::1:37414 ::1:54321 ESTABLISHED 105491/platform-pyt
[root@ovirt-host ~]# netstat -anlp | grep vdsm
unix 2 [ ACC ] STREAM LISTENING 770058 96907/python3 /run/vdsm/svdsm.sock
unix 2 [ ACC ] STREAM LISTENING 1223139 114861/platform-pyt /run/vdsm/mom-vdsm.sock
unix 3 [ ] STREAM CONNECTED 1226522 96907/python3 /run/vdsm/svdsm.sock
unix 3 [ ] STREAM CONNECTED 6903654 96907/python3 /run/vdsm/svdsm.sock
unix 3 [ ] STREAM CONNECTED 1230271 96907/python3 /run/vdsm/svdsm.sock
unix 3 [ ] STREAM CONNECTED 1230065 96907/python3 /run/vdsm/svdsm.sock
unix 3 [ ] STREAM CONNECTED 6823534 96907/python3 /run/vdsm/svdsm.sock
unix 3 [ ] STREAM CONNECTED 6838191 96907/python3 /run/vdsm/svdsm.sock
unix 3 [ ] STREAM CONNECTED 1224904 96907/python3 /run/vdsm/svdsm.sock
unix 3 [ ] STREAM CONNECTED 1225672 96907/python3 /run/vdsm/svdsm.sock
unix 3 [ ] STREAM CONNECTED 1214103 96907/python3 /run/vdsm/svdsm.sock
[root@ovirt-host ~]#
[root@ovirt-host ~]# systemctl status vdsmd.service
? vdsmd.service - Virtual Desktop Server Manager
Loaded: loaded (/usr/lib/systemd/system/vdsmd.service; enabled; vendor preset: disabled)
Active: active (running) since Tue 2023-04-18 19:15:08 IST; 16h ago
Process: 114535 ExecStartPre=/usr/libexec/vdsm/vdsmd_init_common.sh --pre-start (code=exited, status=0/SUCCESS)
Main PID: 114608 (vdsmd)
Tasks: 63 (limit: 48695)
Memory: 122.2M
CGroup: /system.slice/vdsmd.service
tq114608 /usr/bin/python3 /usr/share/vdsm/vdsmd
mq114826 /usr/libexec/ioprocess --read-pipe-fd 50 --write-pipe-fd 49 --max-threads 10 --max-queued-requests 10
Apr 19 09:03:35 ovirt-host vdsm[114608]: WARN Attempting to remove a non existing network: ovirtmgmt/dcaa480a-068c-4030-ba9b-2baddc6de8c0
Apr 19 09:03:35 ovirt-host vdsm[114608]: WARN Attempting to remove a non existing net user: ovirtmgmt/dcaa480a-068c-4030-ba9b-2baddc6de8c0
Apr 19 09:03:35 ovirt-host vdsm[114608]: WARN Attempting to remove a non existing network: ovirtmgmt/dcaa480a-068c-4030-ba9b-2baddc6de8c0
Apr 19 09:03:35 ovirt-host vdsm[114608]: WARN Attempting to remove a non existing net user: ovirtmgmt/dcaa480a-068c-4030-ba9b-2baddc6de8c0
Apr 19 09:13:34 ovirt-host vdsm[114608]: WARN Attempting to add an existing net user: ovirtmgmt/dcaa480a-068c-4030-ba9b-2baddc6de8c0
Apr 19 11:18:27 ovirt-host vdsm[114608]: ERROR ssl handshake: socket error, address: ::ffff:10.0.1.103
Warning: Journal has been rotated since unit was started. Log output is incomplete or unavailable.
[root@ovirt-host ~]#
Current netwoking:
[root@ovirt-host ~]# ip a
1: lo: <LOOPBACK,UP,LOWER_UP> mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000
link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00
inet 127.0.0.1/8 scope host lo
valid_lft forever preferred_lft forever
inet6 ::1/128 scope host
valid_lft forever preferred_lft forever
2: enp2s0: <BROADCAST,MULTICAST,UP,LOWER_UP> mtu 1500 qdisc fq_codel state UP group default qlen 1000
link/ether 48:4d:7e:a8:d8:73 brd ff:ff:ff:ff:ff:ff
3: enp0s20f0u6: <BROADCAST,MULTICAST,UP,LOWER_UP> mtu 1500 qdisc fq_codel state UP group default qlen 1000
link/ether 80:3f:5d:08:4f:7e brd ff:ff:ff:ff:ff:ff
4: enp0s20f0u6.408@enp0s20f0u6: <BROADCAST,MULTICAST,UP,LOWER_UP> mtu 1500 qdisc noqueue master ovirtmgmt state UP group default qlen 1000
link/ether 80:3f:5d:08:4f:7e brd ff:ff:ff:ff:ff:ff
5: enp0s20f0u6.400@enp0s20f0u6: <BROADCAST,MULTICAST,UP,LOWER_UP> mtu 1500 qdisc noqueue state UP group default qlen 1000
link/ether 80:3f:5d:08:4f:7e brd ff:ff:ff:ff:ff:ff
inet6 abcd:abcd:abcd::17/64 scope global noprefixroute
valid_lft forever preferred_lft forever
inet6 fe80::1a7:ccb1:ae9b:9b8f/64 scope link noprefixroute
valid_lft forever preferred_lft forever
38: ;vdsmdummy;: <BROADCAST,MULTICAST> mtu 1500 qdisc noop state DOWN group default qlen 1000
link/ether de:b4:76:ba:04:82 brd ff:ff:ff:ff:ff:ff
39: ovs-system: <BROADCAST,MULTICAST> mtu 1500 qdisc noop state DOWN group default qlen 1000
link/ether 6e:94:55:f5:51:12 brd ff:ff:ff:ff:ff:ff
40: br-int: <BROADCAST,MULTICAST> mtu 1500 qdisc noop state DOWN group default qlen 1000
link/ether b6:96:23:82:d2:48 brd ff:ff:ff:ff:ff:ff
73: ovirtmgmt: <BROADCAST,MULTICAST,UP,LOWER_UP> mtu 1500 qdisc noqueue state UP group default qlen 1000
link/ether 80:3f:5d:08:4f:7e brd ff:ff:ff:ff:ff:ff
inet 10.0.1.101/24 brd 10.0.1.255 scope global noprefixroute ovirtmgmt
valid_lft forever preferred_lft forever
90: vnet2: <BROADCAST,MULTICAST,UP,LOWER_UP> mtu 1500 qdisc noqueue master ovirtmgmt state UNKNOWN group default qlen 1000
link/ether fe:16:3e:0d:0f:b9 brd ff:ff:ff:ff:ff:ff
inet6 fe80::fc16:3eff:fe0d:fb9/64 scope link
valid_lft forever preferred_lft forever
[root@ovirt-host ~]#
Could anyone please help me with this? I have tried re-installation as well but still no luck.
Thanks and Regards
Kushagra Gupta
1 year, 12 months
vm has very slow write speed on posix compliant fs disk.
by arc@b4restore.com
Hi,
we have noticed that we get around 30MB/s of write speed per vm in oVirt on our datastore that is a GPFS filesystem mounted as posix compliant fs. Read speeds are around 1.5-3.3GB/s.
We tested directly on the mount from the host cli with some benchmarking tools and from the host directly into the gpfs filesystem we get line speeds but from vms we dont..
does anybody have some clues to what is going on and what to try?
Br
Andi
1 year, 12 months
ovirt-engine - developer mode installation
by yongshengmaa@126.com
Hello ,
I'm doing a developer mode installation for ovirt-engine. I'd like to install it based on CentOS 8 Stream, but no matched version of postgresql packages are provided. Which CentOS version should I start this work?
Sincerely,
Yongsheng
1 year, 12 months