Unable to add more than 16 vCPUs to running VM
by David White
I have a fully patched / up-to-date engine:
Software Version:4.5.4-1.el8
And a fully patched, up-to-date host.
[root@cha3-storage dwhite]# yum info ovirt-hostLast metadata expiration check: 1:33:40 ago on Sun 04 Jun 2023 09:28:39 AM EDT.
Installed Packages
Name : ovirt-host
Version : 4.5.0
Release : 3.el8
Architecture : x86_64
Size : 11 k
Source : ovirt-host-4.5.0-3.el8.src.rpm
Repository : @System
From repo : centos-ovirt45
The host has 32GB of RAM, and there's only 1 VM on this host.
When I try to add more CPUs to the VM from the manager UI, I get the following error:
- The requested number of vCPUs is not available on the host the VM is running on
What's going on here, and why can I not add more vCPUs to this VM?
Sent with Proton Mail secure email.
1 year, 9 months
oVirt 4.4 - Integrate event notifications with System Center Operations Manager
by angus@ajct.uk
Hello
I am looking to integrate oVirt 4.4 event notifications with an incident management tool - M$ System Center Operations Manager.
I need to give examples to SCOM of what an unhealthy email looks like and what a recover/healthy email looks like for each event I want it to manage.
Is there a way of triggering sample notifications (something failed/something recovered) for events or perhaps I can see the email templates used somewhere? (I did try a grep -Hr on the engine VM for "alertMessage" and "resolveMessage" for a VM migration notification but only found an entry in the in the notifier log.
Thanks
Angus
1 year, 9 months
Regarding oVirt IP Address Change
by Pro Cloudguru
Hello oVIrt Community,
Presently, I've a healthy & running oVirt Hosting that includes single DC,
Cluster, SPM & Storage. The LAN IP Address range is 10.151.65.0/24.
As we're in the process of migrating office premises, I'm worried about the
existing oVirt IP Address Schema as the new range is 10.101.160/0/20.
In this scenario, I'd like to know how the migration happens successfully
without disturbing the oVirt Hosting & all hosted VMs.
Please Guide,
*Nishith N. Vyas*
1 year, 9 months
novnc console error: promise.js missing?
by karl.morgan@gmail.com
Seeing the following on the engine when attempting to start a novnc console
2023-03-31 11:57:34,988-07 INFO [org.ovirt.engine.core.vdsbroker.vdsbroker.SetVmTicketVDSCommand] (default task-29) [307c22a9] START, SetVmTicketVDSCommand(HostName = ov2node02-mn, SetVmTicketVDSCommandParameters:{hostId='19a92de3-c6e4-4f4e-be31-1d5533a2b6b6', vmId='4481b0e9-a96c-4ee7-8bd2-572558eb9fda', protocol='VNC', ticket='LMl903dd', validTime='120', userName='admin@ovirt', userId='9e88a363-34cb-466a-8ba9-25e46819423b', disconnectAction='LOCK_SCREEN', consoleDisconnectActionDelay='0'}), log id: 1d6520d5
2023-03-31 11:57:35,031-07 INFO [org.ovirt.engine.core.vdsbroker.vdsbroker.SetVmTicketVDSCommand] (default task-29) [307c22a9] FINISH, SetVmTicketVDSCommand, return: , log id: 1d6520d5
2023-03-31 11:57:35,044-07 INFO [org.ovirt.engine.core.dal.dbbroker.auditloghandling.AuditLogDirector] (default task-29) [307c22a9] EVENT_ID: VM_SET_TICKET(164), User admin@ovirt@internalkeycloak-authz initiated console session for VM first
2023-03-31 11:57:35,158-07 INFO [org.ovirt.engine.core.utils.servlet.ServletUtils] (default task-29) [] Can't read file '/usr/share/ovirt-engine/files/novnc/vendor/promise.js' for request '/ovirt-engine/services/files/novnc/vendor/promise.js' -- 404
Curious if this is obvious to anyone? How to resolve?
ovirt-engine-webadmin-portal-4.5.4-1.el9.noarch
ovirt-engine-4.5.4-1.el9.noarch
1 year, 9 months
troubleshooting SHE - Run engine-setup with answerfile - ?
by lejeczek
Hi guys.
I'm revisiting my failed attempt to make friends with oVirt
- novice so go easy on me.
Deployment fails sooner that what can troubleshoot - at
least I manged to find in docs so far - and I wanted to ask
hot to troubleshot such failure as this:
...
INFO ] TASK [ovirt.ovirt.engine_setup : Run engine-setup
with answerfile]
[ ERROR ] fatal: [localhost -> 192.168.1.209]: FAILED! =>
{"changed": true, "cmd": ["engine-setup",
"--accept-defaults",
"--config-append=/root/ovirt-engine-answers"], "delta":
"0:08:05.997322", "end": "2023-06-01 13:07:15.932517",
"msg": "non-zero return code", "rc": 1, "start": "2023-06-01
12:59:09.935195", "stderr": "", "stderr_lines": [],
"stdout": "[ INFO ] Stage: Initializing\n[ INFO ] Stage:
Environment setup\n Configuration files:\n
...
Start with setting up Keycloak for Ovirt Engine", "[ ERROR ]
Failed to execute stage 'Closing up': Command
'/usr/share/ovirt-engine-keycloak/bin/kk_cli.sh' failed to
execute", "[ INFO ] Stage: Clean up", " Log file
is located at", "
/var/log/ovirt-engine/setup/ovirt-engine-setup-20230601125911-448xvj.log",
...
[ ERROR ] fatal: [localhost]: FAILED! => {"changed": false,
"msg": "There was a failure deploying the engine on the
local engine VM. The system may not be provisioned according
to the playbook results: please check the logs for the
issue, fix accordingly or re-deploy from scratch.\n"}
[ ERROR ] Failed to execute stage 'Closing up': Failed
executing ansible-playbook
[ INFO ] Stage: Clean up
...
Log file is located at
/var/log/ovirt-hosted-engine-setup/ovirt-hosted-engine-setup-20230601123237-rhwx8e.log
Logfile to which 'stdout' points, shows nothing or I fail to
find anything in it and/or understand it:
...
2023-06-01 13:07:17,716+0100 ERROR
otopi.ovirt_hosted_engine_setup.ansible_utils
ansible_utils._process_output:113 fatal: [localhost ->
192.168.1.209]: FAILED! => {"changed": true, "cmd":
["engine-setup", "--accept-defaults",
"--config-append=/root/ovirt-engine-answers"], "delta":
"0:08:05.997322", "end": "2023-06-01 13:07:15.932517",
"msg": "non-zero return code", "rc": 1, "start": "2023-06-01
12:59:09.935195", "stderr": "", "stderr_lines": [], "stdout":
...
or Ovirt Engine\n[ ERROR ] Failed to execute stage 'Closing
up': Command
'/usr/share/ovirt-engine-keycloak/bin/kk_cli.sh' failed to
execute\n[ INFO ] Stage: Clean up\n Log file is
located at\n
/var/log/ovirt-engine/setup/ovirt-engine-setup-20230601125911-448xvj.log\n[
INFO ] Generating answer file
'/var/lib/ovirt-engine/setup/answers/20230601130714-setup.conf'\n[
INFO ] Stage: Pre-termination\n[ INFO ] Stage:
Termination\n[ ERROR ] Execution of setup failed",
"stdout_lines": ["[ INFO ] Stage: Initializing", "[ INFO ]
Stage: Environment setup", " Configuration files:",
" /etc/ovirt-engine-setup.conf.d/10-packaging-jboss.conf,",
" /etc/ovirt-engine-setup.conf.d/10-packaging.conf,"
...
...
2023-06-01 13:08:52,946+0100 DEBUG
otopi.ovirt_hosted_engine_setup.ansible_utils
ansible_utils._process_output:109 {'msg': "The tas
k includes an option with an undefined variable. The error
was: 'local_vm_ip' is undefined. 'local_vm_ip' is
undefined\n\nThe error
appears to be in
'/usr/share/ansible/collections/ansible_collections/ovirt/ovirt/roles/hosted_engine_setup/tasks/sync_on_engine_ma
chine.yml': line 2, column 3, but may\nbe elsewhere in the
file depending on the exact syntax problem.\n\nThe offending
line appear
s to be:\n\n---\n- name: Set the name for add_host\n ^
here\n", '_ansible_no_log': False}
2023-06-01 13:08:53,047+0100 DEBUG
otopi.ovirt_hosted_engine_setup.ansible_utils
ansible_utils._process_output:109 ignored: [localh
ost]: FAILED! => {"msg": "The task includes an option with
an undefined variable. The error was: 'local_vm_ip' is
undefined. 'local
_vm_ip' is undefined\n\nThe error appears to be in
'/usr/share/ansible/collections/ansible_collections/ovirt/ovirt/roles/hosted_eng
ine_setup/tasks/sync_on_engine_machine.yml': line 2, column
3, but may\nbe elsewhere in the file depending on the exact
syntax prob
lem.\n\nThe offending line appears to be:\n\n---\n- name:
Set the name for add_host\n ^ here\n"}
...
2023-06-01 13:09:16,818+0100 DEBUG
otopi.ovirt_hosted_engine_setup.ansible_utils
ansible_utils._process_output:109 {'changed': True,
'stdout': '', 'stderr': "error: Failed to destroy pool
9ab14cca-78e0-4a8e-8a1b-00e98c300208\nerror: Requested
operation is not valid: storage pool
'9ab14cca-78e0-4a8e-8a1b-00e98c300208' is not active", 'rc':
1, 'cmd': ['virsh', '-c',
'qemu:///system?authfile=/etc/ovirt-hosted-engine/virsh_auth.conf',
'pool-destroy', '9ab14cca-78e0-4a8e-8a1b-00e98c300208'],
'start': '2023-06-01 13:09:16.614978', 'end': '2023-06-01
13:09:16.672260', 'delta': '0:00:00.057282', 'msg':
'non-zero return code', 'invocation': {'module_args':
{'_raw_params': 'virsh -c
qemu:///system?authfile=/etc/ovirt-hosted-engine/virsh_auth.conf
pool-destroy 9ab14cca-78e0-4a8e-8a1b-00e98c300208',
'_uses_shell': False, 'stdin_add_newline': True,
'strip_empty_ends': True, 'argv': None, 'chdir': None,
'executable': None, 'creates': None, 'removes': None,
'stdin': None}}, 'stdout_lines': [], 'stderr_lines':
['error: Failed to destroy pool
9ab14cca-78e0-4a8e-8a1b-00e98c300208', "error: Requested
operation is not valid: storage pool
'9ab14cca-78e0-4a8e-8a1b-00e98c300208' is not active"],
'_ansible_no_log': None}
2023-06-01 13:09:16,918+0100 DEBUG
otopi.ovirt_hosted_engine_setup.ansible_utils
ansible_utils._process_output:109 ignored: [localhost]:
FAILED! => {"changed": true, "cmd": ["virsh", "-c",
"qemu:///system?authfile=/etc/ovirt-hosted-engine/virsh_auth.conf",
"pool-destroy", "9ab14cca-78e0-4a8e-8a1b-00e98c300208"],
"delta": "0:00:00.057282", "end": "2023-06-01
13:09:16.672260", "msg": "non-zero return code", "rc": 1,
"start": "2023-06-01 13:09:16.614978", "stderr": "error:
Failed to destroy pool
9ab14cca-78e0-4a8e-8a1b-00e98c300208\nerror: Requested
operation is not valid: storage pool
'9ab14cca-78e0-4a8e-8a1b-00e98c300208' is not active",
"stderr_lines": ["error: Failed to destroy pool
9ab14cca-78e0-4a8e-8a1b-00e98c300208", "error: Requested
operation is not valid: storage pool
'9ab14cca-78e0-4a8e-8a1b-00e98c300208' is not active"],
"stdout": "", "stdout_lines": []}
2023-06-01 13:09:17,023+0100 INFO
otopi.ovirt_hosted_engine_setup.ansible_utils
ansible_utils._process_output:115 TASK
[ovirt.ovirt.hosted_engine_setup : Undefine local
storage-pool 9ab14cca-78e0-4a8e-8a1b-00e98c300208]
2023-06-01 13:09:17,224+0100 INFO
otopi.ovirt_hosted_engine_setup.ansible_utils
ansible_utils._process_output:115 changed: [localhost]
..
I think that last two snippets-blocks, might be most useful
- but what does it says?
When 'installer' is finished, there is no aforementioned
pool however there is:
-> $ virsh pool-list | grep
'9ab14cca-78e0-4a8e-8a1b-00e98c300208'
9ab14cca-78e0-4a8e-8a1b-00e98c300208-1 active yes
Engine VM remains is up & running,
How far from successful deployment that is & what would be
advised to do next?
many thanks, L.
1 year, 9 months