Zanata access request
by Vladimir Talankin
Hello!
I'm writing on behalf of the HOSTVM support team. HOSTVM is a
virtualization platform based on Ovirt. For the purposes of improving UI
languages, can you please provide us with account details for Zanata (
https://zanata.ovirt.org/)? If any additional information is needed, just
let me know. Thank you.
Regards,
Vladimir
2 years
URGENT HELP NEEDED!! Hosts are in non-operational state.
by eugene@knorydev.com
Oct 25, 2022, 4:10:01 PM - Failed to connect Host <hostname> to Storage Pool <storage pool>
Oct 25, 2022, 4:10:01 PM - Host <hostname> cannot access the Storage Domain(s) VM_Data attached to the <data center>. Setting Host state to Non-Operational.
Oct 25, 2022, 4:10:01 PM - Host <hostname> reports about one of the Active Storage Domains as Problematic.
Oct 25, 2022, 4:09:52 PM - Activation of host <hostname> initiated by <user>.
Hi All,
Today morning, noticed that my 1st and 2nd host is in non-operational status and saw that my storage pool isn't being mounted and whenever I try to press Activate under Management, the above error happens.
I have multiple VM running and I can't bring it up if it goes down, and it is my production server for my company.
Any advise on how to check on this or what logs to look & provide to resolve this?
The person who deployed this setup is no longer in my company and I have no one else to seek help for this issue.
2 years
ovirt 4.5.3 and missing python3.9 packages
by Nathanaël Blanchet
Hi,
When running hosted-engine --deploy with the el8 ovirt-node, ansible 2.13 uses python3.9 by default.
But the installation fails two times because of missing netaddr and jmespath module for python3.9. I suppose the issue is not present in el9 because default python 3.9 presence.
The workaround is to install those two packages with pip:
dnf install python39-pip --enabelrepo appstream
pip3.9 install netaddr jmespath
Why not default installing those two packages ?
2 years
Re: admin user locked out (OVN invalid_grant)
by Mounir MOHELLEBI
Hi there !!,
Can you tel me exactly what is the command you executed to Run /usr/share/ovirt-engine/bin/ovirt-register-sso-client-tool.sh with Client Id: ovirt-provider-ovn Client CA Certificate File Location: /etc/pki/ovirt-engine/certs/engine.cer ???
I executed the following command "/usr/share/ovirt-engine/bin/ovirt-register-sso-client-tool.sh ovirt-provider-ovn" but I obtained this ouput:
Picked up JAVA_TOOL_OPTIONS: -Dcom.redhat.fips=false
Oct 25, 2022 8:17:39 AM org.ovirt.engine.ssoreg.core.SsoRegistrationToolExecutor main
INFO: =========================================================================
Oct 25, 2022 8:17:39 AM org.ovirt.engine.ssoreg.core.SsoRegistrationToolExecutor main
INFO: ================== oVirt Sso Client Registration Tool ===================
Oct 25, 2022 8:17:39 AM org.ovirt.engine.ssoreg.core.SsoRegistrationToolExecutor main
INFO: =========================================================================
Oct 25, 2022 8:17:39 AM org.ovirt.engine.ssoreg.core.SsoRegistrationToolExecutor main
SEVERE: Parameter required but not found: client-id
Can you tell how to execute this command properly and how can I find the correct client-id ??
Best regards,
MOHELLEBI Mounir
Ingénieur système | CIAR Assurances Spa
Phone : +213 (0) 21 692 275 | 021 692 527 | 021 691 597 | 021 694 923
Mobile: + 213 561 604 771
Site: https://www.laciar.com
Email: m.mohellebi(a)laciar.com<mailto:m.mohellebi@laciar.com>
Adresse : 11, Chemin des Crêtes ll Paradou ll Hydra ll Alger ll Algerie
IMPORTANT : Ce courriel est une communication confidentielle et l'information qu'il contient est réservée à l'usage exclusif du destinataire. Si vous n'êtes pas le destinataire visé, vous n'avez aucun droit d'utiliser cette information, de la copier, de la distribuer ou de la diffuser. Si cette communication vous a été transmise par erreur, veuillez la détruire et nous en aviser immédiatement par courriel. Nous vous remercions pour votre aide.
The information contained and transmitted in this email is confidential, and is intended only for the named recipient to which it is addressed. The content of this email may not have been sent with the authority of the company. If the reader of this message is not the named recipient or a person responsible for delivering it to the named recipient, you are notified that the review, dissemination, distribution, transmission, printing or copying, forwarding, or any other use of this message or any part of it, including any attachments, is strictly prohibited.
________________________________
2 years
Local (Deployment) VM Can't Reach "centos-ceph-pacific" Repo
by Matthew J Black
Hi All,
OK, new issue: :-(
If I'm reading things right the local (deployment) vm can't get to the centos-ceph-pacific repo.
The repo is installed on the host machine (along with all of the relevant dependant repos).
I thought that local 192.168.222.0/24 network was nated out the virbr0 virtual bridge - am I wrong in this (ie do we need to update/change our routing tables/whatever)?
Here is the (relevant part) of the log:
~~~
2022-10-11 16:22:14,749+1100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:115 TASK [ovirt.ovirt.engine_setup : Install oVirt Engine package]
2022-10-11 16:26:03,643+1100 DEBUG otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:109 {'results': [], 'rc': 1, 'msg': "Failed to download metadata for repo 'centos-ceph-pacific': Cannot download repomd.xml: Cannot download repodata/repomd.xml: All mirrors were tried", 'invocation': {'module_args': {'name': ['ovirt-engine'], 'state': 'present', 'allow_downgrade': False, 'autoremove': False, 'bugfix': False, 'cacheonly': False, 'disable_gpg_check': False, 'disable_plugin': [], 'disablerepo': [], 'download_only': False, 'enable_plugin': [], 'enablerepo': [], 'exclude': [], 'installroot': '/', 'install_repoquery': True, 'install_weak_deps': True, 'security': False, 'skip_broken': False, 'update_cache': False, 'update_only': False, 'validate_certs': True, 'lock_timeout': 30, 'allowerasing': False, 'nobest': False, 'conf_file': None, 'disable_excludes': None, 'download_dir': None, 'list': None, 'releasever': None}}, '_ansible_no_log': False, 'changed': False, '
_ansible_delegated_vars': {'ansible_host': '192.168.222.77', 'ansible_port': None, 'ansible_user': 'root', 'ansible_connection': 'smart'}}
2022-10-11 16:26:03,744+1100 ERROR otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:113 fatal: [localhost -> 192.168.222.77]: FAILED! => {"changed": false, "msg": "Failed to download metadata for repo 'centos-ceph-pacific': Cannot download repomd.xml: Cannot download repodata/repomd.xml: All mirrors were tried", "rc": 1, "results": []}
2022-10-11 16:26:04,045+1100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:115 TASK [ovirt.ovirt.hosted_engine_setup : Sync on engine machine]
2022-10-11 16:26:04,947+1100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:115 changed: [localhost -> 192.168.222.77]
2022-10-11 16:26:05,449+1100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:115 TASK [ovirt.ovirt.hosted_engine_setup : Set destination directory path]
2022-10-11 16:26:05,950+1100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:115 ok: [localhost -> localhost]
2022-10-11 16:26:06,352+1100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:115 TASK [ovirt.ovirt.hosted_engine_setup : Create destination directory]
2022-10-11 16:26:06,953+1100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:115 changed: [localhost -> localhost]
2022-10-11 16:26:07,355+1100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:115 TASK [ovirt.ovirt.hosted_engine_setup : include_tasks]
2022-10-11 16:26:07,856+1100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:115 ok: [localhost]
2022-10-11 16:26:08,357+1100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:115 TASK [ovirt.ovirt.hosted_engine_setup : Find the local appliance image]
2022-10-11 16:26:08,959+1100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:115 ok: [localhost -> localhost]
2022-10-11 16:26:09,460+1100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:115 TASK [ovirt.ovirt.hosted_engine_setup : Set local_vm_disk_path]
2022-10-11 16:26:09,862+1100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:115 ok: [localhost -> localhost]
2022-10-11 16:26:10,363+1100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:115 TASK [ovirt.ovirt.hosted_engine_setup : Give the vm time to flush dirty buffers]
2022-10-11 16:26:20,986+1100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:115 ok: [localhost -> localhost]
2022-10-11 16:26:21,388+1100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:115 TASK [ovirt.ovirt.hosted_engine_setup : Copy engine logs]
2022-10-11 16:26:27,901+1100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:115 changed: [localhost]
2022-10-11 16:26:28,403+1100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:115 TASK [ovirt.ovirt.hosted_engine_setup : Change ownership of copied engine logs]
2022-10-11 16:26:29,005+1100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:115 changed: [localhost -> localhost]
2022-10-11 16:26:29,506+1100 INFO otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:115 TASK [ovirt.ovirt.hosted_engine_setup : Notify the user about a failure]
2022-10-11 16:26:29,908+1100 DEBUG otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:109 {'msg': 'There was a failure deploying the engine on the local engine VM. The system may not be provisioned according to the playbook results: please check the logs for the issue, fix accordingly or re-deploy from scratch.\n', '_ansible_no_log': False, 'changed': False}
2022-10-11 16:26:30,008+1100 ERROR otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:113 fatal: [localhost]: FAILED! => {"changed": false, "msg": "There was a failure deploying the engine on the local engine VM. The system may not be provisioned according to the playbook results: please check the logs for the issue, fix accordingly or re-deploy from scratch.\n"}
2022-10-11 16:26:30,410+1100 DEBUG otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils._process_output:109 PLAY RECAP [localhost] : ok: 161 changed: 59 unreachable: 0 skipped: 80 failed: 1
2022-10-11 16:26:30,510+1100 DEBUG otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils.run:226 ansible-playbook rc: 2
2022-10-11 16:26:30,510+1100 DEBUG otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils.run:233 ansible-playbook stdout:
2022-10-11 16:26:30,510+1100 DEBUG otopi.ovirt_hosted_engine_setup.ansible_utils ansible_utils.run:236 ansible-playbook stderr:
2022-10-11 16:26:30,510+1100 DEBUG otopi.plugins.gr_he_ansiblesetup.core.misc misc._closeup:475 {'otopi_host_net': {'ansible_facts': {'otopi_host_net': ['bond0.40', 'ens1', 'bond0.20']}, '_ansible_no_log': False, 'changed': False}, 'otopi_localvm_dir': {'changed': True, 'path': '/var/tmp/localvmfq67djtv', 'uid': 0, 'gid': 0, 'owner': 'root', 'group': 'root', 'mode': '0700', 'state': 'directory', 'secontext': 'unconfined_u:object_r:user_tmp_t:s0', 'size': 6, 'invocation': {'module_args': {'state': 'directory', 'path': '/var/tmp', 'prefix': 'localvm', 'suffix': ''}}, '_ansible_no_log': False}, 'otopi_appliance_disk_size': {'ansible_facts': {'virtual_size': '53689188352'}, '_ansible_no_log': False, 'changed': False}, 'ansible-playbook_rc': 2}
2022-10-11 16:26:30,511+1100 DEBUG otopi.context context._executeMethod:145 method exception
Traceback (most recent call last):
File "/usr/lib/python3.6/site-packages/otopi/context.py", line 132, in _executeMethod
method['method']()
File "/usr/share/ovirt-hosted-engine-setup/scripts/../plugins/gr-he-ansiblesetup/core/misc.py", line 509, in _closeup
raise RuntimeError(_('Failed executing ansible-playbook'))
RuntimeError: Failed executing ansible-playbook
2022-10-11 16:26:30,511+1100 ERROR otopi.context context._executeMethod:154 Failed to execute stage 'Closing up': Failed executing ansible-playbook
2022-10-11 16:26:30,512+1100 DEBUG otopi.context context.dumpEnvironment:765 ENVIRONMENT DUMP - BEGIN
2022-10-11 16:26:30,512+1100 DEBUG otopi.context context.dumpEnvironment:775 ENV BASE/error=bool:'True'
2022-10-11 16:26:30,512+1100 DEBUG otopi.context context.dumpEnvironment:775 ENV BASE/exceptionInfo=list:'[(<class 'RuntimeError'>, RuntimeError('Failed executing ansible-playbook',), <traceback object at 0x7efe2a39fa08>)]'
2022-10-11 16:26:30,512+1100 DEBUG otopi.context context.dumpEnvironment:775 ENV OVEHOSTED_CORE/localVMDir=str:'/var/tmp/localvmfq67djtv'
2022-10-11 16:26:30,512+1100 DEBUG otopi.context context.dumpEnvironment:775 ENV OVEHOSTED_STORAGE/ovfSizeGB=int:'51'
2022-10-11 16:26:30,513+1100 DEBUG otopi.context context.dumpEnvironment:779 ENVIRONMENT DUMP - END
~~~
Anyone got any ideas - thanks:
Dulux-Oz
2 years
Failed to delete snapshot
by Jirka Simon
Hello there.
we have troubles with one VM ad it's snapshot
here is short log from engine.log
2022-08-23 09:02:38,735+02 INFO
[org.ovirt.engine.core.bll.tasks.CommandAsyncTask]
(EE-ManagedThreadFactory-engine-Thread-1265784)
[4d79272b-ffac-4b2b-a987-8c19577cd8c4] CommandAsyncTask::
HandleEndActionResult [within thread]: Removing CommandMultiAsyncTasks
object for entity '5842cafc-60d4-4131-8da1-8a909123f08c'
2022-08-23 09:02:41,435+02 ERROR
[org.ovirt.engine.core.bll.snapshots.RemoveSnapshotSingleDiskLiveCommand]
(EE-ManagedScheduledExecutorService-engineScheduledThreadPool-Thread-68)
[4d79272b
-ffac-4b2b-a987-8c19577cd8c4] Command id:
'ead3fbba-7439-4337-8aaf-1d6cf65bbb15 failed child command status for
step 'REDUCE_IMAGE'
2022-08-23 09:02:41,435+02 INFO
[org.ovirt.engine.core.bll.snapshots.RemoveSnapshotSingleDiskLiveCommandCallback] (EE-ManagedScheduledExecutorService-engineScheduledThreadPool-Thread-68) [
4d79272b-ffac-4b2b-a987-8c19577cd8c4] Command
'RemoveSnapshotSingleDiskLive' id:
'ead3fbba-7439-4337-8aaf-1d6cf65bbb15' child commands
'[5b33d970-6ea0-4b8b-bc31-607afd8af1ca, 185d3fec-8c76-
44d3-a0a1-fe8fff331354, 308baf9d-1fc5-43a1-a624-b5f08f7bec1b,
f11f3550-4356-417e-af10-a3afb2051dec,
5842cafc-60d4-4131-8da1-8a909123f08c]' executions were completed, status
'FAILED'
2022-08-23 09:02:42,445+02 ERROR
[org.ovirt.engine.core.bll.snapshots.RemoveSnapshotSingleDiskLiveCommand]
(EE-ManagedScheduledExecutorService-engineScheduledThreadPool-Thread-79)
[4d79272b
-ffac-4b2b-a987-8c19577cd8c4] Merging of snapshot
'6634db4e-23d8-4849-a008-864f15d28c3d' images
'd24a9199-741b-49cb-96d2-0d76fcd21f48'..'48af3301-0cbb-4a6e-97d1-7299e7de883f'
failed. Images
have been marked illegal and can no longer be previewed or reverted to.
Please retry Live Merge on the snapshot to complete the operation.
2022-08-23 09:02:42,451+02 ERROR
[org.ovirt.engine.core.bll.snapshots.RemoveSnapshotSingleDiskLiveCommand]
(EE-ManagedScheduledExecutorService-engineScheduledThreadPool-Thread-79)
[4d79272b
-ffac-4b2b-a987-8c19577cd8c4] Ending command
'org.ovirt.engine.core.bll.snapshots.RemoveSnapshotSingleDiskLiveCommand'
with failure.
2022-08-23 09:02:42,457+02 INFO
[org.ovirt.engine.core.bll.ConcurrentChildCommandsExecutionCallback]
(EE-ManagedScheduledExecutorService-engineScheduledThreadPool-Thread-79)
[4d79272b-ffac
-4b2b-a987-8c19577cd8c4] Command 'RemoveSnapshot' id:
'73340ac8-07d7-4de2-bfeb-8062fa1e8cfd' child commands
'[ead3fbba-7439-4337-8aaf-1d6cf65bbb15]' executions were completed,
status 'FAILE
D'
2022-08-23 09:02:43,470+02 ERROR
[org.ovirt.engine.core.bll.snapshots.RemoveSnapshotCommand]
(EE-ManagedScheduledExecutorService-engineScheduledThreadPool-Thread-22)
[4d79272b-ffac-4b2b-a98
7-8c19577cd8c4] Ending command
'org.ovirt.engine.core.bll.snapshots.RemoveSnapshotCommand' with failure.
2022-08-23 09:02:43,490+02 ERROR
[org.ovirt.engine.core.dal.dbbroker.auditloghandling.AuditLogDirector]
(EE-ManagedScheduledExecutorService-engineScheduledThreadPool-Thread-22)
[4d79272b-ff
ac-4b2b-a987-8c19577cd8c4] EVENT_ID:
USER_REMOVE_SNAPSHOT_FINISHED_FAILURE(357), Failed to delete snapshot
'vProtect 2022-08-05 22:33:44.088143' for VM 'web1.wiki.prod.hq.sldev.cz-2'.
Snapshot can't be deleted, and VM is not possible to clone.
thank you for any help.
Jirka
2 years