Ovirt 4.5 deploy failed
by Selçuk N
Hello,
I'm trying to install Ovirt 4.5 and I got the following error.
FAILED! => {"attempts": 50, "changed": false, "msg": "Error during SSO
authentication access_denied : Cannot authenticate user Invalid user
credentials."}
The node and hosted-engine vm password is the same and does not have any
problem.
Here are detailed logs.
What am I doing wrong? Thank you. Regards
2023-05-18 18:49:25,924+0000 DEBUG ansible on_any args localhost TASK:
ovirt.ovirt.hosted_engine_setup : include_tasks kwargs
2023-05-18 18:49:26,601+0000 INFO ansible ok {'status': 'OK',
'ansible_type': 'task', 'ansible_playbook':
'/usr/share/ovirt-hosted-engine-setup/he_ansible/trigger_role.yml',
'ansible_host': 'localhost', 'ansible_task': '', 'task_duration': 1}
2023-05-18 18:49:26,601+0000 DEBUG ansible on_any args
<ansible.executor.task_result.TaskResult object at 0x7f9707aa5580> kwargs
2023-05-18 18:49:26,644+0000 DEBUG ansible on_any args
/usr/share/ansible/collections/ansible_collections/ovirt/ovirt/roles/hosted_engine_setup/tasks/auth_sso.yml
(args={} vars={}): [localhost] kwargs
2023-05-18 18:49:27,317+0000 INFO ansible task start {'status': 'OK',
'ansible_type': 'task', 'ansible_playbook':
'/usr/share/ovirt-hosted-engine-setup/he_ansible/trigger_role.yml',
'ansible_task': 'ovirt.ovirt.hosted_engine_setup : Obtain SSO token using
username/password credentials'}
2023-05-18 18:49:27,317+0000 DEBUG ansible on_any args TASK:
ovirt.ovirt.hosted_engine_setup : Obtain SSO token using username/password
credentials kwargs is_conditional:False
2023-05-18 18:49:27,318+0000 DEBUG ansible on_any args localhost TASK:
ovirt.ovirt.hosted_engine_setup : Obtain SSO token using username/password
credentials kwargs
2023-05-18 18:49:29,447+0000 DEBUG ansible on_any args
<ansible.executor.task_result.TaskResult object at 0x7f970781b430> kwargs
2023-05-18 18:49:40,175+0000 DEBUG ansible on_any args
<ansible.executor.task_result.TaskResult object at 0x7f97079b9190> kwargs
2023-05-18 18:49:50,869+0000 DEBUG ansible on_any args
<ansible.executor.task_result.TaskResult object at 0x7f97079cb460> kwargs
2023-05-18 18:50:01,565+0000 DEBUG ansible on_any args
<ansible.executor.task_result.TaskResult object at 0x7f97079b9190> kwargs
2023-05-18 18:50:12,258+0000 DEBUG ansible on_any args
<ansible.executor.task_result.TaskResult object at 0x7f9707864b50> kwargs
2023-05-18 18:50:22,932+0000 DEBUG ansible on_any args
<ansible.executor.task_result.TaskResult object at 0x7f97079b9b80> kwargs
2023-05-18 18:50:33,638+0000 DEBUG ansible on_any args
<ansible.executor.task_result.TaskResult object at 0x7f9709cb0370> kwargs
2023-05-18 18:50:44,319+0000 DEBUG ansible on_any args
<ansible.executor.task_result.TaskResult object at 0x7f97079b9190> kwargs
2023-05-18 18:50:54,990+0000 DEBUG ansible on_any args
<ansible.executor.task_result.TaskResult object at 0x7f9709cb0370> kwargs
2023-05-18 18:51:05,683+0000 DEBUG ansible on_any args
<ansible.executor.task_result.TaskResult object at 0x7f97079b9b80> kwargs
2023-05-18 18:51:16,375+0000 DEBUG ansible on_any args
<ansible.executor.task_result.TaskResult object at 0x7f9709cb0370> kwargs
2023-05-18 18:51:27,062+0000 DEBUG ansible on_any args
<ansible.executor.task_result.TaskResult object at 0x7f97079b9190> kwargs
2023-05-18 18:51:37,766+0000 DEBUG ansible on_any args
<ansible.executor.task_result.TaskResult object at 0x7f9709cb0a60> kwargs
2023-05-18 18:51:48,422+0000 DEBUG ansible on_any args
<ansible.executor.task_result.TaskResult object at 0x7f97077c3670> kwargs
2023-05-18 18:51:59,125+0000 DEBUG ansible on_any args
<ansible.executor.task_result.TaskResult object at 0x7f97078c2a30> kwargs
2023-05-18 18:52:09,821+0000 DEBUG ansible on_any args
<ansible.executor.task_result.TaskResult object at 0x7f9707aa5730> kwargs
2023-05-18 18:52:20,489+0000 DEBUG ansible on_any args
<ansible.executor.task_result.TaskResult object at 0x7f9707a41ac0> kwargs
2023-05-18 18:52:31,188+0000 DEBUG ansible on_any args
<ansible.executor.task_result.TaskResult object at 0x7f970781b700> kwargs
2023-05-18 18:52:41,873+0000 DEBUG ansible on_any args
<ansible.executor.task_result.TaskResult object at 0x7f9707894580> kwargs
2023-05-18 18:52:52,559+0000 DEBUG ansible on_any args
<ansible.executor.task_result.TaskResult object at 0x7f9707894760> kwargs
2023-05-18 18:53:03,260+0000 DEBUG ansible on_any args
<ansible.executor.task_result.TaskResult object at 0x7f97079f3d00> kwargs
2023-05-18 18:53:13,923+0000 DEBUG ansible on_any args
<ansible.executor.task_result.TaskResult object at 0x7f9707aa14f0> kwargs
2023-05-18 18:53:24,622+0000 DEBUG ansible on_any args
<ansible.executor.task_result.TaskResult object at 0x7f9707a01310> kwargs
2023-05-18 18:53:35,311+0000 DEBUG ansible on_any args
<ansible.executor.task_result.TaskResult object at 0x7f97076b0700> kwargs
2023-05-18 18:53:45,964+0000 DEBUG ansible on_any args
<ansible.executor.task_result.TaskResult object at 0x7f9707aa14f0> kwargs
2023-05-18 18:53:56,661+0000 DEBUG ansible on_any args
<ansible.executor.task_result.TaskResult object at 0x7f970771afa0> kwargs
2023-05-18 18:54:07,346+0000 DEBUG ansible on_any args
<ansible.executor.task_result.TaskResult object at 0x7f9707aa15e0> kwargs
2023-05-18 18:54:18,018+0000 DEBUG ansible on_any args
<ansible.executor.task_result.TaskResult object at 0x7f9707aa5730> kwargs
2023-05-18 18:54:28,715+0000 DEBUG ansible on_any args
<ansible.executor.task_result.TaskResult object at 0x7f9707aa5c40> kwargs
2023-05-18 18:54:39,387+0000 DEBUG ansible on_any args
<ansible.executor.task_result.TaskResult object at 0x7f97076f8cd0> kwargs
2023-05-18 18:54:50,067+0000 DEBUG ansible on_any args
<ansible.executor.task_result.TaskResult object at 0x7f9707aa17c0> kwargs
2023-05-18 18:55:00,773+0000 DEBUG ansible on_any args
<ansible.executor.task_result.TaskResult object at 0x7f9707aa5730> kwargs
2023-05-18 18:55:11,426+0000 DEBUG ansible on_any args
<ansible.executor.task_result.TaskResult object at 0x7f9707a36c40> kwargs
2023-05-18 18:55:22,121+0000 DEBUG ansible on_any args
<ansible.executor.task_result.TaskResult object at 0x7f9707aa5c40> kwargs
2023-05-18 18:55:32,809+0000 DEBUG ansible on_any args
<ansible.executor.task_result.TaskResult object at 0x7f97077b5880> kwargs
2023-05-18 18:55:43,463+0000 DEBUG ansible on_any args
<ansible.executor.task_result.TaskResult object at 0x7f97075dc0d0> kwargs
2023-05-18 18:55:54,157+0000 DEBUG ansible on_any args
<ansible.executor.task_result.TaskResult object at 0x7f9707aa14f0> kwargs
2023-05-18 18:56:04,841+0000 DEBUG ansible on_any args
<ansible.executor.task_result.TaskResult object at 0x7f9707aa5c40> kwargs
2023-05-18 18:56:15,500+0000 DEBUG ansible on_any args
<ansible.executor.task_result.TaskResult object at 0x7f97079b9250> kwargs
2023-05-18 18:56:26,193+0000 DEBUG ansible on_any args
<ansible.executor.task_result.TaskResult object at 0x7f97076b05e0> kwargs
2023-05-18 18:56:36,875+0000 DEBUG ansible on_any args
<ansible.executor.task_result.TaskResult object at 0x7f970787c790> kwargs
2023-05-18 18:56:47,557+0000 DEBUG ansible on_any args
<ansible.executor.task_result.TaskResult object at 0x7f970769ba00> kwargs
2023-05-18 18:56:58,257+0000 DEBUG ansible on_any args
<ansible.executor.task_result.TaskResult object at 0x7f9707aa14f0> kwargs
2023-05-18 18:57:08,912+0000 DEBUG ansible on_any args
<ansible.executor.task_result.TaskResult object at 0x7f97076f8c10> kwargs
2023-05-18 18:57:19,612+0000 DEBUG ansible on_any args
<ansible.executor.task_result.TaskResult object at 0x7f97076f8040> kwargs
2023-05-18 18:57:30,302+0000 DEBUG ansible on_any args
<ansible.executor.task_result.TaskResult object at 0x7f97077b5160> kwargs
2023-05-18 18:57:40,969+0000 DEBUG ansible on_any args
<ansible.executor.task_result.TaskResult object at 0x7f970769b160> kwargs
2023-05-18 18:57:51,664+0000 DEBUG ansible on_any args
<ansible.executor.task_result.TaskResult object at 0x7f9707516b80> kwargs
2023-05-18 18:58:02,347+0000 DEBUG ansible on_any args
<ansible.executor.task_result.TaskResult object at 0x7f97073f1c40> kwargs
2023-05-18 18:58:13,021+0000 DEBUG ansible on_any args
<ansible.executor.task_result.TaskResult object at 0x7f970769bb20> kwargs
2023-05-18 18:58:24,784+0000 DEBUG var changed: host "localhost" var
"ansible_failed_task" type "<class 'dict'>" value: "{
"action": "ovirt_auth",
"any_errors_fatal": false,
"args": {
"_ansible_check_mode": false,
"_ansible_debug": false,
"_ansible_diff": false,
"_ansible_keep_remote_files": false,
"_ansible_module_name": "ovirt_auth",
"_ansible_no_log": false,
"_ansible_remote_tmp": "~/.ansible/tmp",
"_ansible_selinux_special_fs": [
"fuse",
"nfs",
"vboxsf",
"ramfs",
"9p",
"vfat"
],
"_ansible_shell_executable": "/bin/sh",
"_ansible_socket": null,
"_ansible_string_conversion_action": "warn",
"_ansible_syslog_facility": "LOG_USER",
"_ansible_tmpdir":
"/root/.ansible/tmp/ansible-tmp-1684436303.0466125-36573-187227954552032/",
"_ansible_verbosity": 0,
"_ansible_version": "2.13.5",
"insecure": true
},
"async": 0,
"async_val": 0,
"become": false,
"become_exe": null,
"become_flags": null,
"become_method": "sudo",
"become_user": null,
"changed_when": [],
"check_mode": false,
"collections": [
"ovirt.ovirt",
"ansible.builtin"
],
"connection": "local",
"debugger": null,
"delay": 10,
"delegate_facts": null,
"delegate_to": null,
"diff": false,
"environment": [
{
"OVIRT_PASSWORD": "**FILTERED**",
"OVIRT_URL": "https://eng.xxxx.net/ovirt-engine/api",
"OVIRT_USERNAME": "admin@internal"
}
],
"failed_when": [],
"finalized": true,
"ignore_errors": null,
"ignore_unreachable": null,
"loop": null,
"loop_control": null,
"loop_with": null,
"module_defaults": [],
"name": "Obtain SSO token using username/password credentials",
"no_log": null,
"notify": null,
"poll": 15,
"port": null,
"register": "ovirt_sso_auth",
"remote_user": null,
"retries": 50,
"run_once": null,
"squashed": true,
"tags": [
"never",
"bootstrap_local_vm",
"never"
],
"throttle": 0,
"timeout": 0,
"until": [
"ovirt_sso_auth is succeeded"
],
"uuid": "1866daab-9d24-cf71-0985-00000000181c",
"vars": {},
"when": []
}"
2023-05-18 18:58:24,784+0000 DEBUG var changed: host "localhost" var
"ansible_failed_result" type "<class 'dict'>" value: "{
"_ansible_no_log": null,
"_ansible_parsed": true,
"attempts": 50,
"changed": false,
"exception": "Traceback (most recent call last):\n File
\"/tmp/ansible_ovirt_auth_payload_x_h__rn9/ansible_ovirt_auth_payload.zip/ansible_collections/ovirt/ovirt/plugins/modules/ovirt_auth.py\",
line 287, in main\n File
\"/usr/lib64/python3.6/site-packages/ovirtsdk4/__init__.py\", line 382, in
authenticate\n self._sso_token = self._get_access_token()\n File
\"/usr/lib64/python3.6/site-packages/ovirtsdk4/__init__.py\", line 627, in
_get_access_token\n sso_error[1]\novirtsdk4.AuthError: Error during SSO
authentication access_denied : Cannot authenticate user Invalid user
credentials.\n",
"failed": true,
"invocation": {
"module_args": {
"ca_file": null,
"compress": true,
"headers": null,
"hostname": null,
"insecure": true,
"kerberos": false,
"ovirt_auth": null,
"password": null,
"state": "present",
"timeout": 0,
"token": null,
"url": null,
"username": null
}
},
"msg": "Error during SSO authentication access_denied : Cannot
authenticate user Invalid user credentials."
}"
2023-05-18 18:58:24,784+0000 ERROR ansible failed {
"ansible_host": "localhost",
"ansible_playbook":
"/usr/share/ovirt-hosted-engine-setup/he_ansible/trigger_role.yml",
"ansible_result": {
"_ansible_no_log": null,
"attempts": 50,
"changed": false,
"exception": "Traceback (most recent call last):\n File
\"/tmp/ansible_ovirt_auth_payload_x_h__rn9/ansible_ovirt_auth_payload.zip/ansible_collections/ovirt/ovirt/plugins/modules/ovirt_auth.py\",
line 287, in main\n File
\"/usr/lib64/python3.6/site-packages/ovirtsdk4/__init__.py\", line 382, in
authenticate\n self._sso_token = self._get_access_token()\n File
\"/usr/lib64/python3.6/site-packages/ovirtsdk4/__init__.py\", line 627, in
_get_access_token\n sso_error[1]\novirtsdk4.AuthError: Error during SSO
authentication access_denied : Cannot authenticate user Invalid user
credentials.\n",
"invocation": {
"module_args": {
"ca_file": null,
"compress": true,
"headers": null,
"hostname": null,
"insecure": true,
"kerberos": false,
"ovirt_auth": null,
"password": null,
"state": "present",
"timeout": 0,
"token": null,
"url": null,
"username": null
}
},
"msg": "Error during SSO authentication access_denied : Cannot
authenticate user Invalid user credentials."
},
"ansible_task": "Obtain SSO token using username/password credentials",
"ansible_type": "task",
"status": "FAILED",
"task_duration": 538
}
2023-05-18 18:58:24,784+0000 DEBUG ansible on_any args
<ansible.executor.task_result.TaskResult object at 0x7f97073f1520> kwargs
ignore_errors:None
2023-05-18 18:58:25,487+0000 DEBUG var changed: host "localhost" var
"ovirt_sso_auth" type "<class 'dict'>" value: "{
"attempts": 50,
"changed": false,
"exception": "Traceback (most recent call last):\n File
\"/tmp/ansible_ovirt_auth_payload_x_h__rn9/ansible_ovirt_auth_payload.zip/ansible_collections/ovirt/ovirt/plugins/modules/ovirt_auth.py\",
line 287, in main\n File
\"/usr/lib64/python3.6/site-packages/ovirtsdk4/__init__.py\", line 382, in
authenticate\n self._sso_token = self._get_access_token()\n File
\"/usr/lib64/python3.6/site-packages/ovirtsdk4/__init__.py\", line 627, in
_get_access_token\n sso_error[1]\novirtsdk4.AuthError: Error during SSO
authentication access_denied : Cannot authenticate user Invalid user
credentials.\n",
"failed": true,
"msg": "Error during SSO authentication access_denied : Cannot
authenticate user Invalid user credentials."
}"
2023-05-18 18:58:25,487+0000 INFO ansible task start {'status': 'OK',
'ansible_type': 'task', 'ansible_playbook':
'/usr/share/ovirt-hosted-engine-setup/he_ansible/trigger_role.yml',
'ansible_task': 'ovirt.ovirt.hosted_engine_setup : Sync on engine machine'}
2023-05-18 18:58:25,487+0000 DEBUG ansible on_any args TASK:
ovirt.ovirt.hosted_engine_setup : Sync on engine machine kwargs
is_conditional:False
12 months
ovirt node
by skhurtsilava@cellfie.ge
Hello Guys
I installed oVirt Node 4.4 and I want to deploy Hosted Engine but i get this error
[ INFO ] TASK [ovirt.ovirt.hosted_engine_setup : Ensure the resolved address resolves only on the selected interface]
[ ERROR ] fatal: [localhost]: FAILED! => {"changed": false, "msg": "hostname 'ovirt.bee.moitel.local' doesn't uniquely match the interface 'ens192' selected for the management bridge; it matches also interface with IP ['fe80::9a5b:2039:fe49:5252', '192.168.222.1', 'fd00:1234:5678:900::1']. Please make sure that the hostname got from the interface for the management network resolves only there.\n"}
How can i fix this error?
12 months
Re: Basic authentication to Rest api not working 4.5.4
by kishorekumar.goli@gmail.com
Thanks Alexei for the response.
I see httpd configuration is updated to use oauth. I see below configuration updated in /etc/httpd/conf.d/internalsso-openidc.conf
<LocationMatch ^/ovirt-engine/api($|/)>
AuthType oauth20
Require valid-user
</LocationMatch>
I dont see any release notes about removal of basic authentication in 4.5.x. So I wanted to know if this is mentioned anywhere in the documentation.
12 months
Basic authentication to Rest api not working 4.5.4
by kishorekumar.goli@gmail.com
We are facing issue while using basic authentication .
We get 401 unauthorized error . It was working in previous versions.
parameters used:
curl -vvk -u "admin:admin" -H "Content-type: application/xml" -X GET https://<ovirt_gui>/ovirt-engine/api/hosts/
<!DOCTYPE HTML PUBLIC "-//IETF//DTD HTML 2.0//EN">
<html>
<head>
<title>401 Unauthorized</title>
</head>
<body>
<h1>Unauthorized</h1>
<p>This server could not verify that you
are authorized to access the document
requested. Either you supplied the wrong
credentials (e.g., bad password), or your
browser doesn't understand how to supply
the credentials required.</p>
</body>
</html>
1 year
Failed to read or parse '/etc/pki/ovirt-engine/keys/engine.p12'
by Frank Wall
Hi,
I was trying to restore a oVirt Engine Backup into a new Hosted Engine
appliance (as part of an upgrade), but this failed with the following
error:
--== PKI CONFIGURATION ==--
[WARNING] Failed to read or parse
'/etc/pki/ovirt-engine/keys/engine.p12'
Perhaps it was changed since last Setup.
Error was:
Error outputting keys and certificates
80EBCC44677F0000:error:0308010C:digital envelope
routines:inner_evp_generic_fetch:unsupported:crypto/evp/evp_fetch.c:373:Global
default library context, Algorithm (RC2-40-CBC : 0)
It looks like this is related to openssl requiring legacy mode
to use the old Engine cert/key.
Is there any way to workaround this? Or would it be possible
to repackage the existing PCKS#12 file with new encryption (on
the old Engine)?
Regards
- Frank
1 year
[ansible]attach vdisk to vm
by Pietro Pesce
Hello ever1
i create a playbook for create and attach vdisk (from direct lun) to vm, the firs block work. I want attach the created vdisk to second vm. how can do?
---
# Add fiber chanel disk
- name: Create disk
ovirt.ovirt.ovirt_disk:
auth: "{{ ovirt_auth }}"
name: "{{ item.0 }}"
host: "{{host}}"
shareable: True
interface: virtio_scsi
vm_name: "{{hostname}}"
scsi_passthrough: disabled
logical_unit:
id: "{{ item.1 }}"
storage_type: fcp
loop: "{{ disk_name | zip(lun) | list }}"
## Add disk second node
#- name: Create disk
# ovirt.ovirt.ovirt_disk:
# auth: "{{ ovirt_auth }}"
# vm_name: "{{hostname2}}"
# name: "{{ item.0 }}"
# host: "{{host}}"
# interface: virtio_scsi
# logical_unit:
# id: "{{ item.1 }}"
# storage_type: fcp
# loop: "{{ disk_name | zip(lun) | list }}"
thanks
1 year
engine setup fails: error: The system may not be provisioned according to the playbook results
by neeldey427@gmail.com
I'm trying to setup the engine. But I am getting the same error.
[ INFO ] TASK [ovirt.ovirt.hosted_engine_setup : Remove temporary entry in /etc/hosts for the local VM]
[ INFO ] changed: [localhost]
[ INFO ] TASK [ovirt.ovirt.hosted_engine_setup : include_tasks]
[ INFO ] ok: [localhost]
[ INFO ] TASK [ovirt.ovirt.hosted_engine_setup : Destroy local storage-pool localvm3a2r5z0y]
[ INFO ] changed: [localhost]
[ INFO ] TASK [ovirt.ovirt.hosted_engine_setup : Undefine local storage-pool localvm3a2r5z0y]
[ INFO ] changed: [localhost]
[ INFO ] TASK [ovirt.ovirt.hosted_engine_setup : Destroy local storage-pool 9ef860a6-ee88-4aa6-94ac-a429a90ebec8]
[ INFO ] changed: [localhost]
[ INFO ] TASK [ovirt.ovirt.hosted_engine_setup : Undefine local storage-pool 9ef860a6-ee88-4aa6-94ac-a429a90ebec8]
[ INFO ] changed: [localhost]
[ INFO ] TASK [ovirt.ovirt.hosted_engine_setup : Notify the user about a failure]
[ ERROR ] fatal: [localhost]: FAILED! => {"changed": false, "msg": "The system may not be provisioned according to the playbook results: please check the logs for the issue, fix accordingly or re-deploy from scratch.\n"}
Please let me know if you need more information in this regard or contents from any of the log files.
Any & all suggestions on how to fix/troubleshoot this are much appreciated.
1 year
engine setup fails: error creating bridge interface virbr0: File exists - ?
by lejeczek
Hi guys.
I'm trying to setup the engine on the latest stable ovirt
node(in a VM), so a clean, vanilla-default system.
-> $ hosted-engine --deploy --4
...
[ INFO ] TASK [ovirt.ovirt.hosted_engine_setup : Activate
default libvirt network]
[ ERROR ] fatal: [localhost]: FAILED! => {"changed": false,
"cmd": ["virsh", "net-start", "default"], "delta":
"0:00:00.042134", "end": "2023-05-11 11:08:59.248405",
"msg": "non-zero return code", "rc": 1, "start": "2023-05-11
11:08:59.206271", "stderr": "error: Failed to start network
default\nerror: error creating bridge interface virbr0: File
exists", "stderr_lines": ["error: Failed to start network
default", "error: error creating bridge interface virbr0:
File exists"], "stdout": "", "stdout_lines": []}
[ ERROR ] Failed to execute stage 'Closing up': Failed
getting local_vm_dir
...
Any & all suggestions on how to fix/troubleshoot this are
much appreciated.
many thanks, L.
1 year