Ok after some research, I think that despite answers like this:
https://stackoverflow.com/questions/36668756/ansible-remote-user-vs-ansib...
ansible_user and remote_user may well be different:
https://docs.ansible.com/ansible/latest/user_guide/become.html#setting-en...
This would suggest that the ansible_user is in fact overwriting the local task settings,
so I'm trying the same again, but using remote_user not ansible_user and seeing what
happens.
Will follow up with results.
Regards,
Callum
--
Callum Smith
Research Computing Core
Wellcome Trust Centre for Human Genetics
University of Oxford
e. callum@well.ox.ac.uk<mailto:callum@well.ox.ac.uk>
On 2 Apr 2019, at 15:56, Callum Smith
<callum@well.ox.ac.uk<mailto:callum@well.ox.ac.uk>> wrote:
Re-running same config sorted this error... Though we're back here:
- Clean NFS
- Task run as normal user
- name: Install oVirt Hosted Engine
hosts: virthyp04.virt.in.bmrc.ox.ac.uk
roles:
- ovirt.hosted_engine_setup
- No overrides in ansible.cfg
- ansible_user=root set inside /etc/ansible/hosts
I can't see the command actually trying to do any sudo command for the `dd` - but
it's clearly in the playbook it should be running the command as `vdsm` - is there an
obvious next-step?
TASK [ovirt.hosted_engine_setup : Copy configuration archive to storage]
*************************************************************************************
task path:
/etc/ansible/playbook/ovirt-ansible-hosted-engine-setup/tasks/create_target_vm/03_hosted_engine_final_tasks.yml:150
<virthyp04.virt.in.bmrc.ox.ac.uk> ESTABLISH SSH CONNECTION FOR USER: root
<virthyp04.virt.in.bmrc.ox.ac.uk> SSH: EXEC ssh -C -o ControlMaster=auto -o
ControlPersist=60s -o KbdInteractiveAuthentication=no -o
PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o
PasswordAuthentication=no -o User=root -o ConnectTimeout=10 -o
ControlPath=/etc/ansible/.ansible/cp/2c1e73363c virthyp04.virt.in.bmrc.ox.ac.uk
'/bin/sh -c '"'"'echo ~root && sleep
0'"'"''
<virthyp04.virt.in.bmrc.ox.ac.uk> (0, '/root\n', '')
<virthyp04.virt.in.bmrc.ox.ac.uk> ESTABLISH SSH CONNECTION FOR USER: root
<virthyp04.virt.in.bmrc.ox.ac.uk> SSH: EXEC ssh -C -o ControlMaster=auto -o
ControlPersist=60s -o KbdInteractiveAuthentication=no -o
PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o
PasswordAuthentication=no -o User=root -o ConnectTimeout=10 -o
ControlPath=/etc/ansible/.ansible/cp/2c1e73363c virthyp04.virt.in.bmrc.ox.ac.uk
'/bin/sh -c '"'"'( umask 77 && mkdir -p "` echo
/root/.ansible/tmp/ansible-tmp-1554216709.68-76799008668495 `" && echo
ansible-tmp-1554216709.68-76799008668495="` echo
/root/.ansible/tmp/ansible-tmp-1554216709.68-76799008668495 `" ) && sleep
0'"'"''
<virthyp04.virt.in.bmrc.ox.ac.uk> (0,
'ansible-tmp-1554216709.68-76799008668495=/root/.ansible/tmp/ansible-tmp-1554216709.68-76799008668495\n',
'')
Using module file /opt/ansible/lib/ansible/modules/commands/command.py
<virthyp04.virt.in.bmrc.ox.ac.uk> PUT
/etc/ansible/.ansible/tmp/ansible-local-17322dOuhD0/tmpSZOxcd TO
/root/.ansible/tmp/ansible-tmp-1554216709.68-76799008668495/AnsiballZ_command.py
<virthyp04.virt.in.bmrc.ox.ac.uk> SSH: EXEC sftp -b - -C -o ControlMaster=auto -o
ControlPersist=60s -o KbdInteractiveAuthentication=no -o
PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o
PasswordAuthentication=no -o User=root -o ConnectTimeout=10 -o
ControlPath=/etc/ansible/.ansible/cp/2c1e73363c
'[virthyp04.virt.in.bmrc.ox.ac.uk]'
<virthyp04.virt.in.bmrc.ox.ac.uk> (0, 'sftp> put
/etc/ansible/.ansible/tmp/ansible-local-17322dOuhD0/tmpSZOxcd
/root/.ansible/tmp/ansible-tmp-1554216709.68-76799008668495/AnsiballZ_command.py\n',
'')
<virthyp04.virt.in.bmrc.ox.ac.uk> ESTABLISH SSH CONNECTION FOR USER: root
<virthyp04.virt.in.bmrc.ox.ac.uk> SSH: EXEC ssh -C -o ControlMaster=auto -o
ControlPersist=60s -o KbdInteractiveAuthentication=no -o
PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o
PasswordAuthentication=no -o User=root -o ConnectTimeout=10 -o
ControlPath=/etc/ansible/.ansible/cp/2c1e73363c virthyp04.virt.in.bmrc.ox.ac.uk
'/bin/sh -c '"'"'chmod u+x
/root/.ansible/tmp/ansible-tmp-1554216709.68-76799008668495/
/root/.ansible/tmp/ansible-tmp-1554216709.68-76799008668495/AnsiballZ_command.py
&& sleep 0'"'"''
<virthyp04.virt.in.bmrc.ox.ac.uk> (0, '', '')
<virthyp04.virt.in.bmrc.ox.ac.uk> ESTABLISH SSH CONNECTION FOR USER: root
<virthyp04.virt.in.bmrc.ox.ac.uk> SSH: EXEC ssh -C -o ControlMaster=auto -o
ControlPersist=60s -o KbdInteractiveAuthentication=no -o
PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o
PasswordAuthentication=no -o User=root -o ConnectTimeout=10 -o
ControlPath=/etc/ansible/.ansible/cp/2c1e73363c -tt virthyp04.virt.in.bmrc.ox.ac.uk
'/bin/sh -c '"'"'LANG=en_US.UTF-8 LC_ALL=en_US.UTF-8
LC_MESSAGES=en_US.UTF-8 /usr/bin/python
/root/.ansible/tmp/ansible-tmp-1554216709.68-76799008668495/AnsiballZ_command.py
&& sleep 0'"'"''
<virthyp04.virt.in.bmrc.ox.ac.uk> (1, '\r\n{"changed": true,
"end": "2019-04-02 14:51:52.103431", "stdout": "",
"cmd": ["dd", "bs=20480", "count=1",
"oflag=direct",
"if=/var/tmp/localvmeGi13y/aab12b28-0251-45e5-98d9-77df42d4a69c",
"of=/rhev/data-center/mnt/10.141.15.248:_export_virtman_hosted__storage/166033bb-3aa6-4cfa-8cb1-cc4cd64443eb/images/732fbdba-ee38-46b2-83a4-8513a2dfa6c8/aab12b28-0251-45e5-98d9-77df42d4a69c"],
"failed": true, "delta": "0:00:00.005616",
"stderr": "dd: failed to open
\\u2018/rhev/data-center/mnt/10.141.15.248:_export_virtman_hosted__storage/166033bb-3aa6-4cfa-8cb1-cc4cd64443eb/images/732fbdba-ee38-46b2-83a4-8513a2dfa6c8/aab12b28-0251-45e5-98d9-77df42d4a69c\\u2019:
Permission denied", "rc": 1, "invocation":
{"module_args": {"warn": false, "executable": null,
"_uses_shell": false, "_raw_params": "dd bs=20480 count=1
oflag=direct
if=\\"/var/tmp/localvmeGi13y/aab12b28-0251-45e5-98d9-77df42d4a69c\\"
of=\\"/rhev/data-center/mnt/10.141.15.248:_export_virtman_hosted__storage/166033bb-3aa6-4cfa-8cb1-cc4cd64443eb/images/732fbdba-ee38-46b2-83a4-8513a2dfa6c8/aab12b28-0251-45e5-98d9-77df42d4a69c\\"",
"removes": null, "argv": null, "creates": null,
"chdir": null, "stdin": null}}, "start": "2019-04-02
14:51:52.097815", "msg": "non-zero return code"}\r\n',
'Shared connection to virthyp04.virt.in.bmrc.ox.ac.uk closed.\r\n')
<virthyp04.virt.in.bmrc.ox.ac.uk> Failed to connect to the host via ssh: Shared
connection to virthyp04.virt.in.bmrc.ox.ac.uk closed.
<virthyp04.virt.in.bmrc.ox.ac.uk> ESTABLISH SSH CONNECTION FOR USER: root
<virthyp04.virt.in.bmrc.ox.ac.uk> SSH: EXEC ssh -C -o ControlMaster=auto -o
ControlPersist=60s -o KbdInteractiveAuthentication=no -o
PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o
PasswordAuthentication=no -o User=root -o ConnectTimeout=10 -o
ControlPath=/etc/ansible/.ansible/cp/2c1e73363c virthyp04.virt.in.bmrc.ox.ac.uk
'/bin/sh -c '"'"'rm -f -r
/root/.ansible/tmp/ansible-tmp-1554216709.68-76799008668495/ > /dev/null 2>&1
&& sleep 0'"'"''
<virthyp04.virt.in.bmrc.ox.ac.uk> (0, '', '')
fatal: [virthyp04.virt.in.bmrc.ox.ac.uk]: FAILED! => {
"changed": true,
"cmd": [
"dd",
"bs=20480",
"count=1",
"oflag=direct",
"if=/var/tmp/localvmeGi13y/aab12b28-0251-45e5-98d9-77df42d4a69c",
"of=/rhev/data-center/mnt/10.141.15.248:_export_virtman_hosted__storage/166033bb-3aa6-4cfa-8cb1-cc4cd64443eb/images/732fbdba-ee38-46b2-83a4-8513a2dfa6c8/aab12b28-0251-45e5-98d9-77df42d4a69c"
],
"delta": "0:00:00.005616",
"end": "2019-04-02 14:51:52.103431",
"invocation": {
"module_args": {
"_raw_params": "dd bs=20480 count=1 oflag=direct
if=\"/var/tmp/localvmeGi13y/aab12b28-0251-45e5-98d9-77df42d4a69c\"
of=\"/rhev/data-center/mnt/10.141.15.248:_export_virtman_hosted__storage/166033bb-3aa6-4cfa-8cb1-cc4cd64443eb/images/732fbdba-ee38-46b2-83a4-8513a2dfa6c8/aab12b28-0251-45e5-98d9-77df42d4a69c\"",
"_uses_shell": false,
"argv": null,
"chdir": null,
"creates": null,
"executable": null,
"removes": null,
"stdin": null,
"warn": false
}
},
"msg": "non-zero return code",
"rc": 1,
"start": "2019-04-02 14:51:52.097815",
"stderr": "dd: failed to open
‘/rhev/data-center/mnt/10.141.15.248:_export_virtman_hosted__storage/166033bb-3aa6-4cfa-8cb1-cc4cd64443eb/images/732fbdba-ee38-46b2-83a4-8513a2dfa6c8/aab12b28-0251-45e5-98d9-77df42d4a69c’:
Permission denied",
"stderr_lines": [
"dd: failed to open
‘/rhev/data-center/mnt/10.141.15.248:_export_virtman_hosted__storage/166033bb-3aa6-4cfa-8cb1-cc4cd64443eb/images/732fbdba-ee38-46b2-83a4-8513a2dfa6c8/aab12b28-0251-45e5-98d9-77df42d4a69c’:
Permission denied"
],
"stdout": "",
"stdout_lines": []
}
to retry, use: --limit @/etc/ansible/playbook/ovirt.retry
Regards,
Callum
--
Callum Smith
Research Computing Core
Wellcome Trust Centre for Human Genetics
University of Oxford
e. callum@well.ox.ac.uk<mailto:callum@well.ox.ac.uk>
On 2 Apr 2019, at 15:24, Simone Tiraboschi
<stirabos@redhat.com<mailto:stirabos@redhat.com>> wrote:
On Tue, Apr 2, 2019 at 4:22 PM Callum Smith
<callum@well.ox.ac.uk<mailto:callum@well.ox.ac.uk>> wrote:
Earlier fail this time, but not one that makes sense to me, since the NFS was working
perfectly fine before at creation:
TASK [ovirt.hosted_engine_setup : Add NFS storage domain]
****************************************************************************************************
task path:
/etc/ansible/playbook/ovirt-ansible-hosted-engine-setup/tasks/create_storage_domain.yml:41
<virthyp04.virt.in.bmrc.ox.ac.uk<http://virthyp04.virt.in.bmrc.ox.ac.uk/>>
ESTABLISH SSH CONNECTION FOR USER: root
<virthyp04.virt.in.bmrc.ox.ac.uk<http://virthyp04.virt.in.bmrc.ox.ac.uk/>>
SSH: EXEC ssh -C -o ControlMaster=auto -o ControlPersist=60s -o
KbdInteractiveAuthentication=no -o
PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o
PasswordAuthentication=no -o User=root -o ConnectTimeout=10 -o
ControlPath=/etc/ansible/.ansible/cp/2c1e73363c
virthyp04.virt.in.bmrc.ox.ac.uk<http://virthyp04.virt.in.bmrc.ox.ac.uk/>
'/bin/sh -c '"'"'echo ~root && sleep
0'"'"''
<virthyp04.virt.in.bmrc.ox.ac.uk<http://virthyp04.virt.in.bmrc.ox.ac.uk/>> (0,
'/root\n', '')
<virthyp04.virt.in.bmrc.ox.ac.uk<http://virthyp04.virt.in.bmrc.ox.ac.uk/>>
ESTABLISH SSH CONNECTION FOR USER: root
<virthyp04.virt.in.bmrc.ox.ac.uk<http://virthyp04.virt.in.bmrc.ox.ac.uk/>>
SSH: EXEC ssh -C -o ControlMaster=auto -o ControlPersist=60s -o
KbdInteractiveAuthentication=no -o
PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o
PasswordAuthentication=no -o User=root -o ConnectTimeout=10 -o
ControlPath=/etc/ansible/.ansible/cp/2c1e73363c
virthyp04.virt.in.bmrc.ox.ac.uk<http://virthyp04.virt.in.bmrc.ox.ac.uk/>
'/bin/sh -c '"'"'( umask 77 && mkdir -p "` echo
/root/.ansible/tmp/ansible-tmp-1554214582.05-196228110119264 `" && echo
ansible-tmp-1554214582.05-196228110119264="` echo
/root/.ansible/tmp/ansible-tmp-1554214582.05-196228110119264 `" ) && sleep
0'"'"''
<virthyp04.virt.in.bmrc.ox.ac.uk<http://virthyp04.virt.in.bmrc.ox.ac.uk/>> (0,
'ansible-tmp-1554214582.05-196228110119264=/root/.ansible/tmp/ansible-tmp-1554214582.05-196228110119264\n',
'')
Using module file /opt/ansible/lib/ansible/modules/cloud/ovirt/ovirt_storage_domain.py
<virthyp04.virt.in.bmrc.ox.ac.uk<http://virthyp04.virt.in.bmrc.ox.ac.uk/>> PUT
/etc/ansible/.ansible/tmp/ansible-local-148123yQIHV/tmp08wuqq TO
/root/.ansible/tmp/ansible-tmp-1554214582.05-196228110119264/AnsiballZ_ovirt_storage_domain.py
<virthyp04.virt.in.bmrc.ox.ac.uk<http://virthyp04.virt.in.bmrc.ox.ac.uk/>>
SSH: EXEC sftp -b - -C -o ControlMaster=auto -o ControlPersist=60s -o
KbdInteractiveAuthentication=no -o
PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o
PasswordAuthentication=no -o User=root -o ConnectTimeout=10 -o
ControlPath=/etc/ansible/.ansible/cp/2c1e73363c
'[virthyp04.virt.in.bmrc.ox.ac.uk<http://virthyp04.virt.in.bmrc.ox.ac.uk/>]'
<virthyp04.virt.in.bmrc.ox.ac.uk<http://virthyp04.virt.in.bmrc.ox.ac.uk/>> (0,
'sftp> put /etc/ansible/.ansible/tmp/ansible-local-148123yQIHV/tmp08wuqq
/root/.ansible/tmp/ansible-tmp-1554214582.05-196228110119264/AnsiballZ_ovirt_storage_domain.py\n',
'')
<virthyp04.virt.in.bmrc.ox.ac.uk<http://virthyp04.virt.in.bmrc.ox.ac.uk/>>
ESTABLISH SSH CONNECTION FOR USER: root
<virthyp04.virt.in.bmrc.ox.ac.uk<http://virthyp04.virt.in.bmrc.ox.ac.uk/>>
SSH: EXEC ssh -C -o ControlMaster=auto -o ControlPersist=60s -o
KbdInteractiveAuthentication=no -o
PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o
PasswordAuthentication=no -o User=root -o ConnectTimeout=10 -o
ControlPath=/etc/ansible/.ansible/cp/2c1e73363c
virthyp04.virt.in.bmrc.ox.ac.uk<http://virthyp04.virt.in.bmrc.ox.ac.uk/>
'/bin/sh -c '"'"'chmod u+x
/root/.ansible/tmp/ansible-tmp-1554214582.05-196228110119264/
/root/.ansible/tmp/ansible-tmp-1554214582.05-196228110119264/AnsiballZ_ovirt_storage_domain.py
&& sleep 0'"'"''
<virthyp04.virt.in.bmrc.ox.ac.uk<http://virthyp04.virt.in.bmrc.ox.ac.uk/>> (0,
'', '')
<virthyp04.virt.in.bmrc.ox.ac.uk<http://virthyp04.virt.in.bmrc.ox.ac.uk/>>
ESTABLISH SSH CONNECTION FOR USER: root
<virthyp04.virt.in.bmrc.ox.ac.uk<http://virthyp04.virt.in.bmrc.ox.ac.uk/>>
SSH: EXEC ssh -C -o ControlMaster=auto -o ControlPersist=60s -o
KbdInteractiveAuthentication=no -o
PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o
PasswordAuthentication=no -o User=root -o ConnectTimeout=10 -o
ControlPath=/etc/ansible/.ansible/cp/2c1e73363c -tt
virthyp04.virt.in.bmrc.ox.ac.uk<http://virthyp04.virt.in.bmrc.ox.ac.uk/>
'/bin/sh -c '"'"'/usr/bin/python
/root/.ansible/tmp/ansible-tmp-1554214582.05-196228110119264/AnsiballZ_ovirt_storage_domain.py
&& sleep 0'"'"''
<virthyp04.virt.in.bmrc.ox.ac.uk<http://virthyp04.virt.in.bmrc.ox.ac.uk/>> (1,
'\r\n{"msg": "Fault reason is \\"Operation Failed\\". Fault
detail is \\"[Permission settings on the specified path do not allow access to the
storage.\\nVerify permission settings on the specified storage path.]\\". HTTP
response code is 400.", "failed": true, "exception":
"Traceback (most recent call last):\\n File
\\"/tmp/ansible_ovirt_storage_domain_payload_k6JDp0/__main__.py\\", line 682, in
main\\n ret = storage_domains_module.create()\\n File
\\"/tmp/ansible_ovirt_storage_domain_payload_k6JDp0/ansible_ovirt_storage_domain_payload.zip/ansible/module_utils/ovirt.py\\",
line 587, in create\\n **kwargs\\n File
\\"/usr/lib64/python2.7/site-packages/ovirtsdk4/services.py\\", line 25077, in
add\\n return self._internal_add(storage_domain, headers, query, wait)\\n File
\\"/usr/lib64/python2.7/site-packages/ovirtsdk4/service.py\\", line 232, in
_internal_add\\n return future.wait() if wait else future\\n File
\\"/usr/lib64/python2.7/site-packages/ovirtsdk4/service.py\\", line 55, in
wait\\n return self._code(response)\\n File
\\"/usr/lib64/python2.7/site-packages/ovirtsdk4/service.py\\", line 229, in
callback\\n self._check_fault(response)\\n File
\\"/usr/lib64/python2.7/site-packages/ovirtsdk4/service.py\\", line 132, in
_check_fault\\n self._raise_error(response, body)\\n File
\\"/usr/lib64/python2.7/site-packages/ovirtsdk4/service.p---
y\\", line 118, in _raise_error\\n raise error\\nError: Fault reason is
\\"Operation Failed\\". Fault detail is \\"[Permission settings on the
specified path do not allow access to the storage.\\nVerify permission settings on the
specified storage path.]\\". HTTP response code is 400.\\n",
"invocation": {"module_args": {"comment": null,
"warning_low_space": null, "fetch_nested": false, "localfs":
null, "data_center": "Default", "id": null,
"iscsi": null, "state": "unattached",
"wipe_after_delete": null, "destroy": null, "fcp": null,
"description": null, "format": null, "nested_attributes":
[], "host":
"virthyp04.virt.in.bmrc.ox.ac.uk<http://virthyp04.virt.in.bmrc.ox.ac.uk/>",
"discard_after_delete": null, "wait": true,
"domain_function": "data", "name":
"hosted_storage", "critical_space_action_blocker": null,
"posixfs": null, "poll_interval": 3, "glusterfs": null,
"nfs": {"path": "/export/virtman/hosted_storage",
"version": "auto", "mount_options": "",
"address": "10.141.15.248"}, "timeout": 180,
"backup": null}}}\r\n', 'Shared connection to
virthyp04.virt.in.bmrc.ox.ac.uk<http://virthyp04.virt.in.bmrc.ox.ac.uk/>
closed.\r\n')
<virthyp04.virt.in.bmrc.ox.ac.uk<http://virthyp04.virt.in.bmrc.ox.ac.uk/>>
Failed to connect to the host via ssh: Shared connection to
virthyp04.virt.in.bmrc.ox.ac.uk<http://virthyp04.virt.in.bmrc.ox.ac.uk/> closed.
<virthyp04.virt.in.bmrc.ox.ac.uk<http://virthyp04.virt.in.bmrc.ox.ac.uk/>>
ESTABLISH SSH CONNECTION FOR USER: root
<virthyp04.virt.in.bmrc.ox.ac.uk<http://virthyp04.virt.in.bmrc.ox.ac.uk/>>
SSH: EXEC ssh -C -o ControlMaster=auto -o ControlPersist=60s -o
KbdInteractiveAuthentication=no -o
PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o
PasswordAuthentication=no -o User=root -o ConnectTimeout=10 -o
ControlPath=/etc/ansible/.ansible/cp/2c1e73363c
virthyp04.virt.in.bmrc.ox.ac.uk<http://virthyp04.virt.in.bmrc.ox.ac.uk/>
'/bin/sh -c '"'"'rm -f -r
/root/.ansible/tmp/ansible-tmp-1554214582.05-196228110119264/ > /dev/null 2>&1
&& sleep 0'"'"''
<virthyp04.virt.in.bmrc.ox.ac.uk<http://virthyp04.virt.in.bmrc.ox.ac.uk/>> (0,
'', '')
The full traceback is:
Traceback (most recent call last):
File "/tmp/ansible_ovirt_storage_domain_payload_k6JDp0/__main__.py", line 682,
in main
ret = storage_domains_module.create()
File
"/tmp/ansible_ovirt_storage_domain_payload_k6JDp0/ansible_ovirt_storage_domain_payload.zip/ansible/module_utils/ovirt.py",
line 587, in create
**kwargs
File "/usr/lib64/python2.7/site-packages/ovirtsdk4/services.py", line 25077,
in add
return self._internal_add(storage_domain, headers, query, wait)
File "/usr/lib64/python2.7/site-packages/ovirtsdk4/service.py", line 232, in
_internal_add
return future.wait() if wait else future
File "/usr/lib64/python2.7/site-packages/ovirtsdk4/service.py", line 55, in
wait
return self._code(response)
File "/usr/lib64/python2.7/site-packages/ovirtsdk4/service.py", line 229, in
callback
self._check_fault(response)
File "/usr/lib64/python2.7/site-packages/ovirtsdk4/service.py", line 132, in
_check_fault
self._raise_error(response, body)
File "/usr/lib64/python2.7/site-packages/ovirtsdk4/service.py", line 118, in
_raise_error
raise error
Error: Fault reason is "Operation Failed". Fault detail is "[Permission
settings on the specified path do not allow access to the storage.
Verify permission settings on the specified storage path.]". HTTP response code is
400.
fatal: [virthyp04.virt.in.bmrc.ox.ac.uk<http://virthyp04.virt.in.bmrc.ox.ac.uk/>]:
FAILED! => {
"changed": false,
"invocation": {
"module_args": {
"backup": null,
"comment": null,
"critical_space_action_blocker": null,
"data_center": "Default",
"description": null,
"destroy": null,
"discard_after_delete": null,
"domain_function": "data",
"fcp": null,
"fetch_nested": false,
"format": null,
"glusterfs": null,
"host":
"virthyp04.virt.in.bmrc.ox.ac.uk<http://virthyp04.virt.in.bmrc.ox.ac.uk/>",
"id": null,
"iscsi": null,
"localfs": null,
"name": "hosted_storage",
"nested_attributes": [],
"nfs": {
"address": "10.141.15.248",
"mount_options": "",
"path": "/export/virtman/hosted_storage",
"version": "auto"
},
"poll_interval": 3,
"posixfs": null,
"state": "unattached",
"timeout": 180,
"wait": true,
"warning_low_space": null,
"wipe_after_delete": null
}
},
"msg": "Fault reason is \"Operation Failed\". Fault detail is
\"[Permission settings on the specified path do not allow access to the
storage.\nVerify permission settings on the specified storage path.]\". HTTP response
code is 400."
}
to retry, use: --limit @/etc/ansible/playbook/ovirt.retry
PLAY RECAP
***************************************************************************************************************************************************
virthyp04.virt.in.bmrc.ox.ac.uk<http://virthyp04.virt.in.bmrc.ox.ac.uk/> : ok=205
changed=67 unreachable=0 failed=1
This works, when run on the hypervisor as root:
mount -t nfs 10.141.15.248:/export/virtman/hosted_storage /mnt
I can only suggest to check vdsm logs; maybe you still have some leftover.
Regards,
Callum
--
Callum Smith
Research Computing Core
Wellcome Trust Centre for Human Genetics
University of Oxford
e. callum@well.ox.ac.uk<mailto:callum@well.ox.ac.uk>
On 2 Apr 2019, at 14:58, Callum Smith
<callum@well.ox.ac.uk<mailto:callum@well.ox.ac.uk>> wrote:
Yep seems that at the top level it overwrites it:
- name: Install oVirt Hosted Engine
hosts: virthyp04.virt.in.bmrc.ox.ac.uk<http://virthyp04.virt.in.bmrc.ox.ac.uk/>
roles:
- ovirt.hosted_engine_setup
become: true
become_user: root
Going to try see if i can get it to work without them on. I have ansible_user=root for the
host so shouldn't be needed, problem is it's in context of a playbook where there
are global defaults for the infrastructure. Hopefully support for sudo will come in, but
this would imply that if you had become_user: <sudouser> become_method: sudo, this
would override the task-specific become settings. Might be harder to achieve the sudo
support.
Regards,
Callum
--
Callum Smith
Research Computing Core
Wellcome Trust Centre for Human Genetics
University of Oxford
e. callum@well.ox.ac.uk<mailto:callum@well.ox.ac.uk>
On 2 Apr 2019, at 14:47, Simone Tiraboschi
<stirabos@redhat.com<mailto:stirabos@redhat.com>> wrote:
On Tue, Apr 2, 2019 at 3:35 PM Callum Smith
<callum@well.ox.ac.uk<mailto:callum@well.ox.ac.uk>> wrote:
Ok so on a clean NFS:
<virthyp04.virt.in.bmrc.ox.ac.uk<http://virthyp04.virt.in.bmrc.ox.ac.uk/>>
ESTABLISH SSH CONNECTION FOR USER: root
<virthyp04.virt.in.bmrc.ox.ac.uk<http://virthyp04.virt.in.bmrc.ox.ac.uk/>>
SSH: EXEC ssh -C -o ControlMaster=auto -o ControlPersist=60s -o
KbdInteractiveAuthentication=no -o
PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o
PasswordAuthentication=no -o User=root -o ConnectTimeout=10 -o
ControlPath=/etc/ansible/.ansible/cp/2c1e73363c
virthyp04.virt.in.bmrc.ox.ac.uk<http://virthyp04.virt.in.bmrc.ox.ac.uk/>
'/bin/sh -c '"'"'echo ~root && sleep
0'"'"''
<virthyp04.virt.in.bmrc.ox.ac.uk<http://virthyp04.virt.in.bmrc.ox.ac.uk/>> (0,
'/root\n', '')
<virthyp04.virt.in.bmrc.ox.ac.uk<http://virthyp04.virt.in.bmrc.ox.ac.uk/>>
ESTABLISH SSH CONNECTION FOR USER: root
<virthyp04.virt.in.bmrc.ox.ac.uk<http://virthyp04.virt.in.bmrc.ox.ac.uk/>>
SSH: EXEC ssh -C -o ControlMaster=auto -o ControlPersist=60s -o
KbdInteractiveAuthentication=no -o
PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o
PasswordAuthentication=no -o User=root -o ConnectTimeout=10 -o
ControlPath=/etc/ansible/.ansible/cp/2c1e73363c
virthyp04.virt.in.bmrc.ox.ac.uk<http://virthyp04.virt.in.bmrc.ox.ac.uk/>
'/bin/sh -c '"'"'( umask 77 && mkdir -p "` echo
/root/.ansible/tmp/ansible-tmp-1554211873.65-5732618148991 `" && echo
ansible-tmp-1554211873.65-5732618148991="` echo
/root/.ansible/tmp/ansible-tmp-1554211873.65-5732618148991 `" ) && sleep
0'"'"''
<virthyp04.virt.in.bmrc.ox.ac.uk<http://virthyp04.virt.in.bmrc.ox.ac.uk/>> (0,
'ansible-tmp-1554211873.65-5732618148991=/root/.ansible/tmp/ansible-tmp-1554211873.65-5732618148991\n',
'')
Using module file /opt/ansible/lib/ansible/modules/commands/command.py
<virthyp04.virt.in.bmrc.ox.ac.uk<http://virthyp04.virt.in.bmrc.ox.ac.uk/>> PUT
/etc/ansible/.ansible/tmp/ansible-local-109444FfQz4/tmp0icLBP TO
/root/.ansible/tmp/ansible-tmp-1554211873.65-5732618148991/AnsiballZ_command.py
<virthyp04.virt.in.bmrc.ox.ac.uk<http://virthyp04.virt.in.bmrc.ox.ac.uk/>>
SSH: EXEC sftp -b - -C -o ControlMaster=auto -o ControlPersist=60s -o
KbdInteractiveAuthentication=no -o
PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o
PasswordAuthentication=no -o User=root -o ConnectTimeout=10 -o
ControlPath=/etc/ansible/.ansible/cp/2c1e73363c
'[virthyp04.virt.in.bmrc.ox.ac.uk<http://virthyp04.virt.in.bmrc.ox.ac.uk/>]'
<virthyp04.virt.in.bmrc.ox.ac.uk<http://virthyp04.virt.in.bmrc.ox.ac.uk/>> (0,
'sftp> put /etc/ansible/.ansible/tmp/ansible-local-109444FfQz4/tmp0icLBP
/root/.ansible/tmp/ansible-tmp-1554211873.65-5732618148991/AnsiballZ_command.py\n',
'')
<virthyp04.virt.in.bmrc.ox.ac.uk<http://virthyp04.virt.in.bmrc.ox.ac.uk/>>
ESTABLISH SSH CONNECTION FOR USER: root
<virthyp04.virt.in.bmrc.ox.ac.uk<http://virthyp04.virt.in.bmrc.ox.ac.uk/>>
SSH: EXEC ssh -C -o ControlMaster=auto -o ControlPersist=60s -o
KbdInteractiveAuthentication=no -o
PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o
PasswordAuthentication=no -o User=root -o ConnectTimeout=10 -o
ControlPath=/etc/ansible/.ansible/cp/2c1e73363c
virthyp04.virt.in.bmrc.ox.ac.uk<http://virthyp04.virt.in.bmrc.ox.ac.uk/>
'/bin/sh -c '"'"'chmod u+x
/root/.ansible/tmp/ansible-tmp-1554211873.65-5732618148991/
/root/.ansible/tmp/ansible-tmp-1554211873.65-5732618148991/AnsiballZ_command.py &&
sleep 0'"'"''
<virthyp04.virt.in.bmrc.ox.ac.uk<http://virthyp04.virt.in.bmrc.ox.ac.uk/>> (0,
'', '')
<virthyp04.virt.in.bmrc.ox.ac.uk<http://virthyp04.virt.in.bmrc.ox.ac.uk/>>
ESTABLISH SSH CONNECTION FOR USER: root
<virthyp04.virt.in.bmrc.ox.ac.uk<http://virthyp04.virt.in.bmrc.ox.ac.uk/>>
SSH: EXEC ssh -C -o ControlMaster=auto -o ControlPersist=60s -o
KbdInteractiveAuthentication=no -o
PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o
PasswordAuthentication=no -o User=root -o ConnectTimeout=10 -o
ControlPath=/etc/ansible/.ansible/cp/2c1e73363c -tt
virthyp04.virt.in.bmrc.ox.ac.uk<http://virthyp04.virt.in.bmrc.ox.ac.uk/>
'/bin/sh -c '"'"'LANG=en_US.UTF-8 LC_ALL=en_US.UTF-8
LC_MESSAGES=en_US.UTF-8 /usr/bin/python
/root/.ansible/tmp/ansible-tmp-1554211873.65-5732618148991/AnsiballZ_command.py &&
sleep 0'"'"''
<virthyp04.virt.in.bmrc.ox.ac.uk<http://virthyp04.virt.in.bmrc.ox.ac.uk/>> (1,
'\r\n{"changed": true, "end": "2019-04-02
13:31:16.103973", "stdout": "", "cmd": ["dd",
"bs=20480", "count=1", "oflag=direct",
"if=/var/tmp/localvmjOO3X6/1757281c-7c8e-407f-a18b-aecaf13e27a6",
"of=/rhev/data-center/mnt/10.141.15.248:_export_virtman_hosted__storage/d2d2255f-713c-4b32-9f63-680772435e00/images/01be5af0-c79a-402c-a98e-7da0714a23fe/1757281c-7c8e-407f-a18b-aecaf13e27a6"],
"failed": true, "delta": "0:00:00.005304",
"stderr": "dd: failed to open
\\u2018/rhev/data-center/mnt/10.141.15.248:_export_virtman_hosted__storage/d2d2255f-713c-4b32-9f63-680772435e00/images/01be5af0-c79a-402c-a98e-7da0714a23fe/1757281c-7c8e-407f-a18b-aecaf13e27a6\\u2019:
Permission denied", "rc": 1, "invocation":
{"module_args": {"warn": false, "executable": null,
"_uses_shell": false, "_raw_params": "dd bs=20480 count=1
oflag=direct
if=\\"/var/tmp/localvmjOO3X6/1757281c-7c8e-407f-a18b-aecaf13e27a6\\"
of=\\"/rhev/data-center/mnt/10.141.15.248:_export_virtman_hosted__storage/d2d2255f-713c-4b32-9f63-680772435e00/images/01be5af0-c79a-402c-a98e-7da0714a23fe/1757281c-7c8e-407f-a18b-aecaf13e27a6\\"",
"removes": null, "argv": null, "creates": null,
"chdir": null, "stdin": null}}, "start": "2019-04-02
13:31:16.098669", "msg": "non-zero return code"}\r\n',
'Shared connection to
virthyp04.virt.in.bmrc.ox.ac.uk<http://virthyp04.virt.in.bmrc.ox.ac.uk/>
closed.\r\n')
<virthyp04.virt.in.bmrc.ox.ac.uk<http://virthyp04.virt.in.bmrc.ox.ac.uk/>>
Failed to connect to the host via ssh: Shared connection to
virthyp04.virt.in.bmrc.ox.ac.uk<http://virthyp04.virt.in.bmrc.ox.ac.uk/> closed.
<virthyp04.virt.in.bmrc.ox.ac.uk<http://virthyp04.virt.in.bmrc.ox.ac.uk/>>
ESTABLISH SSH CONNECTION FOR USER: root
<virthyp04.virt.in.bmrc.ox.ac.uk<http://virthyp04.virt.in.bmrc.ox.ac.uk/>>
SSH: EXEC ssh -C -o ControlMaster=auto -o ControlPersist=60s -o
KbdInteractiveAuthentication=no -o
PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o
PasswordAuthentication=no -o User=root -o ConnectTimeout=10 -o
ControlPath=/etc/ansible/.ansible/cp/2c1e73363c
virthyp04.virt.in.bmrc.ox.ac.uk<http://virthyp04.virt.in.bmrc.ox.ac.uk/>
'/bin/sh -c '"'"'rm -f -r
/root/.ansible/tmp/ansible-tmp-1554211873.65-5732618148991/ > /dev/null 2>&1
&& sleep 0'"'"''
<virthyp04.virt.in.bmrc.ox.ac.uk<http://virthyp04.virt.in.bmrc.ox.ac.uk/>> (0,
'', '')
fatal: [virthyp04.virt.in.bmrc.ox.ac.uk<http://virthyp04.virt.in.bmrc.ox.ac.uk/>]:
FAILED! => {
"changed": true,
"cmd": [
"dd",
"bs=20480",
"count=1",
"oflag=direct",
"if=/var/tmp/localvmjOO3X6/1757281c-7c8e-407f-a18b-aecaf13e27a6",
"of=/rhev/data-center/mnt/10.141.15.248:_export_virtman_hosted__storage/d2d2255f-713c-4b32-9f63-680772435e00/images/01be5af0-c79a-402c-a98e-7da0714a23fe/1757281c-7c8e-407f-a18b-aecaf13e27a6"
],
"delta": "0:00:00.005304",
"end": "2019-04-02 13:31:16.103973",
"invocation": {
"module_args": {
"_raw_params": "dd bs=20480 count=1 oflag=direct
if=\"/var/tmp/localvmjOO3X6/1757281c-7c8e-407f-a18b-aecaf13e27a6\"
of=\"/rhev/data-center/mnt/10.141.15.248:_export_virtman_hosted__storage/d2d2255f-713c-4b32-9f63-680772435e00/images/01be5af0-c79a-402c-a98e-7da0714a23fe/1757281c-7c8e-407f-a18b-aecaf13e27a6\"",
"_uses_shell": false,
"argv": null,
"chdir": null,
"creates": null,
"executable": null,
"removes": null,
"stdin": null,
"warn": false
}
},
"msg": "non-zero return code",
"rc": 1,
"start": "2019-04-02 13:31:16.098669",
"stderr": "dd: failed to open
‘/rhev/data-center/mnt/10.141.15.248:_export_virtman_hosted__storage/d2d2255f-713c-4b32-9f63-680772435e00/images/01be5af0-c79a-402c-a98e-7da0714a23fe/1757281c-7c8e-407f-a18b-aecaf13e27a6’:
Permission denied",
"stderr_lines": [
"dd: failed to open
‘/rhev/data-center/mnt/10.141.15.248:_export_virtman_hosted__storage/d2d2255f-713c-4b32-9f63-680772435e00/images/01be5af0-c79a-402c-a98e-7da0714a23fe/1757281c-7c8e-407f-a18b-aecaf13e27a6’:
Permission denied"
],
"stdout": "",
"stdout_lines": []
}
ls -laZ
/rhev/data-center/mnt/10.141.15.248:_export_virtman_hosted__storage/d2d2255f-713c-4b32-9f63-680772435e00/images/01be5af0-c79a-402c-a98e-7da0714a23fe/
drwxrwxrwx. vdsm kvm system_u:object_r:nfs_t:s0 .
drwxrwxrwx. vdsm kvm system_u:object_r:nfs_t:s0 ..
-rw-rw----. vdsm kvm system_u:object_r:nfs_t:s0
1757281c-7c8e-407f-a18b-aecaf13e27a6
-rw-rw----. vdsm kvm system_u:object_r:nfs_t:s0
1757281c-7c8e-407f-a18b-aecaf13e27a6.lease
-rwxrwxrwx. vdsm kvm system_u:object_r:nfs_t:s0
1757281c-7c8e-407f-a18b-aecaf13e27a6.meta
Should this task be running as the `vdsm` user - I imagine adding a become: true; and
become_user: vdsm; would work?
Obviously we already have it:
https://github.com/oVirt/ovirt-ansible-hosted-engine-setup/blob/master/ta...
Are you forcing
become: true
become_user: root
or something like that at playbook level?
I'd expect that the same directive at role level will win being more specific but
honestly I'm not 100% sure.
Regards,
Callum
--
Callum Smith
Research Computing Core
Wellcome Trust Centre for Human Genetics
University of Oxford
e. callum@well.ox.ac.uk<mailto:callum@well.ox.ac.uk>
On 2 Apr 2019, at 10:21, Simone Tiraboschi
<stirabos@redhat.com<mailto:stirabos@redhat.com>> wrote:
On Tue, Apr 2, 2019 at 11:18 AM Callum Smith
<callum@well.ox.ac.uk<mailto:callum@well.ox.ac.uk>> wrote:
No, the NFS is full of artefacts - should i be rm -rf the whole thing every time?
Yes, right.
Regards,
Callum
--
Callum Smith
Research Computing Core
Wellcome Trust Centre for Human Genetics
University of Oxford
e. callum@well.ox.ac.uk<mailto:callum@well.ox.ac.uk>
On 2 Apr 2019, at 10:09, Simone Tiraboschi
<stirabos@redhat.com<mailto:stirabos@redhat.com>> wrote:
TASK [ovirt.hosted_engine_setup : Activate storage domain]
******************************************
...
Error: Fault reason is "Operation Failed". Fault detail is "[]". HTTP
response code is 400.
usually means that the engine failed to activate that storage domain; unfortunately engine
error messages are not always that clear (see
https://bugzilla.redhat.com/1554922<https://bugzilla.redhat.com/show_b...
) but this is often due to fact the the NFS share or the iSCSI lun or whatever you used
wasn't really clean.
Are you manually cleaning it between one attempt and the next one?
On Tue, Apr 2, 2019 at 10:50 AM Callum Smith
<callum@well.ox.ac.uk<mailto:callum@well.ox.ac.uk>> wrote:
Dear Simone,
With no changes, we're now seeing this baffling error:
TASK [ovirt.hosted_engine_setup : Parse OVF]
********************************************************
task path:
/etc/ansible/playbook/ovirt-ansible-hosted-engine-setup/tasks/create_storage_domain.yml:120
<virthyp04.virt.in.bmrc.ox.ac.uk<http://virthyp04.virt.in.bmrc.ox.ac.uk/>>
ESTABLISH SSH CONNECTION FOR USER: root
<virthyp04.virt.in.bmrc.ox.ac.uk<http://virthyp04.virt.in.bmrc.ox.ac.uk/>>
SSH: EXEC ssh -C -o ControlMaster=auto -o ControlPersist=60s -o
KbdInteractiveAuthentication=no -o PreferredAuthentications=
gssapi-with-mic,gssapi-keyex,hostbased,publickey -o PasswordAuthentication=no -o User=root
-o ConnectTimeout=10 -o ControlPath=/etc/ansible/.ansible/cp/2c1e73
363c virthyp04.virt.in.bmrc.ox.ac.uk<http://virthyp04.virt.in.bmrc.ox.ac.uk/>
'/bin/sh -c '"'"'echo ~root && sleep
0'"'"''
<virthyp04.virt.in.bmrc.ox.ac.uk<http://virthyp04.virt.in.bmrc.ox.ac.uk/>> (0,
'/root\n', '')
<virthyp04.virt.in.bmrc.ox.ac.uk<http://virthyp04.virt.in.bmrc.ox.ac.uk/>>
ESTABLISH SSH CONNECTION FOR USER: root
<virthyp04.virt.in.bmrc.ox.ac.uk<http://virthyp04.virt.in.bmrc.ox.ac.uk/>>
SSH: EXEC ssh -C -o ControlMaster=auto -o ControlPersist=60s -o
KbdInteractiveAuthentication=no -o PreferredAuthentications=
gssapi-with-mic,gssapi-keyex,hostbased,publickey -o PasswordAuthentication=no -o User=root
-o ConnectTimeout=10 -o ControlPath=/etc/ansible/.ansible/cp/2c1e73
363c virthyp04.virt.in.bmrc.ox.ac.uk<http://virthyp04.virt.in.bmrc.ox.ac.uk/>
'/bin/sh -c '"'"'( umask 77 && mkdir -p "` echo
/root/.ansible/tmp/ansible-tmp-1553937522.31-129798476242320 `" && echo a
nsible-tmp-1553937522.31-129798476242320="` echo
/root/.ansible/tmp/ansible-tmp-1553937522.31-129798476242320 `" ) && sleep
0'"'"''
<virthyp04.virt.in.bmrc.ox.ac.uk<http://virthyp04.virt.in.bmrc.ox.ac.uk/>> (0,
'ansible-tmp-1553937522.31-129798476242320=/root/.ansible/tmp/ansible-tmp-1553937522.31-129798476242320\n',
'')
Using module file /opt/ansible/lib/ansible/modules/files/xml.py
<virthyp04.virt.in.bmrc.ox.ac.uk<http://virthyp04.virt.in.bmrc.ox.ac.uk/>> PUT
/etc/ansible/.ansible/tmp/ansible-local-32213KmUe6/tmp8wMU8o TO
/root/.ansible/tmp/ansible-tmp-1553937522.31-12979847624
2320/AnsiballZ_xml.py
<virthyp04.virt.in.bmrc.ox.ac.uk<http://virthyp04.virt.in.bmrc.ox.ac.uk/>>
SSH: EXEC sftp -b - -C -o ControlMaster=auto -o ControlPersist=60s -o
KbdInteractiveAuthentication=no -o PreferredAuthentica
tions=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o PasswordAuthentication=no -o
User=root -o ConnectTimeout=10 -o ControlPath=/etc/ansible/.ansible/cp/
2c1e73363c
'[virthyp04.virt.in.bmrc.ox.ac.uk<http://virthyp04.virt.in.bmrc.ox.ac.uk/>]'
<virthyp04.virt.in.bmrc.ox.ac.uk<http://virthyp04.virt.in.bmrc.ox.ac.uk/>> (0,
'sftp> put /etc/ansible/.ansible/tmp/ansible-local-32213KmUe6/tmp8wMU8o
/root/.ansible/tmp/ansible-tmp-1553937522.31-129
798476242320/AnsiballZ_xml.py\n', '')
<virthyp04.virt.in.bmrc.ox.ac.uk<http://virthyp04.virt.in.bmrc.ox.ac.uk/>>
ESTABLISH SSH CONNECTION FOR USER: root
<virthyp04.virt.in.bmrc.ox.ac.uk<http://virthyp04.virt.in.bmrc.ox.ac.uk/>>
SSH: EXEC ssh -C -o ControlMaster=auto -o ControlPersist=60s -o
KbdInteractiveAuthentication=no -o PreferredAuthentications=
gssapi-with-mic,gssapi-keyex,hostbased,publickey -o PasswordAuthentication=no -o User=root
-o ConnectTimeout=10 -o ControlPath=/etc/ansible/.ansible/cp/2c1e73
363c virthyp04.virt.in.bmrc.ox.ac.uk<http://virthyp04.virt.in.bmrc.ox.ac.uk/>
'/bin/sh -c '"'"'chmod u+x
/root/.ansible/tmp/ansible-tmp-1553937522.31-129798476242320/
/root/.ansible/tmp/ansible-tmp-1
553937522.31-129798476242320/AnsiballZ_xml.py && sleep
0'"'"''
<virthyp04.virt.in.bmrc.ox.ac.uk<http://virthyp04.virt.in.bmrc.ox.ac.uk/>> (0,
'', '')
<virthyp04.virt.in.bmrc.ox.ac.uk<http://virthyp04.virt.in.bmrc.ox.ac.uk/>>
ESTABLISH SSH CONNECTION FOR USER: root
<virthyp04.virt.in.bmrc.ox.ac.uk<http://virthyp04.virt.in.bmrc.ox.ac.uk/>>
SSH: EXEC ssh -C -o ControlMaster=auto -o ControlPersist=60s -o
KbdInteractiveAuthentication=no -o PreferredAuthentications=
gssapi-with-mic,gssapi-keyex,hostbased,publickey -o PasswordAuthentication=no -o User=root
-o ConnectTimeout=10 -o ControlPath=/etc/ansible/.ansible/cp/2c1e73
363c -tt virthyp04.virt.in.bmrc.ox.ac.uk<http://virthyp04.virt.in.bmrc.ox.ac.uk/>
'/bin/sh -c '"'"'/usr/bin/python
/root/.ansible/tmp/ansible-tmp-1553937522.31-129798476242320/AnsiballZ_xml.py &&
sle
ep 0'"'"''
<virthyp04.virt.in.bmrc.ox.ac.uk<http://virthyp04.virt.in.bmrc.ox.ac.uk/>> (0,
'\r\n{"count": 1, "matches": [{"Disk":
{"{http://schemas.dmtf.org/ovf/envelope/1/}wipe-after-delete":
"false", "{http://
schemas.dmtf.org/ovf/envelope/1/<http://schemas.dmtf.org/ovf/envelope/...;:
"http://www.vmware.com/specifications/vmdk.html#sparse",
"{http://schemas.dmtf.org/ovf/envelope/1/}vm_snapshot_id":
"5f2be758-82d7-4c07-a220-9060e782dc7a",
"{http://schemas.dmtf.org/ovf/envelope/1/}parentRef": "",
"{http://schemas.dmtf.org/ovf/envelope/1/}fileRef": "6f76686
b-199c-4cb3-bbbe-86fc34365745/72bc3948-5d8d-4877-bac8-7db4995045b5",
"{http://schemas.dmtf.org/ovf/envelope/1/}actual_size": "51",
"{http://schemas.dmtf.org/o
vf/envelope/1/}volume-format": "COW",
"{http://schemas.dmtf.org/ovf/envelope/1/}boot": "true",
"{http://schemas.dmtf.org/ovf/envelope/1/}size": "51", "{http:/
/schemas.dmtf.org/ovf/envelope/1/<http://schemas.dmtf.org/ovf/envelope/1/>}volume-type":
"Sparse", "{http://schemas.dmtf.org/ovf/envelope/1/}disk-type":
"System", "{http://schemas.dmtf.org/ovf/envelo
pe/1/}diskId": "72bc3948-5d8d-4877-bac8-7db4995045b5",
"{http://schemas.dmtf.org/ovf/envelope/1/}disk-interface":
"VirtIO"}}], "changed": false, "actions": {"
xpath": "/ovf:Envelope/Section/Disk", "state":
"present", "namespaces": {"vssd":
"http://schemas.dmtf.org/wbem/wscim/1/cim-schema/2/CIM_VirtualSystemSettingDa
ta", "rasd":
"http://schemas.dmtf.org/wbem/wscim/1/cim-schema/2/CIM_ResourceAllocationSettingData",
"xsi": "http://www.w3.org/2001/XMLSchema-instance", "ovf":
"http://schemas.dmtf.org/ovf/envelope/1/"}}, "msg": 1,
"invocation": {"module_args": {"xpath":
"/ovf:Envelope/Section/Disk", "count": false,
"set_children": null, "xmlstring": null, "strip_cdata_tags":
false, "attribute": "ovf:size", "pretty_print": false,
"add_children": null, "value": null, "content":
"attribute", "state": "present", "namespaces":
{"vssd":
"http://schemas.dmtf.org/wbem/wscim/1/cim-schema/2/CIM_VirtualSystemSettingData",
"rasd":
"http://schemas.dmtf.org/wbem/wscim/1/cim-schema/2/CIM_ResourceAllocationSettingData",
"xsi": "http://www.w3.org/2001/XMLSchema-instance", "ovf":
"http://schemas.dmtf.org/ovf/envelope/1/"}, "input_type":
"yaml", "print_match": false, "path":
"/var/tmp/localvmMNxnwL/master/vms/074a62d4-44f9-4ffe-a172-2702a9fe96df/074a62d4-44f9-4ffe-a172-2702a9fe96df.ovf",
"backup": false}}}\r\n', 'Shared connection to
virthyp04.virt.in.bmrc.ox.ac.uk<http://virthyp04.virt.in.bmrc.ox.ac.uk/>closed.\r\n')
<virthyp04.virt.in.bmrc.ox.ac.uk<http://virthyp04.virt.in.bmrc.ox.ac.uk/>>
ESTABLISH SSH CONNECTION FOR USER: root
<virthyp04.virt.in.bmrc.ox.ac.uk<http://virthyp04.virt.in.bmrc.ox.ac.uk/>>
SSH: EXEC ssh -C -o ControlMaster=auto -o ControlPersist=60s -o
KbdInteractiveAuthentication=no -o
PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o
PasswordAuthentication=no -o User=root -o ConnectTimeout=10 -o
ControlPath=/etc/ansible/.ansible/cp/2c1e73363cvirthyp04.virt.in.bmrc.ox.ac.uk<http://virthyp04.virt.in.bmrc.ox.ac.uk/>
'/bin/sh -c '"'"'rm -f -r
/root/.ansible/tmp/ansible-tmp-1553937522.31-129798476242320/ > /dev/null 2>&1
&& sleep 0'"'"''
<virthyp04.virt.in.bmrc.ox.ac.uk<http://virthyp04.virt.in.bmrc.ox.ac.uk/>> (0,
'', '')
ok: [virthyp04.virt.in.bmrc.ox.ac.uk<http://virthyp04.virt.in.bmrc.ox.ac.uk/>] =>
{
"actions": {
"namespaces": {
"ovf": "http://schemas.dmtf.org/ovf/envelope/1/",
"rasd":
"http://schemas.dmtf.org/wbem/wscim/1/cim-schema/2/CIM_ResourceAllocationSettingData",
"vssd":
"http://schemas.dmtf.org/wbem/wscim/1/cim-schema/2/CIM_VirtualSystemSettingData",
"xsi": "http://www.w3.org/2001/XMLSchema-instance"
},
"state": "present",
"xpath": "/ovf:Envelope/Section/Disk"
},
"changed": false,
"count": 1,
"invocation": {
"module_args": {
"add_children": null,
"attribute": "ovf:size",
"backup": false,
"content": "attribute",
"count": false,
"input_type": "yaml",
"namespaces": {
"ovf": "http://schemas.dmtf.org/ovf/envelope/1/",
"rasd":
"http://schemas.dmtf.org/wbem/wscim/1/cim-schema/2/CIM_ResourceAllocationSettingData",
"vssd":
"http://schemas.dmtf.org/wbem/wscim/1/cim-schema/2/CIM_VirtualSystemSettingData",
"xsi": "http://www.w3.org/2001/XMLSchema-instance"
},
"path":
"/var/tmp/localvmMNxnwL/master/vms/074a62d4-44f9-4ffe-a172-2702a9fe96df/074a62d4-44f9-4ffe-a172-2702a9fe96df.ovf",
"pretty_print": false,
"print_match": false,
"set_children": null,
"state": "present",
"strip_cdata_tags": false,
"value": null,
"xmlstring": null,
"xpath": "/ovf:Envelope/Section/Disk"
}
},
"matches": [
{
"Disk": {
"{http://schemas.dmtf.org/ovf/envelope/1/}actual_size":
"51",
"{http://schemas.dmtf.org/ovf/envelope/1/}boot":
"true",
"{http://schemas.dmtf.org/ovf/envelope/1/}disk-interface":
"VirtIO",
"{http://schemas.dmtf.org/ovf/envelope/1/}disk-type":
"System",
"{http://schemas.dmtf.org/ovf/envelope/1/}diskId":
"72bc3948-5d8d-4877-bac8-7db4995045b5",
"{http://schemas.dmtf.org/ovf/envelope/1/}fileRef":
"6f76686b-199c-4cb3-bbbe-86fc34365745/72bc3948-5d8d-4877-bac8-7db4995045b5",
"{http://schemas.dmtf.org/ovf/envelope/1/}format":
"http://www.vmware.com/specifications/vmdk.html#sparse",
"{http://schemas.dmtf.org/ovf/envelope/1/}parentRef":
"",
"{http://schemas.dmtf.org/ovf/envelope/1/}size":
"51",
"{http://schemas.dmtf.org/ovf/envelope/1/}vm_snapshot_id":
"5f2be758-82d7-4c07-a220-9060e782dc7a",
"{http://schemas.dmtf.org/ovf/envelope/1/}volume-format":
"COW",
"{http://schemas.dmtf.org/ovf/envelope/1/}volume-type":
"Sparse",
"{http://schemas.dmtf.org/ovf/envelope/1/}wipe-after-delete":
"false"
}
}
],
"msg": 1
}
TASK [ovirt.hosted_engine_setup : Get required size]
************************************************
task path:
/etc/ansible/playbook/ovirt-ansible-hosted-engine-setup/tasks/create_storage_domain.yml:132
ok: [virthyp04.virt.in.bmrc.ox.ac.uk<http://virthyp04.virt.in.bmrc.ox.ac.uk/>] =>
{
"ansible_facts": {
"required_size": "65498251264"
},
"changed": false
}
TASK [ovirt.hosted_engine_setup : debug]
************************************************************
task path:
/etc/ansible/playbook/ovirt-ansible-hosted-engine-setup/tasks/create_storage_domain.yml:139
ok: [virthyp04.virt.in.bmrc.ox.ac.uk<http://virthyp04.virt.in.bmrc.ox.ac.uk/>] =>
{
"required_size": "65498251264"
}
TASK [ovirt.hosted_engine_setup : Remove unsuitable storage domain]
*********************************
task path:
/etc/ansible/playbook/ovirt-ansible-hosted-engine-setup/tasks/create_storage_domain.yml:140
skipping: [virthyp04.virt.in.bmrc.ox.ac.uk<http://virthyp04.virt.in.bmrc.ox.ac.uk/>]
=> {
"changed": false,
"skip_reason": "Conditional result was False"
}
TASK [ovirt.hosted_engine_setup : debug]
************************************************************
task path:
/etc/ansible/playbook/ovirt-ansible-hosted-engine-setup/tasks/create_storage_domain.yml:151
ok: [virthyp04.virt.in.bmrc.ox.ac.uk<http://virthyp04.virt.in.bmrc.ox.ac.uk/>] =>
{
"remove_storage_domain_details": {
"changed": false,
"skip_reason": "Conditional result was False",
"skipped": true
}
}
TASK [ovirt.hosted_engine_setup : Check storage domain free space]
**********************************
task path:
/etc/ansible/playbook/ovirt-ansible-hosted-engine-setup/tasks/create_storage_domain.yml:152
skipping: [virthyp04.virt.in.bmrc.ox.ac.uk<http://virthyp04.virt.in.bmrc.ox.ac.uk/>]
=> {
"changed": false,
"skip_reason": "Conditional result was False"
}
TASK [ovirt.hosted_engine_setup : Activate storage domain]
******************************************
task path:
/etc/ansible/playbook/ovirt-ansible-hosted-engine-setup/tasks/create_storage_domain.yml:161
<virthyp04.virt.in.bmrc.ox.ac.uk<http://virthyp04.virt.in.bmrc.ox.ac.uk/>>
ESTABLISH SSH CONNECTION FOR USER: root
<virthyp04.virt.in.bmrc.ox.ac.uk<http://virthyp04.virt.in.bmrc.ox.ac.uk/>>
SSH: EXEC ssh -C -o ControlMaster=auto -o ControlPersist=60s -o
KbdInteractiveAuthentication=no -o
PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o
PasswordAuthentication=no -o User=root -o ConnectTimeout=10 -o
ControlPath=/etc/ansible/.ansible/cp/2c1e73363cvirthyp04.virt.in.bmrc.ox.ac.uk<http://virthyp04.virt.in.bmrc.ox.ac.uk/>
'/bin/sh -c '"'"'echo ~root && sleep
0'"'"''
<virthyp04.virt.in.bmrc.ox.ac.uk<http://virthyp04.virt.in.bmrc.ox.ac.uk/>> (0,
'/root\n', '')
<virthyp04.virt.in.bmrc.ox.ac.uk<http://virthyp04.virt.in.bmrc.ox.ac.uk/>>
ESTABLISH SSH CONNECTION FOR USER: root
<virthyp04.virt.in.bmrc.ox.ac.uk<http://virthyp04.virt.in.bmrc.ox.ac.uk/>>
SSH: EXEC ssh -C -o ControlMaster=auto -o ControlPersist=60s -o
KbdInteractiveAuthentication=no -o
PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o
PasswordAuthentication=no -o User=root -o ConnectTimeout=10 -o
ControlPath=/etc/ansible/.ansible/cp/2c1e73363cvirthyp04.virt.in.bmrc.ox.ac.uk<http://virthyp04.virt.in.bmrc.ox.ac.uk/>
'/bin/sh -c '"'"'( umask 77 && mkdir -p "` echo
/root/.ansible/tmp/ansible-tmp-1553937525.89-247224387363848 `" && echo
ansible-tmp-1553937525.89-247224387363848="` echo
/root/.ansible/tmp/ansible-tmp-1553937525.89-247224387363848 `" ) && sleep
0'"'"''
<virthyp04.virt.in.bmrc.ox.ac.uk<http://virthyp04.virt.in.bmrc.ox.ac.uk/>> (0,
'ansible-tmp-1553937525.89-247224387363848=/root/.ansible/tmp/ansible-tmp-1553937525.89-247224387363848\n',
'')
Using module file /opt/ansible/lib/ansible/modules/cloud/ovirt/ovirt_storage_domain.py
<virthyp04.virt.in.bmrc.ox.ac.uk<http://virthyp04.virt.in.bmrc.ox.ac.uk/>> PUT
/etc/ansible/.ansible/tmp/ansible-local-32213KmUe6/tmpQtVJtM TO
/root/.ansible/tmp/ansible-tmp-1553937525.89-247224387363848/AnsiballZ_ovirt_storage_domain.py
<virthyp04.virt.in.bmrc.ox.ac.uk<http://virthyp04.virt.in.bmrc.ox.ac.uk/>>
SSH: EXEC sftp -b - -C -o ControlMaster=auto -o ControlPersist=60s -o
KbdInteractiveAuthentication=no -o
PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o
PasswordAuthentication=no -o User=root -o ConnectTimeout=10 -o
ControlPath=/etc/ansible/.ansible/cp/2c1e73363c
'[virthyp04.virt.in.bmrc.ox.ac.uk<http://virthyp04.virt.in.bmrc.ox.ac.uk/>]'
<virthyp04.virt.in.bmrc.ox.ac.uk<http://virthyp04.virt.in.bmrc.ox.ac.uk/>> (0,
'sftp> put /etc/ansible/.ansible/tmp/ansible-local-32213KmUe6/tmpQtVJtM
/root/.ansible/tmp/ansible-tmp-1553937525.89-247224387363848/AnsiballZ_ovirt_storage_domain.py\n',
'')
<virthyp04.virt.in.bmrc.ox.ac.uk<http://virthyp04.virt.in.bmrc.ox.ac.uk/>>
ESTABLISH SSH CONNECTION FOR USER: root
<virthyp04.virt.in.bmrc.ox.ac.uk<http://virthyp04.virt.in.bmrc.ox.ac.uk/>>
SSH: EXEC ssh -C -o ControlMaster=auto -o ControlPersist=60s -o
KbdInteractiveAuthentication=no -o
PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o
PasswordAuthentication=no -o User=root -o ConnectTimeout=10 -o
ControlPath=/etc/ansible/.ansible/cp/2c1e73363cvirthyp04.virt.in.bmrc.ox.ac.uk<http://virthyp04.virt.in.bmrc.ox.ac.uk/>
'/bin/sh -c '"'"'chmod u+x
/root/.ansible/tmp/ansible-tmp-1553937525.89-247224387363848/
/root/.ansible/tmp/ansible-tmp-1553937525.89-247224387363848/AnsiballZ_ovirt_storage_domain.py
&& sleep 0'"'"''
<virthyp04.virt.in.bmrc.ox.ac.uk<http://virthyp04.virt.in.bmrc.ox.ac.uk/>> (0,
'', '')
<virthyp04.virt.in.bmrc.ox.ac.uk<http://virthyp04.virt.in.bmrc.ox.ac.uk/>>
ESTABLISH SSH CONNECTION FOR USER: root
<virthyp04.virt.in.bmrc.ox.ac.uk<http://virthyp04.virt.in.bmrc.ox.ac.uk/>>
SSH: EXEC ssh -C -o ControlMaster=auto -o ControlPersist=60s -o
KbdInteractiveAuthentication=no -o
PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o
PasswordAuthentication=no -o User=root -o ConnectTimeout=10 -o
ControlPath=/etc/ansible/.ansible/cp/2c1e73363c
-ttvirthyp04.virt.in.bmrc.ox.ac.uk<http://virthyp04.virt.in.bmrc.ox.ac.uk/>
'/bin/sh -c '"'"'/usr/bin/python
/root/.ansible/tmp/ansible-tmp-1553937525.89-247224387363848/AnsiballZ_ovirt_storage_domain.py
&& sleep 0'"'"''
<virthyp04.virt.in.bmrc.ox.ac.uk<http://virthyp04.virt.in.bmrc.ox.ac.uk/>> (1,
'\r\n{"msg": "Fault reason is \\"Operation Failed\\". Fault
detail is \\"[]\\". HTTP response code is 400.", "failed": true,
"exception": "Traceback (most recent call last):\\n File
\\"/tmp/ansible_ovirt_storage_domain_payload_w8oO0Y/__main__.py\\", line 664, in
main\\n storage_domains_module.post_create_check(sd_id)\\n File
\\"/tmp/ansible_ovirt_storage_domain_payload_w8oO0Y/__main__.py\\", line 526, in
post_create_check\\n id=storage_domain.id<http://storage_domain.id/>,\\n File
\\"/usr/lib64/python2.7/site-packages/ovirtsdk4/services.py\\", line 3053, in
add\\n return self._internal_add(storage_domain, headers, query, wait)\\n File
\\"/usr/lib64/python2.7/site-packages/ovirtsdk4/service.py\\", line 232, in
_internal_add\\n return future.wait() if wait else future\\n File
\\"/usr/lib64/python2.7/site-packages/ovirtsdk4/service.py\\", line 55, in
wait\\n return self._code(response)\\n File
\\"/usr/lib64/python2.7/site-packages/ovirtsdk4/service.py\\", line 229, in
callback\\n self._check_fault(response)\\n File
\\"/usr/lib64/python2.7/site-packages/ovirtsdk4/service.py\\", line 132, in
_check_fault\\n self._raise_error(response, body)\\n File
\\"/usr/lib64/python2.7/site-packages/ovirtsdk4/service.py\\", line 118, in
_raise_error\\n raise error\\nError: Fault reason is \\"Operation Failed\\".
Fault detail is \\"[]\\". HTTP response code is 400.\\n",
"invocation": {"module_args": {"comment": null,
"warning_low_space": null, "fetch_nested": false, "localfs":
null, "data_center": "Default", "id": null,
"iscsi": null, "state": "present",
"wipe_after_delete": null, "destroy": null, "fcp": null,
"description": null, "format": null, "nested_attributes":
[], "host":
"virthyp04.virt.in.bmrc.ox.ac.uk<http://virthyp04.virt.in.bmrc.ox.ac.uk/>",
"discard_after_delete": null, "wait": true,
"domain_function": "data", "name":
"hosted_storage", "critical_space_action_blocker": null,
"posixfs": null, "poll_interval": 3, "glusterfs": null,
"nfs": null, "timeout": 180, "backup": null}}}\r\n',
'Shared connection to
virthyp04.virt.in.bmrc.ox.ac.uk<http://virthyp04.virt.in.bmrc.ox.ac.uk/>closed.\r\n')
<virthyp04.virt.in.bmrc.ox.ac.uk<http://virthyp04.virt.in.bmrc.ox.ac.uk/>>
Failed to connect to the host via ssh: Shared connection
tovirthyp04.virt.in.bmrc.ox.ac.uk<http://virthyp04.virt.in.bmrc.ox.ac.uk/> closed.
<virthyp04.virt.in.bmrc.ox.ac.uk<http://virthyp04.virt.in.bmrc.ox.ac.uk/>>
ESTABLISH SSH CONNECTION FOR USER: root
<virthyp04.virt.in.bmrc.ox.ac.uk<http://virthyp04.virt.in.bmrc.ox.ac.uk/>>
SSH: EXEC ssh -C -o ControlMaster=auto -o ControlPersist=60s -o
KbdInteractiveAuthentication=no -o
PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o
PasswordAuthentication=no -o User=root -o ConnectTimeout=10 -o
ControlPath=/etc/ansible/.ansible/cp/2c1e73363cvirthyp04.virt.in.bmrc.ox.ac.uk<http://virthyp04.virt.in.bmrc.ox.ac.uk/>
'/bin/sh -c '"'"'rm -f -r
/root/.ansible/tmp/ansible-tmp-1553937525.89-247224387363848/ > /dev/null 2>&1
&& sleep 0'"'"''
<virthyp04.virt.in.bmrc.ox.ac.uk<http://virthyp04.virt.in.bmrc.ox.ac.uk/>> (0,
'', '')
The full traceback is:
Traceback (most recent call last):
File "/tmp/ansible_ovirt_storage_domain_payload_w8oO0Y/__main__.py", line 664,
in main
storage_domains_module.post_create_check(sd_id)
File "/tmp/ansible_ovirt_storage_domain_payload_w8oO0Y/__main__.py", line 526,
in post_create_check
id=storage_domain.id<http://storage_domain.id/>,
File "/usr/lib64/python2.7/site-packages/ovirtsdk4/services.py", line 3053, in
add
return self._internal_add(storage_domain, headers, query, wait)
File "/usr/lib64/python2.7/site-packages/ovirtsdk4/service.py", line 232, in
_internal_add
return future.wait() if wait else future
File "/usr/lib64/python2.7/site-packages/ovirtsdk4/service.py", line 55, in
wait
return self._code(response)
File "/usr/lib64/python2.7/site-packages/ovirtsdk4/service.py", line 229, in
callback
self._check_fault(response)
File "/usr/lib64/python2.7/site-packages/ovirtsdk4/service.py", line 132, in
_check_fault
self._raise_error(response, body)
File "/usr/lib64/python2.7/site-packages/ovirtsdk4/service.py", line 118, in
_raise_error
raise error
Error: Fault reason is "Operation Failed". Fault detail is "[]". HTTP
response code is 400.
fatal: [virthyp04.virt.in.bmrc.ox.ac.uk<http://virthyp04.virt.in.bmrc.ox.ac.uk/>]:
FAILED! => {
"changed": false,
"invocation": {
"module_args": {
"backup": null,
"comment": null,
"critical_space_action_blocker": null,
"data_center": "Default",
"description": null,
"destroy": null,
"discard_after_delete": null,
"domain_function": "data",
"fcp": null,
"fetch_nested": false,
"format": null,
"glusterfs": null,
"host":
"virthyp04.virt.in.bmrc.ox.ac.uk<http://virthyp04.virt.in.bmrc.ox.ac.uk/>",
"id": null,
"iscsi": null,
"localfs": null,
"name": "hosted_storage",
"nested_attributes": [],
"nfs": null,
"poll_interval": 3,
"posixfs": null,
"state": "present",
"timeout": 180,
"wait": true,
"warning_low_space": null,
"wipe_after_delete": null
}
},
"msg": "Fault reason is \"Operation Failed\". Fault detail is
\"[]\". HTTP response code is 400."
}
to retry, use: --limit @/etc/ansible/playbook/ovirt.retry
PLAY RECAP
******************************************************************************************
virthyp04.virt.in.bmrc.ox.ac.uk<http://virthyp04.virt.in.bmrc.ox.ac.uk/> : ok=216
changed=69 unreachable=0 failed=1
Regards,
Callum
--
Callum Smith
Research Computing Core
Wellcome Trust Centre for Human Genetics
University of Oxford
e. callum@well.ox.ac.uk<mailto:callum@well.ox.ac.uk>
On 29 Mar 2019, at 17:48, Simone Tiraboschi
<stirabos@redhat.com<mailto:stirabos@redhat.com>> wrote:
On Fri, Mar 29, 2019 at 6:14 PM Callum Smith
<callum@well.ox.ac.uk<mailto:callum@well.ox.ac.uk>> wrote:
So close now:
fatal: [virthyp04.virt.in.bmrc.ox.ac.uk<http://virthyp04.virt.in.bmrc.ox.ac.uk/>]:
FAILED! => {"changed": true, "cmd": ["dd",
"bs=20480", "count=1", "oflag=direct",
"if=/var/tmp/localvmDBMVgn/e208e0f9-0f4d-4d0d-9104-10d8a26bfab6",
"of=/rhev/data-center/mnt/10.141.15.248:_export_virtman_hosted__storage/b4f93b28-1497-44b0-9eaf-5e5e2b71bce8/images/645c4286-71e4-4cce-9049-345903929e1b/e208e0f9-0f4d-4d0d-9104-10d8a26bfab6"],
"delta": "0:00:00.005134", "end": "2019-03-29
17:04:27.952367", "msg": "non-zero return code", "rc":
1, "start": "2019-03-29 17:04:27.947233", "stderr":
"dd: failed to open
‘/rhev/data-center/mnt/10.141.15.248:_export_virtman_hosted__storage/b4f93b28-1497-44b0-9eaf-5e5e2b71bce8/images/645c4286-71e4-4cce-9049-345903929e1b/e208e0f9-0f4d-4d0d-9104-10d8a26bfab6’:
Permission denied", "stderr_lines": ["dd: failed to open
‘/rhev/data-center/mnt/10.141.15.248:_export_virtman_hosted__storage/b4f93b28-1497-44b0-9eaf-5e5e2b71bce8/images/645c4286-71e4-4cce-9049-345903929e1b/e208e0f9-0f4d-4d0d-9104-10d8a26bfab6’:
Permission denied"], "stdout": "", "stdout_lines":
[]}
to retry, use: --limit @/etc/ansible/playbook/ovirt.retry
ls -laZ
/rhev/data-center/mnt/10.141.15.248:_export_virtman_hosted__storage/b4f93b28-1497-44b0-9eaf-5e5e2b71bce8/images/645c4286-71e4-4cce-9049-345903929e1b/e208e0f9-0f4d-4d0d-9104-10d8a26bfab6
-rw-rw----. vdsm kvm system_u:object_r:nfs_t:s0
/rhev/data-center/mnt/10.141.15.248:_export_virtman_hosted__storage/b4f93b28-1497-44b0-9eaf-5e5e2b71bce8/images/645c4286-71e4-4cce-9049-345903929e1b/e208e0f9-0f4d-4d0d-9104-10d8a26bfab6
Any ideas on this one? I can almost touch this deployment now... Looking at the command it
should run as `vdsm` so should work fine, could this be SELinux?
Yes, it should be executed as vdsm user:
https://github.com/oVirt/ovirt-ansible-hosted-engine-setup/blob/master/ta...
Did you tried executing it with -vvv ?
Can you please share the playbook you are using to trigger that role?
Regards,
Callum
--
Callum Smith
Research Computing Core
Wellcome Trust Centre for Human Genetics
University of Oxford
e. callum@well.ox.ac.uk<mailto:callum@well.ox.ac.uk>
On 29 Mar 2019, at 15:50, Callum Smith
<callum@well.ox.ac.uk<mailto:callum@well.ox.ac.uk>> wrote:
Guilty, will roll back and try again!
Regards,
Callum
--
Callum Smith
Research Computing Core
Wellcome Trust Centre for Human Genetics
University of Oxford
e. callum@well.ox.ac.uk<mailto:callum@well.ox.ac.uk>
On 29 Mar 2019, at 15:35, Simone Tiraboschi
<stirabos@redhat.com<mailto:stirabos@redhat.com>> wrote:
The error comes from here:
TASK [ovirt.hosted_engine_setup : Parse OVF]
***************************************************************************************************************************************
fatal: [virthyp04.virt.in.bmrc.ox.ac.uk<http://virthyp04.virt.in.bmrc.ox.ac.uk/>]:
FAILED! => {"changed": false, "msg": "missing parameter(s)
required by 'attribute': value"}
but are you really using it with ansible 2.8 alpha 1?
I'd strongly suggest to switch back to a stable release of ansible which is currently
2.7.9.
That one was due to:
https://github.com/ansible/ansible/issues/53459
In the next ansible build it will be just a warning as for:
https://github.com/ansible/ansible/pull/54336
https://github.com/oVirt/ovirt-ansible-hosted-engine-setup/pull/150/files already address
this on ovirt-ansible-hosted-engine-setup to be compatible with future ansible releases.
On Fri, Mar 29, 2019 at 3:53 PM Callum Smith
<callum@well.ox.ac.uk<mailto:callum@well.ox.ac.uk>> wrote:
The OVF in question is here:
<ovf:Envelope ovf:version="0.9"
xmlns:ovf="http://schemas.dmtf.org/ovf/envelope/1/"
xmlns:rasd="http://schemas.dmtf.org/wbem/wscim/1/cim-schema/2/CIM_Re...
xmlns:vssd="http://schemas.dmtf.org/wbem/wscim/1/cim-schema/2/CIM_Vi...
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"><Re...
ovf:description="074a62d4-44f9-4ffe-a172-2702a9fe96df"
ovf:href="6f76686b-199c-4cb3-bbbe-86fc34365745/72bc3948-5d8d-4877-bac8-7db4995045b5"
ovf:id="72bc3948-5d8d-4877-bac8-7db4995045b5" ovf:size="54760833024"
/></References><Section
xsi:type="ovf:NetworkSection_Type"><Info>List of
Networks</Info></Section><Section
xsi:type="ovf:DiskSection_Type"><Disk ovf:actual_size="51"
ovf:boot="true" ovf:disk-interface="VirtIO"
ovf:disk-type="System"
ovf:diskId="72bc3948-5d8d-4877-bac8-7db4995045b5"
ovf:fileRef="6f76686b-199c-4cb3-bbbe-86fc34365745/72bc3948-5d8d-4877-bac8-7db4995045b5"
ovf:format="http://www.vmware.com/specifications/vmdk.html#sparse"
ovf:parentRef="" ovf:size="51"
ovf:vm_snapshot_id="5f2be758-82d7-4c07-a220-9060e782dc7a"
ovf:volume-format="COW" ovf:volume-type="Sparse"
ovf:wipe-after-delete="false" /></Section><Content
ovf:id="out"
xsi:type="ovf:VirtualSystem_Type"><Name>074a62d4-44f9-4ffe-a172-2702a9fe96df</Name><TemplateId>074a62d4-44f9-4ffe-a172-2702a9fe96df</TemplateId><Description>Created
by OVABuilder</Description><Domain /><CreationDate>2019/03/19
08:35:09</CreationDate><TimeZone
/><IsAutoSuspend>false</IsAutoSuspend><VmType>1</VmType><default_display_type>0</default_display_type><default_boot_sequence>1</default_boot_sequence><Section
ovf:id="074a62d4-44f9-4ffe-a172-2702a9fe96df" ovf:required="false"
xsi:type="ovf:OperatingSystemSection_Type"><Info>Guest
OS</Info><Description>OtherLinux</Description></Section><Section
xsi:type="ovf:VirtualHardwareSection_Type"><Info>4 CPU, 16384
Memory</Info><System><vssd:VirtualSystemType>RHEVM
4.6.0.163</vssd:VirtualSystemType></System><Item><rasd:Caption>4
virtual CPU</rasd:Caption><rasd:Description>Number of virtual
CPU</rasd:Description><rasd:InstanceId>1</rasd:InstanceId><rasd:ResourceType>3</rasd:ResourceType><rasd:num_of_sockets>1</rasd:num_of_sockets><rasd:cpu_per_socket>4</rasd:cpu_per_socket></Item><Item><rasd:Caption>16384
MB of memory</rasd:Caption><rasd:Description>Memory
Size</rasd:Description><rasd:InstanceId>2</rasd:InstanceId><rasd:ResourceType>4</rasd:ResourceType><rasd:AllocationUnits>MegaBytes</rasd:AllocationUnits><rasd:VirtualQuantity>16384</rasd:VirtualQuantity></Item><Item><rasd:Caption>Drive
1</rasd:Caption><rasd:InstanceId>72bc3948-5d8d-4877-bac8-7db4995045b5</rasd:InstanceId><rasd:ResourceType>17</rasd:ResourceType><rasd:HostResource>6f76686b-199c-4cb3-bbbe-86fc34365745/72bc3948-5d8d-4877-bac8-7db4995045b5</rasd:HostResource><rasd:Parent>00000000-0000-0000-0000-000000000000</rasd:Parent><rasd:Template>00000000-0000-0000-0000-000000000000</rasd:Template><rasd:ApplicationList
/><rasd:StorageId>00000000-0000-0000-0000-000000000000</rasd:StorageId><rasd:StoragePoolId>00000000-0000-0000-0000-000000000000</rasd:StoragePoolId><rasd:CreationDate>2019/03/19
08:35:09</rasd:CreationDate><rasd:LastModified>2019/03/19
08:35:09</rasd:LastModified></Item><Item><rasd:Caption>Ethernet 0
rhevm</rasd:Caption><rasd:InstanceId>3</rasd:InstanceId><rasd:ResourceType>10</rasd:ResourceType><rasd:ResourceSubType>3</rasd:ResourceSubType><rasd:Connection>rhevm</rasd:Connection><rasd:Name>eth0</rasd:Name><rasd:speed>1000</rasd:speed></Item><Item><rasd:Caption>Graphics</rasd:Caption><rasd:InstanceId>5</rasd:InstanceId><rasd:ResourceType>20</rasd:ResourceType><rasd:VirtualQuantity>1</rasd:VirtualQuantity></Item></Section></Content></ovf:Envelope>
Regards,
Callum
--
Callum Smith
Research Computing Core
Wellcome Trust Centre for Human Genetics
University of Oxford
e. callum@well.ox.ac.uk<mailto:callum@well.ox.ac.uk>
On 29 Mar 2019, at 14:42, Callum Smith
<callum@well.ox.ac.uk<mailto:callum@well.ox.ac.uk>> wrote:
Ok so we're getting very close now, weird OVF error:
Full ansible log attached
Only error in the engine.log looks normal/expected to me:
2019-03-29 14:32:44,370Z ERROR [org.ovirt.engine.core.bll.pm.FenceProxyLocator]
(EE-ManagedThreadFactory-engineScheduled-Thread-71) [4405d6db] Can not run fence action on
host 'vir
thyp04.virt.in.bmrc.ox.ac.uk<http://thyp04.virt.in.bmrc.ox.ac.uk/>', no suitable
proxy host was found.
<ovirt.20190329133000.ansible.log>
Feeling damn close to success here, but have managed to replicate this issue twice
re-running the installer.
Regards,
Callum
--
Callum Smith
Research Computing Core
Wellcome Trust Centre for Human Genetics
University of Oxford
e. callum@well.ox.ac.uk<mailto:callum@well.ox.ac.uk>
On 29 Mar 2019, at 11:50, Simone Tiraboschi
<stirabos@redhat.com<mailto:stirabos@redhat.com>> wrote:
On Fri, Mar 29, 2019 at 12:36 PM Callum Smith
<callum@well.ox.ac.uk<mailto:callum@well.ox.ac.uk>> wrote:
ip link del ovirtmgmt has done the job....
Another issue, but this is likely due to randomised MAC addresses:
fatal: [virthyp04.virt.in.bmrc.ox.ac.uk<http://virthyp04.virt.in.bmrc.ox.ac.uk/>]:
FAILED! => {"changed": true, "cmd": ["virt-install",
"-n", "HostedEngineLocal", "--os-variant",
"rhel7", "--virt-type", "kvm", "--memory",
"4096", "--vcpus", "64", "--network",
"network=default,mac=fe:58:6c:da:1e:cc,model=virtio", "--disk",
"/var/tmp/localvmOCYiyF/images/6f76686b-199c-4cb3-bbbe-86fc34365745/72bc3948-5d8d-4877-bac8-7db4995045b5",
"--import", "--disk",
"path=/var/tmp/localvmOCYiyF/seed.iso,device=cdrom",
"--noautoconsole", "--rng", "/dev/random",
"--graphics", "vnc", "--video", "vga",
"--sound", "none", "--controller",
"usb,model=none", "--memballoon", "none",
"--boot", "hd,menu=off", "--clock",
"kvmclock_present=yes"], "delta": "0:00:01.355834",
"end": "2019-03-29 11:31:02.100143", "msg": "non-zero
return code", "rc": 1, "start": "2019-03-29
11:31:00.744309", "stderr": "ERROR unsupported configuration:
Unable to use MAC address starting with reserved value 0xFE - 'fe:58:6c:da:1e:cc'
- \nDomain installation does not appear to have been successful.\nIf it was, you can
restart your domain by running:\n virsh --connect qemu:///system start
HostedEngineLocal\notherwise, please restart your installation.",
"stderr_lines": ["ERROR unsupported configuration: Unable to use MAC
address starting with reserved value 0xFE - 'fe:58:6c:da:1e:cc' - ",
"Domain installation does not appear to have been successful.", "If it was,
you can restart your domain by running:", " virsh --connect qemu:///system
start HostedEngineLocal", "otherwise, please restart your installation."],
"stdout": "\nStarting install...", "stdout_lines":
["", "Starting install..."]}
Seems it doesn't take into account reserved values when generating.
If not specified by the user,
a random unicast MAC address is randomly generated here:
https://github.com/oVirt/ovirt-ansible-hosted-engine-setup/blob/master/ta...
for sure it's unicast but maybe we should make it more robust on reserved values.
Simply try again for now; I'll open an issue to track it, thanks!
I hope this feedback is valuable - I have a good feeling about the current deploy
otherwise though.
Regards,
Callum
--
Callum Smith
Research Computing Core
Wellcome Trust Centre for Human Genetics
University of Oxford
e. callum@well.ox.ac.uk<mailto:callum@well.ox.ac.uk>
On 29 Mar 2019, at 11:01, Simone Tiraboschi
<stirabos@redhat.com<mailto:stirabos@redhat.com>> wrote:
On Fri, Mar 29, 2019 at 11:56 AM Callum Smith
<callum@well.ox.ac.uk<mailto:callum@well.ox.ac.uk>> wrote:
Dear Simone,
It doesn't seem to want to work:
# Settings
he_fqdn: "he.virt.in.bmrc.ox.ac.uk<http://he.virt.in.bmrc.ox.ac.uk/>"
he_ansible_host_name:
"virthyp04.virt.in.bmrc.ox.ac.uk<http://virthyp04.virt.in.bmrc.ox.ac.uk/>"
he_admin_password: <snip>
he_appliance_password: <snip>
# Resources
he_mem_size_MB: "4096"
# Storage
he_domain_type: "nfs"
he_storage_domain_addr: <snip>
he_storage_domain_path: <snip>
# Network
he_vm_ip_addr: "10.141.31.240"
he_vm_ip_prefix: "20"
he_dns_addr:
["10.141.31.251","10.141.31.252","10.141.31.253"]
he_default_gateway_4: "10.141.31.254"
he_gateway: he_default_gateway_4
he_force_ip4: true
he_bridge_if: bond0.910
#he_just_collect_network_interfaces: true
# Email
he_smtp_port: 25
he_smtp_server: smtp.ox.ac.uk<http://smtp.ox.ac.uk/>
he_dest_email: rescomp-ops@well.ox.ac.uk<mailto:rescomp-ops@well.ox.ac.uk>
he_source_email: ovirt@bmrc.ox.ac.uk<mailto:ovirt@bmrc.ox.ac.uk>
# Ansible Stuff
ansible_ssh_user: root
ansible_become: false
host_key_checking: false
I've attached the output of the ansible command as a log file, this is what happens
when the IF bond0.910 is assigned the IP and `ovirtmgmt` is not defined on the host.
TASK [ovirt.hosted_engine_setup : debug]
*******************************************************************************************************************************************
ok: [virthyp04.virt.in.bmrc.ox.ac.uk<http://virthyp04.virt.in.bmrc.ox.ac.uk/>] =>
{
"target_address_v4": {
"changed": true,
"cmd": "ip addr show ovirtmgmt | grep 'inet ' | cut -d'
' -f6 | cut -d'/' -f1",
"delta": "0:00:00.008744",
"end": "2019-03-29 10:26:07.510481",
"failed": false,
"rc": 0,
"start": "2019-03-29 10:26:07.501737",
"stderr": "",
"stderr_lines": [],
"stdout": "",
"stdout_lines": []
}
}
according to the logs ovirtmgmt is still there.
can you please share the output of 'ip a'?
Regards,
Callum
--
Callum Smith
Research Computing Core
Wellcome Trust Centre for Human Genetics
University of Oxford
e. callum@well.ox.ac.uk<mailto:callum@well.ox.ac.uk>
On 28 Mar 2019, at 16:23, Simone Tiraboschi
<stirabos@redhat.com<mailto:stirabos@redhat.com>> wrote:
On Thu, Mar 28, 2019 at 1:44 PM Callum Smith
<callum@well.ox.ac.uk<mailto:callum@well.ox.ac.uk>> wrote:
Dear Simone,
This is my experience too, but I'm now hitting this error on the hosted-engine install
at the part where it registers the hypervisor as the first host in the engine:
2019-03-28 12:40:50,025Z INFO [org.ovirt.engine.core.bll.host.HostConnectivityChecker]
(EE-ManagedThreadFactory-engine-Thread-1) [49f371c1] Engine managed to communicate with
VDSM
agent on host
'virthyp04.virt.in.bmrc.ox.ac.uk<http://virthyp04.virt.in.bmrc.ox.ac.uk/>'
with address
'virthyp04.virt.in.bmrc.ox.ac.uk<http://virthyp04.virt.in.bmrc.ox.ac.uk/>'
('db571f8a-fc85-40d3-b86f-c0038e3cd7e7')
2019-03-28 12:40:53,111Z WARN [org.ovirt.engine.core.bll.network.NetworkConfigurator]
(EE-ManagedThreadFactory-engine-Thread-1) [49f371c1] Failed to find a valid interface for
the
management network of host
virthyp04.virt.in.bmrc.ox.ac.uk<http://virthyp04.virt.in.bmrc.ox.ac.uk/>. If the
interface ovirtmgmt is a bridge, it should be torn-down manually.
2019-03-28 12:40:53,111Z ERROR
[org.ovirt.engine.core.bll.hostdeploy.InstallVdsInternalCommand]
(EE-ManagedThreadFactory-engine-Thread-1) [49f371c1] Exception: org.ovirt.engine.cor
e.bll.network.NetworkConfigurator$NetworkConfiguratorException: Interface ovirtmgmt is
invalid for management network
The host's ovirtmgmt network connection is a statically assigned IP on a VLAN on a
bond, how should I be configuring this if not manually?
If you need to deploy over vlan 123 over bond0 simply configure a device exactly called
bond0.123 and statically configure your IP address there.
Choose it for hosted-engine deployment, nothing more: ovirtmgmt will be automatically
created over that device and the vlan ID will be set at engine level for the whole
management network.
Regards,
Callum
--
Callum Smith
Research Computing Core
Wellcome Trust Centre for Human Genetics
University of Oxford
e. callum@well.ox.ac.uk<mailto:callum@well.ox.ac.uk>
On 27 Mar 2019, at 17:09, Simone Tiraboschi
<stirabos@redhat.com<mailto:stirabos@redhat.com>> wrote:
On Wed, Mar 27, 2019 at 4:27 PM Callum Smith
<callum@well.ox.ac.uk<mailto:callum@well.ox.ac.uk>> wrote:
It's ok, migrating to 4.3.2 on the oVirt node (from 4.3.0) did the job of fixing it.
It is a bug if you intend on using the ovirtmgmt network to deploy your ansible from
This is a bit tricky: when the engine brings up the host it also creates the management
bridge and this could lead to a temporary network down on the selected interface for the
bridge creation time (a couple of seconds?)
I tried it on a LAN and ansible ssh connection always survived but I 'm not sure
it's always true.
, and you need it to have an IP address already on that range! But - it works as expected
with the ovirtmgmt bridge setup so nothing to worry about.
Regards,
Callum
--
Callum Smith
Research Computing Core
Wellcome Trust Centre for Human Genetics
University of Oxford
e. callum@well.ox.ac.uk<mailto:callum@well.ox.ac.uk>
On 27 Mar 2019, at 14:57, Simone Tiraboschi
<stirabos@redhat.com<mailto:stirabos@redhat.com>> wrote:
On Wed, Mar 27, 2019 at 3:24 PM Callum Smith
<callum@well.ox.ac.uk<mailto:callum@well.ox.ac.uk>> wrote:
Dear All,
We're trying to deploy our hosted engine remotely using the ansible hosted engine
playbook, which has been a rocky road but we're now at the point where it's
installing, and failing. We've got a pre-defined bond/VLAN setup for our interface
which has the correct bond0 bond0.123 and ovirtmgmt bridge on top but we're hitting
the classic error:
Failed to find a valid in
terface for the management network of host
virthyp04.virt.in.bmrc.ox.ac.uk<http://virthyp04.virt.in.bmrc.ox.ac.uk/>. If the
interface ovirtmgmt is a bridge, it should be torn-down manually.
Does this bug still exist in the latest (4.3) version, and is installing using ansible
with this network configuration impossible?
I don't think it's a bug; please avoid manually creating ovirtmgmt and simply
set:
he_bridge_if: "bond0.123", in ansible variable file and the management bridge
will be created for you at host-deploy time.
Regards,
Callum
--
Callum Smith
Research Computing Core
Wellcome Trust Centre for Human Genetics
University of Oxford
e. callum@well.ox.ac.uk<mailto:callum@well.ox.ac.uk>
_______________________________________________
Users mailing list -- users@ovirt.org<mailto:users@ovirt.org>
To unsubscribe send an email to
users-leave@ovirt.org<mailto:users-leave@ovirt.org>
Privacy Statement:
https://www.ovirt.org/site/privacy-policy/
oVirt Code of Conduct:
https://www.ovirt.org/community/about/community-guidelines/
List Archives:
https://lists.ovirt.org/archives/list/users@ovirt.org/message/SBOZ6FRBRQK...
_______________________________________________
Users mailing list -- users@ovirt.org<mailto:users@ovirt.org>
To unsubscribe send an email to
users-leave@ovirt.org<mailto:users-leave@ovirt.org>
Privacy Statement:
https://www.ovirt.org/site/privacy-policy/
oVirt Code of Conduct:
https://www.ovirt.org/community/about/community-guidelines/
List Archives:
https://lists.ovirt.org/archives/list/users@ovirt.org/message/QOGFRHXDVOI...