[ovirt-users] hosted-engine --deploy Failed

Joe DiTommasso jdito at domeyard.com
Tue May 1 16:08:05 UTC 2018


I ran through this again to regenerate the logs. It's been 100% repeatable
for me on a fresh 7.4 install, running 'hosted-engine --deploy' or the
preconfigured storage option in the cockpit oVirt UI. Deploying
hyperconverged from the cockpit UI worked, however. Attaching contents of
/var/log from hosted-engine and the physical host.

This is what I got from 'journalctl -u vdsmd', wasn't reflected in VDSM
logs.

May 01 11:01:27 sum-glovirt-05.dy.gl systemd[1]: Starting Virtual Desktop
Server Manager...
May 01 11:01:27 sum-glovirt-05.dy.gl vdsmd_init_common.sh[15467]: vdsm:
Running mkdirs
May 01 11:01:27 sum-glovirt-05.dy.gl vdsmd_init_common.sh[15467]: vdsm:
Running configure_coredump
May 01 11:01:27 sum-glovirt-05.dy.gl vdsmd_init_common.sh[15467]: vdsm:
Running configure_vdsm_logs
May 01 11:01:27 sum-glovirt-05.dy.gl vdsmd_init_common.sh[15467]: vdsm:
Running wait_for_network
May 01 11:01:27 sum-glovirt-05.dy.gl vdsmd_init_common.sh[15467]: vdsm:
Running run_init_hooks
May 01 11:01:27 sum-glovirt-05.dy.gl vdsmd_init_common.sh[15467]: vdsm:
Running check_is_configured
May 01 11:01:27 sum-glovirt-05.dy.gl vdsmd_init_common.sh[15467]: abrt is
already configured for vdsm
May 01 11:01:27 sum-glovirt-05.dy.gl vdsmd_init_common.sh[15467]: lvm is
configured for vdsm
May 01 11:01:27 sum-glovirt-05.dy.gl vdsmd_init_common.sh[15467]: libvirt
is already configured for vdsm
May 01 11:01:27 sum-glovirt-05.dy.gl vdsmd_init_common.sh[15467]: Current
revision of multipath.conf detected, preserving
May 01 11:01:28 sum-glovirt-05.dy.gl vdsmd_init_common.sh[15467]: schema is
already configuredvdsm: Running validate_configuration
May 01 11:01:28 sum-glovirt-05.dy.gl vdsmd_init_common.sh[15467]: SUCCESS:
ssl configured to true. No conflicts
May 01 11:01:28 sum-glovirt-05.dy.gl vdsmd_init_common.sh[15467]: vdsm:
Running prepare_transient_repository
May 01 11:01:28 sum-glovirt-05.dy.gl vdsmd_init_common.sh[15467]: vdsm:
Running syslog_available
May 01 11:01:28 sum-glovirt-05.dy.gl vdsmd_init_common.sh[15467]: vdsm:
Running nwfilter
May 01 11:01:28 sum-glovirt-05.dy.gl vdsmd_init_common.sh[15467]: libvirt:
Network Filter Driver error : this function is not supported by the
connection driver: virNWFilterLookupByName
May 01 11:01:28 sum-glovirt-05.dy.gl vdsmd_init_common.sh[15467]: libvirt:
Network Filter Driver error : this function is not supported by the
connection driver: virNWFilterDefineXML
May 01 11:01:28 sum-glovirt-05.dy.gl vdsmd_init_common.sh[15467]: Traceback
(most recent call last):
May 01 11:01:28 sum-glovirt-05.dy.gl vdsmd_init_common.sh[15467]: File
"/usr/bin/vdsm-tool", line 219, in main
May 01 11:01:28 sum-glovirt-05.dy.gl vdsmd_init_common.sh[15467]: return
tool_command[cmd]["command"](*args)
May 01 11:01:28 sum-glovirt-05.dy.gl vdsmd_init_common.sh[15467]: File
"/usr/lib/python2.7/site-packages/vdsm/tool/nwfilter.py", line 40, in main
May 01 11:01:28 sum-glovirt-05.dy.gl vdsmd_init_common.sh[15467]:
NoMacSpoofingFilter().defineNwFilter(conn)
May 01 11:01:28 sum-glovirt-05.dy.gl vdsmd_init_common.sh[15467]: File
"/usr/lib/python2.7/site-packages/vdsm/tool/nwfilter.py", line 76, in
defineNwFilter
May 01 11:01:28 sum-glovirt-05.dy.gl vdsmd_init_common.sh[15467]: nwFilter
= conn.nwfilterDefineXML(self.buildFilterXml())
May 01 11:01:28 sum-glovirt-05.dy.gl vdsmd_init_common.sh[15467]: File
"/usr/lib/python2.7/site-packages/vdsm/common/libvirtconnection.py", line
130, in wrapper
May 01 11:01:28 sum-glovirt-05.dy.gl vdsmd_init_common.sh[15467]: ret =
f(*args, **kwargs)
May 01 11:01:28 sum-glovirt-05.dy.gl vdsmd_init_common.sh[15467]: File
"/usr/lib/python2.7/site-packages/vdsm/common/function.py", line 92, in
wrapper
May 01 11:01:28 sum-glovirt-05.dy.gl vdsmd_init_common.sh[15467]: return
func(inst, *args, **kwargs)
May 01 11:01:28 sum-glovirt-05.dy.gl vdsmd_init_common.sh[15467]: File
"/usr/lib64/python2.7/site-packages/libvirt.py", line 4279, in
nwfilterDefineXML
May 01 11:01:28 sum-glovirt-05.dy.gl vdsmd_init_common.sh[15467]: if ret is
None:raise libvirtError('virNWFilterDefineXML() failed', conn=self)
May 01 11:01:28 sum-glovirt-05.dy.gl vdsmd_init_common.sh[15467]:
libvirtError: this function is not supported by the connection driver:
virNWFilterDefineXML
May 01 11:01:28 sum-glovirt-05.dy.gl systemd[1]: vdsmd.service: control
process exited, code=exited status=1
May 01 11:01:28 sum-glovirt-05.dy.gl systemd[1]: Failed to start Virtual
Desktop Server Manager.
May 01 11:01:28 sum-glovirt-05.dy.gl systemd[1]: Unit vdsmd.service entered
failed state.
May 01 11:01:28 sum-glovirt-05.dy.gl systemd[1]: vdsmd.service failed.
May 01 11:01:29 sum-glovirt-05.dy.gl systemd[1]: vdsmd.service holdoff time
over, scheduling restart.
May 01 11:01:29 sum-glovirt-05.dy.gl systemd[1]: start request repeated too
quickly for vdsmd.service
May 01 11:01:29 sum-glovirt-05.dy.gl systemd[1]: Failed to start Virtual
Desktop Server Manager.
May 01 11:01:29 sum-glovirt-05.dy.gl systemd[1]: Unit vdsmd.service entered
failed state.
May 01 11:01:29 sum-glovirt-05.dy.gl systemd[1]: vdsmd.service failed.


On Tue, May 1, 2018 at 10:18 AM, Yaniv Kaul <ykaul at redhat.com> wrote:

> Can you provide other logs, vdsm log, for example?
> The installation of the Engine seem to have succeeded, then it tried to
> add the host. It was waiting for quite some time to get the host into 'Up'
> state.
> If, for example, it was installing packages, and the yum repo was very
> slow, that may be a reason.
> But there may be many other reasons as well.
> Y.
>
> On Tue, May 1, 2018 at 2:43 PM, Paul.LKW <paul.lkw at gmail.com> wrote:
>
>> Dear All:
>> Recently I just make a try to create a Self-Hosted Engine oVirt but
>> unfortunately both two of my box also failed with bad deployment
>> experience, first of all the online documentation is wrong under "oVirt
>> Self-Hosted Engine Guide" section, it says the deployment script
>> "hosted-engine --deploy" will asking for Storage configuration immediately
>> but it is not true any more, also on both of my 2 box one is configured
>> with bond interface and one not but also failed, in order to separate the
>> issue I think better I post the bonded interface one log for all your to
>> Ref. first, the script runs at "TASK [Wait for the host to be up]" for long
>> long time then give me the Error
>>
>> [ ERROR ] fatal: [localhost]: FAILED! => {"ansible_facts":
>> {"ovirt_hosts": []}, "attempts": 120, "changed": false}
>> [ INFO  ] TASK [include_tasks]
>> [ INFO  ] ok: [localhost]
>> [ INFO  ] TASK [Remove local vm dir]
>> [ INFO  ] changed: [localhost]
>> [ INFO  ] TASK [Notify the user about a failure]
>> [ ERROR ] fatal: [localhost]: FAILED! => {"changed": false, "msg": "The
>> system may not be provisioned according to the playbook results: please
>> check the logs for the issue, fix accordingly or re-deploy from scratch.\n"}
>> [ ERROR ] [DEPRECATION WARNING]: Using tests as filters is deprecated.
>> Instead of using
>> [ ERROR ] `result|succeeded` instead use `result is succeeded`. This
>> feature will be
>> [ ERROR ] removed in version 2.9. Deprecation warnings can be disabled by
>> setting
>> [ ERROR ] deprecation_warnings=False in ansible.cfg.
>> [ ERROR ] [DEPRECATION WARNING]: Using tests as filters is deprecated.
>> Instead of using
>> [ ERROR ] `result|succeeded` instead use `result is succeeded`. This
>> feature will be
>> [ ERROR ] removed in version 2.9. Deprecation warnings can be disabled by
>> setting
>> [ ERROR ] deprecation_warnings=False in ansible.cfg.
>> [ ERROR ] Failed to execute stage 'Closing up': Failed executing
>> ansible-playbook
>> [ INFO  ] Stage: Clean up
>> [ INFO  ] Cleaning temporary resources
>> [ INFO  ] TASK [Gathering Facts]
>> [ INFO  ] ok: [localhost]
>> [ INFO  ] TASK [include_tasks]
>> [ INFO  ] ok: [localhost]
>> [ INFO  ] TASK [Remove local vm dir]
>> [ INFO  ] ok: [localhost]
>> [ INFO  ] Generating answer file '/var/lib/ovirt-hosted-engine-
>> setup/answers/answers-20180501190540.conf'
>> [ INFO  ] Stage: Pre-termination
>> [ INFO  ] Stage: Termination
>> [ ERROR ] Hosted Engine deployment failed: please check the logs for the
>> issue, fix accordingly or re-deploy from scratch.
>>           Log file is located at /var/log/ovirt-hosted-engine-s
>> etup/ovirt-hosted-engine-setup-20180501184459-4v6ctw.log
>>
>> Attached is the so long LOG.
>>
>>
>>
>> _______________________________________________
>> Users mailing list
>> Users at ovirt.org
>> http://lists.ovirt.org/mailman/listinfo/users
>>
>>
>
> _______________________________________________
> Users mailing list
> Users at ovirt.org
> http://lists.ovirt.org/mailman/listinfo/users
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.ovirt.org/pipermail/users/attachments/20180501/8d695445/attachment-0001.html>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: hosted-engine-logs.tgz
Type: application/x-compressed-tar
Size: 960569 bytes
Desc: not available
URL: <http://lists.ovirt.org/pipermail/users/attachments/20180501/8d695445/attachment-0002.bin>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: ovirt-host-logs.tgz
Type: application/x-compressed-tar
Size: 2045706 bytes
Desc: not available
URL: <http://lists.ovirt.org/pipermail/users/attachments/20180501/8d695445/attachment-0003.bin>


More information about the Users mailing list