On Thu, Mar 14, 2019 at 12:51 PM Simone Tiraboschi <stirabos(a)redhat.com>
wrote:
On Thu, Mar 14, 2019 at 12:06 PM Jagi Sarcilla <
jagi.sarcilla(a)cevalogistics.com> wrote:
> Hardware Specs
> Hypervisor:
> Supermicro A2SDi-16C-HLN4F
> Intel(R) Atom(TM) CPU C3955 @ 2.10GHz 16cores
>
I fear it could come from this: Ryan, do we support Intel Atom C3000
family?
> Samsung NVME 1TB
> Sandisk SSHD 1TB
> 256GB RAM
> 4 x 1G nics
>
> Storage:
> FreeNAS-11.2-RELEASE-U1
> - NFS
> - iSCSI
>
>
> * Issues #1
> using iscsi unable to discover the target using cockpit dashboard
>
> * Issue #2
> using nfs can successfully connect but the host wont come up after
> shutdown by the installation
>
> * Error message:
>
> [ INFO ] TASK [oVirt.hosted-engine-setup : Check engine VM health]
>
But in order to reach that point the host went up for the engine but then
the engine is not able to start a VM on it.
So maybe we have an issue here and the host should be refused by the engine
with a clear error message.
Jagi, can you please share the logs you have in
/var/log/ovirt-hosted-engine-setup?
[ ERROR ] fatal: [localhost]: FAILED! => {"attempts":
120, "changed":
> true, "cmd": ["hosted-engine", "--vm-status",
"--json"], "delta":
> "0:00:00.358141", "end": "2019-03-14 06:33:35.499429",
"rc": 0, "start":
> "2019-03-14 06:33:35.141288", "stderr": "",
"stderr_lines": [], "stdout":
> "{\"1\": {\"conf_on_shared_storage\": true,
\"live-data\": true, \"extra\":
> \"metadata_parse_version=1\\nmetadata_feature_version=1\\ntimestamp=43479
> (Thu Mar 14 06:33:34
> 2019)\\nhost-id=1\\nscore=0\\nvm_conf_refresh_time=43480 (Thu Mar 14
> 06:33:34
>
2019)\\nconf_on_shared_storage=True\\nmaintenance=False\\nstate=EngineUnexpectedlyDown\\nstopped=False\\ntimeout=Thu
> Jan 1 07:04:46 1970\\n\", \"hostname\":
\"dreamlevel-1.logistics.corp\",
> \"host-id\": 1, \"engine-status\": {\"reason\":
\"bad vm status\",
> \"health\": \"bad\", \"vm\":
\"down_unexpected\", \"detail\": \"Down\"},
> \"score\": 0, \"stopped\": false, \"maintenance\":
false, \"crc32\":
> \"de300a81\", \"local_conf_timestamp\": 43480,
\"host-ts\": 43479},
> \"global_maintenance\": false}", "stdout_lines":
["{\"1\":
> {\"conf_on_shared_storage\": true, \"live-data\": true,
\"extra\":
> \"metadata_parse_version=1\\nmetadata_feature_version=1\\ntimestamp=43479
> (Thu Mar 14 06:33:34
> 2019)\\nhost-id=1\\nscore=0\\nvm_conf_refresh_time=43480 (Thu Mar 14
> 06:33:34
>
2019)\\nconf_on_shared_storage=True\\nmaintenance=False\\nstate=EngineUnexpectedlyDown\\nstopped=False\\ntimeout=Thu
> Jan 1 07:04:46 1970\\n\", \"hostname\":
\"dreamlevel-1.logistics.corp\",
> \"host-id\": 1, \"engine-status\": {\"reason\":
\"bad vm status\",
> \"health\": \"bad\", \"vm\":
\"down_unexpected\", \"detail\": \"Down\"},
> \"score\": 0, \"stopped\": false, \"maintenance\":
false, \"crc32\":
> \"de300a81\", \"local_conf_timestamp\": 43480,
\"host-ts\": 43479},
> \"global_maintenance\": false}"]}
> [ INFO ] TASK [oVirt.hosted-engine-setup : Check VM status at virt level]
> [ INFO ] TASK [oVirt.hosted-engine-setup : debug]
> [ INFO ] ok: [localhost]
> [ INFO ] TASK [oVirt.hosted-engine-setup : Fail if engine VM is not
> running]
> [ ERROR ] fatal: [localhost]: FAILED! => {"changed": false,
"msg":
> "Engine VM is not running, please check vdsm logs"}
>
> 2019-03-14 06:33:41,168-0400 ERROR ansible failed {'status':
'FAILED',
> 'ansible_type': 'task', 'ansible_task': u'Check VM status
at virt level',
> 'ansible_result': u"type: <type 'dict'>\nstr:
{'_ansible_parsed': True,
> 'stderr_lines': [], u'changed': True, u'end':
u'2019-03-14
> 06:33:39.429283', '_ansible_no_log': False, u'stdout':
u'', u'cmd': u'virsh
> -r list | grep HostedEngine | grep running', u'rc': 1, u'stderr':
u'',
> u'delta': u'0:00:00.118398', u'invocation':
{u'module_args': {u'crea",
> 'ansible_host': u'localhost', 'ansible_playbook':
> u'/usr/share/ovirt-hosted-engine-setup/ansible/create_target_vm.yml'}
> 2019-03-14 06:33:41,168-0400 DEBUG ansible on_any args
> <ansible.executor.task_result.TaskResult object at 0x7fe8948c2fd0> kwargs
> ignore_errors:True
>
>
> 2019-03-14 06:30:45,000-0400 DEBUG ansible on_any args
> <ansible.executor.task_result.TaskResult object at 0x7fe8944dd810> kwargs
> 2019-03-14 06:30:50,678-0400 DEBUG ansible on_any args
> <ansible.executor.task_result.TaskResult object at 0x7fe894782490> kwargs
> 2019-03-14 06:30:56,369-0400 DEBUG ansible on_any args
> <ansible.executor.task_result.TaskResult object at 0x7fe894675490> kwargs
> 2019-03-14 06:31:02,059-0400 DEBUG ansible on_any args
> <ansible.executor.task_result.TaskResult object at 0x7fe8950ff750> kwargs
> 2019-03-14 06:31:07,738-0400 DEBUG ansible on_any args
> <ansible.executor.task_result.TaskResult object at 0x7fe8944dd810> kwargs
> 2019-03-14 06:31:13,429-0400 DEBUG ansible on_any args
> <ansible.executor.task_result.TaskResult object at 0x7fe8947824d0> kwargs
> 2019-03-14 06:31:19,118-0400 DEBUG ansible on_any args
> <ansible.executor.task_result.TaskResult object at 0x7fe8947c2390> kwargs
> 2019-03-14 06:31:24,797-0400 DEBUG ansible on_any args
> <ansible.executor.task_result.TaskResult object at 0x7fe894750b10> kwargs
> 2019-03-14 06:31:30,481-0400 DEBUG ansible on_any args
> <ansible.executor.task_result.TaskResult object at 0x7fe8944dd810> kwargs
> 2019-03-14 06:31:36,174-0400 DEBUG ansible on_any args
> <ansible.executor.task_result.TaskResult object at 0x7fe894782490> kwargs
> 2019-03-14 06:31:41,852-0400 DEBUG ansible on_any args
> <ansible.executor.task_result.TaskResult object at 0x7fe894675490> kwargs
> 2019-03-14 06:31:47,534-0400 DEBUG ansible on_any args
> <ansible.executor.task_result.TaskResult object at 0x7fe8947c2390> kwargs
> 2019-03-14 06:31:53,226-0400 DEBUG ansible on_any args
> <ansible.executor.task_result.TaskResult object at 0x7fe8944dd810> kwargs
> 2019-03-14 06:31:58,908-0400 DEBUG ansible on_any args
> <ansible.executor.task_result.TaskResult object at 0x7fe8947824d0> kwargs
> 2019-03-14 06:32:04,592-0400 DEBUG ansible on_any args
> <ansible.executor.task_result.TaskResult object at 0x7fe8945f9f10> kwargs
> 2019-03-14 06:32:10,266-0400 DEBUG ansible on_any args
> <ansible.executor.task_result.TaskResult object at 0x7fe8944fb390> kwargs
> 2019-03-14 06:32:15,948-0400 DEBUG ansible on_any args
> <ansible.executor.task_result.TaskResult object at 0x7fe8944dd810> kwargs
> 2019-03-14 06:32:21,630-0400 DEBUG ansible on_any args
> <ansible.executor.task_result.TaskResult object at 0x7fe894782490> kwargs
> 2019-03-14 06:32:27,316-0400 DEBUG ansible on_any args
> <ansible.executor.task_result.TaskResult object at 0x7fe894675490> kwargs
> 2019-03-14 06:32:33,016-0400 DEBUG ansible on_any args
> <ansible.executor.task_result.TaskResult object at 0x7fe8945f9f10> kwargs
> 2019-03-14 06:32:38,702-0400 DEBUG ansible on_any args
> <ansible.executor.task_result.TaskResult object at 0x7fe8944dd810> kwargs
> 2019-03-14 06:32:44,382-0400 DEBUG ansible on_any args
> <ansible.executor.task_result.TaskResult object at 0x7fe8947824d0> kwargs
> 2019-03-14 06:32:50,063-0400 DEBUG ansible on_any args
> <ansible.executor.task_result.TaskResult object at 0x7fe8950ff750> kwargs
> 2019-03-14 06:32:55,745-0400 DEBUG ansible on_any args
> <ansible.executor.task_result.TaskResult object at 0x7fe894750b10> kwargs
> 2019-03-14 06:33:01,422-0400 DEBUG ansible on_any args
> <ansible.executor.task_result.TaskResult object at 0x7fe8944dd810> kwargs
> 2019-03-14 06:33:07,107-0400 DEBUG ansible on_any args
> <ansible.executor.task_result.TaskResult object at 0x7fe894782490> kwargs
> 2019-03-14 06:33:12,794-0400 DEBUG ansible on_any args
> <ansible.executor.task_result.TaskResult object at 0x7fe894675490> kwargs
> 2019-03-14 06:33:18,470-0400 DEBUG ansible on_any args
> <ansible.executor.task_result.TaskResult object at 0x7fe8950ff750> kwargs
>
> Engine VM status: EngineUnexpectedlyDown-EngineDown.
>
> * Manually start the Engine VM via virsh:
> error: Failed to start domain HostedEngine
> error: the CPU is incompatible with host CPU: Host CPU does not provide
> required features: pcid
> _______________________________________________
> Users mailing list -- users(a)ovirt.org
> To unsubscribe send an email to users-leave(a)ovirt.org
> Privacy Statement:
https://www.ovirt.org/site/privacy-policy/
> oVirt Code of Conduct:
>
https://www.ovirt.org/community/about/community-guidelines/
> List Archives:
>
https://lists.ovirt.org/archives/list/users@ovirt.org/message/2O45TRVNJAE...
>