Hi,

Thanks for the quick reply, my ovirt-engine is running on CentOS 6.7 and my hypervisor is also CentOS 6.7.
engine.log
]# tail /var/log/ovirt-engine/engine.log
        at java.util.concurrent.FutureTask.run(FutureTask.java:262) [rt.jar:1.7.0_95]
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) [rt.jar:1.7.0_95]
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) [rt.jar:1.7.0_95]
        at java.lang.Thread.run(Thread.java:745) [rt.jar:1.7.0_95]

2016-04-21 09:28:09,600 ERROR [org.ovirt.engine.core.bll.hostdeploy.InstallVdsInternalCommand] (org.ovirt.thread.pool-8-thread-4) [59328744] Host installation failed for host '92380584-6896-42f5-b2ca-727840db7645', 'vcih3.pagasa-vci.lan': Command returned failure code 1 during SSH session 'root@10.11.41.7'
2016-04-21 09:28:09,602 INFO  [org.ovirt.engine.core.vdsbroker.SetVdsStatusVDSCommand] (org.ovirt.thread.pool-8-thread-4) [59328744] START, SetVdsStatusVDSCommand(HostName = vcih3.pagasa-vci.lan, SetVdsStatusVDSCommandParameters:{runAsync='true', hostId='92380584-6896-42f5-b2ca-727840db7645', status='InstallFailed', nonOperationalReason='NONE', stopSpmFailureLogged='false', maintenanceReason='null'}), log id: 17f30fc4
2016-04-21 09:28:09,604 INFO  [org.ovirt.engine.core.vdsbroker.SetVdsStatusVDSCommand] (org.ovirt.thread.pool-8-thread-4) [59328744] FINISH, SetVdsStatusVDSCommand, log id: 17f30fc4
2016-04-21 09:28:09,609 ERROR [org.ovirt.engine.core.dal.dbbroker.auditloghandling.AuditLogDirector] (org.ovirt.thread.pool-8-thread-4) [59328744] Correlation ID: 59328744, Call Stack: null, Custom Event ID: -1, Message: Host vcih3.pagasa-vci.lan installation failed. Command returned failure code 1 during SSH session 'root@10.11.41.7'.
2016-04-21 09:28:09,609 INFO  [org.ovirt.engine.core.bll.hostdeploy.InstallVdsInternalCommand] (org.ovirt.thread.pool-8-thread-4) [59328744] Lock freed to object 'EngineLock:{exclusiveLocks='[92380584-6896-42f5-b2ca-727840db7645=<VDS, ACTION_TYPE_FAILED_OBJECT_LOCKED>]', sharedLocks='null'}'

server.log
]# tail /var/log/ovirt-engine/server.log
        at io.undertow.server.Connectors.executeRootHandler(Connectors.java:199) [undertow-core-1.1.8.Final.jar:1.1.8.Final]
        at io.undertow.server.HttpServerExchange$1.run(HttpServerExchange.java:761) [undertow-core-1.1.8.Final.jar:1.1.8.Final]
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) [rt.jar:1.7.0_95]
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) [rt.jar:1.7.0_95]
        at java.lang.Thread.run(Thread.java:745) [rt.jar:1.7.0_95]

2016-04-21 09:27:33,289 INFO  [org.apache.sshd.client.session.ClientSessionImpl] (pool-18-thread-1) Client session created
2016-04-21 09:27:33,298 INFO  [org.apache.sshd.client.session.ClientSessionImpl] (pool-18-thread-1) Server version string: SSH-2.0-OpenSSH_5.3
2016-04-21 09:27:33,301 INFO  [org.apache.sshd.client.session.ClientSessionImpl] (pool-18-thread-2) Kex: server->client aes128-ctr hmac-sha2-256 none
2016-04-21 09:27:33,302 INFO  [org.apache.sshd.client.session.ClientSessionImpl] (pool-18-thread-2) Kex: client->server aes128-ctr hmac-sha2-256 none

host-deploy
# tail ovirt-host-deploy-20160421092809-10.11.41.7-59328744.log
2016-04-21 09:28:14 DEBUG otopi.plugins.otopi.dialog.machine dialog.__logString:219 DIALOG:SEND       ***Q:STRING TERMINATION_COMMAND
2016-04-21 09:28:14 DEBUG otopi.plugins.otopi.dialog.machine dialog.__logString:219 DIALOG:SEND       ###
2016-04-21 09:28:14 DEBUG otopi.plugins.otopi.dialog.machine dialog.__logString:219 DIALOG:SEND       ### Processing ended, use 'quit' to quit
2016-04-21 09:28:14 DEBUG otopi.plugins.otopi.dialog.machine dialog.__logString:219 DIALOG:SEND       ### COMMAND>
2016-04-21 09:28:14 DEBUG otopi.plugins.otopi.dialog.machine dialog.__logString:219 DIALOG:RECEIVE    noop
2016-04-21 09:28:14 DEBUG otopi.plugins.otopi.dialog.machine dialog.__logString:219 DIALOG:SEND       ***Q:STRING TERMINATION_COMMAND
2016-04-21 09:28:14 DEBUG otopi.plugins.otopi.dialog.machine dialog.__logString:219 DIALOG:SEND       ###
2016-04-21 09:28:14 DEBUG otopi.plugins.otopi.dialog.machine dialog.__logString:219 DIALOG:SEND       ### Processing ended, use 'quit' to quit
2016-04-21 09:28:14 DEBUG otopi.plugins.otopi.dialog.machine dialog.__logString:219 DIALOG:SEND       ### COMMAND>
2016-04-21 09:28:14 DEBUG otopi.plugins.otopi.dialog.machine dialog.__logString:219 DIALOG:RECEIVE    log

hope this logs can help us.

TIA

On Wed, Apr 20, 2016 at 2:06 PM, Yedidyah Bar David <didi@redhat.com> wrote:
On Wed, Apr 20, 2016 at 4:10 AM, Sandvik Agustin
<agustinsandvik@gmail.com> wrote:
> Hi Users,
>
> Good day, i'm having problem on adding hosts/hypervisor in ovirt-engine 3.6,
> a month ago i have successfully added 2 hypervisor when the engine is still
> version 3.5, then a week ago, i manage to update the engine from 3.5 to 3.6,
> yesterday i'm adding another 1 hypervisor and i tried using ovirt 3.6 repo
> for the new hypervisor and failed to add, tried to add the ovirt 3.5 repo
> and still failed to add, the ovirt-engine event tab shows "error code 1
> during ssh session" i've googled it, "correct me if i'm wrong with this" the
> hypervisor having problema resolving name server which inside of ovirt repo
> 3.5 or 3.6, when i tried to open those "repolist" and tried to access it on
> my web browser it shows that
>
> "This site can’t be reached"
>
> some of sites inside repos are unable to reach or the directory path did not
> exist.
>
> are we having problem about those repos or it just me?
>
> is there any other way to install or add hosts whenever the repos or sites
> that holding the packages are having problem?

Not sure I fully understand the issue.

Naturally, if you want to be independent of the Internet, you can maintain
your own local mirror.

Perhaps you are trying to add an EL6 host? Please note that el6 is not
supported in 3.6 as a new host. 3.6 can manage existing el6 hosts in 3.5
compatibility level, but new hosts must be el7 (or fedora).

If it's something else, and/or the above is not helpful, please post links
to relevant logs - engine.log, server.log, host-deploy, etc. Thanks.

Best,
--
Didi