[ovirt-users] Failed to check for available updates on host * with message 'Command returned failure code 1 during SSH session 'root@*

Edward Haas ehaas at redhat.com
Tue Sep 13 18:57:24 UTC 2016


On Tue, Sep 13, 2016 at 12:55 PM, Martin Perina <mperina at redhat.com> wrote:

>
>
> On Tue, Sep 13, 2016 at 10:59 AM, <aleksey.maksimov at it-kb.ru> wrote:
>
>> nslookup resolves names on the engine and hosts without ipv6 address:
>>
>> # nslookup mirrorlist.centos.org
>>
>> Server:         10.1.0.10
>> Address:        10.1.0.10#53
>>
>> Non-authoritative answer:
>> Name:   mirrorlist.centos.org
>> Address: 67.219.148.138
>> Name:   mirrorlist.centos.org
>> Address: 85.236.43.108
>> Name:   mirrorlist.centos.org
>> Address: 212.69.166.138
>> Name:   mirrorlist.centos.org
>> Address: 216.176.179.218
>>
>> Why oVirt updates checking process trying to use ipv6 ?
>>
>>
>> 13.09.2016, 08:26, "aleksey.maksimov at it-kb.ru" <aleksey.maksimov at it-kb.ru
>> >:
>> > Yesterday I changed the settings the host to:
>> >
>> > net.ipv6.conf.all.disable_ipv6 = 1
>> > net.ipv6.conf.default.disable_ipv6 = 1
>> >
>> > I deleted the last two lines in /etc/sysctl.conf and rebooted host.
>> >
>> > Tonight - problem repeated:
>> >
>> > 2016-09-13 01:11:13 ERROR otopi.context context._executeMethod:151
>> Failed to execute stage 'Environment packages setup': Cannot find a valid
>> baseurl for repo: base/7/x86_64
>> > 2016-09-13 01:11:13 DEBUG otopi.transaction transaction.abort:119
>> aborting 'Yum Transaction'
>> > 2016-09-13 01:11:13 INFO otopi.plugins.otopi.packagers.yumpackager
>> yumpackager.info:80 Yum Performing yum transaction rollback
>> > Could not retrieve mirrorlist http://mirrorlist.centos.org/?
>> release=7&arch=x86_64&repo=os&infra=stock error was
>> > 14: curl#7 - "Failed to connect to 2604:1580:fe02:2::10: Network is
>> unreachable"
>> >
>> > Why so?
>>
>
> ​Didi, any idea why otopi (or python underneath, not sure) is using IPv6​
> addresses when IPv6 is disabled on the host?
>

Could you please resolve it with "getent ahosts <name>" ?
Please add the output from "ip addr", if you see there an IPv6 address,
something is enabling the ipv6 on that iface.


>
>
>> >
>> > 12.09.2016, 17:34, "Martin Perina" <mperina at redhat.com>:
>> >> On Mon, Sep 12, 2016 at 3:34 PM, <aleksey.maksimov at it-kb.ru> wrote:
>> >>> My /etc/sysctl.conf is:
>> >>>
>> >>> net.ipv6.conf.all.disable_ipv6 = 1
>> >>> net.ipv6.conf.default.disable_ipv6 = 1
>> >>> net.ipv6.conf.lo.disable_ipv6 = 0
>> >>> net.ipv6.conf.ovirtmgmt.disable_ipv6 = 0
>> >>
>> >> ​I'm not networking expert, but last two lines means that you have
>> IPv6 enabled on loopback and ovirtmgmt network interfaces (but disabled on
>> all others). So you shouldn't be surprised to get IPv6 address from DNS
>> resolver if ovirtmgmt supports IPv6. Dan/Edward, am I right?
>> >> ​
>> >>> Last two lines was added for http://lists.ovirt.org/piperma
>> il/users/2016-July/041443.html
>> >>>
>> >>> My configuration file is bad ??
>> >>>
>> >>> 12.09.2016, 16:22, "Martin Perina" <mperina at redhat.com>:
>> >>>> On Mon, Sep 12, 2016 at 2:49 PM, <aleksey.maksimov at it-kb.ru> wrote:
>> >>>>> Ok. I found log-file /var/log/ovirt-engine/host-dep
>> loy/ovirt-host-mgmt-20160912020013-kom-ad01-vm31.holding.com-null.log
>> with this:
>> >>>>>
>> >>>>> ...
>> >>>>> 2016-09-12 02:00:13 ERROR otopi.plugins.otopi.packagers.yumpackager
>> yumpackager.error:85 Yum Cannot queue package iproute: Cannot find a valid
>> baseurl for repo: base/7/x86_64
>> >>>>> 2016-09-12 02:00:13 DEBUG otopi.context context._executeMethod:142
>> method exception
>> >>>>> Traceback (most recent call last):
>> >>>>>   File "/tmp/ovirt-B4lcSm14u9/pythonlib/otopi/context.py", line
>> 132, in _executeMethod
>> >>>>>     method['method']()
>> >>>>>   File "/tmp/ovirt-B4lcSm14u9/otopi-plugins/otopi/network/hostname.py",
>> line 54, in _internal_packages
>> >>>>>     self.packager.install(packages=('iproute',))
>> >>>>>   File "/tmp/ovirt-B4lcSm14u9/otopi-plugins/otopi/packagers/yumpackager.py",
>> line 295, in install
>> >>>>>     ignoreErrors=ignoreErrors
>> >>>>>   File "/tmp/ovirt-B4lcSm14u9/pythonlib/otopi/miniyum.py", line
>> 851, in install
>> >>>>>     **kwargs
>> >>>>>   File "/tmp/ovirt-B4lcSm14u9/pythonlib/otopi/miniyum.py", line
>> 495, in _queue
>> >>>>>     provides = self._queryProvides(packages=(package,))
>> >>>>>   File "/tmp/ovirt-B4lcSm14u9/pythonlib/otopi/miniyum.py", line
>> 433, in _queryProvides
>> >>>>>     for po in self._yb.searchPackageProvides(args=packages):
>> >>>>>   File "/usr/lib/python2.7/site-packages/yum/__init__.py", line
>> 3429, in searchPackageProvides
>> >>>>>     where = self.returnPackagesByDep(arg)
>> >>>>>   File "/usr/lib/python2.7/site-packages/yum/__init__.py", line
>> 4255, in returnPackagesByDep
>> >>>>>     return self.pkgSack.searchProvides(depstring)
>> >>>>>   File "/usr/lib/python2.7/site-packages/yum/__init__.py", line
>> 1079, in <lambda>
>> >>>>>     pkgSack = property(fget=lambda self: self._getSacks(),
>> >>>>>   File "/usr/lib/python2.7/site-packages/yum/__init__.py", line
>> 784, in _getSacks
>> >>>>>     self.repos.populateSack(which=repos)
>> >>>>>   File "/usr/lib/python2.7/site-packages/yum/repos.py", line 344,
>> in populateSack
>> >>>>>     self.doSetup()
>> >>>>>   File "/usr/lib/python2.7/site-packages/yum/repos.py", line 158,
>> in doSetup
>> >>>>>     self.ayum.plugins.run('postreposetup')
>> >>>>>   File "/usr/lib/python2.7/site-packages/yum/plugins.py", line
>> 188, in run
>> >>>>>     func(conduitcls(self, self.base, conf, **kwargs))
>> >>>>>   File "/usr/lib/yum-plugins/fastestmirror.py", line 197, in
>> postreposetup_hook
>> >>>>>     if downgrade_ftp and _len_non_ftp(repo.urls) == 1:
>> >>>>>   File "/usr/lib/python2.7/site-packages/yum/yumRepo.py", line
>> 871, in <lambda>
>> >>>>>     urls = property(fget=lambda self: self._geturls(),
>> >>>>>   File "/usr/lib/python2.7/site-packages/yum/yumRepo.py", line
>> 868, in _geturls
>> >>>>>     self._baseurlSetup()
>> >>>>>   File "/usr/lib/python2.7/site-packages/yum/yumRepo.py", line
>> 834, in _baseurlSetup
>> >>>>>     self.check()
>> >>>>>   File "/usr/lib/python2.7/site-packages/yum/yumRepo.py", line
>> 554, in check
>> >>>>>     'Cannot find a valid baseurl for repo: %s' % self.ui_id
>> >>>>> RepoError: Cannot find a valid baseurl for repo: base/7/x86_64
>> >>>>> 2016-09-12 02:00:13 ERROR otopi.context context._executeMethod:151
>> Failed to execute stage 'Environment packages setup': Cannot find a valid
>> baseurl for repo: base/7/x86_64
>> >>>>> 2016-09-12 02:00:13 DEBUG otopi.transaction transaction.abort:119
>> aborting 'Yum Transaction'
>> >>>>> 2016-09-12 02:00:13 INFO otopi.plugins.otopi.packagers.yumpackager
>> yumpackager.info:80 Yum Performing yum transaction rollback
>> >>>>> Could not retrieve mirrorlist http://mirrorlist.centos.org/?
>> release=7&arch=x86_64&repo=os&infra=stock error was
>> >>>>> 14: curl#7 - "Failed to connect to 2604:1580:fe02:2::10: Network is
>> unreachable"
>> >>>>> Loaded plugins: fastestmirror
>> >>>>> 2016-09-12 02:00:13 DEBUG otopi.context context.dumpEnvironment:760
>> ENVIRONMENT DUMP - BEGIN
>> >>>>> 2016-09-12 02:00:13 DEBUG otopi.context context.dumpEnvironment:770
>> ENV BASE/error=bool:'True'
>> >>>>> 2016-09-12 02:00:13 DEBUG otopi.context context.dumpEnvironment:770
>> ENV BASE/exceptionInfo=list:'[(<class 'yum.Errors.RepoError'>,
>> RepoError(), <traceback object at 0x7f7002252f80>)]'
>> >>>>> 2016-09-12 02:00:13 DEBUG otopi.context context.dumpEnvironment:774
>> ENVIRONMENT DUMP - END
>> >>>>> ...
>> >>>>>
>> >>>>> He's trying to use ipv6 connection ??
>> >>>>> But, I have not ipv6 networks and my hosts/engine used only ipv4
>> settings.
>> >>>>
>> >>>> ​Hmm, it seems to me that you don't have IPv6 disabled properly, as
>> your DNS resolver returns you IPv6 addresses. Please take a look at:
>> >>>>
>> >>>> https://wiki.centos.org/FAQ/CentOS7#head-8984faf811faccca74c
>> 7bcdd74de7467f2fcd8ee
>> >>>> ​
>> >>>>
>> >>>>> 12.09.2016, 15:45, "Martin Perina" <mperina at redhat.com>:
>> >>>>>>> ​2016-09-12 02:00:13,388 INFO [org.ovirt.engine.core.bll.hostdeploy.VdsDeployBase]
>> (VdsDeploy) [] Retrieving installation logs to:
>> '/var/log/ovirt-engine/host-deploy/ovirt-host-mgmt-201609120
>> 20013-kom-ad01-vm31.holding.com-null.l$
>> >>>>>>
>> >>>>>> This is most probably the log file where details of the issue can
>> be found
>> >>>>>>
>> >>>>>> On Mon, Sep 12, 2016 at 2:35 PM, Yedidyah Bar David <
>> didi at redhat.com> wrote:
>> >>>>>>> On Mon, Sep 12, 2016 at 3:08 PM,  <aleksey.maksimov at it-kb.ru>
>> wrote:
>> >>>>>>>>
>> >>>>>>>> Excuse me. I do not understand. What I should do?
>> >>>>>>>> The directory /var/log/ovirt-engine/host-deploy many different
>> log files
>> >>>>>>>
>> >>>>>>> Please check the one from a time where you see this error.
>> >>>>>>>
>> >>>>>>> Every run of host-deploy creates its own log file.
>> >>>>>>>
>> >>>>>>> Best,
>> >>>>>>>
>> >>>>>>>>
>> >>>>>>>> 12.09.2016, 14:59, "Yedidyah Bar David" <didi at redhat.com>:
>> >>>>>>>>> On Mon, Sep 12, 2016 at 11:17 AM, <aleksey.maksimov at it-kb.ru>
>> wrote:
>> >>>>>>>>>>  # yum install iproute
>> >>>>>>>>>>
>> >>>>>>>>>>  Loaded plugins: fastestmirror
>> >>>>>>>>>>  Loading mirror speeds from cached hostfile
>> >>>>>>>>>>   * base: mirror.awanti.com
>> >>>>>>>>>>   * elrepo: dfw.mirror.rackspace.com
>> >>>>>>>>>>   * epel: epel.besthosting.ua
>> >>>>>>>>>>   * extras: centos-mirror.rbc.ru
>> >>>>>>>>>>   * ovirt-4.0: ftp.nluug.nl
>> >>>>>>>>>>   * ovirt-4.0-epel: epel.besthosting.ua
>> >>>>>>>>>>   * updates: centos-mirror.rbc.ru
>> >>>>>>>>>>  Package iproute-3.10.0-54.el7_2.1.x86_64 already installed
>> and latest version
>> >>>>>>>>>>  Nothing to do
>> >>>>>>>>>>
>> >>>>>>>>>>  # ls -la /etc/yum.repos.d/
>> >>>>>>>>>>
>> >>>>>>>>>>  total 76
>> >>>>>>>>>>  drwxr-xr-x. 2 root root 4096 Jul 26 14:08 .
>> >>>>>>>>>>  drwxr-xr-x. 102 root root 12288 Sep 12 11:00 ..
>> >>>>>>>>>>  -rw-r--r--. 1 root root 1664 Dec 9 2015 CentOS-Base.repo
>> >>>>>>>>>>  -rw-r--r--. 1 root root 1309 Dec 9 2015 CentOS-CR.repo
>> >>>>>>>>>>  -rw-r--r--. 1 root root 649 Dec 9 2015 CentOS-Debuginfo.repo
>> >>>>>>>>>>  -rw-r--r--. 1 root root 290 Dec 9 2015 CentOS-fasttrack.repo
>> >>>>>>>>>>  -rw-r--r--. 1 root root 630 Dec 9 2015 CentOS-Media.repo
>> >>>>>>>>>>  -rw-r--r--. 1 root root 1331 Dec 9 2015 CentOS-Sources.repo
>> >>>>>>>>>>  -rw-r--r--. 1 root root 1952 Dec 9 2015 CentOS-Vault.repo
>> >>>>>>>>>>  -rw-r--r--. 1 root root 2150 May 21 2014 elrepo.repo
>> >>>>>>>>>>  -rw-r--r--. 1 root root 957 Jun 3 16:52 epel.repo
>> >>>>>>>>>>  -rw-r--r--. 1 root root 1056 Jun 3 16:52 epel-testing.repo
>> >>>>>>>>>>  -rw-r--r--. 1 root root 383 Jul 21 10:44 HP-MCP.repo
>> >>>>>>>>>>  -rw-r--r--. 1 root root 446 Jul 21 10:45 HP-SPP-2014-06.repo
>> >>>>>>>>>>  -rw-r--r--. 1 root root 417 Jul 21 10:44 HP-SPP-Current.repo
>> >>>>>>>>>>  -rw-r--r--. 1 root root 1678 Jul 26 14:08
>> ovirt-4.0-dependencies.repo
>> >>>>>>>>>>  -rw-r--r--. 1 root root 289 Jul 26 14:08 ovirt-4.0.repo
>> >>>>>>>>>>
>> >>>>>>>>>>  Which repo file you are interested in?
>> >>>>>>>>>
>> >>>>>>>>> The one(s) that has in it '[base]'. Also, you might find more
>> >>>>>>>>> details in the host-deploy log, in
>> /var/log/ovirt-engine/host-deploy .
>> >>>>>>>>>
>> >>>>>>>>>>  12.09.2016, 11:14, "Yedidyah Bar David" <didi at redhat.com>:
>> >>>>>>>>>>>  On Mon, Sep 12, 2016 at 10:59 AM, <aleksey.maksimov at it-kb.ru>
>> wrote:
>> >>>>>>>>>>>>   Hi, Martin.
>> >>>>>>>>>>>>
>> >>>>>>>>>>>>   The command is executed successfully:
>> >>>>>>>>>>>>
>> >>>>>>>>>>>>   Updated:
>> >>>>>>>>>>>>     vdsm.x86_64 0:4.18.12-1.el7 vdsm-cli.noarch
>> 0:4.18.12-1.el7
>> >>>>>>>>>>>>   Dependency Updated:
>> >>>>>>>>>>>>     vdsm-api.noarch 0:4.18.12-1.el7
>> vdsm-hook-vmfex-dev.noarch 0:4.18.12-1.el7 vdsm-infra.noarch 0:4.18.12-1.el7
>> >>>>>>>>>>>>     vdsm-jsonrpc.noarch 0:4.18.12-1.el7 vdsm-python.noarch
>> 0:4.18.12-1.el7 vdsm-xmlrpc.noarch 0:4.18.12-1.el7
>> >>>>>>>>>>>>     vdsm-yajsonrpc.noarch 0:4.18.12-1.el7
>> >>>>>>>>>>>>   Complete!
>> >>>>>>>>>>>>
>> >>>>>>>>>>>>   12.09.2016, 10:47, "Martin Perina" <mperina at redhat.com>:
>> >>>>>>>>>>>>>   Hi Aleksey,
>> >>>>>>>>>>>>>
>> >>>>>>>>>>>>>   the error message is raised by Host upgrade manager (more
>> info at [1]) which checks for available package upgrades on the host.
>> >>>>>>>>>>>>>
>> >>>>>>>>>>>>>   It seems that you have somehow corrupted repository
>> configuration of the host 'kom-ad01-vm31.holding.com':
>> >>>>>>>>>>>>>
>> >>>>>>>>>>>>>   2016-09-12 02:00:13,300 ERROR
>> [org.ovirt.engine.core.bll.hostdeploy.VdsDeployBase] (VdsDeploy) [] Yum
>> Cannot queue package iproute: Cannot find a valid baseurl for repo:
>> base/7/x86_64
>> >>>>>>>>>>>>>   2016-09-12 02:00:13,304 ERROR
>> [org.ovirt.engine.core.bll.hostdeploy.VdsDeployBase] (VdsDeploy) []
>> Failed to execute stage 'Environment packages setup': Cannot find a valid
>> baseurl for repo: base/7/x86_64
>> >>>>>>>>>>>>>
>> >>>>>>>>>>>>>   Could you please verify that you are able to successfully
>> execute following commands on the host?
>> >>>>>>>>>>>>>
>> >>>>>>>>>>>>>     yum check-update
>> >>>>>>>>>>>>>     yum update vdsm vdsm-cli
>> >>>>>>>>>>>
>> >>>>>>>>>>>  Can you please try also:
>> >>>>>>>>>>>
>> >>>>>>>>>>>  yum install iproute
>> >>>>>>>>>>>
>> >>>>>>>>>>>  And also share your /etc/yum.repos.d/* ?
>> >>>>>>>>>>>
>> >>>>>>>>>>>  Thanks.
>> >>>>>>>>>>>
>> >>>>>>>>>>>>>   Thanks
>> >>>>>>>>>>>>>
>> >>>>>>>>>>>>>   Martin Perina
>> >>>>>>>>>>>>>
>> >>>>>>>>>>>>>   [1] http://www.ovirt.org/develop/r
>> elease-management/features/engine/upgrademanager/
>> >>>>>>>>>>>>>
>> >>>>>>>>>>>>>   On Mon, Sep 12, 2016 at 9:20 AM, <
>> aleksey.maksimov at it-kb.ru> wrote:
>> >>>>>>>>>>>>>>   Hello oVirt guru`s!
>> >>>>>>>>>>>>>>
>> >>>>>>>>>>>>>>   oVirt Engine Version: 4.0.1.1-1.el7.centos
>> >>>>>>>>>>>>>>
>> >>>>>>>>>>>>>>   Every day there is a message in web UI for one of my two
>> hosts:
>> >>>>>>>>>>>>>>
>> >>>>>>>>>>>>>>   "Failed to check for available updates on host
>> KOM-AD01-VM31 with message 'Command returned failure code 1 during SSH
>> session 'root@
>> >>>>>>>>>>>>>>
>> >>>>>>>>>>>>>>   kom-ad01-vm31.holding.com''.
>> >>>>>>>>>>>>>>
>> >>>>>>>>>>>>>>   In /var/log/ovirt-engine/engine.log at this time:
>> >>>>>>>>>>>>>>
>> >>>>>>>>>>>>>>   ...
>> >>>>>>>>>>>>>>   2016-09-12 01:59:45,045 INFO
>> [org.ovirt.engine.core.bll.hostdeploy.VdsDeployBase] (VdsDeploy) []
>> Stage: Initializing
>> >>>>>>>>>>>>>>   2016-09-12 01:59:45,055 INFO
>> [org.ovirt.engine.core.bll.hostdeploy.VdsDeployBase] (VdsDeploy) []
>> Stage: Environment setup
>> >>>>>>>>>>>>>>   2016-09-12 01:59:45,070 INFO
>> [org.ovirt.engine.core.bll.hostdeploy.VdsDeployBase] (VdsDeploy) []
>> Stage: Environment packages setup
>> >>>>>>>>>>>>>>
>> >>>>>>>>>>>>>>   2016-09-12 02:00:13,300 ERROR
>> [org.ovirt.engine.core.bll.hostdeploy.VdsDeployBase] (VdsDeploy) [] Yum
>> Cannot queue package iproute: Cannot find a valid baseurl for repo:
>> base/7/x86_64
>> >>>>>>>>>>>>>>   2016-09-12 02:00:13,304 ERROR
>> [org.ovirt.engine.core.bll.hostdeploy.VdsDeployBase] (VdsDeploy) []
>> Failed to execute stage 'Environment packages setup': Cannot find a valid
>> baseurl for repo: base/7/x86_64
>> >>>>>>>>>>>>>>   2016-09-12 02:00:13,304 INFO
>> [org.ovirt.engine.core.bll.hostdeploy.VdsDeployBase] (VdsDeploy) [] Yum
>> Performing yum transaction rollback
>> >>>>>>>>>>>>>>   2016-09-12 02:00:13,368 INFO
>> [org.ovirt.engine.core.bll.hostdeploy.VdsDeployBase] (VdsDeploy) []
>> Stage: Pre-termination
>> >>>>>>>>>>>>>>   2016-09-12 02:00:13,388 INFO
>> [org.ovirt.engine.core.bll.hostdeploy.VdsDeployBase] (VdsDeploy) []
>> Retrieving installation logs to: '/var/log/ovirt-engine/host-de
>> ploy/ovirt-host-mgmt-20160912020013-kom-ad01-vm31.holding.com-null.l$
>> >>>>>>>>>>>>>>   2016-09-12 02:00:13,459 INFO
>> [org.ovirt.engine.core.bll.hostdeploy.VdsDeployBase] (VdsDeploy) []
>> Stage: Termination
>> >>>>>>>>>>>>>>   2016-09-12 02:00:13,506 ERROR
>> [org.ovirt.engine.core.uutils.ssh.SSHDialog] (DefaultQuartzScheduler5)
>> [] SSH error running command root at kom-ad01-vm31.holding.com:'umask 0077;
>> MYTMP="$(TMPDIR="${OVIRT_TMPDIR}" mktemp -d -t ovirt-XXXXXXXXXX)"; trap
>> "chmod -R u+rwX \"${MYTMP}\" > /dev/null 2>&1; rm -fr \"${MYTMP}\" >
>> /dev/null 2>&1" 0; tar --warning=no-timestamp -C "${MYTMP}" -x &&
>> "${MYTMP}"/ovirt-host-mgmt DIALOG/dialect=str:machine
>> DIALOG/customization=bool:True': Command returned failure code 1 during
>> SSH session 'root at kom-ad01-vm31.holding.com'
>> >>>>>>>>>>>>>>
>> >>>>>>>>>>>>>>   2016-09-12 02:00:13,506 ERROR
>> [org.ovirt.engine.core.uutils.ssh.SSHDialog] (DefaultQuartzScheduler5)
>> [] Exception: java.io.IOException: Command returned failure code 1 during
>> SSH session 'root at kom-ad01-vm31.holding.com'
>> >>>>>>>>>>>>>>           at org.ovirt.engine.core.uutils.s
>> sh.SSHClient.executeCommand(SSHClient.java:526) [uutils.jar:]
>> >>>>>>>>>>>>>>           at org.ovirt.engine.core.uutils.s
>> sh.SSHDialog.executeCommand(SSHDialog.java:317) [uutils.jar:]
>> >>>>>>>>>>>>>>           at org.ovirt.engine.core.bll.host
>> deploy.VdsDeployBase.execute(VdsDeployBase.java:563) [bll.jar:]
>> >>>>>>>>>>>>>>           at org.ovirt.engine.core.bll.host
>> .HostUpgradeManager.isUpdateAvailable(HostUpgradeManager.java:44)
>> [bll.jar:]
>> >>>>>>>>>>>>>>           at org.ovirt.engine.core.bll.host
>> .AvailableUpdatesFinder.isUpdateAvailable(AvailableUpdatesFinder.java:26)
>> [bll.jar:]
>> >>>>>>>>>>>>>>           at org.ovirt.engine.core.bll.VdsE
>> ventListener.isUpdateAvailable(VdsEventListener.java:553) [bll.jar:]
>> >>>>>>>>>>>>>>           at org.ovirt.engine.core.vdsbroke
>> r.ResourceManager.isUpdateAvailable(ResourceManager.java:510)
>> [vdsbroker.jar:]
>> >>>>>>>>>>>>>>           at org.ovirt.engine.core.vdsbroke
>> r.VdsManager.availableUpdates(VdsManager.java:306) [vdsbroker.jar:]
>> >>>>>>>>>>>>>>           at sun.reflect.NativeMethodAccessorImpl.invoke0(Native
>> Method) [rt.jar:1.8.0_101]
>> >>>>>>>>>>>>>>           at sun.reflect.NativeMethodAccess
>> orImpl.invoke(NativeMethodAccessorImpl.java:62) [rt.jar:1.8.0_101]
>> >>>>>>>>>>>>>>           at sun.reflect.DelegatingMethodAc
>> cessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>> [rt.jar:1.8.0_101]
>> >>>>>>>>>>>>>>           at java.lang.reflect.Method.invoke(Method.java:498)
>> [rt.jar:1.8.0_101]
>> >>>>>>>>>>>>>>           at org.ovirt.engine.core.utils.ti
>> mer.JobWrapper.invokeMethod(JobWrapper.java:77) [scheduler.jar:]
>> >>>>>>>>>>>>>>           at org.ovirt.engine.core.utils.ti
>> mer.JobWrapper.execute(JobWrapper.java:51) [scheduler.jar:]
>> >>>>>>>>>>>>>>           at org.quartz.core.JobRunShell.run(JobRunShell.java:213)
>> [quartz.jar:]
>> >>>>>>>>>>>>>>           at java.util.concurrent.Executors
>> $RunnableAdapter.call(Executors.java:511) [rt.jar:1.8.0_101]
>> >>>>>>>>>>>>>>           at java.util.concurrent.FutureTask.run(FutureTask.java:266)
>> [rt.jar:1.8.0_101]
>> >>>>>>>>>>>>>>           at java.util.concurrent.ThreadPoo
>> lExecutor.runWorker(ThreadPoolExecutor.java:1142) [rt.jar:1.8.0_101]
>> >>>>>>>>>>>>>>           at java.util.concurrent.ThreadPoo
>> lExecutor$Worker.run(ThreadPoolExecutor.java:617) [rt.jar:1.8.0_101]
>> >>>>>>>>>>>>>>           at java.lang.Thread.run(Thread.java:745)
>> [rt.jar:1.8.0_101]
>> >>>>>>>>>>>>>>
>> >>>>>>>>>>>>>>   2016-09-12 02:00:13,507 ERROR
>> [org.ovirt.engine.core.bll.hostdeploy.VdsDeployBase]
>> (DefaultQuartzScheduler5) [] Error during host kom-ad01-vm31.holding.com
>> install: java.io.IOException: Command returned failure code 1 during SSH
>> session 'root at kom-ad01-vm31.holding.com'
>> >>>>>>>>>>>>>>           at org.ovirt.engine.core.uutils.s
>> sh.SSHClient.executeCommand(SSHClient.java:526) [uutils.jar:]
>> >>>>>>>>>>>>>>           at org.ovirt.engine.core.uutils.s
>> sh.SSHDialog.executeCommand(SSHDialog.java:317) [uutils.jar:]
>> >>>>>>>>>>>>>>           at org.ovirt.engine.core.bll.host
>> deploy.VdsDeployBase.execute(VdsDeployBase.java:563) [bll.jar:]
>> >>>>>>>>>>>>>>           at org.ovirt.engine.core.bll.host
>> .HostUpgradeManager.isUpdateAvailable(HostUpgradeManager.java:44)
>> [bll.jar:]
>> >>>>>>>>>>>>>>           at org.ovirt.engine.core.bll.host
>> .AvailableUpdatesFinder.isUpdateAvailable(AvailableUpdatesFinder.java:26)
>> [bll.jar:]
>> >>>>>>>>>>>>>>           at org.ovirt.engine.core.bll.VdsE
>> ventListener.isUpdateAvailable(VdsEventListener.java:553) [bll.jar:]
>> >>>>>>>>>>>>>>           at org.ovirt.engine.core.vdsbroke
>> r.ResourceManager.isUpdateAvailable(ResourceManager.java:510)
>> [vdsbroker.jar:]
>> >>>>>>>>>>>>>>           at org.ovirt.engine.core.vdsbroke
>> r.VdsManager.availableUpdates(VdsManager.java:306) [vdsbroker.jar:]
>> >>>>>>>>>>>>>>           at sun.reflect.NativeMethodAccessorImpl.invoke0(Native
>> Method) [rt.jar:1.8.0_101]
>> >>>>>>>>>>>>>>           at sun.reflect.NativeMethodAccess
>> orImpl.invoke(NativeMethodAccessorImpl.java:62) [rt.jar:1.8.0_101]
>> >>>>>>>>>>>>>>           at sun.reflect.DelegatingMethodAc
>> cessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>> [rt.jar:1.8.0_101]
>> >>>>>>>>>>>>>>           at java.lang.reflect.Method.invoke(Method.java:498)
>> [rt.jar:1.8.0_101]
>> >>>>>>>>>>>>>>           at org.ovirt.engine.core.utils.ti
>> mer.JobWrapper.invokeMethod(JobWrapper.java:77) [scheduler.jar:]
>> >>>>>>>>>>>>>>           at org.ovirt.engine.core.utils.ti
>> mer.JobWrapper.execute(JobWrapper.java:51) [scheduler.jar:]
>> >>>>>>>>>>>>>>           at org.quartz.core.JobRunShell.run(JobRunShell.java:213)
>> [quartz.jar:]
>> >>>>>>>>>>>>>>           at java.util.concurrent.Executors
>> $RunnableAdapter.call(Executors.java:511) [rt.jar:1.8.0_101]
>> >>>>>>>>>>>>>>           at java.util.concurrent.FutureTask.run(FutureTask.java:266)
>> [rt.jar:1.8.0_101]
>> >>>>>>>>>>>>>>           at java.util.concurrent.ThreadPoo
>> lExecutor.runWorker(ThreadPoolExecutor.java:1142) [rt.jar:1.8.0_101]
>> >>>>>>>>>>>>>>           at java.util.concurrent.ThreadPoo
>> lExecutor$Worker.run(ThreadPoolExecutor.java:617) [rt.jar:1.8.0_101]
>> >>>>>>>>>>>>>>           at java.lang.Thread.run(Thread.java:745)
>> [rt.jar:1.8.0_101]
>> >>>>>>>>>>>>>>
>> >>>>>>>>>>>>>>   2016-09-12 02:00:13,512 ERROR
>> [org.ovirt.engine.core.bll.host.HostUpgradeManager]
>> (DefaultQuartzScheduler5) [] Failed to refresh host 'KOM-AD01-VM31'
>> packages 'vdsm-cli, vdsm' availability.
>> >>>>>>>>>>>>>>   2016-09-12 02:00:13,512 ERROR
>> [org.ovirt.engine.core.bll.host.HostUpgradeManager]
>> (DefaultQuartzScheduler5) [] Exception: java.io.IOException: Command
>> returned failure code 1 during SSH session 'root at kom-ad01-vm31.holding.co
>> m'
>> >>>>>>>>>>>>>>           at org.ovirt.engine.core.uutils.s
>> sh.SSHClient.executeCommand(SSHClient.java:526) [uutils.jar:]
>> >>>>>>>>>>>>>>           at org.ovirt.engine.core.uutils.s
>> sh.SSHDialog.executeCommand(SSHDialog.java:317) [uutils.jar:]
>> >>>>>>>>>>>>>>           at org.ovirt.engine.core.bll.host
>> deploy.VdsDeployBase.execute(VdsDeployBase.java:563) [bll.jar:]
>> >>>>>>>>>>>>>>           at org.ovirt.engine.core.bll.host
>> .HostUpgradeManager.isUpdateAvailable(HostUpgradeManager.java:44)
>> [bll.jar:]
>> >>>>>>>>>>>>>>           at org.ovirt.engine.core.bll.host
>> .AvailableUpdatesFinder.isUpdateAvailable(AvailableUpdatesFinder.java:26)
>> [bll.jar:]
>> >>>>>>>>>>>>>>           at org.ovirt.engine.core.bll.VdsE
>> ventListener.isUpdateAvailable(VdsEventListener.java:553) [bll.jar:]
>> >>>>>>>>>>>>>>           at org.ovirt.engine.core.vdsbroke
>> r.ResourceManager.isUpdateAvailable(ResourceManager.java:510)
>> [vdsbroker.jar:]
>> >>>>>>>>>>>>>>           at org.ovirt.engine.core.vdsbroke
>> r.VdsManager.availableUpdates(VdsManager.java:306) [vdsbroker.jar:]
>> >>>>>>>>>>>>>>           at sun.reflect.NativeMethodAccessorImpl.invoke0(Native
>> Method) [rt.jar:1.8.0_101]
>> >>>>>>>>>>>>>>           at sun.reflect.NativeMethodAccess
>> orImpl.invoke(NativeMethodAccessorImpl.java:62) [rt.jar:1.8.0_101]
>> >>>>>>>>>>>>>>           at sun.reflect.DelegatingMethodAc
>> cessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>> [rt.jar:1.8.0_101]
>> >>>>>>>>>>>>>>           at java.lang.reflect.Method.invoke(Method.java:498)
>> [rt.jar:1.8.0_101]
>> >>>>>>>>>>>>>>           at org.ovirt.engine.core.utils.ti
>> mer.JobWrapper.invokeMethod(JobWrapper.java:77) [scheduler.jar:]
>> >>>>>>>>>>>>>>           at org.ovirt.engine.core.utils.ti
>> mer.JobWrapper.execute(JobWrapper.java:51) [scheduler.jar:]
>> >>>>>>>>>>>>>>           at org.quartz.core.JobRunShell.run(JobRunShell.java:213)
>> [quartz.jar:]
>> >>>>>>>>>>>>>>           at java.util.concurrent.Executors
>> $RunnableAdapter.call(Executors.java:511) [rt.jar:1.8.0_101]
>> >>>>>>>>>>>>>>           at java.util.concurrent.FutureTask.run(FutureTask.java:266)
>> [rt.jar:1.8.0_101]
>> >>>>>>>>>>>>>>           at java.util.concurrent.ThreadPoo
>> lExecutor.runWorker(ThreadPoolExecutor.java:1142) [rt.jar:1.8.0_101]
>> >>>>>>>>>>>>>>           at java.util.concurrent.ThreadPoo
>> lExecutor$Worker.run(ThreadPoolExecutor.java:617) [rt.jar:1.8.0_101]
>> >>>>>>>>>>>>>>           at java.lang.Thread.run(Thread.java:745)
>> [rt.jar:1.8.0_101]
>> >>>>>>>>>>>>>>
>> >>>>>>>>>>>>>>   2016-09-12 02:00:13,513 ERROR
>> [org.ovirt.engine.core.vdsbroker.VdsManager] (DefaultQuartzScheduler5)
>> [] Failed to check if updates are available for host 'KOM-AD01-VM31'
>> >>>>>>>>>>>>>>   2016-09-12 02:00:13,518 ERROR
>> [org.ovirt.engine.core.dal.dbbroker.auditloghandling.AuditLogDirector]
>> (DefaultQuartzScheduler5) [] Correlation ID: null, Call Stack: null, Custom
>> Event ID: -1, Message: Failed to check for available updates on host
>> KOM-AD01-VM31 with message 'Command returned failure code 1 during SSH
>> session 'root at kom-ad01-vm31.holding.com''.
>> >>>>>>>>>>>>>>
>> >>>>>>>>>>>>>>   ...
>> >>>>>>>>>>>>>>
>> >>>>>>>>>>>>>>   What could be the problem?
>> >>>>>>>>>>>>>>   _______________________________________________
>> >>>>>>>>>>>>>>   Users mailing list
>> >>>>>>>>>>>>>>   Users at ovirt.org
>> >>>>>>>>>>>>>>   http://lists.ovirt.org/mailman/listinfo/users
>> >>>>>>>>>>>>   _______________________________________________
>> >>>>>>>>>>>>   Users mailing list
>> >>>>>>>>>>>>   Users at ovirt.org
>> >>>>>>>>>>>>   http://lists.ovirt.org/mailman/listinfo/users
>> >>>>>>>>>>>
>> >>>>>>>>>>>  --
>> >>>>>>>>>>>  Didi
>> >>>>>>>>>
>> >>>>>>>>> --
>> >>>>>>>>> Didi
>> >>>>>>>
>> >>>>>>> --
>> >>>>>>> Didi
>>
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.ovirt.org/pipermail/users/attachments/20160913/8c2ac9a7/attachment-0001.html>


More information about the Users mailing list