Build failed in Jenkins: deploy-to_ovirt-master_tested #4157
by jenkins@jenkins.phx.ovirt.org
See <http://jenkins.ovirt.org/job/deploy-to_ovirt-master_tested/4157/display/r...>
------------------------------------------
Started by upstream project "ovirt-appliance_master_build-artifacts-el7-x86_64" build number 843
originally caused by:
Started by timer
[EnvInject] - Loading node environment variables.
Building remotely on vm0061.workers-phx.ovirt.org (libvirt phx fc28 nested) in workspace <http://jenkins.ovirt.org/job/deploy-to_ovirt-master_tested/ws/>
[ssh-agent] Looking for ssh-agent implementation...
[ssh-agent] Exec ssh-agent (binary ssh-agent on a remote machine)
$ ssh-agent
SSH_AUTH_SOCK=/tmp/ssh-hASfehYV6P5u/agent.11952
SSH_AGENT_PID=11958
[ssh-agent] Started.
$ ssh-add <http://jenkins.ovirt.org/job/deploy-to_ovirt-master_tested/ws/@tmp/privat...>
Identity added: <http://jenkins.ovirt.org/job/deploy-to_ovirt-master_tested/ws/@tmp/privat...> (<http://jenkins.ovirt.org/job/deploy-to_ovirt-master_tested/ws/@tmp/privat...)>
[ssh-agent] Using credentials deploy-ovirt-experimental (SSH key for deploying to the tested repo)
[deploy-to_ovirt-master_tested] $ /bin/bash -xe /tmp/jenkins8161833860203097377.sh
+ [[ http://jenkins.ovirt.org/job/ovirt-appliance_master_build-artifacts-el7-x... == '' ]]
+ queue_name=ovirt-master
+ echo repo-extra-dir:master
+ ssh -o StrictHostKeyChecking=no deploy-ovirt-experimental(a)resources.ovirt.org
+ echo http://jenkins.ovirt.org/job/ovirt-appliance_master_build-artifacts-el7-x...
Pseudo-terminal will not be allocated because stdin is not a terminal.
+ BASE_DIR=/srv/resources/repos/ovirt/tested
+ PUBLISH_MD_COPIES=50
+ main
+ local tmp_dir
+ mkdir -p /srv/resources/repos/ovirt/tested
++ mktemp -d /srv/resources/repos/ovirt/tested/.deploy.XXXXXXXXXX
Collecting packages
+ tmp_dir=/srv/resources/repos/ovirt/tested/.deploy.CrswqIXGt3
+ trap 'rm -rf '\''/srv/resources/repos/ovirt/tested/.deploy.CrswqIXGt3'\''' EXIT HUP
+ echo 'Collecting packages'
+ collect_packages /srv/resources/repos/ovirt/tested/.deploy.CrswqIXGt3
+ local repoman_dst=/srv/resources/repos/ovirt/tested/.deploy.CrswqIXGt3
+ repoman --temp-dir generate-in-repo --option main.allowed_repo_paths=/srv/resources/repos/ovirt/tested/.deploy.CrswqIXGt3 --option main.on_empty_source=warn /srv/resources/repos/ovirt/tested/.deploy.CrswqIXGt3 add conf:stdin
2018-07-01 14:41:47,686::INFO::repoman.cmd::
2018-07-01 14:41:47,686::INFO::repoman.cmd::Adding artifacts to the repo /srv/resources/repos/ovirt/tested/.deploy.CrswqIXGt3
2018-07-01 14:41:47,687::INFO::repoman.common.repo::Adding repo extra dir master
2018-07-01 14:41:47,691::INFO::repoman.common.stores.RPM::Loading repo /srv/resources/repos/ovirt/tested/.deploy.CrswqIXGt3/master
2018-07-01 14:41:47,691::INFO::repoman.common.stores.RPM::Repo /srv/resources/repos/ovirt/tested/.deploy.CrswqIXGt3/master loaded
2018-07-01 14:41:47,694::INFO::repoman.common.stores.iso::Loading repo /srv/resources/repos/ovirt/tested/.deploy.CrswqIXGt3/master
2018-07-01 14:41:47,695::INFO::repoman.common.stores.iso::Repo /srv/resources/repos/ovirt/tested/.deploy.CrswqIXGt3/master loaded
2018-07-01 14:41:47,712::INFO::repoman.common.repo::Resolving artifact source http://jenkins.ovirt.org/job/ovirt-appliance_master_build-artifacts-el7-x...
2018-07-01 14:41:47,885::INFO::repoman.common.sources.jenkins::Parsing jenkins URL: http://jenkins.ovirt.org/job/ovirt-appliance_master_build-artifacts-el7-x...
2018-07-01 14:41:47,888::INFO::repoman.common.sources.jenkins:: Got URL: http://jenkins.ovirt.org/job/ovirt-appliance_master_build-artifacts-el7-x...
2018-07-01 14:41:47,888::INFO::repoman.common.sources.jenkins:: Got URL: http://jenkins.ovirt.org/job/ovirt-appliance_master_build-artifacts-el7-x...
2018-07-01 14:41:47,892::INFO::root:: Done
2018-07-01 14:41:47,941::INFO::root::Downloading http://jenkins.ovirt.org/job/ovirt-appliance_master_build-artifacts-el7-x..., length 842M ...
%[-----------------------25------------------------50-----------------------75------------------------] %[====================================================================================================]
2018-07-01 14:42:16,107::INFO::repoman.common.stores.RPM::Adding package /srv/resources/repos/ovirt/tested/.deploy.CrswqIXGt3/.lago_tmp/tmp9TI_N1/tmpop8PCF/ovirt-engine-appliance-4.3-20180701.1.el7.noarch.rpm to repo /srv/resources/repos/ovirt/tested/.deploy.CrswqIXGt3/master
2018-07-01 14:42:16,142::INFO::root::Downloading http://jenkins.ovirt.org/job/ovirt-appliance_master_build-artifacts-el7-x..., length 842M ...
%[-----------------------25------------------------50-----------------------75------------------------] %[====================================================================================================]
2018-07-01 14:42:45,484::INFO::repoman.common.stores.RPM::Adding package /srv/resources/repos/ovirt/tested/.deploy.CrswqIXGt3/.lago_tmp/tmp9TI_N1/tmpop8PCF/ovirt-engine-appliance-4.3-20180701.1.el7.src.rpm to repo /srv/resources/repos/ovirt/tested/.deploy.CrswqIXGt3/master
2018-07-01 14:42:45,485::INFO::repoman.cmd::
2018-07-01 14:42:45,486::INFO::repoman.common.stores.RPM::Saving new added rpms into /srv/resources/repos/ovirt/tested/.deploy.CrswqIXGt3/master
2018-07-01 14:42:45,486::INFO::root::Saving /srv/resources/repos/ovirt/tested/.deploy.CrswqIXGt3/master/rpm/el7/noarch/ovirt-engine-appliance-4.3-20180701.1.el7.noarch.rpm
2018-07-01 14:42:45,488::INFO::root::Saving /srv/resources/repos/ovirt/tested/.deploy.CrswqIXGt3/master/rpm/el7/SRPMS/ovirt-engine-appliance-4.3-20180701.1.el7.src.rpm
2018-07-01 14:42:45,489::INFO::repoman.common.stores.RPM::
2018-07-01 14:42:45,489::INFO::repoman.common.stores.RPM::Updating metadata
2018-07-01 14:42:45,489::INFO::repoman.common.stores.RPM:: Creating metadata for el7
2018-07-01 14:43:04,726::INFO::repoman.common.stores.RPM::
2018-07-01 14:43:04,728::INFO::repoman.common.stores.RPM::Creating symlinks
2018-07-01 14:43:04,729::INFO::repoman.common.stores.RPM::
2018-07-01 14:43:04,730::INFO::repoman.common.stores.RPM::Saved /srv/resources/repos/ovirt/tested/.deploy.CrswqIXGt3/master
2018-07-01 14:43:04,730::INFO::repoman.common.stores.iso::Saving new added isos into /srv/resources/repos/ovirt/tested/.deploy.CrswqIXGt3/master
2018-07-01 14:43:04,730::INFO::repoman.common.stores.iso::
2018-07-01 14:43:04,731::INFO::repoman.common.stores.iso::Saved /srv/resources/repos/ovirt/tested/.deploy.CrswqIXGt3/master
2018-07-01 14:43:04,733::INFO::repoman.common.repo::Cleaning up temporary dir /srv/resources/repos/ovirt/tested/.deploy.CrswqIXGt3/.lago_tmp/tmp9TI_N1/tmpop8PCF
2018-07-01 14:43:04,734::INFO::repoman.common.repo::Cleaning up temporary dir /srv/resources/repos/ovirt/tested/.deploy.CrswqIXGt3/.lago_tmp/tmp9TI_N1
Publishing to repo
+ echo 'Publishing to repo'
+ push_to_tested /srv/resources/repos/ovirt/tested/.deploy.CrswqIXGt3 /srv/resources/repos/ovirt/tested
+ local pkg_src=/srv/resources/repos/ovirt/tested/.deploy.CrswqIXGt3
+ local pkg_dst=/srv/resources/repos/ovirt/tested
+ cd /srv/resources/repos/ovirt/tested/.deploy.CrswqIXGt3
+ find . -type d '!' -name repodata
+ tac
+ read dir
+ install -o deploy-ovirt-experimental -m 755 -d /srv/resources/repos/ovirt/tested/./.lago_tmp
+ find ./.lago_tmp -maxdepth 1 '!' -type d -print0
+ xargs -0 -r cp -RPplf -t /srv/resources/repos/ovirt/tested/./.lago_tmp
+ [[ -d ./.lago_tmp/repodata ]]
+ read dir
+ install -o deploy-ovirt-experimental -m 755 -d /srv/resources/repos/ovirt/tested/./master/rpm/el7/noarch
+ find ./master/rpm/el7/noarch -maxdepth 1 '!' -type d -print0
+ xargs -0 -r cp -RPplf -t /srv/resources/repos/ovirt/tested/./master/rpm/el7/noarch
+ [[ -d ./master/rpm/el7/noarch/repodata ]]
+ read dir
+ install -o deploy-ovirt-experimental -m 755 -d /srv/resources/repos/ovirt/tested/./master/rpm/el7/SRPMS
+ find ./master/rpm/el7/SRPMS -maxdepth 1 '!' -type d -print0
+ xargs -0 -r cp -RPplf -t /srv/resources/repos/ovirt/tested/./master/rpm/el7/SRPMS
+ [[ -d ./master/rpm/el7/SRPMS/repodata ]]
+ xargs -P 8 -r printf '%s\n' /srv/resources/repos/ovirt/tested/.deploy.CrswqIXGt3
+ comm -23 /dev/fd/63 /dev/fd/62
++ find /srv/resources/repos/ovirt/tested/./master/rpm/el7/SRPMS -name '*.rpm' -type f -mtime +14
++ sort
++ repomanage -k1 --new -c /srv/resources/repos/ovirt/tested/./master/rpm/el7/SRPMS
++ date +%Y-%m-%d-%H:%M
++ sort
++ date +%Y-%m-%d-%H:%M
/home/deploy-ovirt-experimental/deploy-to-tested.sh: line 54: /home/deploy-ovirt-experimental/logfiles/2018-07-01-14:43-/srv/resources/repos/ovirt/tested.log: No such file or directory
+ rm -rf /srv/resources/repos/ovirt/tested/.deploy.CrswqIXGt3
Build step 'Execute shell' marked build as failure
$ ssh-agent -k
unset SSH_AUTH_SOCK;
unset SSH_AGENT_PID;
echo Agent pid 11958 killed;
[ssh-agent] Stopped.
6 years, 5 months
Lack of resources on the ML server
by Marc Dequènes (Duck)
Quack,
Sandro noticed some posts did not appear quickly on the interface.
There's quite a lot of traffic and it was not keeping up, so I added a
vCPU (2->3), but I had to restart the VM but it was fast.
I also noticed some MM crontab maintenance jobs did trigger an OOM
killer, so I added a bit more RAM (4GB->5GB).
Another thing to consider is: we also filter outgoing mails, which is
nice to avoid being blacklisted if some bad guy subscribe and post
spammy content. But it seems postfix is applying a recipient concurrency
of one when sending to the filter, which means if the list has x
subscribers then it is checked x times, which is silly. There is no such
thing in the configuration so I'm looking into it. This probably affects
other installations but this one being heavily loaded…
\_o<
6 years, 5 months
Gerrit trying to set 3rd party cookies
by Nir Soffer
After watching Sarah Bird's great talk about the terrifying web[1], I found
that for
some reason 3rd party cookies were enabled in my browser.
After disabling them, I found that gerrit is using 3rd party cookies from
gravatar.com.
(see attached screenshot).
Why do we allow 3rd parties like gravatar to set cookies?
Can we use gravatar without setting cookies?
[image: Screenshot from 2018-07-01 15-31-37.png]
[1] https://il.pycon.org/2018/schedule/presentation/18/
Nir
6 years, 5 months
[JIRA] (OVIRT-1878) GitHub support for pusher.py
by Barak Korren (oVirt JIRA)
[ https://ovirt-jira.atlassian.net/browse/OVIRT-1878?page=com.atlassian.jir... ]
Barak Korren commented on OVIRT-1878:
-------------------------------------
[~eedri] probably a similar flow, but likely a different API. And should be a different ticket.
> GitHub support for pusher.py
> ----------------------------
>
> Key: OVIRT-1878
> URL: https://ovirt-jira.atlassian.net/browse/OVIRT-1878
> Project: oVirt - virtualization made easy
> Issue Type: Improvement
> Components: pusher.py
> Reporter: Barak Korren
> Assignee: infra
> Labels: first_time_task, github
>
> '{{pusher.py}}' currently only supports gerrit, what this means in practice is that automated polling for upstream source changes is only available for repos that are hosted in Gerrit.
> We need to support this in GitHub as well.
--
This message was sent by Atlassian Jira
(v1001.0.0-SNAPSHOT#100088)
6 years, 5 months
[JIRA] (OVIRT-1878) GitHub support for pusher.py
by Eyal Edri (oVirt JIRA)
[ https://ovirt-jira.atlassian.net/browse/OVIRT-1878?page=com.atlassian.jir... ]
Eyal Edri commented on OVIRT-1878:
----------------------------------
We need to support GitLab as well, as that is the standard for downstream projects that don't use Gerrit.
Is it the same as GitHub?
> GitHub support for pusher.py
> ----------------------------
>
> Key: OVIRT-1878
> URL: https://ovirt-jira.atlassian.net/browse/OVIRT-1878
> Project: oVirt - virtualization made easy
> Issue Type: Improvement
> Components: pusher.py
> Reporter: Barak Korren
> Assignee: infra
> Labels: first_time_task, github
>
> '{{pusher.py}}' currently only supports gerrit, what this means in practice is that automated polling for upstream source changes is only available for repos that are hosted in Gerrit.
> We need to support this in GitHub as well.
--
This message was sent by Atlassian Jira
(v1001.0.0-SNAPSHOT#100088)
6 years, 5 months
Build failed in Jenkins:
system-sync_mirrors-centos-updates-el7-x86_64 #1710
by jenkins@jenkins.phx.ovirt.org
See <http://jenkins.ovirt.org/job/system-sync_mirrors-centos-updates-el7-x86_6...>
------------------------------------------
Started by timer
[EnvInject] - Loading node environment variables.
Building remotely on mirrors.phx.ovirt.org (mirrors) in workspace <http://jenkins.ovirt.org/job/system-sync_mirrors-centos-updates-el7-x86_6...>
> git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
> git config remote.origin.url http://gerrit.ovirt.org/jenkins.git # timeout=10
Cleaning workspace
> git rev-parse --verify HEAD # timeout=10
Resetting working tree
> git reset --hard # timeout=10
> git clean -fdx # timeout=10
Pruning obsolete local branches
Fetching upstream changes from http://gerrit.ovirt.org/jenkins.git
> git --version # timeout=10
> git fetch --tags --progress http://gerrit.ovirt.org/jenkins.git +refs/heads/*:refs/remotes/origin/* --prune
> git rev-parse origin/master^{commit} # timeout=10
Checking out Revision b3783cc795c2fdf2a3acf7aa33a80feab30134be (origin/master)
> git config core.sparsecheckout # timeout=10
> git checkout -f b3783cc795c2fdf2a3acf7aa33a80feab30134be
Commit message: "Add stdci v2 jobs for nsis-simple-service-plugin"
> git rev-list --no-walk b3783cc795c2fdf2a3acf7aa33a80feab30134be # timeout=10
[system-sync_mirrors-centos-updates-el7-x86_64] $ /bin/bash -xe /tmp/jenkins9084118679452512808.sh
+ jenkins/scripts/mirror_mgr.sh resync_yum_mirror centos-updates-el7 x86_64 jenkins/data/mirrors-reposync.conf
Checking if mirror needs a resync
Traceback (most recent call last):
File "/usr/bin/reposync", line 343, in <module>
main()
File "/usr/bin/reposync", line 175, in main
my.doRepoSetup()
File "/usr/lib/python2.7/site-packages/yum/__init__.py", line 681, in doRepoSetup
return self._getRepos(thisrepo, True)
File "/usr/lib/python2.7/site-packages/yum/__init__.py", line 721, in _getRepos
self._repos.doSetup(thisrepo)
File "/usr/lib/python2.7/site-packages/yum/repos.py", line 157, in doSetup
self.retrieveAllMD()
File "/usr/lib/python2.7/site-packages/yum/repos.py", line 88, in retrieveAllMD
dl = repo._async and repo._commonLoadRepoXML(repo)
File "/usr/lib/python2.7/site-packages/yum/yumRepo.py", line 1474, in _commonLoadRepoXML
if self._latestRepoXML(local):
File "/usr/lib/python2.7/site-packages/yum/yumRepo.py", line 1452, in _latestRepoXML
repomd = self.metalink_data.repomd
File "/usr/lib/python2.7/site-packages/yum/yumRepo.py", line 918, in <lambda>
metalink_data = property(fget=lambda self: self._getMetalink(),
File "/usr/lib/python2.7/site-packages/yum/yumRepo.py", line 914, in _getMetalink
self._metalink = metalink.MetaLinkRepoMD(self.metalink_filename)
File "/usr/lib/python2.7/site-packages/yum/metalink.py", line 189, in __init__
raise MetaLinkRepoErrorParseFail, "File %s is not XML" % filename
yum.metalink.MetaLinkRepoErrorParseFail: File /home/jenkins/mirrors_cache/fedora-updates-fc26/metalink.xml is not XML
Build step 'Execute shell' marked build as failure
6 years, 5 months
Re: [ OST Failure Report ] [ oVirt Master (ovirt-engine) ] [
28-06-2018 ] [ 098_ovirt_provider_ovn.use_ovn_provider.]
by Dafna Ron
Thanks Alona,
Can you please update me once you have a fix?
Thanks,
Dafna
On Thu, Jun 28, 2018 at 10:28 AM, Alona Kaplan <alkaplan(a)redhat.com> wrote:
> Hi,
> I'm aware to the error. Francesco and me are working on it.
>
> Thank,
> Alona.
>
> On Thu, Jun 28, 2018, 12:23 Dafna Ron <dron(a)redhat.com> wrote:
>
>> ovirt-hosted-engine-ha failed on the same issue as well.
>>
>> On Thu, Jun 28, 2018 at 10:07 AM, Dafna Ron <dron(a)redhat.com> wrote:
>>
>>> Hi,
>>>
>>> We had a failure in test 098_ovirt_provider_ovn.use_ovn_provider.
>>>
>>> Although CQ is pointing to this change: https://gerrit.ovirt.org/#/c/
>>> 92567/ - packaging: Add python-netaddr requirement I actually think
>>> from the error its because of changes made to multiqueues
>>>
>>> https://gerrit.ovirt.org/#/c/92009/ - engine: Update libvirtVmXml to
>>> consider vmBase.multiQueuesEnabled attribute
>>> https://gerrit.ovirt.org/#/c/92008/ - engine: Introduce algorithm for
>>> calculating how many queues asign per vnic
>>> https://gerrit.ovirt.org/#/c/92007/ - engine: Add multiQueuesEnabled to
>>> VmBase
>>> https://gerrit.ovirt.org/#/c/92318/ - restapi: Add 'Multi Queues
>>> Enabled' to the relevant mappers
>>> https://gerrit.ovirt.org/#/c/92149/ - webadmin: Add 'Multi Queues
>>> Enabled' to vm dialog
>>>
>>> Alona, can you please take a look?
>>>
>>>
>>> *Link to Job:*
>>>
>>>
>>>
>>>
>>> *http://jenkins.ovirt.org/job/ovirt-master_change-queue-tester/8375/
>>> <http://jenkins.ovirt.org/job/ovirt-master_change-queue-tester/8375/>Link
>>> to all
>>> logs:https://jenkins.ovirt.org/job/ovirt-master_change-queue-tester/8375/...
>>> <https://jenkins.ovirt.org/job/ovirt-master_change-queue-tester/8375/artif...>(Relevant)
>>> error snippet from the log: <error>*
>>> *eng**ine: *
>>>
>>> 2018-06-27 13:59:25,976-04 ERROR [org.ovirt.engine.core.
>>> vdsbroker.vdsbroker.GetAllVmStatsVDSCommand] (EE-ManagedThreadFactory-engineScheduled-Thread-80)
>>> [] Command 'GetAllVmStatsVDSCommand(HostName =
>>> lago-basic-suite-master-host-1, VdsIdVDSCommandParametersBase:
>>> {hostId='d9094c95-3275-4616-b4c2-815e753bcfed'})' execution failed:
>>> VDSGenericException: VDSNetworkException: Broken pipe
>>> 2018-06-27 13:59:25,977-04 DEBUG [org.ovirt.engine.core.utils.threadpool.ThreadPoolUtil]
>>> (EE-ManagedThreadFactory-engine-Thread-442) [] Executing task:
>>> EE-ManagedThreadFactory-engine-Thread-442
>>> 2018-06-27 13:59:25,977-04 DEBUG [org.ovirt.engine.core.common.
>>> di.interceptor.DebugLoggingInterceptor] (EE-ManagedThreadFactory-engine-Thread-442)
>>> [] method: getVdsManager, params: [d9094c95-3275-4616-b4c2-815e753bcfed],
>>> timeElapsed: 0ms
>>> 2018-06-27 13:59:25,977-04 WARN [org.ovirt.engine.core.vdsbroker.VdsManager]
>>> (EE-ManagedThreadFactory-engine-Thread-442) [] Host
>>> 'lago-basic-suite-master-host-1' is not responding.
>>> 2018-06-27 13:59:25,979-04 ERROR [org.ovirt.engine.core.dal.
>>> dbbroker.auditloghandling.AuditLogDirector] (EE-ManagedThreadFactory-engineScheduled-Thread-63)
>>> [] EVENT_ID: VDS_BROKER_COMMAND_FAILURE(10,802), VDSM
>>> lago-basic-suite-master-host-1 command GetStatsAsyncVDS failed: Broken pipe
>>> 2018-06-27 13:59:25,976-04 DEBUG [org.ovirt.engine.core.
>>> vdsbroker.vdsbroker.GetAllVmStatsVDSCommand] (EE-ManagedThreadFactory-engineScheduled-Thread-80)
>>> [] Exception: org.ovirt.engine.core.vdsbroker.vdsbroker.VDSNetworkException:
>>> VDSGenericException: VDSNetworkException: Broken pipe
>>> at org.ovirt.engine.core.vdsbroker.vdsbroker.BrokerCommandBase.
>>> proceedProxyReturnValue(BrokerCommandBase.java:189) [vdsbroker.jar:]
>>> at org.ovirt.engine.core.vdsbroker.vdsbroker.
>>> GetAllVmStatsVDSCommand.executeVdsBrokerCommand(
>>> GetAllVmStatsVDSCommand.java:23) [vdsbroker.jar:]
>>> at org.ovirt.engine.core.vdsbroker.vdsbroker.VdsBrokerCommand.
>>> executeVdsCommandWithNetworkEvent(VdsBrokerCommand.java:123)
>>> [vdsbroker.jar:]
>>> at org.ovirt.engine.core.vdsbroker.vdsbroker.VdsBrokerCommand.
>>> executeVDSCommand(VdsBrokerCommand.java:111) [vdsbroker.jar:]
>>> at org.ovirt.engine.core.vdsbroker.VDSCommandBase.
>>> executeCommand(VDSCommandBase.java:65) [vdsbroker.jar:]
>>> at org.ovirt.engine.core.dal.VdcCommandBase.execute(VdcCommandBase.java:31)
>>> [dal.jar:]
>>> at org.ovirt.engine.core.vdsbroker.vdsbroker.
>>> DefaultVdsCommandExecutor.execute(DefaultVdsCommandExecutor.java:14)
>>> [vdsbroker.jar:]
>>> at org.ovirt.engine.core.vdsbroker.ResourceManager.
>>> runVdsCommand(ResourceManager.java:399) [vdsbroker.jar:]
>>> at org.ovirt.engine.core.vdsbroker.ResourceManager$
>>> Proxy$_$$_WeldSubclass.runVdsCommand$$super(Unknown Source)
>>> [vdsbroker.jar:]
>>> at sun.reflect.GeneratedMethodAccessor270.invoke(Unknown
>>> Source) [:1.8.0_171]
>>> at sun.reflect.DelegatingMethodAccessorImpl.invoke(
>>> DelegatingMethodAccessorImpl.java:43) [rt.jar:1.8.0_171]
>>> at java.lang.reflect.Method.invoke(Method.java:498)
>>> [rt.jar:1.8.0_171]
>>> at org.jboss.weld.interceptor.proxy.
>>> TerminalAroundInvokeInvocationContext.proceedInternal(
>>> TerminalAroundInvokeInvocationContext.java:49)
>>> [weld-core-impl-2.4.3.Final.jar:2.4.3.Final]
>>> at org.jboss.weld.interceptor.proxy.
>>> AroundInvokeInvocationContext.proceed(AroundInvokeInvocationContext.java:77)
>>> [weld-core-impl-2.4.3.Final.jar:2.4.3.Final]
>>> at org.ovirt.engine.core.common.di.interceptor.
>>> LoggingInterceptor.apply(LoggingInterceptor.java:12) [common.jar:]
>>> at sun.reflect.GeneratedMethodAccessor68.invoke(Unknown Source)
>>> [:1.8.0_171]
>>> at sun.reflect.DelegatingMethodAccessorImpl.invoke(
>>> DelegatingMethodAccessorImpl.java:43) [rt.jar:1.8.0_171]
>>> at java.lang.reflect.Method.invoke(Method.java:498)
>>> [rt.jar:1.8.0_171]
>>> at org.jboss.weld.interceptor.reader.
>>> SimpleInterceptorInvocation$SimpleMethodInvocation.invoke(
>>> SimpleInterceptorInvocation.java:73) [weld-core-impl-2.4.3.Final.
>>> jar:2.4.3.Final]
>>> at org.jboss.weld.interceptor.proxy.InterceptorMethodHandler.
>>> executeAroundInvoke(InterceptorMethodHandler.java:84)
>>> [weld-core-impl-2.4.3.Final.jar:2.4.3.Final]
>>> at org.jboss.weld.interceptor.proxy.InterceptorMethodHandler.
>>> executeInterception(InterceptorMethodHandler.java:72)
>>> [weld-core-impl-2.4.3.Final.jar:2.4.3.Final]
>>> at org.jboss.weld.interceptor.proxy.InterceptorMethodHandler.
>>> invoke(InterceptorMethodHandler.java:56) [weld-core-impl-2.4.3.Final.
>>> jar:2.4.3.Final]
>>> at org.jboss.weld.bean.proxy.CombinedInterceptorAndDecorato
>>> rStackMethodHandler.invoke(CombinedInterceptorAndDecoratorStackMethodHandler.java:79)
>>> [weld-core-impl-2.4.3.Final.jar:2.4.3.Final]
>>> at org.jboss.weld.bean.proxy.CombinedInterceptorAndDecorato
>>> rStackMethodHandler.invoke(CombinedInterceptorAndDecoratorStackMethodHandler.java:68)
>>> [weld-core-impl-2.4.3.Final.jar:2.4.3.Final]
>>> at org.ovirt.engine.core.vdsbroker.ResourceManager$
>>> Proxy$_$$_WeldSubclass.runVdsCommand(Unknown Source) [vdsbroker.jar:]
>>> at org.ovirt.engine.core.vdsbroker.monitoring.
>>> VmsStatisticsFetcher.poll(VmsStatisticsFetcher.java:29) [vdsbroker.jar:]
>>> at org.ovirt.engine.core.vdsbroker.monitoring.
>>> VmsListFetcher.fetch(VmsListFetcher.java:49) [vdsbroker.jar:]
>>> at org.ovirt.engine.core.vdsbroker.monitoring.
>>> PollVmStatsRefresher.poll(PollVmStatsRefresher.java:44) [vdsbroker.jar:]
>>> at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
>>> [rt.jar:1.8.0_171]
>>> at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308)
>>> [rt.jar:1.8.0_171]
>>> at org.glassfish.enterprise.concurrent.internal.
>>> ManagedScheduledThreadPoolExecutor$ManagedScheduledFutureTask.
>>> access$201(ManagedScheduledThreadPoolExecutor.java:383)
>>> [javax.enterprise.concurrent-1.0.jar:]
>>> at org.glassfish.enterprise.concurrent.internal.
>>> ManagedScheduledThreadPoolExecutor$ManagedScheduledFutureTask.run(
>>> ManagedScheduledThreadPoolExecutor.java:534)
>>> [javax.enterprise.concurrent-1.0.jar:]
>>> at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
>>> [rt.jar:1.8.0_171]
>>> at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
>>> [rt.jar:1.8.0_171]
>>> at java.lang.Thread.run(Thread.java:748) [rt.jar:1.8.0_171]
>>> at org.glassfish.enterprise.concurrent.ManagedThreadFactoryImpl$
>>> ManagedThread.run(ManagedThreadFactoryImpl.java:250)
>>> [javax.enterprise.concurrent-1.0.jar:]
>>> at org.jboss.as.ee.concurrent.service.
>>> ElytronManagedThreadFactory$ElytronManagedThread.run(
>>> ElytronManagedThreadFactory.java:78)
>>>
>>> 2018-06-27 13:59:25,984-04 DEBUG [org.ovirt.engine.core.
>>> vdsbroker.vdsbroker.GetAllVmStatsVDSCommand] (EE-ManagedThreadFactory-engineScheduled-Thread-80)
>>> [] FINISH, GetAllVmStatsVDSCommand, return: , log id: 56d99e77
>>> 2018-06-27 13:59:25,984-04 DEBUG [org.ovirt.engine.core.common.
>>> di.interceptor.DebugLoggingInterceptor] (EE-ManagedThreadFactory-engineScheduled-Thread-80)
>>> [] method: runVdsCommand, params: [GetAllVmStats,
>>> VdsIdVDSCommandParametersBase:{hostId='d9094c95-3275-4616-b4c2-815e753bcfed'}],
>>> timeElapsed: 1497ms
>>> 2018-06-27 13:59:25,984-04 INFO [org.ovirt.engine.core.
>>> vdsbroker.monitoring.PollVmStatsRefresher] (EE-ManagedThreadFactory-engineScheduled-Thread-80)
>>> [] Failed to fetch vms info for host 'lago-basic-suite-master-host-1' -
>>> skipping VMs monitoring.
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>> *vdsm: 2018-06-27 14:10:17,314-0400 INFO (jsonrpc/7) [virt.vm]
>>> (vmId='b8a11304-07e3-4e64-af35-7421be780d5b') Hotunplug NIC xml: <?xml
>>> version='1.0' encoding='utf-8'?><interface type="bridge"> <address
>>> bus="0x00" domain="0x0000" function="0x0" slot="0x0b" type="pci" /> <mac
>>> address="00:1a:4a:16:01:0e" /> <model type="virtio" /> <source
>>> bridge="network_1" /> <link state="up" /> <driver name="vhost"
>>> queues="" /> <alias name="ua-3c77476f-f194-476a-8412-d76a9e58d1f9"
>>> /></interface> (vm:3321)2018-06-27 14:10:17,328-0400 ERROR (jsonrpc/7)
>>> [virt.vm] (vmId='b8a11304-07e3-4e64-af35-7421be780d5b') Hotunplug failed
>>> (vm:3353)Traceback (most recent call last): File
>>> "/usr/lib/python2.7/site-packages/vdsm/virt/vm.py", line 3343, in
>>> hotunplugNic self._dom.detachDevice(nicXml) File
>>> "/usr/lib/python2.7/site-packages/vdsm/virt/virdomain.py", line 99, in f
>>> ret = attr(*args, **kwargs) File
>>> "/usr/lib/python2.7/site-packages/vdsm/common/libvirtconnection.py", line
>>> 131, in wrapper ret = f(*args, **kwargs) File
>>> "/usr/lib/python2.7/site-packages/vdsm/common/function.py", line 93, in
>>> wrapper return func(inst, *args, **kwargs) File
>>> "/usr/lib64/python2.7/site-packages/libvirt.py", line 1177, in
>>> detachDevice if ret == -1: raise libvirtError ('virDomainDetachDevice()
>>> failed', dom=self)libvirtError: 'queues' attribute must be positive number:
>>> 2018-06-27 14:10:17,345-0400 DEBUG (jsonrpc/7) [api] FINISH hotunplugNic
>>> response={'status': {'message': "'queues' attribute must be positive
>>> number: ", 'code': 50}} (api:136)2018-06-27 14:10:17,346-0400 INFO
>>> (jsonrpc/7) [api.virt] FINISH hotunplugNic return={'status': {'message':
>>> "'queues' attribute must be positive number: ", 'code': 50}}
>>> from=::ffff:192.168.201.4,32976, flow_id=ecb6652,
>>> vmId=b8a11304-07e3-4e64-af35-7421be780d5b (api:53)2018-06-27
>>> 14:10:17,346-0400 INFO (jsonrpc/7) [jsonrpc.JsonRpcServer] RPC call
>>> VM.hotunplugNic failed (error 50) in 0.07 seconds (__init__:311)2018-06-27
>>> 14:10:19,244-0400 DEBUG (qgapoller/2) [vds] Not sending QEMU-GA command
>>> 'guest-get-users' to vm_id='b8a11304-07e3-4e64-af35-7421be780d5b', command
>>> is not supported (qemuguestagent:192)2018-06-27 14:10:20,038-0400 DEBUG
>>> (jsonrpc/1) [jsonrpc.JsonRpcServer] Calling 'Host.getAllVmStats' in bridge
>>> with {} (__init__:328)2018-06-27 14:10:20,038-0400 INFO (jsonrpc/1)
>>> [api.host] START getAllVmStats() from=::1,48032 (api:47)2018-06-27
>>> 14:10:20,041-0400 INFO (jsonrpc/1) [api.host] FINISH getAllVmStats
>>> return={'status': {'message': 'Done', 'code': 0}, 'statsList':
>>> (suppressed)} from=::1,48032 (api:53)2018-06-27 14:10:20,043-0400 DEBUG
>>> (jsonrpc/1) [jsonrpc.JsonRpcServer] Return 'Host.getAllVmStats' in bridge
>>> with (suppressed) (__init__:355)2018-06-27 14:10:20,043-0400 INFO
>>> (jsonrpc/1) [jsonrpc.JsonRpcServer] RPC call Host.getAllVmStats succeeded
>>> in 0.00 seconds (__init__:311)2018-06-27 14:10:20,057-0400 DEBUG
>>> (jsonrpc/6) [jsonrpc.JsonRpcServer] Calling 'Host.getAllVmIoTunePolicies'
>>> in bridge with {} (__init__:328)2018-06-27 14:10:20,058-0400 INFO
>>> (jsonrpc/6) [api.host] START getAllVmIoTunePolicies() from=::1,48032
>>> (api:47)2018-06-27 14:10:20,058-0400 INFO (jsonrpc/6) [api.host] FINISH
>>> getAllVmIoTunePolicies return={'status': {'message': 'Done', 'code': 0},
>>> 'io_tune_policies_dict': {'b8a11304-07e3-4e64-af35-7421be780d5b':
>>> {'policy': [], 'current_values': [{'ioTune': {'write_bytes_sec': 0L,
>>> 'total_iops_sec': 0L, 'read_iops_sec': 0L, 'read_bytes_sec': 0L,
>>> 'write_iops_sec': 0L, 'total_bytes_sec': 0L}, 'path':
>>> '/rhev/data-center/mnt/blockSD/cf23ceeb-81a3-4714-85a0-c6ddd1e024da/images/650fe4ae-47a1-4f2d-9cba-1617a8c868c3/03e75c3c-24e7-4e68-a6f1-21728aaaa73e',
>>> 'name': 'vda'}]}}} from=::1,48032 (api:53)2018-06-27 14:10:20,059-0400
>>> DEBUG (jsonrpc/6) [jsonrpc.JsonRpcServer] Return
>>> 'Host.getAllVmIoTunePolicies' in bridge with
>>> {'b8a11304-07e3-4e64-af35-7421be780d5b': {'policy': [], 'current_values':
>>> [{'ioTune': {'write_bytes_sec': 0L, 'total_iops_sec': 0L, 'read_iops_sec':
>>> 0L, 'read_bytes_sec': 0L, 'write_iops_sec': 0L, 'total_bytes_sec': 0L},
>>> 'path':
>>> '/rhev/data-center/mnt/blockSD/cf23ceeb-81a3-4714-85a0-c6ddd1e024da/images/650fe4ae-47a1-4f2d-9cba-1617a8c868c3/03e75c3c-24e7-4e68-a6f1-21728aaaa73e',
>>> 'name': 'vda'}]}} (__init__:355)</error>*
>>>
>>
>>
6 years, 5 months