[JIRA] (OVIRT-1910) Strange Lago CI test failure
by Yaniv Kaul (oVirt JIRA)
Yaniv Kaul created OVIRT-1910:
---------------------------------
Summary: Strange Lago CI test failure
Key: OVIRT-1910
URL: https://ovirt-jira.atlassian.net/browse/OVIRT-1910
Project: oVirt - virtualization made easy
Issue Type: By-EMAIL
Reporter: Yaniv Kaul
Assignee: infra
My patch[1] is failing, but I'm not sure why.
See [2] for the job - the issues seems to be with the pusher - see its
log[3]:
2018-02-23 12:42:16,222:DEBUG:__main__:Executing command: 'git log -1
--pretty=format:%H'
2018-02-23 12:42:16,230:DEBUG:__main__:Git exited with status: 0
2018-02-23 12:42:16,230:DEBUG:__main__: ---- stderr ----
2018-02-23 12:42:16,230:DEBUG:__main__: ---- stdout ----
2018-02-23 12:42:16,230:DEBUG:__main__:
c26adc01449d97899aac18406298374de0e7dd73
2018-02-23 12:42:16,231:DEBUG:__main__:Executing command: 'git
rev-parse jenkins-lago_master_github_check-patch-el7-x86_64-711^{commit}'
2018-02-23 12:42:16,237:DEBUG:__main__:Git exited with status: 128
2018-02-23 12:42:16,237:DEBUG:__main__: ---- stderr ----
2018-02-23 12:42:16,237:DEBUG:__main__: fatal: ambiguous argument
'jenkins-lago_master_github_check-patch-el7-x86_64-711^{commit}':
unknown revision or path not in the working tree.
Any ideas?
TIA,
Y.
[1] https://github.com/lago-project/lago/pull/695
[2]
http://jenkins.ovirt.org/job/lago_master_github_check-patch-el7-x86_64/711/
[3]
http://jenkins.ovirt.org/job/lago_master_github_check-patch-el7-x86_64/71...
--
This message was sent by Atlassian Jira
(v1001.0.0-SNAPSHOT#100080)
6 years, 10 months
The importance of fixing failed build-artifacts jobs
by Dafna Ron
Hi All,
We have been seeing a large amount of changes that are not deployed into
tested lately because of failed build-artifacts jobs so we decided that
perhaps we need to explain the importance of fixing a failed
build-artifacts job.
If a change failed a build-artifacts job, no matter what platform/arch it
failed in, the change will not be deployed to tested.
Here is an example of a change that will not be added to tested:
[image: Inline image 1]
As you can see, only one of the build-artifacts jobs failed but since the
project specify that it requires all of these arches/platforms, the change
will not be added to tested until all of the jobs are fixed.
So what can we do?
1. Add the code which builds-artifacts to 'check-patch' so you'll get a -1
if a build failed (assuming you will not merge with -1 from CI).
2. post merge - look for emails on failed artifacts on your change (you
will have to fix the job and then re-trigger the change)
3. you can see all current broken failed artifacts jobs in jenkins under
'unstable critical' view [1] and you will know if your project is being
deployed.
4. Remove the broken OS from your project ( either from Jenkins or from
your automation dir if you're using V2 ) - ask us for help! this should be
an easy patch
5.Don't add new OS builds until you're absolutly sure they work ( you can
add check-patch to keep testing it, but don't add build-artifacts until its
stable ).
Please contact myself or anyone else from the CI team for assistance or
questions and we would be happy to help.
[1] http://jenkins.ovirt.org/
Thank you,
Dafna
6 years, 10 months
Build failed in Jenkins: system-sync_mirrors-centos-updates-el7-x86_64 #1305
by jenkins@jenkins.phx.ovirt.org
See <http://jenkins.ovirt.org/job/system-sync_mirrors-centos-updates-el7-x86_6...>
------------------------------------------
Started by timer
[EnvInject] - Loading node environment variables.
Building remotely on mirrors.phx.ovirt.org (mirrors) in workspace <http://jenkins.ovirt.org/job/system-sync_mirrors-centos-updates-el7-x86_6...>
> git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
> git config remote.origin.url http://gerrit.ovirt.org/jenkins.git # timeout=10
Cleaning workspace
> git rev-parse --verify HEAD # timeout=10
Resetting working tree
> git reset --hard # timeout=10
> git clean -fdx # timeout=10
Pruning obsolete local branches
Fetching upstream changes from http://gerrit.ovirt.org/jenkins.git
> git --version # timeout=10
> git fetch --tags --progress http://gerrit.ovirt.org/jenkins.git +refs/heads/*:refs/remotes/origin/* --prune
> git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 969a8aab0fd2334b47c2c55e7867b2f4248a8b90 (origin/master)
> git config core.sparsecheckout # timeout=10
> git checkout -f 969a8aab0fd2334b47c2c55e7867b2f4248a8b90
Commit message: "ovirt-image-uploader: drop master jobs"
> git rev-list --no-walk 969a8aab0fd2334b47c2c55e7867b2f4248a8b90 # timeout=10
[system-sync_mirrors-centos-updates-el7-x86_64] $ /bin/bash -xe /tmp/jenkins606789742611228443.sh
+ jenkins/scripts/mirror_mgr.sh resync_yum_mirror centos-updates-el7 x86_64 jenkins/data/mirrors-reposync.conf
Checking if mirror needs a resync
Traceback (most recent call last):
File "/usr/bin/reposync", line 343, in <module>
main()
File "/usr/bin/reposync", line 175, in main
my.doRepoSetup()
File "/usr/lib/python2.7/site-packages/yum/__init__.py", line 681, in doRepoSetup
return self._getRepos(thisrepo, True)
File "/usr/lib/python2.7/site-packages/yum/__init__.py", line 721, in _getRepos
self._repos.doSetup(thisrepo)
File "/usr/lib/python2.7/site-packages/yum/repos.py", line 157, in doSetup
self.retrieveAllMD()
File "/usr/lib/python2.7/site-packages/yum/repos.py", line 88, in retrieveAllMD
dl = repo._async and repo._commonLoadRepoXML(repo)
File "/usr/lib/python2.7/site-packages/yum/yumRepo.py", line 1474, in _commonLoadRepoXML
if self._latestRepoXML(local):
File "/usr/lib/python2.7/site-packages/yum/yumRepo.py", line 1452, in _latestRepoXML
repomd = self.metalink_data.repomd
File "/usr/lib/python2.7/site-packages/yum/yumRepo.py", line 918, in <lambda>
metalink_data = property(fget=lambda self: self._getMetalink(),
File "/usr/lib/python2.7/site-packages/yum/yumRepo.py", line 914, in _getMetalink
self._metalink = metalink.MetaLinkRepoMD(self.metalink_filename)
File "/usr/lib/python2.7/site-packages/yum/metalink.py", line 189, in __init__
raise MetaLinkRepoErrorParseFail, "File %s is not XML" % filename
yum.metalink.MetaLinkRepoErrorParseFail: File /home/jenkins/mirrors_cache/fedora-base-fcraw/metalink.xml is not XML
Build step 'Execute shell' marked build as failure
6 years, 10 months
OST Failure - Weekly update [17/02/2018-23/02/2018]
by Dafna Ron
Hello,
I would like to update on this week's failures and OST current status.
I am happy to say at the end of this week we do not have any failed
changes.
We had 5 reported failed changes this week:
1. replace vdsm stats with collectd virt plugin -
https://gerrit.ovirt.org/#/c/87310/
- date: 20-02-2018
- version: Ovirt 4.2
- test failed: 003_00_metrics_bootstrap.metrics_and_log_collector
- fix: fixed bug in initial validations -
https://gerrit.ovirt.org/#/c/87919/
2. replace vdsm stats with collectd virt plugin -
https://gerrit.ovirt.org/#/c/87310/
- date: 21-02-2018
- version: Ovirt Master
- test failed: 098_ovirt_provider_ovn.use_ovn_provider
- NO FIX WAS REPORTED
3. ansible: End playbook based on initial validations -
https://gerrit.ovirt.org/#/c/88062/
- date: 22-02-2018
- version: Ovirt Master
- Test failed: 003_00_metrics_bootstrap.metrics_and_log_collector
- Fix: bug reported: https://bugzilla.redhat.com/show_bug.cgi?id=1548087
4. momIF: change the way we connect to MOM -
https://gerrit.ovirt.org/#/c/87944/
- date: 22-02-2018
- version: Ovirt Master
- Test failed: 002_bootstrap.verify_add_hosts + 002_bootstrap.add_hosts
- Fix: change was reverted until its fixed
5. Require collectd-virt plugin - https://gerrit.ovirt.org/#/c/87311/
- date: 22-02-2018
- version: Ovirt Master
- Test: failed: 002_bootstrap.add_hosts
- Fix: Require collectd-virt plugin - https://gerrit.ovirt.org/#/c/87311/
*Below you can see the chart for this week's resolved issues but cause of
failure: *
Code= regression of working components/functionalities
Configurations - package related issues
Other = failed build artifacts
Infra = infrastructure/OST/Lago related issues
[image: Inline image 1]
[image: Inline image 2]
*Below is a chart of resolved failures based on ovirt version*
*[image: Inline image 3]*
*[image: Inline image 4]*
*Below is a chart showing failures by suite type: *
* Suite type None means that it was a failure that did not result in a
failed test such artificts related or packaging related.
[image: Inline image 5]
We are currently working on creating more statistics and defining the type
of information it would be interesting to present. if there is anything
specific that you would be interested in please let us know and we would
add that to our plans.
Many thanks,
Dafna Ron
6 years, 10 months