planned gerrit downtime
by Evgheni Dereveanchin
Hi everyone,
I will be migrating gerrit.ovirt.org to a different VM tonight so it will
be unavailable for several hours as the process involves copying large
amounts of data.
It will not be possible to pull from or push to any repos as well as to
review changes.
I'll follow up as soon as the migration is complete.
--
Regards,
Evgheni Dereveanchin
5 years, 4 months
Change in stdci-staging[master]: just a change
by review@gerrit-staging.phx.ovirt.org
From Barak Korren <bkorren(a)redhat.com>:
Barak Korren has posted comments on this change. ( https://gerrit-staging.phx.ovirt.org/215 )
Change subject: just a change
......................................................................
Patch Set 1:
ci test please
--
To view, visit https://gerrit-staging.phx.ovirt.org/215
To unsubscribe, visit https://gerrit-staging.phx.ovirt.org/settings
Gerrit-Project: stdci-staging
Gerrit-Branch: master
Gerrit-MessageType: comment
Gerrit-Change-Id: I6ad5ccb6da74775cd59597274439129b0e398609
Gerrit-Change-Number: 215
Gerrit-PatchSet: 1
Gerrit-Owner: Daniel Belenky <dbelenky(a)redhat.com>
Gerrit-Reviewer: Barak Korren <bkorren(a)redhat.com>
Gerrit-Reviewer: Jenkins CI <infra(a)ovirt.org>
Gerrit-Comment-Date: Mon, 10 Jun 2019 07:31:59 +0000
Gerrit-HasComments: No
5 years, 4 months
Change in stdci-staging[master]: just a change
by review@gerrit-staging.phx.ovirt.org
From Barak Korren <bkorren(a)redhat.com>:
Barak Korren has posted comments on this change. ( https://gerrit-staging.phx.ovirt.org/215 )
Change subject: just a change
......................................................................
Patch Set 1:
ci please test
--
To view, visit https://gerrit-staging.phx.ovirt.org/215
To unsubscribe, visit https://gerrit-staging.phx.ovirt.org/settings
Gerrit-Project: stdci-staging
Gerrit-Branch: master
Gerrit-MessageType: comment
Gerrit-Change-Id: I6ad5ccb6da74775cd59597274439129b0e398609
Gerrit-Change-Number: 215
Gerrit-PatchSet: 1
Gerrit-Owner: Daniel Belenky <dbelenky(a)redhat.com>
Gerrit-Reviewer: Barak Korren <bkorren(a)redhat.com>
Gerrit-Reviewer: Jenkins CI <infra(a)ovirt.org>
Gerrit-Comment-Date: Mon, 10 Jun 2019 07:28:44 +0000
Gerrit-HasComments: No
5 years, 4 months
Change in stdci-staging[master]: just a change
by review@gerrit-staging.phx.ovirt.org
From Barak Korren <bkorren(a)redhat.com>:
Barak Korren has posted comments on this change. ( https://gerrit-staging.phx.ovirt.org/215 )
Change subject: just a change
......................................................................
Patch Set 1:
ci test please
--
To view, visit https://gerrit-staging.phx.ovirt.org/215
To unsubscribe, visit https://gerrit-staging.phx.ovirt.org/settings
Gerrit-Project: stdci-staging
Gerrit-Branch: master
Gerrit-MessageType: comment
Gerrit-Change-Id: I6ad5ccb6da74775cd59597274439129b0e398609
Gerrit-Change-Number: 215
Gerrit-PatchSet: 1
Gerrit-Owner: Daniel Belenky <dbelenky(a)redhat.com>
Gerrit-Reviewer: Barak Korren <bkorren(a)redhat.com>
Gerrit-Reviewer: Jenkins CI <infra(a)ovirt.org>
Gerrit-Comment-Date: Mon, 10 Jun 2019 07:23:26 +0000
Gerrit-HasComments: No
5 years, 4 months
FC28 Blacklisted in oVirt STDCI V2
by Barak Korren
Hi all,
By Snadro's request, as of a few minutes ago FC28 had been blacklisted in
oVirt's STDCI V2 system.
What this means is that if any project has FC28 threads configured in its
STDCI V2 YAML configuration file, those threads will be ignored by the
STDCI system and not invoked.
Threads for building and testing on other distributions will keep working
as before.
It is still recommended that projects with fc28 configuration will patch
their configuration files to remove FC28, to make it easier to understand
what the CI does and not rely on an implicit blacklist.
Please note that this only concerns projects that have made the switch to
STDCI V2. Projects that still use V1, and have FC28 jobs defined, those
jobs will keep working as before. The following patch by Sandro, however,
includes code to remove all those jobs:
https://gerrit.ovirt.org/c/100556/
Thanks,
Barak.
--
Barak Korren
RHV DevOps team , RHCE, RHCi
Red Hat EMEA
redhat.com | TRIED. TESTED. TRUSTED. | redhat.com/trusted
5 years, 4 months
Build failed in Jenkins: system-sync_mirrors-glusterfs-5-el7-x86_64
#418
by jenkins@jenkins.phx.ovirt.org
See <http://jenkins.ovirt.org/job/system-sync_mirrors-glusterfs-5-el7-x86_64/4...>
------------------------------------------
Started by timer
[EnvInject] - Loading node environment variables.
Building remotely on mirrors.phx.ovirt.org (mirrors) in workspace <http://jenkins.ovirt.org/job/system-sync_mirrors-glusterfs-5-el7-x86_64/ws/>
No credentials specified
> git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
> git config remote.origin.url http://gerrit.ovirt.org/jenkins.git # timeout=10
Cleaning workspace
> git rev-parse --verify HEAD # timeout=10
Resetting working tree
> git reset --hard # timeout=10
> git clean -fdx # timeout=10
Pruning obsolete local branches
Fetching upstream changes from http://gerrit.ovirt.org/jenkins.git
> git --version # timeout=10
> git fetch --tags --progress http://gerrit.ovirt.org/jenkins.git +refs/heads/*:refs/remotes/origin/* --prune
> git rev-parse origin/master^{commit} # timeout=10
Checking out Revision c41bdc92144d7ef87391521fca327d6e790624c1 (origin/master)
> git config core.sparsecheckout # timeout=10
> git checkout -f c41bdc92144d7ef87391521fca327d6e790624c1
Commit message: "standard-stage: Remove pusher.py support in V1"
> git rev-list --no-walk c41bdc92144d7ef87391521fca327d6e790624c1 # timeout=10
[system-sync_mirrors-glusterfs-5-el7-x86_64] $ /bin/bash -xe /tmp/jenkins7239729937118097042.sh
+ jenkins/scripts/mirror_mgr.sh resync_yum_mirror glusterfs-5-el7 x86_64 jenkins/data/mirrors-reposync.conf
+ MIRRORS_MP_BASE=/var/www/html/repos
+ MIRRORS_HTTP_BASE=http://mirrors.phx.ovirt.org/repos
+ MIRRORS_CACHE=/home/jenkins/mirrors_cache
+ MAX_LOCK_ATTEMPTS=120
+ LOCK_WAIT_INTERVAL=5
+ LOCK_BASE=/home/jenkins
+ OLD_MD_TO_KEEP=100
+ HTTP_SELINUX_TYPE=httpd_sys_content_t
+ HTTP_FILE_MODE=644
+ main resync_yum_mirror glusterfs-5-el7 x86_64 jenkins/data/mirrors-reposync.conf
+ local command=resync_yum_mirror
+ command_args=("${@:2}")
+ local command_args
+ cmd_resync_yum_mirror glusterfs-5-el7 x86_64 jenkins/data/mirrors-reposync.conf
+ local repo_name=glusterfs-5-el7
+ local repo_archs=x86_64
+ local reposync_conf=jenkins/data/mirrors-reposync.conf
+ local sync_needed
+ mkdir -p /home/jenkins/mirrors_cache
+ verify_repo_fs glusterfs-5-el7 yum
+ local repo_name=glusterfs-5-el7
+ local repo_type=yum
+ sudo install -o jenkins -d /var/www/html/repos/yum /var/www/html/repos/yum/glusterfs-5-el7 /var/www/html/repos/yum/glusterfs-5-el7/base
+ check_yum_sync_needed glusterfs-5-el7 x86_64 jenkins/data/mirrors-reposync.conf sync_needed
+ local repo_name=glusterfs-5-el7
+ local repo_archs=x86_64
+ local reposync_conf=jenkins/data/mirrors-reposync.conf
+ local p_sync_needed=sync_needed
+ local reposync_out
+ echo 'Checking if mirror needs a resync'
Checking if mirror needs a resync
+ rm -rf /home/jenkins/mirrors_cache/glusterfs-5-el7
++ IFS=,
++ echo x86_64
+ for arch in '$(IFS=,; echo $repo_archs)'
++ run_reposync glusterfs-5-el7 x86_64 jenkins/data/mirrors-reposync.conf --urls --quiet
++ local repo_name=glusterfs-5-el7
++ local repo_arch=x86_64
++ local reposync_conf=jenkins/data/mirrors-reposync.conf
++ extra_args=("${@:4}")
++ local extra_args
++ reposync --config=jenkins/data/mirrors-reposync.conf --repoid=glusterfs-5-el7 --arch=x86_64 --cachedir=/home/jenkins/mirrors_cache --download_path=/var/www/html/repos/yum/glusterfs-5-el7/base --norepopath --newest-only --urls --quiet
Traceback (most recent call last):
File "/usr/bin/reposync", line 373, in <module>
main()
File "/usr/bin/reposync", line 185, in main
my.doRepoSetup()
File "/usr/lib/python2.7/site-packages/yum/__init__.py", line 681, in doRepoSetup
return self._getRepos(thisrepo, True)
File "/usr/lib/python2.7/site-packages/yum/__init__.py", line 721, in _getRepos
self._repos.doSetup(thisrepo)
File "/usr/lib/python2.7/site-packages/yum/repos.py", line 157, in doSetup
self.retrieveAllMD()
File "/usr/lib/python2.7/site-packages/yum/repos.py", line 88, in retrieveAllMD
dl = repo._async and repo._commonLoadRepoXML(repo)
File "/usr/lib/python2.7/site-packages/yum/yumRepo.py", line 1468, in _commonLoadRepoXML
local = self.cachedir + '/repomd.xml'
File "/usr/lib/python2.7/site-packages/yum/yumRepo.py", line 777, in <lambda>
cachedir = property(lambda self: self._dirGetAttr('cachedir'))
File "/usr/lib/python2.7/site-packages/yum/yumRepo.py", line 760, in _dirGetAttr
self.dirSetup()
File "/usr/lib/python2.7/site-packages/yum/yumRepo.py", line 738, in dirSetup
self._dirSetupMkdir_p(dir)
File "/usr/lib/python2.7/site-packages/yum/yumRepo.py", line 715, in _dirSetupMkdir_p
raise Errors.RepoError, msg
yum.Errors.RepoError: Error making cache directory: /home/jenkins/mirrors_cache/glusterfs-5-el7/packages error was: [Errno 17] File exists: '/home/jenkins/mirrors_cache/glusterfs-5-el7/packages'
+ reposync_out=
Build step 'Execute shell' marked build as failure
5 years, 4 months