Build failed in Jenkins: system-sync_mirrors-mock-copr-fc29-x86_64
#416
by jenkins@jenkins.phx.ovirt.org
See <http://jenkins.ovirt.org/job/system-sync_mirrors-mock-copr-fc29-x86_64/41...>
------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on mirrors.phx.ovirt.org (mirrors) in workspace <http://jenkins.ovirt.org/job/system-sync_mirrors-mock-copr-fc29-x86_64/ws/>
No credentials specified
> git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
> git config remote.origin.url http://gerrit.ovirt.org/jenkins.git # timeout=10
Cleaning workspace
> git rev-parse --verify HEAD # timeout=10
Resetting working tree
> git reset --hard # timeout=10
> git clean -fdx # timeout=10
Pruning obsolete local branches
Fetching upstream changes from http://gerrit.ovirt.org/jenkins.git
> git --version # timeout=10
> git fetch --tags --progress http://gerrit.ovirt.org/jenkins.git +refs/heads/*:refs/remotes/origin/* --prune
> git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 73e6c6324b26eacacae20b320580c739a65166bf (origin/master)
> git config core.sparsecheckout # timeout=10
> git checkout -f 73e6c6324b26eacacae20b320580c739a65166bf
Commit message: "appliance: require more space for building"
> git rev-list --no-walk 73e6c6324b26eacacae20b320580c739a65166bf # timeout=10
[system-sync_mirrors-mock-copr-fc29-x86_64] $ /bin/bash -xe /tmp/jenkins8774405251977708786.sh
+ jenkins/scripts/mirror_mgr.sh resync_yum_mirror mock-copr-fc29 x86_64 jenkins/data/mirrors-reposync.conf
+ MIRRORS_MP_BASE=/var/www/html/repos
+ MIRRORS_HTTP_BASE=http://mirrors.phx.ovirt.org/repos
+ MIRRORS_CACHE=/home/jenkins/mirrors_cache
+ MAX_LOCK_ATTEMPTS=120
+ LOCK_WAIT_INTERVAL=5
+ LOCK_BASE=/home/jenkins
+ OLD_MD_TO_KEEP=100
+ HTTP_SELINUX_TYPE=httpd_sys_content_t
+ HTTP_FILE_MODE=644
+ main resync_yum_mirror mock-copr-fc29 x86_64 jenkins/data/mirrors-reposync.conf
+ local command=resync_yum_mirror
+ command_args=("${@:2}")
+ local command_args
+ cmd_resync_yum_mirror mock-copr-fc29 x86_64 jenkins/data/mirrors-reposync.conf
+ local repo_name=mock-copr-fc29
+ local repo_archs=x86_64
+ local reposync_conf=jenkins/data/mirrors-reposync.conf
+ local sync_needed
+ mkdir -p /home/jenkins/mirrors_cache
+ verify_repo_fs mock-copr-fc29 yum
+ local repo_name=mock-copr-fc29
+ local repo_type=yum
+ sudo install -o jenkins -d /var/www/html/repos/yum /var/www/html/repos/yum/mock-copr-fc29 /var/www/html/repos/yum/mock-copr-fc29/base
+ check_yum_sync_needed mock-copr-fc29 x86_64 jenkins/data/mirrors-reposync.conf sync_needed
+ local repo_name=mock-copr-fc29
+ local repo_archs=x86_64
+ local reposync_conf=jenkins/data/mirrors-reposync.conf
+ local p_sync_needed=sync_needed
+ local reposync_out
+ echo 'Checking if mirror needs a resync'
Checking if mirror needs a resync
+ rm -rf /home/jenkins/mirrors_cache/mock-copr-fc29
++ IFS=,
++ echo x86_64
+ for arch in '$(IFS=,; echo $repo_archs)'
++ run_reposync mock-copr-fc29 x86_64 jenkins/data/mirrors-reposync.conf --urls --quiet
++ local repo_name=mock-copr-fc29
++ local repo_arch=x86_64
++ local reposync_conf=jenkins/data/mirrors-reposync.conf
++ extra_args=("${@:4}")
++ local extra_args
++ reposync --config=jenkins/data/mirrors-reposync.conf --repoid=mock-copr-fc29 --arch=x86_64 --cachedir=/home/jenkins/mirrors_cache --download_path=/var/www/html/repos/yum/mock-copr-fc29/base --norepopath --newest-only --urls --quiet
Traceback (most recent call last):
File "/usr/bin/reposync", line 373, in <module>
main()
File "/usr/bin/reposync", line 185, in main
my.doRepoSetup()
File "/usr/lib/python2.7/site-packages/yum/__init__.py", line 681, in doRepoSetup
return self._getRepos(thisrepo, True)
File "/usr/lib/python2.7/site-packages/yum/__init__.py", line 721, in _getRepos
self._repos.doSetup(thisrepo)
File "/usr/lib/python2.7/site-packages/yum/repos.py", line 157, in doSetup
self.retrieveAllMD()
File "/usr/lib/python2.7/site-packages/yum/repos.py", line 88, in retrieveAllMD
dl = repo._async and repo._commonLoadRepoXML(repo)
File "/usr/lib/python2.7/site-packages/yum/yumRepo.py", line 1482, in _commonLoadRepoXML
result = self._getFileRepoXML(local, text)
File "/usr/lib/python2.7/site-packages/yum/yumRepo.py", line 1259, in _getFileRepoXML
size=102400) # setting max size as 100K
File "/usr/lib/python2.7/site-packages/yum/yumRepo.py", line 1042, in _getFile
raise e
yum.Errors.NoMoreMirrorsRepoError: failure: repodata/repomd.xml from centos-qemu-ev-testing-el7: [Errno 256] No more mirrors to try.
https://buildlogs.centos.org/centos/7/virt/x86_64/kvm-common/repodata/rep...: [Errno 14] curl#7 - "Failed to connect to 2607:1680:0:1::2: Network is unreachable"
https://buildlogs.centos.org/centos/7/virt/x86_64/kvm-common/repodata/rep...: [Errno 14] curl#7 - "Failed to connect to 2607:1680:0:1::2: Network is unreachable"
https://buildlogs.centos.org/centos/7/virt/x86_64/kvm-common/repodata/rep...: [Errno 14] curl#7 - "Failed to connect to 2607:1680:0:1::2: Network is unreachable"
https://buildlogs.centos.org/centos/7/virt/x86_64/kvm-common/repodata/rep...: [Errno 14] curl#7 - "Failed to connect to 2607:1680:0:1::2: Network is unreachable"
https://buildlogs.centos.org/centos/7/virt/x86_64/kvm-common/repodata/rep...: [Errno 14] curl#7 - "Failed to connect to 2607:1680:0:1::2: Network is unreachable"
https://buildlogs.centos.org/centos/7/virt/x86_64/kvm-common/repodata/rep...: [Errno 14] curl#7 - "Failed to connect to 2607:1680:0:1::2: Network is unreachable"
https://buildlogs.centos.org/centos/7/virt/x86_64/kvm-common/repodata/rep...: [Errno 14] curl#7 - "Failed to connect to 2607:1680:0:1::2: Network is unreachable"
https://buildlogs.centos.org/centos/7/virt/x86_64/kvm-common/repodata/rep...: [Errno 14] curl#7 - "Failed to connect to 2607:1680:0:1::2: Network is unreachable"
https://buildlogs.centos.org/centos/7/virt/x86_64/kvm-common/repodata/rep...: [Errno 14] curl#7 - "Failed to connect to 2607:1680:0:1::2: Network is unreachable"
https://buildlogs.centos.org/centos/7/virt/x86_64/kvm-common/repodata/rep...: [Errno 14] curl#7 - "Failed to connect to 2607:1680:0:1::2: Network is unreachable"
+ reposync_out=
Build step 'Execute shell' marked build as failure
5 years, 2 months
Build failed in Jenkins:
system-sync_mirrors-centos-extras-el7-x86_64 #2970
by jenkins@jenkins.phx.ovirt.org
See <http://jenkins.ovirt.org/job/system-sync_mirrors-centos-extras-el7-x86_64...>
------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on mirrors.phx.ovirt.org (mirrors) in workspace <http://jenkins.ovirt.org/job/system-sync_mirrors-centos-extras-el7-x86_64...>
No credentials specified
> git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
> git config remote.origin.url http://gerrit.ovirt.org/jenkins.git # timeout=10
Cleaning workspace
> git rev-parse --verify HEAD # timeout=10
Resetting working tree
> git reset --hard # timeout=10
> git clean -fdx # timeout=10
Pruning obsolete local branches
Fetching upstream changes from http://gerrit.ovirt.org/jenkins.git
> git --version # timeout=10
> git fetch --tags --progress http://gerrit.ovirt.org/jenkins.git +refs/heads/*:refs/remotes/origin/* --prune
> git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 73e6c6324b26eacacae20b320580c739a65166bf (origin/master)
> git config core.sparsecheckout # timeout=10
> git checkout -f 73e6c6324b26eacacae20b320580c739a65166bf
Commit message: "appliance: require more space for building"
> git rev-list --no-walk 73e6c6324b26eacacae20b320580c739a65166bf # timeout=10
[system-sync_mirrors-centos-extras-el7-x86_64] $ /bin/bash -xe /tmp/jenkins5336629620286959032.sh
+ jenkins/scripts/mirror_mgr.sh resync_yum_mirror centos-extras-el7 x86_64 jenkins/data/mirrors-reposync.conf
+ MIRRORS_MP_BASE=/var/www/html/repos
+ MIRRORS_HTTP_BASE=http://mirrors.phx.ovirt.org/repos
+ MIRRORS_CACHE=/home/jenkins/mirrors_cache
+ MAX_LOCK_ATTEMPTS=120
+ LOCK_WAIT_INTERVAL=5
+ LOCK_BASE=/home/jenkins
+ OLD_MD_TO_KEEP=100
+ HTTP_SELINUX_TYPE=httpd_sys_content_t
+ HTTP_FILE_MODE=644
+ main resync_yum_mirror centos-extras-el7 x86_64 jenkins/data/mirrors-reposync.conf
+ local command=resync_yum_mirror
+ command_args=("${@:2}")
+ local command_args
+ cmd_resync_yum_mirror centos-extras-el7 x86_64 jenkins/data/mirrors-reposync.conf
+ local repo_name=centos-extras-el7
+ local repo_archs=x86_64
+ local reposync_conf=jenkins/data/mirrors-reposync.conf
+ local sync_needed
+ mkdir -p /home/jenkins/mirrors_cache
+ verify_repo_fs centos-extras-el7 yum
+ local repo_name=centos-extras-el7
+ local repo_type=yum
+ sudo install -o jenkins -d /var/www/html/repos/yum /var/www/html/repos/yum/centos-extras-el7 /var/www/html/repos/yum/centos-extras-el7/base
+ check_yum_sync_needed centos-extras-el7 x86_64 jenkins/data/mirrors-reposync.conf sync_needed
+ local repo_name=centos-extras-el7
+ local repo_archs=x86_64
+ local reposync_conf=jenkins/data/mirrors-reposync.conf
+ local p_sync_needed=sync_needed
+ local reposync_out
+ echo 'Checking if mirror needs a resync'
Checking if mirror needs a resync
+ rm -rf /home/jenkins/mirrors_cache/centos-extras-el7
++ IFS=,
++ echo x86_64
+ for arch in '$(IFS=,; echo $repo_archs)'
++ run_reposync centos-extras-el7 x86_64 jenkins/data/mirrors-reposync.conf --urls --quiet
++ local repo_name=centos-extras-el7
++ local repo_arch=x86_64
++ local reposync_conf=jenkins/data/mirrors-reposync.conf
++ extra_args=("${@:4}")
++ local extra_args
++ reposync --config=jenkins/data/mirrors-reposync.conf --repoid=centos-extras-el7 --arch=x86_64 --cachedir=/home/jenkins/mirrors_cache --download_path=/var/www/html/repos/yum/centos-extras-el7/base --norepopath --newest-only --urls --quiet
Traceback (most recent call last):
File "/usr/bin/reposync", line 373, in <module>
main()
File "/usr/bin/reposync", line 185, in main
my.doRepoSetup()
File "/usr/lib/python2.7/site-packages/yum/__init__.py", line 681, in doRepoSetup
return self._getRepos(thisrepo, True)
File "/usr/lib/python2.7/site-packages/yum/__init__.py", line 721, in _getRepos
self._repos.doSetup(thisrepo)
File "/usr/lib/python2.7/site-packages/yum/repos.py", line 157, in doSetup
self.retrieveAllMD()
File "/usr/lib/python2.7/site-packages/yum/repos.py", line 88, in retrieveAllMD
dl = repo._async and repo._commonLoadRepoXML(repo)
File "/usr/lib/python2.7/site-packages/yum/yumRepo.py", line 1482, in _commonLoadRepoXML
result = self._getFileRepoXML(local, text)
File "/usr/lib/python2.7/site-packages/yum/yumRepo.py", line 1259, in _getFileRepoXML
size=102400) # setting max size as 100K
File "/usr/lib/python2.7/site-packages/yum/yumRepo.py", line 1042, in _getFile
raise e
yum.Errors.NoMoreMirrorsRepoError: failure: repodata/repomd.xml from centos-qemu-ev-testing-el7: [Errno 256] No more mirrors to try.
https://buildlogs.centos.org/centos/7/virt/x86_64/kvm-common/repodata/rep...: [Errno 14] curl#7 - "Failed to connect to 2604:4500:0:109::10: Network is unreachable"
https://buildlogs.centos.org/centos/7/virt/x86_64/kvm-common/repodata/rep...: [Errno 14] curl#7 - "Failed to connect to 2604:4500:0:109::10: Network is unreachable"
https://buildlogs.centos.org/centos/7/virt/x86_64/kvm-common/repodata/rep...: [Errno 14] curl#7 - "Failed to connect to 2604:4500:0:109::10: Network is unreachable"
https://buildlogs.centos.org/centos/7/virt/x86_64/kvm-common/repodata/rep...: [Errno 14] curl#7 - "Failed to connect to 2604:4500:0:109::10: Network is unreachable"
https://buildlogs.centos.org/centos/7/virt/x86_64/kvm-common/repodata/rep...: [Errno 14] curl#7 - "Failed to connect to 2604:4500:0:109::10: Network is unreachable"
https://buildlogs.centos.org/centos/7/virt/x86_64/kvm-common/repodata/rep...: [Errno 14] curl#7 - "Failed to connect to 2604:4500:0:109::10: Network is unreachable"
https://buildlogs.centos.org/centos/7/virt/x86_64/kvm-common/repodata/rep...: [Errno 14] curl#7 - "Failed to connect to 2604:4500:0:109::10: Network is unreachable"
https://buildlogs.centos.org/centos/7/virt/x86_64/kvm-common/repodata/rep...: [Errno 14] curl#7 - "Failed to connect to 2604:4500:0:109::10: Network is unreachable"
https://buildlogs.centos.org/centos/7/virt/x86_64/kvm-common/repodata/rep...: [Errno 14] curl#7 - "Failed to connect to 2604:4500:0:109::10: Network is unreachable"
https://buildlogs.centos.org/centos/7/virt/x86_64/kvm-common/repodata/rep...: [Errno 14] curl#7 - "Failed to connect to 2604:4500:0:109::10: Network is unreachable"
+ reposync_out=
Build step 'Execute shell' marked build as failure
5 years, 2 months
Build failed in Jenkins: system-sync_mirrors-mock-copr-fc28-x86_64
#414
by jenkins@jenkins.phx.ovirt.org
See <http://jenkins.ovirt.org/job/system-sync_mirrors-mock-copr-fc28-x86_64/41...>
------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on mirrors.phx.ovirt.org (mirrors) in workspace <http://jenkins.ovirt.org/job/system-sync_mirrors-mock-copr-fc28-x86_64/ws/>
No credentials specified
> git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
> git config remote.origin.url http://gerrit.ovirt.org/jenkins.git # timeout=10
Cleaning workspace
> git rev-parse --verify HEAD # timeout=10
Resetting working tree
> git reset --hard # timeout=10
> git clean -fdx # timeout=10
Pruning obsolete local branches
Fetching upstream changes from http://gerrit.ovirt.org/jenkins.git
> git --version # timeout=10
> git fetch --tags --progress http://gerrit.ovirt.org/jenkins.git +refs/heads/*:refs/remotes/origin/* --prune
> git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 73e6c6324b26eacacae20b320580c739a65166bf (origin/master)
> git config core.sparsecheckout # timeout=10
> git checkout -f 73e6c6324b26eacacae20b320580c739a65166bf
Commit message: "appliance: require more space for building"
> git rev-list --no-walk 73e6c6324b26eacacae20b320580c739a65166bf # timeout=10
[system-sync_mirrors-mock-copr-fc28-x86_64] $ /bin/bash -xe /tmp/jenkins2854932318694301322.sh
+ jenkins/scripts/mirror_mgr.sh resync_yum_mirror mock-copr-fc28 x86_64 jenkins/data/mirrors-reposync.conf
+ MIRRORS_MP_BASE=/var/www/html/repos
+ MIRRORS_HTTP_BASE=http://mirrors.phx.ovirt.org/repos
+ MIRRORS_CACHE=/home/jenkins/mirrors_cache
+ MAX_LOCK_ATTEMPTS=120
+ LOCK_WAIT_INTERVAL=5
+ LOCK_BASE=/home/jenkins
+ OLD_MD_TO_KEEP=100
+ HTTP_SELINUX_TYPE=httpd_sys_content_t
+ HTTP_FILE_MODE=644
+ main resync_yum_mirror mock-copr-fc28 x86_64 jenkins/data/mirrors-reposync.conf
+ local command=resync_yum_mirror
+ command_args=("${@:2}")
+ local command_args
+ cmd_resync_yum_mirror mock-copr-fc28 x86_64 jenkins/data/mirrors-reposync.conf
+ local repo_name=mock-copr-fc28
+ local repo_archs=x86_64
+ local reposync_conf=jenkins/data/mirrors-reposync.conf
+ local sync_needed
+ mkdir -p /home/jenkins/mirrors_cache
+ verify_repo_fs mock-copr-fc28 yum
+ local repo_name=mock-copr-fc28
+ local repo_type=yum
+ sudo install -o jenkins -d /var/www/html/repos/yum /var/www/html/repos/yum/mock-copr-fc28 /var/www/html/repos/yum/mock-copr-fc28/base
+ check_yum_sync_needed mock-copr-fc28 x86_64 jenkins/data/mirrors-reposync.conf sync_needed
+ local repo_name=mock-copr-fc28
+ local repo_archs=x86_64
+ local reposync_conf=jenkins/data/mirrors-reposync.conf
+ local p_sync_needed=sync_needed
+ local reposync_out
+ echo 'Checking if mirror needs a resync'
Checking if mirror needs a resync
+ rm -rf /home/jenkins/mirrors_cache/mock-copr-fc28
++ IFS=,
++ echo x86_64
+ for arch in '$(IFS=,; echo $repo_archs)'
++ run_reposync mock-copr-fc28 x86_64 jenkins/data/mirrors-reposync.conf --urls --quiet
++ local repo_name=mock-copr-fc28
++ local repo_arch=x86_64
++ local reposync_conf=jenkins/data/mirrors-reposync.conf
++ extra_args=("${@:4}")
++ local extra_args
++ reposync --config=jenkins/data/mirrors-reposync.conf --repoid=mock-copr-fc28 --arch=x86_64 --cachedir=/home/jenkins/mirrors_cache --download_path=/var/www/html/repos/yum/mock-copr-fc28/base --norepopath --newest-only --urls --quiet
Traceback (most recent call last):
File "/usr/bin/reposync", line 373, in <module>
main()
File "/usr/bin/reposync", line 185, in main
my.doRepoSetup()
File "/usr/lib/python2.7/site-packages/yum/__init__.py", line 681, in doRepoSetup
return self._getRepos(thisrepo, True)
File "/usr/lib/python2.7/site-packages/yum/__init__.py", line 721, in _getRepos
self._repos.doSetup(thisrepo)
File "/usr/lib/python2.7/site-packages/yum/repos.py", line 157, in doSetup
self.retrieveAllMD()
File "/usr/lib/python2.7/site-packages/yum/repos.py", line 88, in retrieveAllMD
dl = repo._async and repo._commonLoadRepoXML(repo)
File "/usr/lib/python2.7/site-packages/yum/yumRepo.py", line 1482, in _commonLoadRepoXML
result = self._getFileRepoXML(local, text)
File "/usr/lib/python2.7/site-packages/yum/yumRepo.py", line 1259, in _getFileRepoXML
size=102400) # setting max size as 100K
File "/usr/lib/python2.7/site-packages/yum/yumRepo.py", line 1042, in _getFile
raise e
yum.Errors.NoMoreMirrorsRepoError: failure: repodata/repomd.xml from centos-opstools-testing-el7: [Errno 256] No more mirrors to try.
http://buildlogs.centos.org/centos/7/opstools/x86_64/repodata/repomd.xml: [Errno 14] HTTPS Error 302 - Found
+ reposync_out=
Build step 'Execute shell' marked build as failure
5 years, 2 months
Build failed in Jenkins:
system-sync_mirrors-centos-ovirt-4.2-el7-x86_64 #2481
by jenkins@jenkins.phx.ovirt.org
See <http://jenkins.ovirt.org/job/system-sync_mirrors-centos-ovirt-4.2-el7-x86...>
------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on mirrors.phx.ovirt.org (mirrors) in workspace <http://jenkins.ovirt.org/job/system-sync_mirrors-centos-ovirt-4.2-el7-x86...>
No credentials specified
> git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
> git config remote.origin.url http://gerrit.ovirt.org/jenkins.git # timeout=10
Cleaning workspace
> git rev-parse --verify HEAD # timeout=10
Resetting working tree
> git reset --hard # timeout=10
> git clean -fdx # timeout=10
Pruning obsolete local branches
Fetching upstream changes from http://gerrit.ovirt.org/jenkins.git
> git --version # timeout=10
> git fetch --tags --progress http://gerrit.ovirt.org/jenkins.git +refs/heads/*:refs/remotes/origin/* --prune
> git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 73e6c6324b26eacacae20b320580c739a65166bf (origin/master)
> git config core.sparsecheckout # timeout=10
> git checkout -f 73e6c6324b26eacacae20b320580c739a65166bf
Commit message: "appliance: require more space for building"
> git rev-list --no-walk 73e6c6324b26eacacae20b320580c739a65166bf # timeout=10
[system-sync_mirrors-centos-ovirt-4.2-el7-x86_64] $ /bin/bash -xe /tmp/jenkins3634427512316572798.sh
+ jenkins/scripts/mirror_mgr.sh resync_yum_mirror centos-ovirt-4.2-el7 x86_64 jenkins/data/mirrors-reposync.conf
+ MIRRORS_MP_BASE=/var/www/html/repos
+ MIRRORS_HTTP_BASE=http://mirrors.phx.ovirt.org/repos
+ MIRRORS_CACHE=/home/jenkins/mirrors_cache
+ MAX_LOCK_ATTEMPTS=120
+ LOCK_WAIT_INTERVAL=5
+ LOCK_BASE=/home/jenkins
+ OLD_MD_TO_KEEP=100
+ HTTP_SELINUX_TYPE=httpd_sys_content_t
+ HTTP_FILE_MODE=644
+ main resync_yum_mirror centos-ovirt-4.2-el7 x86_64 jenkins/data/mirrors-reposync.conf
+ local command=resync_yum_mirror
+ command_args=("${@:2}")
+ local command_args
+ cmd_resync_yum_mirror centos-ovirt-4.2-el7 x86_64 jenkins/data/mirrors-reposync.conf
+ local repo_name=centos-ovirt-4.2-el7
+ local repo_archs=x86_64
+ local reposync_conf=jenkins/data/mirrors-reposync.conf
+ local sync_needed
+ mkdir -p /home/jenkins/mirrors_cache
+ verify_repo_fs centos-ovirt-4.2-el7 yum
+ local repo_name=centos-ovirt-4.2-el7
+ local repo_type=yum
+ sudo install -o jenkins -d /var/www/html/repos/yum /var/www/html/repos/yum/centos-ovirt-4.2-el7 /var/www/html/repos/yum/centos-ovirt-4.2-el7/base
+ check_yum_sync_needed centos-ovirt-4.2-el7 x86_64 jenkins/data/mirrors-reposync.conf sync_needed
+ local repo_name=centos-ovirt-4.2-el7
+ local repo_archs=x86_64
+ local reposync_conf=jenkins/data/mirrors-reposync.conf
+ local p_sync_needed=sync_needed
+ local reposync_out
+ echo 'Checking if mirror needs a resync'
Checking if mirror needs a resync
+ rm -rf /home/jenkins/mirrors_cache/centos-ovirt-4.2-el7
++ IFS=,
++ echo x86_64
+ for arch in '$(IFS=,; echo $repo_archs)'
++ run_reposync centos-ovirt-4.2-el7 x86_64 jenkins/data/mirrors-reposync.conf --urls --quiet
++ local repo_name=centos-ovirt-4.2-el7
++ local repo_arch=x86_64
++ local reposync_conf=jenkins/data/mirrors-reposync.conf
++ extra_args=("${@:4}")
++ local extra_args
++ reposync --config=jenkins/data/mirrors-reposync.conf --repoid=centos-ovirt-4.2-el7 --arch=x86_64 --cachedir=/home/jenkins/mirrors_cache --download_path=/var/www/html/repos/yum/centos-ovirt-4.2-el7/base --norepopath --newest-only --urls --quiet
Traceback (most recent call last):
File "/usr/bin/reposync", line 373, in <module>
main()
File "/usr/bin/reposync", line 185, in main
my.doRepoSetup()
File "/usr/lib/python2.7/site-packages/yum/__init__.py", line 681, in doRepoSetup
return self._getRepos(thisrepo, True)
File "/usr/lib/python2.7/site-packages/yum/__init__.py", line 721, in _getRepos
self._repos.doSetup(thisrepo)
File "/usr/lib/python2.7/site-packages/yum/repos.py", line 157, in doSetup
self.retrieveAllMD()
File "/usr/lib/python2.7/site-packages/yum/repos.py", line 88, in retrieveAllMD
dl = repo._async and repo._commonLoadRepoXML(repo)
File "/usr/lib/python2.7/site-packages/yum/yumRepo.py", line 1482, in _commonLoadRepoXML
result = self._getFileRepoXML(local, text)
File "/usr/lib/python2.7/site-packages/yum/yumRepo.py", line 1259, in _getFileRepoXML
size=102400) # setting max size as 100K
File "/usr/lib/python2.7/site-packages/yum/yumRepo.py", line 1042, in _getFile
raise e
yum.Errors.NoMoreMirrorsRepoError: failure: repodata/repomd.xml from centos-opstools-testing-el7: [Errno 256] No more mirrors to try.
http://buildlogs.centos.org/centos/7/opstools/x86_64/repodata/repomd.xml: [Errno 14] curl#7 - "Failed to connect to 2607:1680:0:1::2: Network is unreachable"
http://buildlogs.centos.org/centos/7/opstools/x86_64/repodata/repomd.xml: [Errno 14] curl#7 - "Failed to connect to 2607:1680:0:1::2: Network is unreachable"
http://buildlogs.centos.org/centos/7/opstools/x86_64/repodata/repomd.xml: [Errno 14] curl#7 - "Failed to connect to 2607:1680:0:1::2: Network is unreachable"
http://buildlogs.centos.org/centos/7/opstools/x86_64/repodata/repomd.xml: [Errno 14] curl#7 - "Failed to connect to 2607:1680:0:1::2: Network is unreachable"
http://buildlogs.centos.org/centos/7/opstools/x86_64/repodata/repomd.xml: [Errno 14] curl#7 - "Failed to connect to 2607:1680:0:1::2: Network is unreachable"
http://buildlogs.centos.org/centos/7/opstools/x86_64/repodata/repomd.xml: [Errno 14] curl#7 - "Failed to connect to 2607:1680:0:1::2: Network is unreachable"
http://buildlogs.centos.org/centos/7/opstools/x86_64/repodata/repomd.xml: [Errno 14] curl#7 - "Failed to connect to 2607:1680:0:1::2: Network is unreachable"
http://buildlogs.centos.org/centos/7/opstools/x86_64/repodata/repomd.xml: [Errno 14] curl#7 - "Failed to connect to 2607:1680:0:1::2: Network is unreachable"
http://buildlogs.centos.org/centos/7/opstools/x86_64/repodata/repomd.xml: [Errno 14] curl#7 - "Failed to connect to 2607:1680:0:1::2: Network is unreachable"
http://buildlogs.centos.org/centos/7/opstools/x86_64/repodata/repomd.xml: [Errno 14] curl#7 - "Failed to connect to 2607:1680:0:1::2: Network is unreachable"
+ reposync_out=
Build step 'Execute shell' marked build as failure
5 years, 2 months
[JIRA] (OVIRT-2805) build-artifacts job stuck on s390x - cannot run
OST
by Nir Soffer (oVirt JIRA)
[ https://ovirt-jira.atlassian.net/browse/OVIRT-2805?page=com.atlassian.jir... ]
Nir Soffer commented on OVIRT-2805:
-----------------------------------
On Sun, Sep 29, 2019 at 2:35 AM Nir Soffer <nsoffer(a)redhat.com> wrote:
> On Sun, Sep 29, 2019 at 1:13 AM Nir Soffer <nsoffer(a)redhat.com> wrote:
>
>> Since Friday build-artifacts on s390x get stuck again, so we cannot run
>> OST.
>> This is not a new issue, we have issues on s390x every few weeks.
>>
>> I posted this patch to disable this job:
>> https://gerrit.ovirt.org/c/97851
>>
>
> The patch does not work, it still runs the s390x job, and it run it
> incorrectly.
> Maybe the issue is not at the slave, but at vdsm automation scripts?
>
> We have this yam:
>
> 19 stages:
> 20 - build-artifacts:
> 21 substages:
> 22 - build-py27:
> 23 archs:
> 24 - ppc64le
> 25 - x86_64
> 26 - build-py37:
> 27 distributions:
> 28 - fc30
>
> And we get these jobs:
> - build-artifacts.build-py27.el7.ppc64le
> - build-artifacts.build-py27.el7.x86_64
> - build-artifacts.build-py27.fc29.x86_64
> - build-artifacts.build-py37.fc30.x86_64
> - build-artifacts.fc29.s390x
>
> The last job - s390x looks wrong - we should have only
> build-py27 and build-py37 jobs, using:
>
> - automation/build-artifacts.build-py27.sh
> - automation/build-artifacts.build-py37.sh
>
> But both scripts are using a symlinks:
> lrwxrwxrwx. 1 nsoffer nsoffer 18 Sep 29 00:55 automation/
> build-artifacts.build-py27.sh -> build-artifacts.sh
> lrwxrwxrwx. 1 nsoffer nsoffer 18 Sep 29 00:55 automation/
> build-artifacts.build-py37.sh -> build-artifacts.sh
> -rwxrwxr-x. 1 nsoffer nsoffer 346 Sep 17 02:54
> automation/build-artifacts.sh
>
> Is it possible the the CI find build-artifacts.sh and run it even when no
> sub stage is specified?
>
> I'll try to rename this script to avoid this.
>
Hopefully fixed by:
https://gerrit.ovirt.org/c/103655/
>
> We can enable the job when we have a reliable build slave again.
>>
>> Here are some failed jobs:
>> - http://jenkins.ovirt.org/job/standard-manual-runner/757/
>> - http://jenkins.ovirt.org/job/standard-manual-runner/758/
>> - http://jenkins.ovirt.org/job/standard-manual-runner/759/
>> - http://jenkins.ovirt.org/job/standard-manual-runner/762/
>> - http://jenkins.ovirt.org/job/standard-manual-runner/763/
>> - http://jenkins.ovirt.org/job/standard-manual-runner/764/
>>
>> Nir
>>
>
> build-artifacts job stuck on s390x - cannot run OST
> ---------------------------------------------------
>
> Key: OVIRT-2805
> URL: https://ovirt-jira.atlassian.net/browse/OVIRT-2805
> Project: oVirt - virtualization made easy
> Issue Type: By-EMAIL
> Reporter: Nir Soffer
> Assignee: infra
>
> Since Friday build-artifacts on s390x get stuck again, so we cannot run OST.
> This is not a new issue, we have issues on s390x every few weeks.
> I posted this patch to disable this job:
> https://gerrit.ovirt.org/c/97851
> We can enable the job when we have a reliable build slave again.
> Here are some failed jobs:
> - http://jenkins.ovirt.org/job/standard-manual-runner/757/
> - http://jenkins.ovirt.org/job/standard-manual-runner/758/
> - http://jenkins.ovirt.org/job/standard-manual-runner/759/
> - http://jenkins.ovirt.org/job/standard-manual-runner/762/
> - http://jenkins.ovirt.org/job/standard-manual-runner/763/
> - http://jenkins.ovirt.org/job/standard-manual-runner/764/
> Nir
--
This message was sent by Atlassian Jira
(v1001.0.0-SNAPSHOT#100111)
5 years, 2 months
[JIRA] (OVIRT-2805) build-artifacts job stuck on s390x - cannot run
OST
by Nir Soffer (oVirt JIRA)
[ https://ovirt-jira.atlassian.net/browse/OVIRT-2805?page=com.atlassian.jir... ]
Nir Soffer commented on OVIRT-2805:
-----------------------------------
On Sun, Sep 29, 2019 at 1:13 AM Nir Soffer <nsoffer(a)redhat.com> wrote:
> Since Friday build-artifacts on s390x get stuck again, so we cannot run
> OST.
> This is not a new issue, we have issues on s390x every few weeks.
>
> I posted this patch to disable this job:
> https://gerrit.ovirt.org/c/97851
>
The patch does not work, it still runs the s390x job, and it run it
incorrectly.
Maybe the issue is not at the slave, but at vdsm automation scripts?
We have this yam:
19 stages:
20 - build-artifacts:
21 substages:
22 - build-py27:
23 archs:
24 - ppc64le
25 - x86_64
26 - build-py37:
27 distributions:
28 - fc30
And we get these jobs:
- build-artifacts.build-py27.el7.ppc64le
- build-artifacts.build-py27.el7.x86_64
- build-artifacts.build-py27.fc29.x86_64
- build-artifacts.build-py37.fc30.x86_64
- build-artifacts.fc29.s390x
The last job - s390x looks wrong - we should have only
build-py27 and build-py37 jobs, using:
- automation/build-artifacts.build-py27.sh
- automation/build-artifacts.build-py37.sh
But both scripts are using a symlinks:
lrwxrwxrwx. 1 nsoffer nsoffer 18 Sep 29 00:55 automation/
build-artifacts.build-py27.sh -> build-artifacts.sh
lrwxrwxrwx. 1 nsoffer nsoffer 18 Sep 29 00:55 automation/
build-artifacts.build-py37.sh -> build-artifacts.sh
-rwxrwxr-x. 1 nsoffer nsoffer 346 Sep 17 02:54 automation/build-artifacts.sh
Is it possible the the CI find build-artifacts.sh and run it even when no
sub stage is specified?
I'll try to rename this script to avoid this.
We can enable the job when we have a reliable build slave again.
>
> Here are some failed jobs:
> - http://jenkins.ovirt.org/job/standard-manual-runner/757/
> - http://jenkins.ovirt.org/job/standard-manual-runner/758/
> - http://jenkins.ovirt.org/job/standard-manual-runner/759/
> - http://jenkins.ovirt.org/job/standard-manual-runner/762/
> - http://jenkins.ovirt.org/job/standard-manual-runner/763/
> - http://jenkins.ovirt.org/job/standard-manual-runner/764/
>
> Nir
>
> build-artifacts job stuck on s390x - cannot run OST
> ---------------------------------------------------
>
> Key: OVIRT-2805
> URL: https://ovirt-jira.atlassian.net/browse/OVIRT-2805
> Project: oVirt - virtualization made easy
> Issue Type: By-EMAIL
> Reporter: Nir Soffer
> Assignee: infra
>
> Since Friday build-artifacts on s390x get stuck again, so we cannot run OST.
> This is not a new issue, we have issues on s390x every few weeks.
> I posted this patch to disable this job:
> https://gerrit.ovirt.org/c/97851
> We can enable the job when we have a reliable build slave again.
> Here are some failed jobs:
> - http://jenkins.ovirt.org/job/standard-manual-runner/757/
> - http://jenkins.ovirt.org/job/standard-manual-runner/758/
> - http://jenkins.ovirt.org/job/standard-manual-runner/759/
> - http://jenkins.ovirt.org/job/standard-manual-runner/762/
> - http://jenkins.ovirt.org/job/standard-manual-runner/763/
> - http://jenkins.ovirt.org/job/standard-manual-runner/764/
> Nir
--
This message was sent by Atlassian Jira
(v1001.0.0-SNAPSHOT#100111)
5 years, 2 months
[oVirt] [CQ weekly status] [27-09-2019]
by Dusan Fodor
Hi,
This mail is to provide the current status of CQ and allow people to review
status before and after the weekend.
Please refer to below colour map for further information on the meaning of
the colours.
*CQ-4.3*: GREEN (#3)
Last failure was on 27-09 for v2v-conversion-host. Job was marked as failed
because its on merge job (ghpush) tries to queue change in now non-existing
4.2 CQ.
*CQ-Master:* RED (#1)
Last failure was on 27-09 was for ovirt-engine on adding storage domain.
This is a random failure described in thread "random failures for
add_master_storage_domain", awaiting its fix. Another issue detected is in
initialize_engine test, mail was sent to investigate it.
Current running jobs for 4.3 [1] and master [2] can be found here:
[1] https://jenkins.ovirt.org/view/Change%20queue%20jobs/job/ovirt
-4.3_change-queue-tester/
[2] http://jenkins.ovirt.org/view/Change%20queue%20jobs/job/ovirt
-master_change-queue-tester/
Have a nice week!
Dusan
-------------------------------------------------------------------------------------------------------------------
COLOUR MAP
Green = job has been passing successfully
** green for more than 3 days may suggest we need a review of our test
coverage
1.
1-3 days GREEN (#1)
2.
4-7 days GREEN (#2)
3.
Over 7 days GREEN (#3)
Yellow = intermittent failures for different projects but no lasting or
current regressions
** intermittent would be a healthy project as we expect a number of
failures during the week
** I will not report any of the solved failures or regressions.
1.
Solved job failures YELLOW (#1)
2.
Solved regressions YELLOW (#2)
Red = job has been failing
** Active Failures. The colour will change based on the amount of time the
project/s has been broken. Only active regressions would be reported.
1.
1-3 days RED (#1)
2.
4-7 days RED (#2)
3.
Over 7 days RED (#3)
_______________________________________________
Devel mailing list -- devel(a)ovirt.org
To unsubscribe send an email to devel-leave(a)ovirt.org
Privacy Statement: https://www.ovirt.org/site/privacy-policy/
oVirt Code of Conduct: https://www.ovirt
.org/community/about/community-guidelines/
List Archives: https://lists.ovirt.org/archives/list/devel@ovirt
.org/message/YCNCKRK3G4EJXA3OCYAUS4VMKRDA67F4/
5 years, 2 months