update: replace epel repos in engine setup-upgrade jobs with ovirt proxy
by Eyal Edri
Opening ticket to infra to replace repos in the job to use proxy ones,
hopefully it should reduce possibility of broken epel repos.
On Mon, Aug 15, 2016 at 4:07 PM, Evgheni Dereveanchin <ederevea(a)redhat.com>
wrote:
> Yes, centos mirrorlists exist on the proxy:
>
> https://gerrit.ovirt.org/gitweb?p=infra-puppet.git;a=
> blob;f=site/ovirt_proxy/files/repos.yaml;h=8ece5fd28864ccea1122343d22b95f
> 151525b952;hb=refs/heads/production
>
> Regards,
> Evgheni Dereveanchin
>
> ----- Original Message -----
> From: "Eyal Edri" <eedri(a)redhat.com>
> To: "Evgheni Dereveanchin" <ederevea(a)redhat.com>
> Cc: "Martin Perina" <mperina(a)redhat.com>, "Gil Shinar" <gshinar(a)redhat.com>,
> "infra" <infra(a)ovirt.org>, "Martin Betak" <mbetak(a)redhat.com>
> Sent: Monday, 15 August, 2016 3:01:07 PM
> Subject: Re: Change in ovirt-engine[master]: backend: Make
> VdsArchitectureHelper a CDI singleton
>
> On Mon, Aug 15, 2016 at 3:59 PM, Evgheni Dereveanchin <ederevea(a)redhat.com
> >
> wrote:
>
> > Hi Eyal,
> >
> > I checked ovirt-engine_master_upgrade-from-4.0_el7_created/914 and it
> > does not use repoproxy repos:
> >
> > Loaded plugins: fastestmirror, versionlock
> > Loading mirror speeds from cached hostfile
> > * base: mirrors.usc.edu
> > * epel: dl.fedoraproject.org
> > * extras: centos.mirror.lstn.net
> > * updates: mirror.supremebytes.com
> > Resolving Dependencies
> > --> Running transaction check
> >
>
> Do we have those in our proxy?
>
>
> > ...
> >
> > I also logged in to el7-vm27 and there's no proxy defined in yum configs.
> >
> > Regards,
> > Evgheni Dereveanchin
> >
> > ----- Original Message -----
> > From: "Eyal Edri" <eedri(a)redhat.com>
> > To: "Martin Perina" <mperina(a)redhat.com>
> > Cc: "Gil Shinar" <gshinar(a)redhat.com>, "Evgheni Dereveanchin" <
> > ederevea(a)redhat.com>, "infra" <infra(a)ovirt.org>, "Martin Betak" <
> > mbetak(a)redhat.com>
> > Sent: Monday, 15 August, 2016 2:47:06 PM
> > Subject: Re: Change in ovirt-engine[master]: backend: Make
> > VdsArchitectureHelper a CDI singleton
> >
> > Evgheni,
> > Can you check if the proxy is used there?
> >
> > On Mon, Aug 15, 2016 at 3:27 PM, Martin Perina <mperina(a)redhat.com>
> wrote:
> >
> > >
> > >
> > > On Mon, Aug 15, 2016 at 1:29 PM, Eyal Edri <eedri(a)redhat.com> wrote:
> > >
> > >> Looks like epel is down? Did you try running the job again?
> > >>
> > >
> > > Yes, the error happens I think from Friday, but only some slaves (my
> > > theory), because sometimes you are lucky and upgrade job finishes fine.
> > >
> > >
> > >
> > >> We have a ticket to mirror it to resources.ovirt.org, but we need to
> > >> extend disk space there before we do it.
> > >>
> > >> On Mon, Aug 15, 2016 at 1:37 PM, Martin Perina <mperina(a)redhat.com>
> > >> wrote:
> > >>
> > >>> Hi,
> > >>>
> > >>> could you please take a look and upgrade jobs?
> > >>>
> > >>> [ ERROR ] Yum Cannot queue package iproute: Cannot find a valid
> > baseurl for repo: epel/x86_64
> > >>> [ ERROR ] Failed to execute stage 'Environment packages setup':
> Cannot
> > find a valid baseurl for repo: epel/x86_64
> > >>>
> > >>> I saw this error on several today builds ...
> > >>>
> > >>> Thanks
> > >>>
> > >>> Martin
> > >>>
> > >>>
> > >>> On Mon, Aug 15, 2016 at 12:34 PM, Jenkins CI <
> gerrit2(a)gerrit.ovirt.org
> > >
> > >>> wrote:
> > >>>
> > >>>> Jenkins CI has posted comments on this change.
> > >>>>
> > >>>> Change subject: backend: Make VdsArchitectureHelper a CDI singleton
> > >>>> ............................................................
> ........
> > >>>>
> > >>>>
> > >>>> Patch Set 5:
> > >>>>
> > >>>> Build Failed
> > >>>>
> > >>>> http://jenkins.ovirt.org/job/ovirt-engine_master_upgrade-fro
> > >>>> m-4.0_el7_created/914/ : FAILURE
> > >>>>
> > >>>> http://jenkins.ovirt.org/job/ovirt-engine_master_check-patch
> > >>>> -el7-x86_64/5255/ : SUCCESS
> > >>>>
> > >>>> http://jenkins.ovirt.org/job/ovirt-engine_master_find-bugs_c
> > >>>> reated/1917/ : SUCCESS
> > >>>>
> > >>>> http://jenkins.ovirt.org/job/ovirt-engine_master_upgrade-fro
> > >>>> m-master_el7_created/915/ : SUCCESS
> > >>>>
> > >>>> --
> > >>>> To view, visit https://gerrit.ovirt.org/62261
> > >>>> To unsubscribe, visit https://gerrit.ovirt.org/settings
> > >>>>
> > >>>> Gerrit-MessageType: comment
> > >>>> Gerrit-Change-Id: I4cdca46c04943204d7ebda1069f6a5f7ea095d10
> > >>>> Gerrit-PatchSet: 5
> > >>>> Gerrit-Project: ovirt-engine
> > >>>> Gerrit-Branch: master
> > >>>> Gerrit-Owner: Martin Betak <mbetak(a)redhat.com>
> > >>>> Gerrit-Reviewer: Jenkins CI
> > >>>> Gerrit-Reviewer: Martin Betak <mbetak(a)redhat.com>
> > >>>> Gerrit-Reviewer: Martin Peřina <mperina(a)redhat.com>
> > >>>> Gerrit-Reviewer: gerrit-hooks <automation(a)ovirt.org>
> > >>>> Gerrit-HasComments: No
> > >>>>
> > >>>
> > >>>
> > >>> _______________________________________________
> > >>> Infra mailing list
> > >>> Infra(a)ovirt.org
> > >>> http://lists.ovirt.org/mailman/listinfo/infra
> > >>>
> > >>>
> > >>
> > >>
> > >> --
> > >> Eyal Edri
> > >> Associate Manager
> > >> RHV DevOps
> > >> EMEA ENG Virtualization R&D
> > >> Red Hat Israel
> > >>
> > >> phone: +972-9-7692018
> > >> irc: eedri (on #tlv #rhev-dev #rhev-integ)
> > >>
> > >
> > >
> >
> >
> > --
> > Eyal Edri
> > Associate Manager
> > RHV DevOps
> > EMEA ENG Virtualization R&D
> > Red Hat Israel
> >
> > phone: +972-9-7692018
> > irc: eedri (on #tlv #rhev-dev #rhev-integ)
> >
>
>
>
> --
> Eyal Edri
> Associate Manager
> RHV DevOps
> EMEA ENG Virtualization R&D
> Red Hat Israel
>
> phone: +972-9-7692018
> irc: eedri (on #tlv #rhev-dev #rhev-integ)
>
--
Eyal Edri
Associate Manager
RHV DevOps
EMEA ENG Virtualization R&D
Red Hat Israel
phone: +972-9-7692018
irc: eedri (on #tlv #rhev-dev #rhev-integ)
8 years, 4 months
[JIRA] (OVIRT-686) Find a better way to show unstable status in CI
by eyal edri [Administrator] (oVirt JIRA)
eyal edri [Administrator] created OVIRT-686:
-----------------------------------------------
Summary: Find a better way to show unstable status in CI
Key: OVIRT-686
URL: https://ovirt-jira.atlassian.net/browse/OVIRT-686
Project: oVirt - virtualization made easy
Issue Type: Improvement
Reporter: eyal edri [Administrator]
Assignee: infra
Today we have the main view on jenkins.ovirt.org showing the unstable list of jobs,
While its effective in showing the current stable jobs, its not optimal in showing trends, history or dashboard view of the jenkins itself.
We need to find a jenkins plugin or a dashboard to show the current status of Jenkins in a more efficient and informative way.
There are numerous dashboards and view plugins, just need to find the right one for oVirt.
--
This message was sent by Atlassian JIRA
(v1000.245.0#100009)
8 years, 4 months
[JIRA] (OVIRT-686) Find a better way to show unstable status in CI
by eyal edri [Administrator] (oVirt JIRA)
[ https://ovirt-jira.atlassian.net/browse/OVIRT-686?page=com.atlassian.jira... ]
eyal edri [Administrator] updated OVIRT-686:
--------------------------------------------
Epic Link: OVIRT-400
> Find a better way to show unstable status in CI
> -----------------------------------------------
>
> Key: OVIRT-686
> URL: https://ovirt-jira.atlassian.net/browse/OVIRT-686
> Project: oVirt - virtualization made easy
> Issue Type: Improvement
> Reporter: eyal edri [Administrator]
> Assignee: infra
>
> Today we have the main view on jenkins.ovirt.org showing the unstable list of jobs,
> While its effective in showing the current stable jobs, its not optimal in showing trends, history or dashboard view of the jenkins itself.
> We need to find a jenkins plugin or a dashboard to show the current status of Jenkins in a more efficient and informative way.
> There are numerous dashboards and view plugins, just need to find the right one for oVirt.
--
This message was sent by Atlassian JIRA
(v1000.245.0#100009)
8 years, 4 months
Build failed in Jenkins: ovirt_4.0_system-tests #287
by jenkins@jenkins.phx.ovirt.org
See <http://jenkins.ovirt.org/job/ovirt_4.0_system-tests/287/changes>
Changes:
[Gal Ben Haim] Removed the hard coded path of the template repo. The template repo file
------------------------------------------
[...truncated 659 lines...]
## took 814 seconds
## rc = 1
##########################################################
##! ERROR vvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvv
##! Last 20 log enties: logs/mocker-fedora-23-x86_64.fc23.basic_suite_4.0.sh/basic_suite_4.0.sh.log
##!
+ true
+ env_cleanup
+ echo '#########################'
#########################
+ local res=0
+ local uuid
+ echo '======== Cleaning up'
======== Cleaning up
+ [[ -e <http://jenkins.ovirt.org/job/ovirt_4.0_system-tests/ws/ovirt-system-tests...> ]]
+ echo '----------- Cleaning with lago'
----------- Cleaning with lago
+ lago --workdir <http://jenkins.ovirt.org/job/ovirt_4.0_system-tests/ws/ovirt-system-tests...> destroy --yes --all-prefixes
+ echo '----------- Cleaning with lago done'
----------- Cleaning with lago done
+ [[ 0 != \0 ]]
+ echo '======== Cleanup done'
======== Cleanup done
+ exit 0
Took 631 seconds
===================================
##!
##! ERROR ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
##!########################################################
##########################################################
Build step 'Execute shell' marked build as failure
Performing Post build task...
Match found for :.* : True
Logical operation result is TRUE
Running script : #!/bin/bash -xe
echo 'shell_scripts/system_tests.collect_logs.sh'
#
# Required jjb vars:
# version
#
VERSION=4.0
SUITE_TYPE=
WORKSPACE="$PWD"
OVIRT_SUITE="$SUITE_TYPE_suite_$VERSION"
TESTS_LOGS="$WORKSPACE/ovirt-system-tests/exported-artifacts"
rm -rf "$WORKSPACE/exported-artifacts"
mkdir -p "$WORKSPACE/exported-artifacts"
if [[ -d "$TESTS_LOGS" ]]; then
mv "$TESTS_LOGS/"* "$WORKSPACE/exported-artifacts/"
fi
[ovirt_4.0_system-tests] $ /bin/bash -xe /tmp/hudson3554245375878471530.sh
+ echo shell_scripts/system_tests.collect_logs.sh
shell_scripts/system_tests.collect_logs.sh
+ VERSION=4.0
+ SUITE_TYPE=
+ WORKSPACE=<http://jenkins.ovirt.org/job/ovirt_4.0_system-tests/ws/>
+ OVIRT_SUITE=4.0
+ TESTS_LOGS=<http://jenkins.ovirt.org/job/ovirt_4.0_system-tests/ws/ovirt-system-tests...>
+ rm -rf <http://jenkins.ovirt.org/job/ovirt_4.0_system-tests/287/artifact/exported...>
+ mkdir -p <http://jenkins.ovirt.org/job/ovirt_4.0_system-tests/287/artifact/exported...>
+ [[ -d <http://jenkins.ovirt.org/job/ovirt_4.0_system-tests/ws/ovirt-system-tests...> ]]
+ mv <http://jenkins.ovirt.org/job/ovirt_4.0_system-tests/ws/ovirt-system-tests...> <http://jenkins.ovirt.org/job/ovirt_4.0_system-tests/ws/ovirt-system-tests...> <http://jenkins.ovirt.org/job/ovirt_4.0_system-tests/ws/ovirt-system-tests...> <http://jenkins.ovirt.org/job/ovirt_4.0_system-tests/287/artifact/exported...>
POST BUILD TASK : SUCCESS
END OF POST BUILD TASK : 0
Match found for :.* : True
Logical operation result is TRUE
Running script : #!/bin/bash -xe
echo "shell-scripts/mock_cleanup.sh"
shopt -s nullglob
WORKSPACE="$PWD"
# Make clear this is the cleanup, helps reading the jenkins logs
cat <<EOC
_______________________________________________________________________
#######################################################################
# #
# CLEANUP #
# #
#######################################################################
EOC
# Archive the logs, we want them anyway
logs=(
./*log
./*/logs
)
if [[ "$logs" ]]; then
tar cvzf exported-artifacts/logs.tgz "${logs[@]}"
rm -rf "${logs[@]}"
fi
# stop any processes running inside the chroot
failed=false
mock_confs=("$WORKSPACE"/*/mocker*)
# Clean current jobs mockroot if any
for mock_conf_file in "${mock_confs[@]}"; do
[[ "$mock_conf_file" ]] || continue
echo "Cleaning up mock $mock_conf"
mock_root="${mock_conf_file##*/}"
mock_root="${mock_root%.*}"
my_mock="/usr/bin/mock"
my_mock+=" --configdir=${mock_conf_file%/*}"
my_mock+=" --root=${mock_root}"
my_mock+=" --resultdir=$WORKSPACE"
#TODO: investigate why mock --clean fails to umount certain dirs sometimes,
#so we can use it instead of manually doing all this.
echo "Killing all mock orphan processes, if any."
$my_mock \
--orphanskill \
|| {
echo "ERROR: Failed to kill orphans on $chroot."
failed=true
}
mock_root="$(\
grep \
-Po "(?<=config_opts\['root'\] = ')[^']*" \
"$mock_conf_file" \
)" || :
[[ "$mock_root" ]] || continue
mounts=($(mount | awk '{print $3}' | grep "$mock_root")) || :
if [[ "$mounts" ]]; then
echo "Found mounted dirs inside the chroot $chroot. Trying to umount."
fi
for mount in "${mounts[@]}"; do
sudo umount --lazy "$mount" \
|| {
echo "ERROR: Failed to umount $mount."
failed=true
}
done
done
# Clean any leftover chroot from other jobs
for mock_root in /var/lib/mock/*; do
this_chroot_failed=false
mounts=($(mount | awk '{print $3}' | grep "$mock_root")) || :
if [[ "$mounts" ]]; then
echo "Found mounted dirs inside the chroot $mock_root." \
"Trying to umount."
fi
for mount in "${mounts[@]}"; do
sudo umount --lazy "$mount" \
|| {
echo "ERROR: Failed to umount $mount."
failed=true
this_chroot_failed=true
}
done
if ! $this_chroot_failed; then
sudo rm -rf "$mock_root"
fi
done
if $failed; then
echo "Aborting."
exit 1
fi
# remove mock system cache, we will setup proxies to do the caching and this
# takes lots of space between runs
shopt -u nullglob
sudo rm -Rf /var/cache/mock/*
# restore the permissions in the working dir, as sometimes it leaves files
# owned by root and then the 'cleanup workspace' from jenkins job fails to
# clean and breaks the jobs
sudo chown -R "$USER" "$WORKSPACE"
[ovirt_4.0_system-tests] $ /bin/bash -xe /tmp/hudson7620342481452401506.sh
+ echo shell-scripts/mock_cleanup.sh
shell-scripts/mock_cleanup.sh
+ shopt -s nullglob
+ WORKSPACE=<http://jenkins.ovirt.org/job/ovirt_4.0_system-tests/ws/>
+ cat
_______________________________________________________________________
#######################################################################
# #
# CLEANUP #
# #
#######################################################################
+ logs=(./*log ./*/logs)
+ [[ -n ./ovirt-system-tests/logs ]]
+ tar cvzf exported-artifacts/logs.tgz ./ovirt-system-tests/logs
./ovirt-system-tests/logs/
./ovirt-system-tests/logs/mocker-fedora-23-x86_64.fc23.init/
./ovirt-system-tests/logs/mocker-fedora-23-x86_64.fc23.init/stdout_stderr.log
./ovirt-system-tests/logs/mocker-fedora-23-x86_64.fc23.init/state.log
./ovirt-system-tests/logs/mocker-fedora-23-x86_64.fc23.init/build.log
./ovirt-system-tests/logs/mocker-fedora-23-x86_64.fc23.init/root.log
./ovirt-system-tests/logs/mocker-fedora-23-x86_64.fc23.install_packages/
./ovirt-system-tests/logs/mocker-fedora-23-x86_64.fc23.install_packages/stdout_stderr.log
./ovirt-system-tests/logs/mocker-fedora-23-x86_64.fc23.install_packages/state.log
./ovirt-system-tests/logs/mocker-fedora-23-x86_64.fc23.install_packages/build.log
./ovirt-system-tests/logs/mocker-fedora-23-x86_64.fc23.install_packages/root.log
./ovirt-system-tests/logs/mocker-fedora-23-x86_64.fc23.clean_rpmdb/
./ovirt-system-tests/logs/mocker-fedora-23-x86_64.fc23.clean_rpmdb/stdout_stderr.log
./ovirt-system-tests/logs/mocker-fedora-23-x86_64.fc23.basic_suite_4.0.sh/
./ovirt-system-tests/logs/mocker-fedora-23-x86_64.fc23.basic_suite_4.0.sh/basic_suite_4.0.sh.log
+ rm -rf ./ovirt-system-tests/logs
+ failed=false
+ mock_confs=("$WORKSPACE"/*/mocker*)
+ for mock_conf_file in '"${mock_confs[@]}"'
+ [[ -n <http://jenkins.ovirt.org/job/ovirt_4.0_system-tests/ws/ovirt-system-tests...> ]]
+ echo 'Cleaning up mock '
Cleaning up mock
+ mock_root=mocker-fedora-23-x86_64.fc23.cfg
+ mock_root=mocker-fedora-23-x86_64.fc23
+ my_mock=/usr/bin/mock
+ my_mock+=' --configdir=<http://jenkins.ovirt.org/job/ovirt_4.0_system-tests/ws/ovirt-system-tests'>
+ my_mock+=' --root=mocker-fedora-23-x86_64.fc23'
+ my_mock+=' --resultdir=<http://jenkins.ovirt.org/job/ovirt_4.0_system-tests/ws/'>
+ echo 'Killing all mock orphan processes, if any.'
Killing all mock orphan processes, if any.
+ /usr/bin/mock --configdir=<http://jenkins.ovirt.org/job/ovirt_4.0_system-tests/ws/ovirt-system-tests> --root=mocker-fedora-23-x86_64.fc23 --resultdir=<http://jenkins.ovirt.org/job/ovirt_4.0_system-tests/ws/> --orphanskill
WARNING: Could not find required logging config file: <http://jenkins.ovirt.org/job/ovirt_4.0_system-tests/ws/ovirt-system-tests....> Using default...
INFO: mock.py version 1.2.18 starting (python version = 3.5.1)...
Start: init plugins
INFO: selinux enabled
Finish: init plugins
Start: run
Finish: run
++ grep -Po '(?<=config_opts\['\''root'\''\] = '\'')[^'\'']*' <http://jenkins.ovirt.org/job/ovirt_4.0_system-tests/ws/ovirt-system-tests...>
+ mock_root=fedora-23-x86_64-0c362156a2fa4a935ea8b988eb73b2ad
+ [[ -n fedora-23-x86_64-0c362156a2fa4a935ea8b988eb73b2ad ]]
+ mounts=($(mount | awk '{print $3}' | grep "$mock_root"))
++ mount
++ awk '{print $3}'
++ grep fedora-23-x86_64-0c362156a2fa4a935ea8b988eb73b2ad
+ :
+ [[ -n '' ]]
+ false
+ shopt -u nullglob
+ sudo rm -Rf /var/cache/mock/fedora-23-x86_64-0c362156a2fa4a935ea8b988eb73b2ad
+ sudo chown -R jenkins <http://jenkins.ovirt.org/job/ovirt_4.0_system-tests/ws/>
POST BUILD TASK : SUCCESS
END OF POST BUILD TASK : 1
Recording test results
Archiving artifacts
8 years, 4 months