[JIRA] (OVIRT-824) Jenkins CI not triggering for ovirt-engine-dashboard gerrit patches
by eyal edri [Administrator] (oVirt JIRA)
[ https://ovirt-jira.atlassian.net/browse/OVIRT-824?page=com.atlassian.jira... ]
eyal edri [Administrator] commented on OVIRT-824:
-------------------------------------------------
Hi Scott,
The hooks you see in Gerrit has nothing to do with Jenkins jobs, they are verifying correctness of patches and updating bugs.
In order to run jenkins jobs you need to make sure the project has an 'automation' dir and inside it 'check-patch.sh' and 'check-merged.sh' scripts.
You can look in other projects for examples, like ovirt-engine or vdsm.
more info can be found here: http://infra-docs.readthedocs.io/en/latest/CI/Build_and_test_standards.html
> Jenkins CI not triggering for ovirt-engine-dashboard gerrit patches
> -------------------------------------------------------------------
>
> Key: OVIRT-824
> URL: https://ovirt-jira.atlassian.net/browse/OVIRT-824
> Project: oVirt - virtualization made easy
> Issue Type: By-EMAIL
> Reporter: Scott Dickerson
> Assignee: infra
>
> Hi,
> I submitted an ovirt-engine-dashboard patch [1] last night. The
> gerrit-hooks ran just fine to attach the patch to its BZ [2]. However, the
> normal Jenkins CI hooks/job did not. I tried rerunning the hooks with the
> "Rerun-Hooks: all" gerrit comments on the patch, but Jenkins CI did not run
> again. How can we tell if the Jenkins CI triggers on the
> ovirt-engine-dashboard gerrit are still configured correctly? What can be
> done to fix the problem?
> To attempt to work around the issue, I tried the gerrit manual trigger page
> [3]. Unfortunately my jenkins account (sdickers) does not have the
> appropriate permission to the page. My teammate was able to manually
> trigger the job and my patch cleared CI (thanks Greg). Who do I need to
> ping to get that permission added to my account?
> Regards,
> Scott
> [1] - https://gerrit.ovirt.org/66368
> [2] - https://bugzilla.redhat.com/1389382
> [3] - http://jenkins.ovirt.org/gerrit_manual_trigger/
> --
> Scott Dickerson
> Senior Software Engineer
> RHEV-M Engineering - UX Team
> Red Hat, Inc
--
This message was sent by Atlassian JIRA
(v1000.526.2#100018)
8 years, 2 months
[JIRA] (OVIRT-824) Jenkins CI not triggering for ovirt-engine-dashboard gerrit patches
by Scott Dickerson (oVirt JIRA)
Scott Dickerson created OVIRT-824:
-------------------------------------
Summary: Jenkins CI not triggering for ovirt-engine-dashboard gerrit patches
Key: OVIRT-824
URL: https://ovirt-jira.atlassian.net/browse/OVIRT-824
Project: oVirt - virtualization made easy
Issue Type: By-EMAIL
Reporter: Scott Dickerson
Assignee: infra
Hi,
I submitted an ovirt-engine-dashboard patch [1] last night. The
gerrit-hooks ran just fine to attach the patch to its BZ [2]. However, the
normal Jenkins CI hooks/job did not. I tried rerunning the hooks with the
"Rerun-Hooks: all" gerrit comments on the patch, but Jenkins CI did not run
again. How can we tell if the Jenkins CI triggers on the
ovirt-engine-dashboard gerrit are still configured correctly? What can be
done to fix the problem?
To attempt to work around the issue, I tried the gerrit manual trigger page
[3]. Unfortunately my jenkins account (sdickers) does not have the
appropriate permission to the page. My teammate was able to manually
trigger the job and my patch cleared CI (thanks Greg). Who do I need to
ping to get that permission added to my account?
Regards,
Scott
[1] - https://gerrit.ovirt.org/66368
[2] - https://bugzilla.redhat.com/1389382
[3] - http://jenkins.ovirt.org/gerrit_manual_trigger/
--
Scott Dickerson
Senior Software Engineer
RHEV-M Engineering - UX Team
Red Hat, Inc
--
This message was sent by Atlassian JIRA
(v1000.526.2#100018)
8 years, 2 months
Build failed in Jenkins: ovirt_3.6_system-tests #726
by jenkins@jenkins.phx.ovirt.org
See <http://jenkins.ovirt.org/job/ovirt_3.6_system-tests/726/changes>
Changes:
[Yaniv Kaul] Fixes for HE suite
[Ondra Machacek] Remove Fedora 23 build from aaa-misc project
[Gil Shinar] Added builds discarder to test experimental
[Yedidyah Bar David] nsis-simple-service-plugin: Remove 3.6
------------------------------------------
[...truncated 756 lines...]
## took 1661 seconds
## rc = 1
##########################################################
##! ERROR vvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvv
##! Last 20 log entries: logs/mocker-fedora-23-x86_64.fc23.basic_suite_3.6.sh/basic_suite_3.6.sh.log
##!
+ env_cleanup
+ echo '#########################'
#########################
+ local res=0
+ local uuid
+ echo '======== Cleaning up'
======== Cleaning up
+ [[ -e <http://jenkins.ovirt.org/job/ovirt_3.6_system-tests/ws/ovirt-system-tests...> ]]
+ echo '----------- Cleaning with lago'
----------- Cleaning with lago
+ lago --workdir <http://jenkins.ovirt.org/job/ovirt_3.6_system-tests/ws/ovirt-system-tests...> destroy --yes --all-prefixes
+ echo '----------- Cleaning with lago done'
----------- Cleaning with lago done
+ [[ 0 != \0 ]]
+ echo '======== Cleanup done'
======== Cleanup done
+ exit 0
+ exit
Took 1485 seconds
===================================
##!
##! ERROR ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
##!########################################################
##########################################################
Build step 'Execute shell' marked build as failure
Performing Post build task...
Match found for :.* : True
Logical operation result is TRUE
Running script : #!/bin/bash -xe
echo 'shell_scripts/system_tests.collect_logs.sh'
#
# Required jjb vars:
# version
#
VERSION=3.6
SUITE_TYPE=
WORKSPACE="$PWD"
OVIRT_SUITE="$SUITE_TYPE_suite_$VERSION"
TESTS_LOGS="$WORKSPACE/ovirt-system-tests/exported-artifacts"
rm -rf "$WORKSPACE/exported-artifacts"
mkdir -p "$WORKSPACE/exported-artifacts"
if [[ -d "$TESTS_LOGS" ]]; then
mv "$TESTS_LOGS/"* "$WORKSPACE/exported-artifacts/"
fi
[ovirt_3.6_system-tests] $ /bin/bash -xe /tmp/hudson894335147430562572.sh
+ echo shell_scripts/system_tests.collect_logs.sh
shell_scripts/system_tests.collect_logs.sh
+ VERSION=3.6
+ SUITE_TYPE=
+ WORKSPACE=<http://jenkins.ovirt.org/job/ovirt_3.6_system-tests/ws/>
+ OVIRT_SUITE=3.6
+ TESTS_LOGS=<http://jenkins.ovirt.org/job/ovirt_3.6_system-tests/ws/ovirt-system-tests...>
+ rm -rf <http://jenkins.ovirt.org/job/ovirt_3.6_system-tests/726/artifact/exported...>
+ mkdir -p <http://jenkins.ovirt.org/job/ovirt_3.6_system-tests/726/artifact/exported...>
+ [[ -d <http://jenkins.ovirt.org/job/ovirt_3.6_system-tests/ws/ovirt-system-tests...> ]]
+ mv <http://jenkins.ovirt.org/job/ovirt_3.6_system-tests/ws/ovirt-system-tests...> <http://jenkins.ovirt.org/job/ovirt_3.6_system-tests/ws/ovirt-system-tests...> <http://jenkins.ovirt.org/job/ovirt_3.6_system-tests/ws/ovirt-system-tests...> <http://jenkins.ovirt.org/job/ovirt_3.6_system-tests/ws/ovirt-system-tests...> <http://jenkins.ovirt.org/job/ovirt_3.6_system-tests/726/artifact/exported...>
POST BUILD TASK : SUCCESS
END OF POST BUILD TASK : 0
Match found for :.* : True
Logical operation result is TRUE
Running script : #!/bin/bash -xe
echo "shell-scripts/mock_cleanup.sh"
shopt -s nullglob
WORKSPACE="$PWD"
# Make clear this is the cleanup, helps reading the jenkins logs
cat <<EOC
_______________________________________________________________________
#######################################################################
# #
# CLEANUP #
# #
#######################################################################
EOC
# Archive the logs, we want them anyway
logs=(
./*log
./*/logs
)
if [[ "$logs" ]]; then
tar cvzf exported-artifacts/logs.tgz "${logs[@]}"
rm -rf "${logs[@]}"
fi
# stop any processes running inside the chroot
failed=false
mock_confs=("$WORKSPACE"/*/mocker*)
# Clean current jobs mockroot if any
for mock_conf_file in "${mock_confs[@]}"; do
[[ "$mock_conf_file" ]] || continue
echo "Cleaning up mock $mock_conf"
mock_root="${mock_conf_file##*/}"
mock_root="${mock_root%.*}"
my_mock="/usr/bin/mock"
my_mock+=" --configdir=${mock_conf_file%/*}"
my_mock+=" --root=${mock_root}"
my_mock+=" --resultdir=$WORKSPACE"
#TODO: investigate why mock --clean fails to umount certain dirs sometimes,
#so we can use it instead of manually doing all this.
echo "Killing all mock orphan processes, if any."
$my_mock \
--orphanskill \
|| {
echo "ERROR: Failed to kill orphans on $chroot."
failed=true
}
mock_root="$(\
grep \
-Po "(?<=config_opts\['root'\] = ')[^']*" \
"$mock_conf_file" \
)" || :
[[ "$mock_root" ]] || continue
mounts=($(mount | awk '{print $3}' | grep "$mock_root")) || :
if [[ "$mounts" ]]; then
echo "Found mounted dirs inside the chroot $chroot. Trying to umount."
fi
for mount in "${mounts[@]}"; do
sudo umount --lazy "$mount" \
|| {
echo "ERROR: Failed to umount $mount."
failed=true
}
done
done
# Clean any leftover chroot from other jobs
for mock_root in /var/lib/mock/*; do
this_chroot_failed=false
mounts=($(mount | awk '{print $3}' | grep "$mock_root")) || :
if [[ "$mounts" ]]; then
echo "Found mounted dirs inside the chroot $mock_root." \
"Trying to umount."
fi
for mount in "${mounts[@]}"; do
sudo umount --lazy "$mount" \
|| {
echo "ERROR: Failed to umount $mount."
failed=true
this_chroot_failed=true
}
done
if ! $this_chroot_failed; then
sudo rm -rf "$mock_root"
fi
done
if $failed; then
echo "Aborting."
exit 1
fi
# remove mock system cache, we will setup proxies to do the caching and this
# takes lots of space between runs
shopt -u nullglob
sudo rm -Rf /var/cache/mock/*
# restore the permissions in the working dir, as sometimes it leaves files
# owned by root and then the 'cleanup workspace' from jenkins job fails to
# clean and breaks the jobs
sudo chown -R "$USER" "$WORKSPACE"
[ovirt_3.6_system-tests] $ /bin/bash -xe /tmp/hudson8054350441011707830.sh
+ echo shell-scripts/mock_cleanup.sh
shell-scripts/mock_cleanup.sh
+ shopt -s nullglob
+ WORKSPACE=<http://jenkins.ovirt.org/job/ovirt_3.6_system-tests/ws/>
+ cat
_______________________________________________________________________
#######################################################################
# #
# CLEANUP #
# #
#######################################################################
+ logs=(./*log ./*/logs)
+ [[ -n ./ovirt-system-tests/logs ]]
+ tar cvzf exported-artifacts/logs.tgz ./ovirt-system-tests/logs
./ovirt-system-tests/logs/
./ovirt-system-tests/logs/mocker-fedora-23-x86_64.fc23.init/
./ovirt-system-tests/logs/mocker-fedora-23-x86_64.fc23.init/stdout_stderr.log
./ovirt-system-tests/logs/mocker-fedora-23-x86_64.fc23.init/state.log
./ovirt-system-tests/logs/mocker-fedora-23-x86_64.fc23.init/build.log
./ovirt-system-tests/logs/mocker-fedora-23-x86_64.fc23.init/root.log
./ovirt-system-tests/logs/mocker-fedora-23-x86_64.fc23.install_packages/
./ovirt-system-tests/logs/mocker-fedora-23-x86_64.fc23.install_packages/stdout_stderr.log
./ovirt-system-tests/logs/mocker-fedora-23-x86_64.fc23.install_packages/state.log
./ovirt-system-tests/logs/mocker-fedora-23-x86_64.fc23.install_packages/build.log
./ovirt-system-tests/logs/mocker-fedora-23-x86_64.fc23.install_packages/root.log
./ovirt-system-tests/logs/mocker-fedora-23-x86_64.fc23.clean_rpmdb/
./ovirt-system-tests/logs/mocker-fedora-23-x86_64.fc23.clean_rpmdb/stdout_stderr.log
./ovirt-system-tests/logs/mocker-fedora-23-x86_64.fc23.basic_suite_3.6.sh/
./ovirt-system-tests/logs/mocker-fedora-23-x86_64.fc23.basic_suite_3.6.sh/basic_suite_3.6.sh.log
+ rm -rf ./ovirt-system-tests/logs
+ failed=false
+ mock_confs=("$WORKSPACE"/*/mocker*)
+ for mock_conf_file in '"${mock_confs[@]}"'
+ [[ -n <http://jenkins.ovirt.org/job/ovirt_3.6_system-tests/ws/ovirt-system-tests...> ]]
+ echo 'Cleaning up mock '
Cleaning up mock
+ mock_root=mocker-fedora-23-x86_64.fc23.cfg
+ mock_root=mocker-fedora-23-x86_64.fc23
+ my_mock=/usr/bin/mock
+ my_mock+=' --configdir=<http://jenkins.ovirt.org/job/ovirt_3.6_system-tests/ws/ovirt-system-tests'>
+ my_mock+=' --root=mocker-fedora-23-x86_64.fc23'
+ my_mock+=' --resultdir=<http://jenkins.ovirt.org/job/ovirt_3.6_system-tests/ws/'>
+ echo 'Killing all mock orphan processes, if any.'
Killing all mock orphan processes, if any.
+ /usr/bin/mock --configdir=<http://jenkins.ovirt.org/job/ovirt_3.6_system-tests/ws/ovirt-system-tests> --root=mocker-fedora-23-x86_64.fc23 --resultdir=<http://jenkins.ovirt.org/job/ovirt_3.6_system-tests/ws/> --orphanskill
WARNING: Could not find required logging config file: <http://jenkins.ovirt.org/job/ovirt_3.6_system-tests/ws/ovirt-system-tests....> Using default...
INFO: mock.py version 1.2.21 starting (python version = 3.5.1)...
Start: init plugins
INFO: selinux enabled
Finish: init plugins
Start: run
Finish: run
++ grep -Po '(?<=config_opts\['\''root'\''\] = '\'')[^'\'']*' <http://jenkins.ovirt.org/job/ovirt_3.6_system-tests/ws/ovirt-system-tests...>
+ mock_root=fedora-23-x86_64-235bec7d0621e95d1cae73d7cf9dc27c
+ [[ -n fedora-23-x86_64-235bec7d0621e95d1cae73d7cf9dc27c ]]
+ mounts=($(mount | awk '{print $3}' | grep "$mock_root"))
++ mount
++ awk '{print $3}'
++ grep fedora-23-x86_64-235bec7d0621e95d1cae73d7cf9dc27c
+ :
+ [[ -n '' ]]
+ false
+ shopt -u nullglob
+ sudo rm -Rf /var/cache/mock/fedora-23-x86_64-235bec7d0621e95d1cae73d7cf9dc27c
+ sudo chown -R jenkins <http://jenkins.ovirt.org/job/ovirt_3.6_system-tests/ws/>
POST BUILD TASK : SUCCESS
END OF POST BUILD TASK : 1
Recording test results
Archiving artifacts
8 years, 2 months
[JIRA] (OVIRT-823) fix nagios alerts for Jenkins disk
by Evgheni Dereveanchin (oVirt JIRA)
Evgheni Dereveanchin created OVIRT-823:
------------------------------------------
Summary: fix nagios alerts for Jenkins disk
Key: OVIRT-823
URL: https://ovirt-jira.atlassian.net/browse/OVIRT-823
Project: oVirt - virtualization made easy
Issue Type: Bug
Reporter: Evgheni Dereveanchin
Assignee: infra
Nagios did not alert about disk space running out on Jenkins which eventually caused an outage this morning.
--
This message was sent by Atlassian JIRA
(v1000.526.2#100018)
8 years, 2 months