Yes.
On Thu, Jul 21, 2016 at 9:51 AM, Sandro Bonazzola <sbonazzo(a)redhat.com>
wrote:
On Tue, Jul 19, 2016 at 10:30 AM, Eyal Edri <eedri(a)redhat.com> wrote:
> we have it for all versions:
>
http://resources.ovirt.org/repos/ovirt/experimental/
>
So:
http://resources.ovirt.org/repos/ovirt/experimental/3.6/latest.tested
http://resources.ovirt.org/repos/ovirt/experimental/4.0/latest.tested
http://resources.ovirt.org/repos/ovirt/experimental/master/latest.tested
should be equivalend to the snapshot + snapshot-static repos right?
>
>
> On Tue, Jul 19, 2016 at 11:27 AM, Tolik Litovsky <tlitovsk(a)redhat.com>
> wrote:
>
>> Is it only for master job ?
>> Or we have such repos for all branches ?
>>
>> On Tue, Jul 19, 2016 at 10:13 AM, Eyal Edri <eedri(a)redhat.com> wrote:
>>
>>> Ryan/Tolik,
>>> Can you build appliance only from tested engine repo [1] ? lets see how
>>> it affects the stability, next step will be to publish tested appliance
>>> after it runs Lago verification.
>>>
>>> [1]
>>>
http://resources.ovirt.org/repos/ovirt/experimental/master/latest.tested/
>>> (published only after ovirt-system-tests basic suite finish successfully)
>>>
>>>
>>> On Tue, Jul 19, 2016 at 10:10 AM, Lev Veyde <lveyde(a)redhat.com> wrote:
>>>
>>>> Hi Eyal,
>>>>
>>>> The last failed run failed on:
>>>> *15:50:02* [ INFO ] Extracting disk image from OVF archive (could
>>>> take a few minutes depending on archive size)
>>>> *21:35:04* Build timed out (after 360 minutes). Marking the build as
>>>> failed.
>>>>
>>>> So it basically got stuck while extracting the OVF image.
>>>>
>>>> Some previous runs failed mostly on either:
>>>> a) broken ovirt-engine-appliance build
>>>> b) ovirt-engine-appliance missing from the yum repo
>>>>
>>>> We need to make sure that the process of building and publishing the
>>>> ovirt-engine-appliance works flawlessly e.g. build ovirt-engine, publish
it
>>>> into the repo so that the build of the appliance can work, then publish
it
>>>> to the repo as well.
>>>> This is extra important as the hosted-engine flow installation will
>>>> probably become the default one, and without synced ovirt appliance we
>>>> can't really test the changes in the engine.
>>>>
>>>> Thanks in advance,
>>>> Lev Veyde.
>>>>
>>>> ------------------------------
>>>> *From: *"Eyal Edri" <eedri(a)redhat.com>
>>>> *To: *jenkins(a)jenkins.phx.ovirt.org
>>>> *Cc: *"infra" <infra(a)ovirt.org>, "Lev Veyde"
<lveyde(a)redhat.com>,
>>>> sbonazzo(a)redhat.com
>>>> *Sent: *Tuesday, July 19, 2016 8:26:22 AM
>>>> *Subject: *Re: Build failed in Jenkins: ovirt_4.0_he-system-tests #32
>>>>
>>>>
>>>> Lev, this test is a bit flaky going from stable to failure quite
>>>> often, can you check what is causing it?
>>>> On Jul 19, 2016 12:35 AM, <jenkins(a)jenkins.phx.ovirt.org> wrote:
>>>>
>>>>> See <
>>>>>
http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/32/changes>
>>>>>
>>>>> Changes:
>>>>>
>>>>> [Lev Veyde] ovirt-system-tests: Add automation for
>>>>> he_iscsi_basic_suite_4.0
>>>>>
>>>>> [Sandro Bonazzola] vdsm: avoid fc24 out of master
>>>>>
>>>>> [Sandro Bonazzola] ovirt-engine: add 3.6.8 branch testing
>>>>>
>>>>> ------------------------------------------
>>>>> [...truncated 620 lines...]
>>>>>
>>>>> WORKSPACE="$PWD"
>>>>> OVIRT_SUITE="$SUITE_TYPE_suite_$VERSION"
>>>>>
TESTS_LOGS="$WORKSPACE/ovirt-system-tests/exported-artifacts"
>>>>>
>>>>> rm -rf "$WORKSPACE/exported-artifacts"
>>>>> mkdir -p "$WORKSPACE/exported-artifacts"
>>>>>
>>>>> if [[ -d "$TESTS_LOGS" ]]; then
>>>>> mv "$TESTS_LOGS/"*
"$WORKSPACE/exported-artifacts/"
>>>>> fi
>>>>>
>>>>> [ovirt_4.0_he-system-tests] $ /bin/bash -xe
>>>>> /tmp/hudson1764906258788527221.sh
>>>>> + echo shell_scripts/system_tests.collect_logs.sh
>>>>> shell_scripts/system_tests.collect_logs.sh
>>>>> + VERSION=4.0
>>>>> + SUITE_TYPE=
>>>>> + WORKSPACE=<
>>>>>
http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/>
>>>>> + OVIRT_SUITE=4.0
>>>>> + TESTS_LOGS=<
>>>>>
http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ovirt-system-te...
>>>>> >
>>>>> + rm -rf <
>>>>>
http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/32/artifact/export...
>>>>> >
>>>>> + mkdir -p <
>>>>>
http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/32/artifact/export...
>>>>> >
>>>>> + [[ -d <
>>>>>
http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ovirt-system-te...
>>>>> ]]
>>>>> POST BUILD TASK : SUCCESS
>>>>> END OF POST BUILD TASK : 0
>>>>> Match found for :.* : True
>>>>> Logical operation result is TRUE
>>>>> Running script : #!/bin/bash -xe
>>>>> echo "shell-scripts/mock_cleanup.sh"
>>>>>
>>>>> shopt -s nullglob
>>>>>
>>>>>
>>>>> WORKSPACE="$PWD"
>>>>>
>>>>> # Make clear this is the cleanup, helps reading the jenkins logs
>>>>> cat <<EOC
>>>>>
>>>>>
_______________________________________________________________________
>>>>>
>>>>>
#######################################################################
>>>>> #
>>>>> #
>>>>> # CLEANUP
>>>>> #
>>>>> #
>>>>> #
>>>>>
>>>>>
#######################################################################
>>>>> EOC
>>>>>
>>>>>
>>>>> # Archive the logs, we want them anyway
>>>>> logs=(
>>>>> ./*log
>>>>> ./*/logs
>>>>> )
>>>>> if [[ "$logs" ]]; then
>>>>> tar cvzf exported-artifacts/logs.tgz "${logs[@]}"
>>>>> rm -rf "${logs[@]}"
>>>>> fi
>>>>>
>>>>> # stop any processes running inside the chroot
>>>>> failed=false
>>>>> mock_confs=("$WORKSPACE"/*/mocker*)
>>>>> # Clean current jobs mockroot if any
>>>>> for mock_conf_file in "${mock_confs[@]}"; do
>>>>> [[ "$mock_conf_file" ]] || continue
>>>>> echo "Cleaning up mock $mock_conf"
>>>>> mock_root="${mock_conf_file##*/}"
>>>>> mock_root="${mock_root%.*}"
>>>>> my_mock="/usr/bin/mock"
>>>>> my_mock+=" --configdir=${mock_conf_file%/*}"
>>>>> my_mock+=" --root=${mock_root}"
>>>>> my_mock+=" --resultdir=$WORKSPACE"
>>>>>
>>>>> #TODO: investigate why mock --clean fails to umount certain dirs
>>>>> sometimes,
>>>>> #so we can use it instead of manually doing all this.
>>>>> echo "Killing all mock orphan processes, if any."
>>>>> $my_mock \
>>>>> --orphanskill \
>>>>> || {
>>>>> echo "ERROR: Failed to kill orphans on $chroot."
>>>>> failed=true
>>>>> }
>>>>>
>>>>> mock_root="$(\
>>>>> grep \
>>>>> -Po "(?<=config_opts\['root'\] =
')[^']*" \
>>>>> "$mock_conf_file" \
>>>>> )" || :
>>>>> [[ "$mock_root" ]] || continue
>>>>> mounts=($(mount | awk '{print $3}' | grep
"$mock_root")) || :
>>>>> if [[ "$mounts" ]]; then
>>>>> echo "Found mounted dirs inside the chroot $chroot.
Trying to
>>>>> umount."
>>>>> fi
>>>>> for mount in "${mounts[@]}"; do
>>>>> sudo umount --lazy "$mount" \
>>>>> || {
>>>>> echo "ERROR: Failed to umount $mount."
>>>>> failed=true
>>>>> }
>>>>> done
>>>>> done
>>>>>
>>>>> # Clean any leftover chroot from other jobs
>>>>> for mock_root in /var/lib/mock/*; do
>>>>> this_chroot_failed=false
>>>>> mounts=($(mount | awk '{print $3}' | grep
"$mock_root")) || :
>>>>> if [[ "$mounts" ]]; then
>>>>> echo "Found mounted dirs inside the chroot
$mock_root." \
>>>>> "Trying to umount."
>>>>> fi
>>>>> for mount in "${mounts[@]}"; do
>>>>> sudo umount --lazy "$mount" \
>>>>> || {
>>>>> echo "ERROR: Failed to umount $mount."
>>>>> failed=true
>>>>> this_chroot_failed=true
>>>>> }
>>>>> done
>>>>> if ! $this_chroot_failed; then
>>>>> sudo rm -rf "$mock_root"
>>>>> fi
>>>>> done
>>>>>
>>>>> if $failed; then
>>>>> echo "Aborting."
>>>>> exit 1
>>>>> fi
>>>>>
>>>>> # remove mock system cache, we will setup proxies to do the caching
>>>>> and this
>>>>> # takes lots of space between runs
>>>>> shopt -u nullglob
>>>>> sudo rm -Rf /var/cache/mock/*
>>>>>
>>>>> # restore the permissions in the working dir, as sometimes it leaves
>>>>> files
>>>>> # owned by root and then the 'cleanup workspace' from jenkins
job
>>>>> fails to
>>>>> # clean and breaks the jobs
>>>>> sudo chown -R "$USER" "$WORKSPACE"
>>>>>
>>>>> [ovirt_4.0_he-system-tests] $ /bin/bash -xe
>>>>> /tmp/hudson5198775129414653216.sh
>>>>> + echo shell-scripts/mock_cleanup.sh
>>>>> shell-scripts/mock_cleanup.sh
>>>>> + shopt -s nullglob
>>>>> + WORKSPACE=<
>>>>>
http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/>
>>>>> + cat
>>>>>
>>>>>
_______________________________________________________________________
>>>>>
>>>>>
#######################################################################
>>>>> #
>>>>> #
>>>>> # CLEANUP
>>>>> #
>>>>> #
>>>>> #
>>>>>
>>>>>
#######################################################################
>>>>> + logs=(./*log ./*/logs)
>>>>> + [[ -n ./ovirt-system-tests/logs ]]
>>>>> + tar cvzf exported-artifacts/logs.tgz ./ovirt-system-tests/logs
>>>>> ./ovirt-system-tests/logs/
>>>>> ./ovirt-system-tests/logs/
>>>>> mocker-fedora-23-x86_64.fc23.he_basic_suite_4.0.sh/
>>>>>
>>>>>
./ovirt-system-tests/logs/mocker-fedora-23-x86_64.fc23.he_basic_suite_4.0.sh/he_basic_suite_4.0.sh.log
>>>>> ./ovirt-system-tests/logs/mocker-fedora-23-x86_64.fc23.clean_rpmdb/
>>>>>
>>>>>
./ovirt-system-tests/logs/mocker-fedora-23-x86_64.fc23.clean_rpmdb/stdout_stderr.log
>>>>>
>>>>>
./ovirt-system-tests/logs/mocker-fedora-23-x86_64.fc23.install_packages/
>>>>>
>>>>>
./ovirt-system-tests/logs/mocker-fedora-23-x86_64.fc23.install_packages/root.log
>>>>>
>>>>>
./ovirt-system-tests/logs/mocker-fedora-23-x86_64.fc23.install_packages/build.log
>>>>>
>>>>>
./ovirt-system-tests/logs/mocker-fedora-23-x86_64.fc23.install_packages/state.log
>>>>>
>>>>>
./ovirt-system-tests/logs/mocker-fedora-23-x86_64.fc23.install_packages/stdout_stderr.log
>>>>> ./ovirt-system-tests/logs/mocker-fedora-23-x86_64.fc23.init/
>>>>>
<
http://mocker-fedora-23-x86_64.fc23.he_basic_suite_4.0.sh/./ovirt-system-...
>>>>> ./ovirt-system-tests/logs/mocker-fedora-23-x86_64.fc23.init/root.log
>>>>>
./ovirt-system-tests/logs/mocker-fedora-23-x86_64.fc23.init/build.log
>>>>>
./ovirt-system-tests/logs/mocker-fedora-23-x86_64.fc23.init/state.log
>>>>>
>>>>>
./ovirt-system-tests/logs/mocker-fedora-23-x86_64.fc23.init/stdout_stderr.log
>>>>> + rm -rf ./ovirt-system-tests/logs
>>>>> + failed=false
>>>>> + mock_confs=("$WORKSPACE"/*/mocker*)
>>>>> + for mock_conf_file in '"${mock_confs[@]}"'
>>>>> + [[ -n <
>>>>>
http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ovirt-system-te...
>>>>> ]]
>>>>> + echo 'Cleaning up mock '
>>>>> Cleaning up mock
>>>>> + mock_root=mocker-fedora-23-x86_64.fc23.cfg
>>>>> + mock_root=mocker-fedora-23-x86_64.fc23
>>>>> + my_mock=/usr/bin/mock
>>>>> + my_mock+=' --configdir=<
>>>>>
http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ovirt-system-tests
>>>>> '>
>>>>> + my_mock+=' --root=mocker-fedora-23-x86_64.fc23'
>>>>> + my_mock+=' --resultdir=<
>>>>>
http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/'>
>>>>> + echo 'Killing all mock orphan processes, if any.'
>>>>> Killing all mock orphan processes, if any.
>>>>> + /usr/bin/mock --configdir=<
>>>>>
http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ovirt-system-te...
>>>>> --root=mocker-fedora-23-x86_64.fc23 --resultdir=<
>>>>>
http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/>
>>>>> --orphanskill
>>>>> WARNING: Could not find required logging config file: <
>>>>>
http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ovirt-system-te...
>>>>> Using default...
>>>>> INFO: mock.py version 1.2.17 starting (python version = 3.4.3)...
>>>>> Start: init plugins
>>>>> INFO: selinux enabled
>>>>> Finish: init plugins
>>>>> Start: run
>>>>> WARNING: Process ID 115551 still running in chroot. Killing...
>>>>> WARNING: Process ID 115576 still running in chroot. Killing...
>>>>> WARNING: Process ID 115577 still running in chroot. Killing...
>>>>> WARNING: Process ID 115578 still running in chroot. Killing...
>>>>> WARNING: Process ID 116634 still running in chroot. Killing...
>>>>> Finish: run
>>>>> ++ grep -Po
'(?<=config_opts\['\''root'\''\] =
'\'')[^'\'']*' <
>>>>>
http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ovirt-system-te...
>>>>> >
>>>>> + mock_root=fedora-23-x86_64-0c362156a2fa4a935ea8b988eb73b2ad
>>>>> + [[ -n fedora-23-x86_64-0c362156a2fa4a935ea8b988eb73b2ad ]]
>>>>> + mounts=($(mount | awk '{print $3}' | grep
"$mock_root"))
>>>>> ++ mount
>>>>> ++ awk '{print $3}'
>>>>> ++ grep fedora-23-x86_64-0c362156a2fa4a935ea8b988eb73b2ad
>>>>> + [[ -n
>>>>>
/var/lib/mock/fedora-23-x86_64-0c362156a2fa4a935ea8b988eb73b2ad-91644/root/proc
>>>>> ]]
>>>>> + echo 'Found mounted dirs inside the chroot . Trying to
umount.'
>>>>> Found mounted dirs inside the chroot . Trying to umount.
>>>>> + for mount in '"${mounts[@]}"'
>>>>> + sudo umount --lazy
>>>>>
/var/lib/mock/fedora-23-x86_64-0c362156a2fa4a935ea8b988eb73b2ad-91644/root/proc
>>>>> + for mount in '"${mounts[@]}"'
>>>>> + sudo umount --lazy
>>>>>
/var/lib/mock/fedora-23-x86_64-0c362156a2fa4a935ea8b988eb73b2ad-91644/root/sys
>>>>> sh: [115551: 1 (255)] tcsetattr: Inappropriate ioctl for device
>>>>> + for mount in '"${mounts[@]}"'
>>>>> + sudo umount --lazy
>>>>>
/var/lib/mock/fedora-23-x86_64-0c362156a2fa4a935ea8b988eb73b2ad-91644/root/dev/shm
>>>>> + for mount in '"${mounts[@]}"'
>>>>> + sudo umount --lazy
>>>>>
/var/lib/mock/fedora-23-x86_64-0c362156a2fa4a935ea8b988eb73b2ad-91644/root/dev/pts
>>>>> + for mount in '"${mounts[@]}"'
>>>>> + sudo umount --lazy
>>>>>
/var/lib/mock/fedora-23-x86_64-0c362156a2fa4a935ea8b988eb73b2ad-91644/root/var/cache/yum
>>>>> + for mount in '"${mounts[@]}"'
>>>>> + sudo umount --lazy
>>>>>
/var/lib/mock/fedora-23-x86_64-0c362156a2fa4a935ea8b988eb73b2ad-91644/root<
>>>>>
http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ovirt-system-tests
>>>>> >
>>>>> + for mount in '"${mounts[@]}"'
>>>>> + sudo umount --lazy
>>>>>
/var/lib/mock/fedora-23-x86_64-0c362156a2fa4a935ea8b988eb73b2ad-91644/root/run/libvirt
>>>>> + for mount in '"${mounts[@]}"'
>>>>> + sudo umount --lazy
>>>>>
/var/lib/mock/fedora-23-x86_64-0c362156a2fa4a935ea8b988eb73b2ad-91644/root/var/lib/lago
>>>>> umount:
>>>>>
/var/lib/mock/fedora-23-x86_64-0c362156a2fa4a935ea8b988eb73b2ad-91644/root/var/lib/lago:
>>>>> not mounted
>>>>> + echo 'ERROR: Failed to umount
>>>>>
/var/lib/mock/fedora-23-x86_64-0c362156a2fa4a935ea8b988eb73b2ad-91644/root/var/lib/lago.'
>>>>> ERROR: Failed to umount
>>>>>
/var/lib/mock/fedora-23-x86_64-0c362156a2fa4a935ea8b988eb73b2ad-91644/root/var/lib/lago.
>>>>> + failed=true
>>>>> + for mount in '"${mounts[@]}"'
>>>>> + sudo umount --lazy
>>>>>
/var/lib/mock/fedora-23-x86_64-0c362156a2fa4a935ea8b988eb73b2ad-91644/root/proc/filesystems
>>>>> umount:
>>>>>
/var/lib/mock/fedora-23-x86_64-0c362156a2fa4a935ea8b988eb73b2ad-91644/root/proc/filesystems:
>>>>> mountpoint not found
>>>>> + echo 'ERROR: Failed to umount
>>>>>
/var/lib/mock/fedora-23-x86_64-0c362156a2fa4a935ea8b988eb73b2ad-91644/root/proc/filesystems.'
>>>>> ERROR: Failed to umount
>>>>>
/var/lib/mock/fedora-23-x86_64-0c362156a2fa4a935ea8b988eb73b2ad-91644/root/proc/filesystems.
>>>>> + failed=true
>>>>> + for mock_root in '/var/lib/mock/*'
>>>>> + this_chroot_failed=false
>>>>> + mounts=($(mount | awk '{print $3}' | grep
"$mock_root"))
>>>>> ++ mount
>>>>> ++ awk '{print $3}'
>>>>> ++ grep
>>>>>
/var/lib/mock/fedora-23-x86_64-0c362156a2fa4a935ea8b988eb73b2ad-91644
>>>>> + :
>>>>> + [[ -n '' ]]
>>>>> + false
>>>>> + sudo rm -rf
>>>>>
/var/lib/mock/fedora-23-x86_64-0c362156a2fa4a935ea8b988eb73b2ad-91644
>>>>> + true
>>>>> + echo Aborting.
>>>>> Aborting.
>>>>> + exit 1
>>>>> POST BUILD TASK : FAILURE
>>>>> END OF POST BUILD TASK : 1
>>>>> Recording test results
>>>>> ERROR: Step ‘Publish JUnit test result report’ failed: No test
report
>>>>> files were found. Configuration error?
>>>>> Archiving artifacts
>>>>> _______________________________________________
>>>>> Infra mailing list
>>>>> Infra(a)ovirt.org
>>>>>
http://lists.ovirt.org/mailman/listinfo/infra
>>>>>
>>>>>
>>>>>
>>>>
>>>
>>>
>>> --
>>> Eyal Edri
>>> Associate Manager
>>> RHEV DevOps
>>> EMEA ENG Virtualization R&D
>>> Red Hat Israel
>>>
>>> phone: +972-9-7692018
>>> irc: eedri (on #tlv #rhev-dev #rhev-integ)
>>>
>>
>>
>>
>> --
>> Best regards
>> Tolik Litovsky
>> RHEV-H Team
>> Red Hat
>>
>> Red Hat: trustworthy, transformative technology. Powered by the
>> community.
>> Connect at
redhat.com
>>
>
>
>
> --
> Eyal Edri
> Associate Manager
> RHEV DevOps
> EMEA ENG Virtualization R&D
> Red Hat Israel
>
> phone: +972-9-7692018
> irc: eedri (on #tlv #rhev-dev #rhev-integ)
>
--
Sandro Bonazzola
Better technology. Faster innovation. Powered by community collaboration.
See how it works at
redhat.com
--
Eyal Edri
Associate Manager
RHEV DevOps
EMEA ENG Virtualization R&D
Red Hat Israel
phone: +972-9-7692018
irc: eedri (on #tlv #rhev-dev #rhev-integ)