[JIRA] (OVIRT-914) Better arch support for mock_runner.sh
by Barak Korren (oVirt JIRA)
[ https://ovirt-jira.atlassian.net/browse/OVIRT-914?page=com.atlassian.jira... ]
Barak Korren updated OVIRT-914:
-------------------------------
Labels: mock_runner.sh standard-ci (was: )
> Better arch support for mock_runner.sh
> --------------------------------------
>
> Key: OVIRT-914
> URL: https://ovirt-jira.atlassian.net/browse/OVIRT-914
> Project: oVirt - virtualization made easy
> Issue Type: Improvement
> Components: Jenkins
> Reporter: Barak Korren
> Assignee: infra
> Labels: mock_runner.sh, standard-ci
>
> We managed to us "{{mock_runner.sh}}" in multi-arch so far because it was flexible enough to allow us to select the chroot file.
> The issue is that mock_runner does not actually *know* the arch we are running on so we can`t::
> * do different mounts per-arch
> * install different packages per-arch
> * have different {{check_*}} scripts per-arch
--
This message was sent by Atlassian JIRA
(v1000.670.2#100024)
7 years, 10 months
[JIRA] (OVIRT-1013) Automate standard-CI job creation
by Barak Korren (oVirt JIRA)
[ https://ovirt-jira.atlassian.net/browse/OVIRT-1013?page=com.atlassian.jir... ]
Barak Korren updated OVIRT-1013:
--------------------------------
Labels: standard-ci (was: )
> Automate standard-CI job creation
> ---------------------------------
>
> Key: OVIRT-1013
> URL: https://ovirt-jira.atlassian.net/browse/OVIRT-1013
> Project: oVirt - virtualization made easy
> Issue Type: Improvement
> Components: Jenkins
> Reporter: Barak Korren
> Assignee: infra
> Labels: standard-ci
>
> The premise of the CI standard is simple: The developers place simple script in the 'automation' directory, and The infa team take care of making Jenkins run it when it should.
> But we haven't been able to fully deliver on this premise yet. Getting a project to work with the CI standard also takes writing some YAML in the 'Jenkins' repo. Even worse, this YAML needs to be maintained over time as new project branches get created, new platforms get targeted, etc.
> The core reason behind having to write YAML, is that there are two technical details that we need to know in order to run the CI jobs, but are not specified in a way that allows detecting them automatically. Those details are:
> 1. The platforms a certain project needs to be built and tested on.
> 2. The branches of the project that CI needs to look at, and how do they map to an oVirt releases.
> We need to specify a way to specify the details above in a way that will allow the CI system to automatically detect them.
--
This message was sent by Atlassian JIRA
(v1000.670.2#100024)
7 years, 10 months
[JIRA] (OVIRT-1013) Automate standard-CI job creation
by Barak Korren (oVirt JIRA)
[ https://ovirt-jira.atlassian.net/browse/OVIRT-1013?page=com.atlassian.jir... ]
Barak Korren commented on OVIRT-1013:
-------------------------------------
Since this may be a developer-visible change, a message about this was sent to devel:
http://lists.ovirt.org/pipermail/devel/2017-January/029161.html
> Automate standard-CI job creation
> ---------------------------------
>
> Key: OVIRT-1013
> URL: https://ovirt-jira.atlassian.net/browse/OVIRT-1013
> Project: oVirt - virtualization made easy
> Issue Type: Improvement
> Components: Jenkins
> Reporter: Barak Korren
> Assignee: infra
>
> The premise of the CI standard is simple: The developers place simple script in the 'automation' directory, and The infa team take care of making Jenkins run it when it should.
> But we haven't been able to fully deliver on this premise yet. Getting a project to work with the CI standard also takes writing some YAML in the 'Jenkins' repo. Even worse, this YAML needs to be maintained over time as new project branches get created, new platforms get targeted, etc.
> The core reason behind having to write YAML, is that there are two technical details that we need to know in order to run the CI jobs, but are not specified in a way that allows detecting them automatically. Those details are:
> 1. The platforms a certain project needs to be built and tested on.
> 2. The branches of the project that CI needs to look at, and how do they map to an oVirt releases.
> We need to specify a way to specify the details above in a way that will allow the CI system to automatically detect them.
--
This message was sent by Atlassian JIRA
(v1000.670.2#100024)
7 years, 10 months
[JIRA] (OVIRT-1013) Automate standard-CI job creation
by Barak Korren (oVirt JIRA)
[ https://ovirt-jira.atlassian.net/browse/OVIRT-1013?page=com.atlassian.jir... ]
Barak Korren updated OVIRT-1013:
--------------------------------
Description:
The premise of the CI standard is simple: The developers place simple script in the 'automation' directory, and The infa team take care of making Jenkins run it when it should.
But we haven't been able to fully deliver on this premise yet. Getting a project to work with the CI standard also takes writing some YAML in the 'Jenkins' repo. Even worse, this YAML needs to be maintained over time as new project branches get created, new platforms get targeted, etc.
The core reason behind having to write YAML, is that there are two technical details that we need to know in order to run the CI jobs, but are not specified in a way that allows detecting them automatically. Those details are:
1. The platforms a certain project needs to be built and tested on.
2. The branches of the project that CI needs to look at, and how do they map to an oVirt releases.
We need to specify a way to specify the details above in a way that will allow the CI system to automatically detect them.
was:
The premise of the CI standard is simple: The developers place simple script in the 'automation' directory, and The infa team take care of making Jenkins run it when it should.
But we haven't been able to fully deliver on this premise yet. Getting a project to work with the CI standard also takes writing some YAML in the 'jenkins' repo. Even worse, this YAML needs to be maintained over time as new project branches get created, new platforms get targeted, etc.
The core reason behind having to write YAML, is that there are two technical details that we need to know in order to run the CI jobs, but are not specified in a way that allows detecting them automatically. Those details are:
1. The platforms a certain project needs to be built and tested on.
2. The branches of the project that CI needs to look at, and how do they map to an oVirt releases.
We need to specify a way to specify the details above in a way that will allow the CI system to automatically detect them.
> Automate standard-CI job creation
> ---------------------------------
>
> Key: OVIRT-1013
> URL: https://ovirt-jira.atlassian.net/browse/OVIRT-1013
> Project: oVirt - virtualization made easy
> Issue Type: Improvement
> Components: Jenkins
> Reporter: Barak Korren
> Assignee: infra
>
> The premise of the CI standard is simple: The developers place simple script in the 'automation' directory, and The infa team take care of making Jenkins run it when it should.
> But we haven't been able to fully deliver on this premise yet. Getting a project to work with the CI standard also takes writing some YAML in the 'Jenkins' repo. Even worse, this YAML needs to be maintained over time as new project branches get created, new platforms get targeted, etc.
> The core reason behind having to write YAML, is that there are two technical details that we need to know in order to run the CI jobs, but are not specified in a way that allows detecting them automatically. Those details are:
> 1. The platforms a certain project needs to be built and tested on.
> 2. The branches of the project that CI needs to look at, and how do they map to an oVirt releases.
> We need to specify a way to specify the details above in a way that will allow the CI system to automatically detect them.
--
This message was sent by Atlassian JIRA
(v1000.670.2#100024)
7 years, 10 months
[JIRA] (OVIRT-1013) Automate standard-CI job creation
by Barak Korren (oVirt JIRA)
Barak Korren created OVIRT-1013:
-----------------------------------
Summary: Automate standard-CI job creation
Key: OVIRT-1013
URL: https://ovirt-jira.atlassian.net/browse/OVIRT-1013
Project: oVirt - virtualization made easy
Issue Type: Improvement
Components: Jenkins
Reporter: Barak Korren
Assignee: infra
The premise of the CI standard is simple: The developers place simple script in the 'automation' directory, and The infa team take care of making Jenkins run it when it should.
But we haven't been able to fully deliver on this premise yet. Getting a project to work with the CI standard also takes writing some YAML in the 'jenkins' repo. Even worse, this YAML needs to be maintained over time as new project branches get created, new platforms get targeted, etc.
The core reason behind having to write YAML, is that there are two technical details that we need to know in order to run the CI jobs, but are not specified in a way that allows detecting them automatically. Those details are:
1. The platforms a certain project needs to be built and tested on.
2. The branches of the project that CI needs to look at, and how do they map to an oVirt releases.
We need to specify a way to specify the details above in a way that will allow the CI system to automatically detect them.
--
This message was sent by Atlassian JIRA
(v1000.670.2#100024)
7 years, 10 months
[JIRA] (OVIRT-1013) Automate standard-CI job creation
by Barak Korren (oVirt JIRA)
[ https://ovirt-jira.atlassian.net/browse/OVIRT-1013?page=com.atlassian.jir... ]
Barak Korren updated OVIRT-1013:
--------------------------------
Epic Link: OVIRT-400
> Automate standard-CI job creation
> ---------------------------------
>
> Key: OVIRT-1013
> URL: https://ovirt-jira.atlassian.net/browse/OVIRT-1013
> Project: oVirt - virtualization made easy
> Issue Type: Improvement
> Components: Jenkins
> Reporter: Barak Korren
> Assignee: infra
>
> The premise of the CI standard is simple: The developers place simple script in the 'automation' directory, and The infa team take care of making Jenkins run it when it should.
> But we haven't been able to fully deliver on this premise yet. Getting a project to work with the CI standard also takes writing some YAML in the 'jenkins' repo. Even worse, this YAML needs to be maintained over time as new project branches get created, new platforms get targeted, etc.
> The core reason behind having to write YAML, is that there are two technical details that we need to know in order to run the CI jobs, but are not specified in a way that allows detecting them automatically. Those details are:
> 1. The platforms a certain project needs to be built and tested on.
> 2. The branches of the project that CI needs to look at, and how do they map to an oVirt releases.
> We need to specify a way to specify the details above in a way that will allow the CI system to automatically detect them.
--
This message was sent by Atlassian JIRA
(v1000.670.2#100024)
7 years, 10 months
[JIRA] (OVIRT-894) Add support of secrets and credentials in Standard-CI
by Barak Korren (oVirt JIRA)
[ https://ovirt-jira.atlassian.net/browse/OVIRT-894?page=com.atlassian.jira... ]
Barak Korren updated OVIRT-894:
-------------------------------
Summary: Add support of secrets and credentials in Standard-CI (was: Add support of sercers and credentials in Standard-CI)
> Add support of secrets and credentials in Standard-CI
> -----------------------------------------------------
>
> Key: OVIRT-894
> URL: https://ovirt-jira.atlassian.net/browse/OVIRT-894
> Project: oVirt - virtualization made easy
> Issue Type: New Feature
> Components: General
> Reporter: Barak Korren
> Assignee: infra
> Labels: standard-ci
>
> Some task we'd like to carry out in Standard-CI require credentials. Examples for such tasks include:
> # Publishing artifacts to repositories (E.g containers to Dokcerhub, puppet code to Puppetmaster)
> # Launching remote processes (Check code with remote services)
> While Jenkins supports storing and managing secret data, we did not enable a way to use this capability from Standard-CI so far.
--
This message was sent by Atlassian JIRA
(v1000.670.2#100024)
7 years, 10 months
Re: ** PROBLEM Service Alert: ovirt-mirrorchecker/www.gtlib.gatech.edu/pub/oVirt/pub mirror site last sync is WARNING **
by Nadav Goldin
Looks good now, thanks!
Nadav.
On Tue, Jan 10, 2017 at 1:38 AM, Neil Bright <ncbright(a)gatech.edu> wrote:
> Hi Nadav,
>
> I was fixing some things earlier this afternoon (around noon US/Eastern). Are you still having reachability issues?
>
>> On Jan 9, 2017, at 5:06 PM, Nadav Goldin <ngoldin(a)redhat.com> wrote:
>>
>> Hi,
>> Seems like the ovirt mirror went out of sync again, can you have a look?
>>
>> Thanks,
>>
>> Nadav.
>>
>>
>>
>> ---------- Forwarded message ----------
>> From: icinga <icinga(a)monitoring.ovirt.org>
>> Date: Mon, Jan 9, 2017 at 11:51 PM
>> Subject: ** PROBLEM Service Alert:
>> ovirt-mirrorchecker/www.gtlib.gatech.edu/pub/oVirt/pub mirror site
>> last sync is WARNING **
>> To: ngoldin(a)redhat.com
>>
>>
>> ***** Icinga *****
>>
>> Notification Type: PROBLEM
>>
>> Service: www.gtlib.gatech.edu/pub/oVirt/pub mirror site last sync
>> Host: ovirt-mirrorchecker
>> Address: 66.187.230.105
>> State: WARNING
>>
>> Date/Time: Mon Jan 9 21:51:19 UTC 2017
>>
>> Additional Info:
>>
>> WARNING - 237938 seconds since last sync, which are 66.0939 hours.
>
> +======================================================================+
> Neil Bright (ncbright(a)gatech.edu) (404) 385-6954
> http://www.pace.gatech.edu
> 258 Fourth Street, Rich Bldg, Rm 321 / Atlanta, GA 30332-0700
>
>
7 years, 10 months
Build failed in Jenkins: ovirt_3.6_he-system-tests #803
by jenkins@jenkins.phx.ovirt.org
See <http://jenkins.ovirt.org/job/ovirt_3.6_he-system-tests/803/changes>
Changes:
[Yaniv Kaul] Allow discard for storage
------------------------------------------
[...truncated 653 lines...]
Finish: shell
@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@
@@ Fri Jan 6 03:00:01 UTC 2017 automation/he_basic_suite_3.6.sh chroot finished
@@ took 410 seconds
@@ rc = 1
@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@
========== Scrubbing chroot
mock \
--configdir="<http://jenkins.ovirt.org/job/ovirt_3.6_he-system-tests/ws/ovirt-system-tests"> \
--root="mocker-epel-7-x86_64.el7" \
--resultdir="./mock_logs.FJcynAI8/mocker-epel-7-x86_64.el7.scrub" \
--scrub=chroot
WARNING: Could not find required logging config file: <http://jenkins.ovirt.org/job/ovirt_3.6_he-system-tests/ws/ovirt-system-te....> Using default...
INFO: mock.py version 1.2.21 starting (python version = 3.5.1)...
Start: init plugins
INFO: selinux enabled
Finish: init plugins
Start: run
Start: scrub ['chroot']
INFO: scrubbing chroot for mocker-epel-7-x86_64.el7
Finish: scrub ['chroot']
Finish: run
Scrub chroot took 6 seconds
============================
##########################################################
## Fri Jan 6 03:00:07 UTC 2017 Finished env: el7:epel-7-x86_64
## took 416 seconds
## rc = 1
##########################################################
find: 'logs': No such file or directory
No log files found, check command output
##!########################################################
Collecting mock logs
'./mock_logs.FJcynAI8/mocker-epel-7-x86_64.el7.clean_rpmdb' -> 'exported-artifacts/mock_logs/mocker-epel-7-x86_64.el7.clean_rpmdb'
'./mock_logs.FJcynAI8/mocker-epel-7-x86_64.el7.he_basic_suite_3.6.sh' -> 'exported-artifacts/mock_logs/mocker-epel-7-x86_64.el7.he_basic_suite_3.6.sh'
'./mock_logs.FJcynAI8/mocker-epel-7-x86_64.el7.init' -> 'exported-artifacts/mock_logs/mocker-epel-7-x86_64.el7.init'
##########################################################
Build step 'Execute shell' marked build as failure
Performing Post build task...
Match found for :.* : True
Logical operation result is TRUE
Running script : #!/bin/bash -xe
echo 'shell_scripts/system_tests.collect_logs.sh'
#
# Required jjb vars:
# version
#
VERSION=3.6
SUITE_TYPE=
WORKSPACE="$PWD"
OVIRT_SUITE="$SUITE_TYPE_suite_$VERSION"
TESTS_LOGS="$WORKSPACE/ovirt-system-tests/exported-artifacts"
rm -rf "$WORKSPACE/exported-artifacts"
mkdir -p "$WORKSPACE/exported-artifacts"
if [[ -d "$TESTS_LOGS" ]]; then
mv "$TESTS_LOGS/"* "$WORKSPACE/exported-artifacts/"
fi
[ovirt_3.6_he-system-tests] $ /bin/bash -xe /tmp/hudson5037284079566254380.sh
+ echo shell_scripts/system_tests.collect_logs.sh
shell_scripts/system_tests.collect_logs.sh
+ VERSION=3.6
+ SUITE_TYPE=
+ WORKSPACE=<http://jenkins.ovirt.org/job/ovirt_3.6_he-system-tests/ws/>
+ OVIRT_SUITE=3.6
+ TESTS_LOGS=<http://jenkins.ovirt.org/job/ovirt_3.6_he-system-tests/ws/ovirt-system-te...>
+ rm -rf <http://jenkins.ovirt.org/job/ovirt_3.6_he-system-tests/803/artifact/expor...>
+ mkdir -p <http://jenkins.ovirt.org/job/ovirt_3.6_he-system-tests/803/artifact/expor...>
+ [[ -d <http://jenkins.ovirt.org/job/ovirt_3.6_he-system-tests/ws/ovirt-system-te...> ]]
+ mv <http://jenkins.ovirt.org/job/ovirt_3.6_he-system-tests/ws/ovirt-system-te...> <http://jenkins.ovirt.org/job/ovirt_3.6_he-system-tests/ws/ovirt-system-te...> <http://jenkins.ovirt.org/job/ovirt_3.6_he-system-tests/ws/ovirt-system-te...> <http://jenkins.ovirt.org/job/ovirt_3.6_he-system-tests/803/artifact/expor...>
POST BUILD TASK : SUCCESS
END OF POST BUILD TASK : 0
Match found for :.* : True
Logical operation result is TRUE
Running script : #!/bin/bash -x
echo "shell-scripts/mock_cleanup.sh"
# Make clear this is the cleanup, helps reading the jenkins logs
cat <<EOC
_______________________________________________________________________
#######################################################################
# #
# CLEANUP #
# #
#######################################################################
EOC
shopt -s nullglob
WORKSPACE="${WORKSPACE:-$PWD}"
UMOUNT_RETRIES="${UMOUNT_RETRIES:-3}"
UMOUNT_RETRY_DELAY="${UMOUNT_RETRY_DELAY:-1s}"
safe_umount() {
local mount="${1:?}"
local attempt
for ((attempt=0 ; attempt < $UMOUNT_RETRIES ; attempt++)); do
# If this is not the 1st time through the loop, Sleep a while to let
# the problem "solve itself"
[[ attempt > 0 ]] && sleep "$UMOUNT_RETRY_DELAY"
# Try to umount
sudo umount --lazy "$mount" && return 0
# See if the mount is already not there despite failing
findmnt --kernel --first "$mount" > /dev/null && return 0
done
echo "ERROR: Failed to umount $mount."
return 1
}
# restore the permissions in the working dir, as sometimes it leaves files
# owned by root and then the 'cleanup workspace' from jenkins job fails to
# clean and breaks the jobs
sudo chown -R "$USER" "$WORKSPACE"
# stop any processes running inside the chroot
failed=false
mock_confs=("$WORKSPACE"/*/mocker*)
# Clean current jobs mockroot if any
for mock_conf_file in "${mock_confs[@]}"; do
[[ "$mock_conf_file" ]] || continue
echo "Cleaning up mock $mock_conf"
mock_root="${mock_conf_file##*/}"
mock_root="${mock_root%.*}"
my_mock="/usr/bin/mock"
my_mock+=" --configdir=${mock_conf_file%/*}"
my_mock+=" --root=${mock_root}"
my_mock+=" --resultdir=$WORKSPACE"
#TODO: investigate why mock --clean fails to umount certain dirs sometimes,
#so we can use it instead of manually doing all this.
echo "Killing all mock orphan processes, if any."
$my_mock \
--orphanskill \
|| {
echo "ERROR: Failed to kill orphans on $chroot."
failed=true
}
mock_root="$(\
grep \
-Po "(?<=config_opts\['root'\] = ')[^']*" \
"$mock_conf_file" \
)" || :
[[ "$mock_root" ]] || continue
mounts=($(mount | awk '{print $3}' | grep "$mock_root")) || :
if [[ "$mounts" ]]; then
echo "Found mounted dirs inside the chroot $chroot. Trying to umount."
fi
for mount in "${mounts[@]}"; do
safe_umount "$mount" || failed=true
done
done
# Clean any leftover chroot from other jobs
for mock_root in /var/lib/mock/*; do
this_chroot_failed=false
mounts=($(cut -d\ -f2 /proc/mounts | grep "$mock_root" | sort -r)) || :
if [[ "$mounts" ]]; then
echo "Found mounted dirs inside the chroot $mock_root." \
"Trying to umount."
fi
for mount in "${mounts[@]}"; do
safe_umount "$mount" && continue
# If we got here, we failed $UMOUNT_RETRIES attempts so we should make
# noise
failed=true
this_chroot_failed=true
done
if ! $this_chroot_failed; then
sudo rm -rf "$mock_root"
fi
done
# remove mock caches that are older then 2 days:
find /var/cache/mock/ -mindepth 1 -maxdepth 1 -type d -mtime +2 -print0 | \
xargs -0 -tr sudo rm -rf
# We make no effort to leave around caches that may still be in use because
# packages installed in them may go out of date, so may as well recreate them
# Drop all left over libvirt domains
for UUID in $(virsh list --all --uuid); do
virsh destroy $UUID || :
sleep 2
virsh undefine --remove-all-storage --storage vda --snapshots-metadata $UUID || :
done
if $failed; then
echo "Cleanup script failed, propegating failure to job"
exit 1
fi
[ovirt_3.6_he-system-tests] $ /bin/bash -x /tmp/hudson5918502768537555975.sh
+ echo shell-scripts/mock_cleanup.sh
shell-scripts/mock_cleanup.sh
+ cat
_______________________________________________________________________
#######################################################################
# #
# CLEANUP #
# #
#######################################################################
+ shopt -s nullglob
+ WORKSPACE=<http://jenkins.ovirt.org/job/ovirt_3.6_he-system-tests/ws/>
+ UMOUNT_RETRIES=3
+ UMOUNT_RETRY_DELAY=1s
+ sudo chown -R jenkins <http://jenkins.ovirt.org/job/ovirt_3.6_he-system-tests/ws/>
+ failed=false
+ mock_confs=("$WORKSPACE"/*/mocker*)
+ for mock_conf_file in '"${mock_confs[@]}"'
+ [[ -n <http://jenkins.ovirt.org/job/ovirt_3.6_he-system-tests/ws/ovirt-system-te...> ]]
+ echo 'Cleaning up mock '
Cleaning up mock
+ mock_root=mocker-epel-7-x86_64.el7.cfg
+ mock_root=mocker-epel-7-x86_64.el7
+ my_mock=/usr/bin/mock
+ my_mock+=' --configdir=<http://jenkins.ovirt.org/job/ovirt_3.6_he-system-tests/ws/ovirt-system-tests'>
+ my_mock+=' --root=mocker-epel-7-x86_64.el7'
+ my_mock+=' --resultdir=<http://jenkins.ovirt.org/job/ovirt_3.6_he-system-tests/ws/'>
+ echo 'Killing all mock orphan processes, if any.'
Killing all mock orphan processes, if any.
+ /usr/bin/mock --configdir=<http://jenkins.ovirt.org/job/ovirt_3.6_he-system-tests/ws/ovirt-system-tests> --root=mocker-epel-7-x86_64.el7 --resultdir=<http://jenkins.ovirt.org/job/ovirt_3.6_he-system-tests/ws/> --orphanskill
WARNING: Could not find required logging config file: <http://jenkins.ovirt.org/job/ovirt_3.6_he-system-tests/ws/ovirt-system-te....> Using default...
INFO: mock.py version 1.2.21 starting (python version = 3.5.1)...
Start: init plugins
INFO: selinux enabled
Finish: init plugins
Start: run
Finish: run
++ grep -Po '(?<=config_opts\['\''root'\''\] = '\'')[^'\'']*' <http://jenkins.ovirt.org/job/ovirt_3.6_he-system-tests/ws/ovirt-system-te...>
+ mock_root=epel-7-x86_64-e30b297ddc7e818ebfbed823dcce8eb8
+ [[ -n epel-7-x86_64-e30b297ddc7e818ebfbed823dcce8eb8 ]]
+ mounts=($(mount | awk '{print $3}' | grep "$mock_root"))
++ mount
++ awk '{print $3}'
++ grep epel-7-x86_64-e30b297ddc7e818ebfbed823dcce8eb8
+ :
+ [[ -n '' ]]
+ find /var/cache/mock/ -mindepth 1 -maxdepth 1 -type d -mtime +2 -print0
+ xargs -0 -tr sudo rm -rf
++ virsh list --all --uuid
+ false
POST BUILD TASK : SUCCESS
END OF POST BUILD TASK : 1
Recording test results
ERROR: Step ?Publish JUnit test result report? failed: No test report files were found. Configuration error?
Archiving artifacts
7 years, 10 months