Python code style: introducing isort
by Sandro Bonazzola
Hi,
in project maintained by integration team we tried to adhere to some
styling rules while writing python code, one of these was related to the
import lines. After a lot of time doing this manually, we discovered isort
( https://pypi.python.org/pypi/isort/4.2.5 )
We're using it before we push our code to gerrit and a few days ago I sent
a patch for the python code within ovirt-engine:
https://gerrit.ovirt.org/61964
The configuration we're currently using is:
$ cat ~/.isort.cfg [settings] line_length=79
known_standard_library=configparser,Cheetah.Template force_single_line=True
default_section=FIRSTPARTY known_otopi=otopi
known_host_deploy=ovirt_host_deploy known_ovirt_engine=ovirt_engine
known_ovirt_engine_setup=ovirt_engine_setup
known_ovirt_setup_lib=ovirt_setup_lib known_vdsm=vdsm
known_ovirt_hosted_engine_setup=ovirt_hosted_engine_setup
sections=FUTURE,STDLIB,FIRSTPARTY,OTOPI,VDSM,HOST_DEPLOY,OVIRT_ENGINE,OVIRT_ENGINE_SETUP,OVIRT_SETUP_LIB,OVIRT_HOSTED_ENGINE_SETUP,THIRDPARTY,LOCALFOLDER
lines_between_types=2
It has been proposed to add 'isort --check-only' in check-patch.for python
projects in addition to pep8 style checking.
It has also been proposed to reach consensus on isort configuration so all
the python code
within ovirt project will have same styling.
I'd like to get some feedback about these proposal in order to decide how
to procede.
Thanks,
--
Sandro Bonazzola
Better technology. Faster innovation. Powered by community collaboration.
See how it works at redhat.com
8 years, 4 months
otopi based program testing: coverage reports
by Sandro Bonazzola
The otopi 1.6.0 package currently in ovirt-master-snapshot includes
a new feature that allows you to collect coverage reports from your
executions. In order to enable the feature you'll need to provide a
configuration file like: 8<-------------------------------------- [run]
branch = True [report] exclude_lines = # Have to re-enable the standard
pragma pragma: no cover # Don't complain about missing debug-only code: def
__repr__ if self\.debug # Don't complain if tests don't hit defensive
assertion code: raise AssertionError raise NotImplementedError # Don't
complain if non-runnable code isn't run: if 0: if __name__ == .__main__.:
ignore_errors = True 8<------------------------------------ Be sure to have
python-coverage and libselinux-python installed on your system and then
execute: OTOPI_COVERAGE=1 COVERAGE_PROCESS_START=<your config> <your
command>
where your config is the path to your config file and your command may be
engine-setup, engine-cleanup, ovirt-host-deploy, hosted-engine --deploy or
any other possible command which uses otopi framework.
In order to generate the report: coverage html -d coverage_html_report
This feature is already leveraged in otopi check-patches and check-merged
jobs: http://jenkins.ovirt.org/search/?q=otopi_master_check
More test jobs will possibly follow.
New tests to improve coverage, improvements to the coverage configuration,
suggestions, porting to other projects are welcome.
--
Sandro Bonazzola
Better technology. Faster innovation. Powered by community collaboration.
See how it works at redhat.com
8 years, 4 months
ovirt-image-upload fails with NFS error when using a local disk
by Lynn Dixon
Hey all,
I am using oVirt 4.0 and trying to upload an OVA into my export domain
using the ovirt-image-uploader tool.
My Export domain is a local disk on the host, and is NOT using NFS.
However, it seems the ovirt-image-uploader tool assumes my domain is NFS
and it will give the following error:
ERROR: mount.nfs: Failed to resolve server None: Name or service not known
Is there a way to import OVA's in to the Export domain when its using a
local disk?
*Lynn Dixon* | Red Hat Certified Architect #100-006-188
*Sr. Cloud Consultant* | Cloud Management Practice
Google Voice: 423-618-1414
Cell/Text: 423-774-3188
Click here to view my Certification Portfolio <http://red.ht/1XMX2Mi>
8 years, 4 months
Fwd: Build failed in Jenkins: ovirt_3.6_he-system-tests #339
by Pavel Zhukov
Hi,
Hosted engine is failed to install and for some reasons OST is not
failed because of that.
http://jenkins.ovirt.org/job/ovirt_3.6_he-system-tests/338/consoleFull
*08:15:52* |- /var/lib/cloud/instance/scripts/runcmd: line
2: /usr/bin/engine-setup: No such file or directory
*08:15:52* |- HE_APPLIANCE_ENGINE_SETUP_FAIL*08:15:52* [
ERROR ] Engine setup failed on the appliance*08:15:52* [ ERROR ]
Failed to execute stage 'Closing up': Engine setup failed on the
appliance Please check its log on the appliance. *08:15:52* [ INFO ]
Stage: Clean up*08:15:53* [ INFO ] Generating answer file
'/var/lib/ovirt-hosted-engine-setup/answers/answers-20160823041551.conf'*08:15:53*
[ INFO ] Stage: Pre-termination*08:15:53* [ INFO ] Stage:
Termination*08:15:53* [ ERROR ] Hosted Engine deployment failed: this
system is not reliable, please check the issue, fix and redeploy
---------- Forwarded message ----------
From: <jenkins(a)jenkins.phx.ovirt.org>
Date: Tue, Aug 23, 2016 at 2:30 PM
Subject: Build failed in Jenkins: ovirt_3.6_he-system-tests #339
To: sbonazzo(a)redhat.com, infra(a)ovirt.org, lveyde(a)redhat.com
See <http://jenkins.ovirt.org/job/ovirt_3.6_he-system-tests/339/changes>
Changes:
[Yaniv Kaul] Combine Engine and Storage VMs.
------------------------------------------
[...truncated 1228 lines...]
## rc = 1
##########################################################
##! ERROR vvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvv
##! Last 20 log enties: logs/mocker-fedora-23-x86_64.
fc23.he_basic_suite_3.6.sh/he_basic_suite_3.6.sh.log
##!
+ true
+ env_cleanup
+ echo '#########################'
#########################
+ local res=0
+ local uuid
+ echo '======== Cleaning up'
======== Cleaning up
+ [[ -e <http://jenkins.ovirt.org/job/ovirt_3.6_he-system-tests/ws/
ovirt-system-tests/deployment-he_basic_suite_3.6> ]]
+ echo '----------- Cleaning with lago'
----------- Cleaning with lago
+ lago --workdir <http://jenkins.ovirt.org/job/ovirt_3.6_he-system-tests/ws/
ovirt-system-tests/deployment-he_basic_suite_3.6> destroy --yes
--all-prefixes
+ echo '----------- Cleaning with lago done'
----------- Cleaning with lago done
+ [[ 0 != \0 ]]
+ echo '======== Cleanup done'
======== Cleanup done
+ exit 0
Took 2133 seconds
===================================
##!
##! ERROR ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
##!########################################################
##########################################################
Build step 'Execute shell' marked build as failure
Performing Post build task...
Match found for :.* : True
Logical operation result is TRUE
Running script : #!/bin/bash -xe
echo 'shell_scripts/system_tests.collect_logs.sh'
#
# Required jjb vars:
# version
#
VERSION=3.6
SUITE_TYPE=
WORKSPACE="$PWD"
OVIRT_SUITE="$SUITE_TYPE_suite_$VERSION"
TESTS_LOGS="$WORKSPACE/ovirt-system-tests/exported-artifacts"
rm -rf "$WORKSPACE/exported-artifacts"
mkdir -p "$WORKSPACE/exported-artifacts"
if [[ -d "$TESTS_LOGS" ]]; then
mv "$TESTS_LOGS/"* "$WORKSPACE/exported-artifacts/"
fi
[ovirt_3.6_he-system-tests] $ /bin/bash -xe /tmp/
hudson8442885515320324019.sh
+ echo shell_scripts/system_tests.collect_logs.sh
shell_scripts/system_tests.collect_logs.sh
+ VERSION=3.6
+ SUITE_TYPE=
+ WORKSPACE=<http://jenkins.ovirt.org/job/ovirt_3.6_he-system-tests/ws/>
+ OVIRT_SUITE=3.6
+ TESTS_LOGS=<http://jenkins.ovirt.org/job/ovirt_3.6_he-
system-tests/ws/ovirt-system-tests/exported-artifacts>
+ rm -rf <http://jenkins.ovirt.org/job/ovirt_3.6_he-system-tests/339/
artifact/exported-artifacts>
+ mkdir -p <http://jenkins.ovirt.org/job/ovirt_3.6_he-system-tests/339/
artifact/exported-artifacts>
+ [[ -d <http://jenkins.ovirt.org/job/ovirt_3.6_he-system-tests/ws/
ovirt-system-tests/exported-artifacts> ]]
+ mv <http://jenkins.ovirt.org/job/ovirt_3.6_he-system-tests/ws/
ovirt-system-tests/exported-artifacts/lago_logs> <
http://jenkins.ovirt.org/job/ovirt_3.6_he-system-tests/ws/
ovirt-system-tests/exported-artifacts/test_logs> <
http://jenkins.ovirt.org/job/ovirt_3.6_he-system-tests/339/
artifact/exported-artifacts/>
POST BUILD TASK : SUCCESS
END OF POST BUILD TASK : 0
Match found for :.* : True
Logical operation result is TRUE
Running script : #!/bin/bash -xe
echo "shell-scripts/mock_cleanup.sh"
shopt -s nullglob
WORKSPACE="$PWD"
# Make clear this is the cleanup, helps reading the jenkins logs
cat <<EOC
_______________________________________________________________________
#######################################################################
# #
# CLEANUP #
# #
#######################################################################
EOC
# Archive the logs, we want them anyway
logs=(
./*log
./*/logs
)
if [[ "$logs" ]]; then
tar cvzf exported-artifacts/logs.tgz "${logs[@]}"
rm -rf "${logs[@]}"
fi
# stop any processes running inside the chroot
failed=false
mock_confs=("$WORKSPACE"/*/mocker*)
# Clean current jobs mockroot if any
for mock_conf_file in "${mock_confs[@]}"; do
[[ "$mock_conf_file" ]] || continue
echo "Cleaning up mock $mock_conf"
mock_root="${mock_conf_file##*/}"
mock_root="${mock_root%.*}"
my_mock="/usr/bin/mock"
my_mock+=" --configdir=${mock_conf_file%/*}"
my_mock+=" --root=${mock_root}"
my_mock+=" --resultdir=$WORKSPACE"
#TODO: investigate why mock --clean fails to umount certain dirs
sometimes,
#so we can use it instead of manually doing all this.
echo "Killing all mock orphan processes, if any."
$my_mock \
--orphanskill \
|| {
echo "ERROR: Failed to kill orphans on $chroot."
failed=true
}
mock_root="$(\
grep \
-Po "(?<=config_opts\['root'\] = ')[^']*" \
"$mock_conf_file" \
)" || :
[[ "$mock_root" ]] || continue
mounts=($(mount | awk '{print $3}' | grep "$mock_root")) || :
if [[ "$mounts" ]]; then
echo "Found mounted dirs inside the chroot $chroot. Trying to
umount."
fi
for mount in "${mounts[@]}"; do
sudo umount --lazy "$mount" \
|| {
echo "ERROR: Failed to umount $mount."
failed=true
}
done
done
# Clean any leftover chroot from other jobs
for mock_root in /var/lib/mock/*; do
this_chroot_failed=false
mounts=($(mount | awk '{print $3}' | grep "$mock_root")) || :
if [[ "$mounts" ]]; then
echo "Found mounted dirs inside the chroot $mock_root." \
"Trying to umount."
fi
for mount in "${mounts[@]}"; do
sudo umount --lazy "$mount" \
|| {
echo "ERROR: Failed to umount $mount."
failed=true
this_chroot_failed=true
}
done
if ! $this_chroot_failed; then
sudo rm -rf "$mock_root"
fi
done
if $failed; then
echo "Aborting."
exit 1
fi
# remove mock system cache, we will setup proxies to do the caching and this
# takes lots of space between runs
shopt -u nullglob
sudo rm -Rf /var/cache/mock/*
# restore the permissions in the working dir, as sometimes it leaves files
# owned by root and then the 'cleanup workspace' from jenkins job fails to
# clean and breaks the jobs
sudo chown -R "$USER" "$WORKSPACE"
[ovirt_3.6_he-system-tests] $ /bin/bash -xe /tmp/
hudson2384388653446686967.sh
+ echo shell-scripts/mock_cleanup.sh
shell-scripts/mock_cleanup.sh
+ shopt -s nullglob
+ WORKSPACE=<http://jenkins.ovirt.org/job/ovirt_3.6_he-system-tests/ws/>
+ cat
_______________________________________________________________________
#######################################################################
# #
# CLEANUP #
# #
#######################################################################
+ logs=(./*log ./*/logs)
+ [[ -n ./ovirt-system-tests/logs ]]
+ tar cvzf exported-artifacts/logs.tgz ./ovirt-system-tests/logs
./ovirt-system-tests/logs/
./ovirt-system-tests/logs/mocker-fedora-23-x86_64.fc23.install_packages/
./ovirt-system-tests/logs/mocker-fedora-23-x86_64.fc23.
install_packages/stdout_stderr.log
./ovirt-system-tests/logs/mocker-fedora-23-x86_64.fc23.
install_packages/root.log
./ovirt-system-tests/logs/mocker-fedora-23-x86_64.fc23.
install_packages/state.log
./ovirt-system-tests/logs/mocker-fedora-23-x86_64.fc23.
install_packages/build.log
./ovirt-system-tests/logs/mocker-fedora-23-x86_64.fc23.he_
basic_suite_3.6.sh/
./ovirt-system-tests/logs/mocker-fedora-23-x86_64.fc23.
he_basic_suite_3.6.sh/he_basic_suite_3.6.sh.log
./ovirt-system-tests/logs/mocker-fedora-23-x86_64.fc23.init/
./ovirt-system-tests/logs/mocker-fedora-23-x86_64.fc23.
init/stdout_stderr.log
./ovirt-system-tests/logs/mocker-fedora-23-x86_64.fc23.init/root.log
./ovirt-system-tests/logs/mocker-fedora-23-x86_64.fc23.init/state.log
./ovirt-system-tests/logs/mocker-fedora-23-x86_64.fc23.init/build.log
./ovirt-system-tests/logs/mocker-fedora-23-x86_64.fc23.clean_rpmdb/
./ovirt-system-tests/logs/mocker-fedora-23-x86_64.fc23.
clean_rpmdb/stdout_stderr.log
+ rm -rf ./ovirt-system-tests/logs
+ failed=false
+ mock_confs=("$WORKSPACE"/*/mocker*)
+ for mock_conf_file in '"${mock_confs[@]}"'
+ [[ -n <http://jenkins.ovirt.org/job/ovirt_3.6_he-system-tests/ws/
ovirt-system-tests/mocker-fedora-23-x86_64.fc23.cfg> ]]
+ echo 'Cleaning up mock '
Cleaning up mock
+ mock_root=mocker-fedora-23-x86_64.fc23.cfg
+ mock_root=mocker-fedora-23-x86_64.fc23
+ my_mock=/usr/bin/mock
+ my_mock+=' --configdir=<http://jenkins.ovirt.org/job/ovirt_3.6_he-
system-tests/ws/ovirt-system-tests'>
+ my_mock+=' --root=mocker-fedora-23-x86_64.fc23'
+ my_mock+=' --resultdir=<http://jenkins.ovirt.org/job/ovirt_3.6_he-
system-tests/ws/'>
+ echo 'Killing all mock orphan processes, if any.'
Killing all mock orphan processes, if any.
+ /usr/bin/mock --configdir=<http://jenkins.ovirt.org/job/ovirt_3.6_he-
system-tests/ws/ovirt-system-tests> --root=mocker-fedora-23-x86_64.fc23
--resultdir=<http://jenkins.ovirt.org/job/ovirt_3.6_he-system-tests/ws/>
--orphanskill
WARNING: Could not find required logging config file: <
http://jenkins.ovirt.org/job/ovirt_3.6_he-system-tests/ws/
ovirt-system-tests/logging.ini.> Using default...
INFO: mock.py version 1.2.17 starting (python version = 3.4.3)...
Start: init plugins
INFO: selinux enabled
Finish: init plugins
Start: run
Finish: run
++ grep -Po '(?<=config_opts\['\''root'\''\] = '\'')[^'\'']*' <
http://jenkins.ovirt.org/job/ovirt_3.6_he-system-tests/ws/
ovirt-system-tests/mocker-fedora-23-x86_64.fc23.cfg>
+ mock_root=fedora-23-x86_64-235bec7d0621e95d1cae73d7cf9dc27c
+ [[ -n fedora-23-x86_64-235bec7d0621e95d1cae73d7cf9dc27c ]]
+ mounts=($(mount | awk '{print $3}' | grep "$mock_root"))
++ mount
++ awk '{print $3}'
++ grep fedora-23-x86_64-235bec7d0621e95d1cae73d7cf9dc27c
+ :
+ [[ -n '' ]]
+ false
+ shopt -u nullglob
+ sudo rm -Rf /var/cache/mock/fedora-23-x86_64-
235bec7d0621e95d1cae73d7cf9dc27c
+ sudo chown -R jenkins <http://jenkins.ovirt.org/job/
ovirt_3.6_he-system-tests/ws/>
POST BUILD TASK : SUCCESS
END OF POST BUILD TASK : 1
Recording test results
ERROR: Step ‘Publish JUnit test result report’ failed: No test report files
were found. Configuration error?
Archiving artifacts
_______________________________________________
Infra mailing list
Infra(a)ovirt.org
http://lists.ovirt.org/mailman/listinfo/infra
--
Pavel Zhukov
Software Engineer
RHEV Devops
IRC: landgraf
8 years, 4 months
Problems in engine pom files
by Sandro Bonazzola
It's a rainy Saturday and I'm getting annoyed so I'm looking at future and
looks like at least in terms of maven future we have bad weather as well:
[ERROR] [ERROR] Some problems were encountered while processing the POMs:
[ERROR] 'dependencies.dependency.(groupId:artifactId:type:classifier)' must
be unique: ${engine.groupId}:common:test-jar -> duplicate declaration of
version ${engine.version} @ org.ovirt.engine.core:utils:[unknown-version],
/builddir/build/BUILD/ovirt-engine-4.1.0/backend/manager/modules/utils/pom.xml,
line 155, column 17
[ERROR] 'build.plugins.plugin.(groupId:artifactId)' must be unique but
found duplicate declaration of plugin org.codehaus.mojo:exec-maven-plugin @
org.ovirt.engine.api:restapi-definition:[unknown-version],
/builddir/build/BUILD/ovirt-engine-4.1.0/backend/manager/modules/restapi/interface/definition/pom.xml,
line 213, column 15
[WARNING] 'build.plugins.plugin.version' for
org.apache.maven.plugins:maven-dependency-plugin is missing. @
org.ovirt.engine.api:restapi-definition:[unknown-version],
/builddir/build/BUILD/ovirt-engine-4.1.0/backend/manager/modules/restapi/interface/definition/pom.xml,
line 72, column 15
@
[ERROR] The build could not read 2 projects -> [Help 1]
[ERROR]
[ERROR] The project org.ovirt.engine.core:utils:4.1.0-SNAPSHOT
(/builddir/build/BUILD/ovirt-engine-4.1.0/backend/manager/modules/utils/pom.xml)
has 1 error
[ERROR] 'dependencies.dependency.(groupId:artifactId:type:classifier)'
must be unique: ${engine.groupId}:common:test-jar -> duplicate declaration
of version ${engine.version} @
org.ovirt.engine.core:utils:[unknown-version],
/builddir/build/BUILD/ovirt-engine-4.1.0/backend/manager/modules/utils/pom.xml,
line 155, column 17
[ERROR]
[ERROR] The project
org.ovirt.engine.api:restapi-definition:4.1.0-SNAPSHOT
(/builddir/build/BUILD/ovirt-engine-4.1.0/backend/manager/modules/restapi/interface/definition/pom.xml)
has 1 error
[ERROR] 'build.plugins.plugin.(groupId:artifactId)' must be unique but
found duplicate declaration of plugin org.codehaus.mojo:exec-maven-plugin @
org.ovirt.engine.api:restapi-definition:[unknown-version],
/builddir/build/BUILD/ovirt-engine-4.1.0/backend/manager/modules/restapi/interface/definition/pom.xml,
line 213, column 15
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e
switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions,
please read the following articles:
[ERROR] [Help 1]
http://cwiki.apache.org/confluence/display/MAVEN/ProjectBuildingException
(got building engine on fedora rawhide aka fc26)
--
Sandro Bonazzola
Better technology. Faster innovation. Powered by community collaboration.
See how it works at redhat.com
8 years, 4 months
qemu-kvm-ev-2.3.0-31.el7_2.21.1 available for testing on x86_64, ppc64le and aarch64
by Sandro Bonazzola
qemu-kvm-ev-2.3.0-31.el7_2.21.1 has been tagged for testing for CentOS Virt
SIG and is already in testing repositories.
It's now available for x86_64, ppc64le and aarch64.
Please help testing and providing feedback, thanks.
We plan to move to stable repo around Wednesday next week.
ChangeLog since previous release:
* Fri Aug 19 2016 Sandro Bonazzola <sbonazzo(a)redhat.com> -
ev-2.3.0-31.el7_2.21 - Removing RH branding from package name * Tue Aug 02
2016 Miroslav Rezanina <mrezanin(a)redhat.com> - rhev-2.3.0-31.el7_2.21 -
kvm-block-iscsi-avoid-potential-overflow-of-acb-task-cdb.patch [bz#1358997]
- Resolves: bz#1358997 (CVE-2016-5126 qemu-kvm-rhev: Qemu: block: iscsi:
buffer overflow in iscsi_aio_ioctl [rhel-7.2.z]) * Wed Jul 27 2016 Miroslav
Rezanina <mrezanin(a)redhat.com> - rhev-2.3.0-31.el7_2.20 -
kvm-virtio-error-out-if-guest-exceeds-virtqueue-size.patch [bz#1359731] -
Resolves: bz#1359731 (EMBARGOED CVE-2016-5403 qemu-kvm-rhev: Qemu: virtio:
unbounded memory allocation on host via guest leading to DoS [rhel-7.2.z])
* Wed Jul 20 2016 Miroslav Rezanina <mrezanin(a)redhat.com> -
rhev-2.3.0-31.el7_2.19 -
kvm-qemu-sockets-use-qapi_free_SocketAddress-in-cleanup.patch [bz#1354090]
- kvm-tap-use-an-exit-notifier-to-call-down_script.patch [bz#1354090] -
kvm-slirp-use-exit-notifier-for-slirp_smb_cleanup.patch [bz#1354090] -
kvm-net-do-not-use-atexit-for-cleanup.patch [bz#1354090] - Resolves:
bz#1354090 (Boot guest with vhostuser server mode, QEMU prompt
'Segmentation fault' after executing '(qemu)system_powerdown') * Fri Jul 08
2016 Miroslav Rezanina <mrezanin(a)redhat.com> - rhev-2.3.0-31.el7_2.18 -
kvm-vhost-user-disable-chardev-handlers-on-close.patch [bz#1351892] -
kvm-char-clean-up-remaining-chardevs-when-leaving.patch [bz#1351892] -
kvm-sockets-add-helpers-for-creating-SocketAddress-from-.patch [bz#1351892]
- kvm-socket-unlink-unix-socket-on-remove.patch [bz#1351892] -
kvm-char-do-not-use-atexit-cleanup-handler.patch [bz#1351892] - Resolves:
bz#1351892 (vhost-user: A socket file is not deleted after VM's port is
detached.) * Tue Jun 28 2016 Miroslav Rezanina <mrezanin(a)redhat.com> -
rhev-2.3.0-31.el7_2.17 -
kvm-vhost-user-set-link-down-when-the-char-device-is-clo.patch [bz#1348593]
- kvm-vhost-user-fix-use-after-free.patch [bz#1348593] -
kvm-vhost-user-test-fix-up-rhel6-build.patch [bz#1348593] -
kvm-vhost-user-test-fix-migration-overlap-test.patch [bz#1348593] -
kvm-vhost-user-test-fix-chardriver-race.patch [bz#1348593] -
kvm-vhost-user-test-use-unix-port-for-migration.patch [bz#1348593] -
kvm-vhost-user-test-fix-crash-with-glib-2.36.patch [bz#1348593] -
kvm-vhost-user-test-use-correct-ROM-to-speed-up-and-avoi.patch [bz#1348593]
- kvm-tests-append-i386-tests.patch [bz#1348593] -
kvm-vhost-user-add-ability-to-know-vhost-user-backend-di.patch [bz#1348593]
- kvm-qemu-char-add-qemu_chr_disconnect-to-close-a-fd-acce.patch
[bz#1348593] - kvm-vhost-user-disconnect-on-start-failure.patch
[bz#1348593] - kvm-vhost-net-do-not-crash-if-backend-is-not-present.patch
[bz#1348593] - kvm-vhost-net-save-restore-vhost-user-acked-features.patch
[bz#1348593] - kvm-vhost-net-save-restore-vring-enable-state.patch
[bz#1348593] - kvm-test-start-vhost-user-reconnect-test.patch [bz#1348593]
- Resolves: bz#1348593 (No recovery after vhost-user process restart)
--
Sandro Bonazzola
Better technology. Faster innovation. Powered by community collaboration.
See how it works at redhat.com
8 years, 4 months
Bundled jar files in backend subpackage
by Sandro Bonazzola
Hi,
looking at $ LC_ALL=C rpm -qlvp
http://resources.ovirt.org/pub/ovirt-master-snapshot/rpm/fc24/noarch/ovir...
jar |grep -v ^l |grep common
I see:
-rw-r--r-- 1 root root 77761 Aug 18 00:19
/usr/share/ovirt-engine/modules/common/com/netflix/config/main/archaius-core.jar
-rw-r--r-- 1 root root 16442 Aug 18 00:19
/usr/share/ovirt-engine/modules/common/com/netflix/hystrix/contrib/main/hystrix-metrics-event-stream.jar
-rw-r--r-- 1 root root 290223 Aug 18 00:19
/usr/share/ovirt-engine/modules/common/com/netflix/hystrix/main/hystrix-core.jar
-rw-r--r-- 1 root root 738300 Aug 18 00:19
/usr/share/ovirt-engine/modules/common/io/reactivex/rxjava/main/rxjava.jar
-rw-r--r-- 1 root root 33218 Aug 18 00:19
/usr/share/ovirt-engine/modules/common/org/apache/avalon/framework/main/avalon-framework-api.jar
-rw-r--r-- 1 root root 61021 Aug 18 00:19
/usr/share/ovirt-engine/modules/common/org/apache/avalon/framework/main/avalon-framework-impl.jar
-rw-r--r-- 1 root root 401858 Aug 18 00:19
/usr/share/ovirt-engine/modules/common/org/apache/xmlgraphics/batik/main/batik-awt-util.jar
-rw-r--r-- 1 root root 558892 Aug 18 00:19
/usr/share/ovirt-engine/modules/common/org/apache/xmlgraphics/batik/main/batik-bridge.jar
-rw-r--r-- 1 root root 310919 Aug 18 00:19
/usr/share/ovirt-engine/modules/common/org/apache/xmlgraphics/batik/main/batik-css.jar
-rw-r--r-- 1 root root 10257 Aug 18 00:19
/usr/share/ovirt-engine/modules/common/org/apache/xmlgraphics/batik/main/batik-ext.jar
-rw-r--r-- 1 root root 67900 Aug 18 00:19
/usr/share/ovirt-engine/modules/common/org/apache/xmlgraphics/batik/main/batik-extension.jar
-rw-r--r-- 1 root root 242866 Aug 18 00:19
/usr/share/ovirt-engine/modules/common/org/apache/xmlgraphics/batik/main/batik-gvt.jar
-rw-r--r-- 1 root root 601098 Aug 18 00:19
/usr/share/ovirt-engine/modules/common/org/apache/xmlgraphics/batik/main/batik-svg-dom.jar
-rw-r--r-- 1 root root 121997 Aug 18 00:19
/usr/share/ovirt-engine/modules/common/org/apache/xmlgraphics/batik/main/batik-transcoder.jar
-rw-r--r-- 1 root root 128286 Aug 18 00:19
/usr/share/ovirt-engine/modules/common/org/apache/xmlgraphics/batik/main/batik-util.jar
-rw-r--r-- 1 root root 569113 Aug 18 00:19
/usr/share/ovirt-engine/modules/common/org/apache/xmlgraphics/commons/main/xmlgraphics-commons.jar
-rw-r--r-- 1 root root 3079811 Aug 18 00:19
/usr/share/ovirt-engine/modules/common/org/apache/xmlgraphics/fop/main/fop.jar
-rw-r--r-- 1 root root 6071 Aug 18 00:19
/usr/share/ovirt-engine/modules/common/org/ovirt/engine/api/metamodel-server/main/metamodel-server.jar
-rw-r--r-- 1 root root 8225 Aug 18 00:20
/usr/share/ovirt-engine/modules/common/org/ovirt/engine/core/auth-plugin/main/auth-plugin.jar
-rw-r--r-- 1 root root 4012 Aug 18 00:22
/usr/share/ovirt-engine/modules/common/org/ovirt/engine/core/logger/main/logger.jar
-rw-r--r-- 1 root root 370051 Aug 18 00:19
/usr/share/ovirt-engine/modules/common/org/springframework/main/spring-aop.jar
-rw-r--r-- 1 root root 731512 Aug 18 00:19
/usr/share/ovirt-engine/modules/common/org/springframework/main/spring-beans.jar
-rw-r--r-- 1 root root 1097552 Aug 18 00:19
/usr/share/ovirt-engine/modules/common/org/springframework/main/spring-context.jar
-rw-r--r-- 1 root root 1078737 Aug 18 00:19
/usr/share/ovirt-engine/modules/common/org/springframework/main/spring-core.jar
-rw-r--r-- 1 root root 262990 Aug 18 00:19
/usr/share/ovirt-engine/modules/common/org/springframework/main/spring-expression.jar
-rw-r--r-- 1 root root 7243 Aug 18 00:19
/usr/share/ovirt-engine/modules/common/org/springframework/main/spring-instrument.jar
-rw-r--r-- 1 root root 423369 Aug 18 00:19
/usr/share/ovirt-engine/modules/common/org/springframework/main/spring-jdbc.jar
-rw-r--r-- 1 root root 265523 Aug 18 00:19
/usr/share/ovirt-engine/modules/common/org/springframework/main/spring-tx.jar
Are there chances we can replace (some of the) above bundled jar files with
symlinks to system provided jar as we did for the other ones (at least on
fc24)?
--
Sandro Bonazzola
Better technology. Faster innovation. Powered by community collaboration.
See how it works at redhat.com
8 years, 4 months
Engine 4.0.3 stable branch created
by Tal Nisan
Hi everyone.
A branch for 4.0.3 was created today (ovirt-engine-4.0.3), please push your
4.0.3 patches to that branch.
The branch was created from the 4.0.2 branch so if you had 4.0.3 patches
merged today to the 4.0.2 branch they made it in, namely:
4035211563b526d49f2388e81cc8933df50a7ab5 packaging: setup: Enroll missing
pki on upgrade
270b87be8897ea948adde632416c122e19bb009c packaging: spec: Require
ovirt-imageio-proxy-setup
f2a688a05bb64a77bfcdbfeae14849e02faa552f engine: Set default switchType if
none is received from vdsm
8 years, 4 months