Re: [ovirt-users] oVirt HA behavior
by Alex K
Enabling VM leases could be an answer to this. Will test tomorrow.
Thanx,
Alex
On Sep 18, 2017 7:50 PM, "Alex K" <rightkicktech(a)gmail.com> wrote:
Hi All,
I have the following issue with the HA behavior of oVirt 4.1 and need to
check with you if there is any work around from your experience.
I have 3 servers (A, B, C) with hosted engine in self hosted setup on top
gluster with replica 3 + 1 arbiter. All good except one point:
The hosts have been configured with power management using IPMI (server
iLO).
If I disconnect power from one host (say C) (or disconnect all network
cables of the host) the two other hosts go to a loop where they try to
verify the status of the host C by issuing power management commands to the
host C. Since power of host is off the server iLO does not respond on the
network and the power management of host C fails, leaving the VMs that were
running on the host C in an unknown state and they are never restarted to
the other hosts.
Is there any fencing option to change this behavior so as if both available
hosts fail to do power management of the unresponsive host to decide that
the host is down and to restart the VMs of that host to the other available
hosts.
I could also add additional power management through UPS to avoid this
issue but this is not currently an option and I am interested to see if
this behavior can be tweaked.
Thanx,
Alex
7 years, 3 months
Upgrade 4 --> 4.1 OVirt PKI problem
by Lionel Caignec
Hi,
i've just upgraded to ovirt 4.1 but with some problem, in short ovirt regenerate all the pki infrastructure and now i've 2 problem :
1) If i start ovirt just after install, i can log in the GUI but all my host are unvailable with ssl communication problem.
2) If i restore my old "/etc/pki/ovirt-engine" my hosts seems to communicate (engine.log) but i can't connect to the GUI.
I've this warning "sun.security.validator.ValidatorException: PKIX path building failed: sun.security.provider.certpath.SunCertPathBuilderException: unable to find valid certification path to requested target".
Moreover i've running production vm on my host that i cannot poweroff. Someone have an idea?
Thank you.
Lionel
7 years, 3 months
how to remove or stop a stuck "transferring via API" in disk tab
by Nathanaël Blanchet
After trying the upload_disk.py API, some aborted test still appear in
the webadmin in a tranferring state. I get those many event messages :
VDSM gaua1.v100.abes.fr command ExtendImageTicketVDS failed: Image
daemon request failed: u'status=404, code=404, title=Not Found,
explanation=The resource could not be found., detail=No such ticket:
147ddf0d-7d25-4bd2-bb4a-91d945872bca, reason=Not Found'
How to solve this?
--
Nathanaël Blanchet
Supervision réseau
Pôle Infrastrutures Informatiques
227 avenue Professeur-Jean-Louis-Viala
34193 MONTPELLIER CEDEX 5
Tél. 33 (0)4 67 54 84 55
Fax 33 (0)4 67 54 84 14
blanchet(a)abes.fr
7 years, 3 months
AcquireHostIdFailure and code 661
by Neil
Hi guys,
Please could someone shed some light on this issue I'm facing.
I'm trying to add a new NFS storage domain but when I try add it, I get a
message saying "Acquire hostID failed" and it fails to add.
I can mount the NFS share manually and I can see that once the attaching
has failed the NFS share is still mounted on the hosts, as per the
following...
172.16.0.11:/raid1/data/_NAS_NFS_Exports_/STOR2 on
/rhev/data-center/mnt/172.16.0.11:_raid1_data___NAS__NFS__Exports___STOR2
type nfs
(rw,soft,nosharecache,timeo=600,retrans=6,nfsvers=3,addr=172.16.0.11)
Also looking at the folders on the NFS share I can see that some data has
been written, so it's not a permissions issue...
drwx---r-x+ 4 vdsm kvm 4096 Sep 11 16:08
16ab135b-0362-4d7e-bb11-edf5b93535d5
-rwx---rwx. 1 vdsm kvm 0 Sep 11 16:08 __DIRECT_IO_TEST__
I have just upgraded from 3.3 to 3.5 as well as upgraded my 3 hosts in the
hope it's a known bug, but I'm still encountering the same problem.
It's not a hosted engine and you might see in the logs that I have a
storage domain that is out of space which I'm aware of, and I'm hoping the
system using this space will be decommissioned in 2 days....
Filesystem Size Used Avail Use% Mounted on
/dev/sda2 420G 2.2G 413G 1% /
tmpfs 48G 0 48G 0% /dev/shm
172.16.0.10:/raid0/data/_NAS_NFS_Exports_/RAID1_1TB
915G 915G 424M 100%
/rhev/data-center/mnt/172.16.0.10:
_raid0_data___NAS__NFS__Exports___RAID1__1TB
172.16.0.10:/raid0/data/_NAS_NFS_Exports_/STORAGE1
5.5T 3.7T 1.8T 67%
/rhev/data-center/mnt/172.16.0.10:_raid0_data___NAS__NFS__Exports___STORAGE1
172.16.0.20:/data/ov-export
3.6T 2.3T 1.3T 65%
/rhev/data-center/mnt/172.16.0.20:_data_ov-export
172.16.0.11:/raid1/data/_NAS_NFS_Exports_/4TB
3.6T 2.0T 1.6T 56%
/rhev/data-center/mnt/172.16.0.11:_raid1_data___NAS__NFS__Exports___4TB
172.16.0.253:/var/lib/exports/iso
193G 42G 141G 23%
/rhev/data-center/mnt/172.16.0.253:_var_lib_exports_iso
172.16.0.11:/raid1/data/_NAS_NFS_Exports_/STOR2
5.5T 3.7G 5.5T 1%
/rhev/data-center/mnt/172.16.0.11:_raid1_data___NAS__NFS__Exports___STOR2
The "STOR2" above is left mounted after attempting to add the new NFS
storage domain.
Engine details:
Fedora release 19 (Schrödinger’s Cat)
ovirt-engine-dbscripts-3.5.0.1-1.fc19.noarch
ovirt-release34-1.0.3-1.noarch
ovirt-image-uploader-3.5.0-1.fc19.noarch
ovirt-engine-websocket-proxy-3.5.0.1-1.fc19.noarch
ovirt-log-collector-3.5.0-1.fc19.noarch
ovirt-release35-006-1.noarch
ovirt-engine-setup-3.5.0.1-1.fc19.noarch
ovirt-release33-1.0.0-0.1.master.noarch
ovirt-engine-tools-3.5.0.1-1.fc19.noarch
ovirt-engine-lib-3.5.0.1-1.fc19.noarch
ovirt-engine-sdk-python-3.5.0.8-1.fc19.noarch
ovirt-host-deploy-java-1.3.0-1.fc19.noarch
ovirt-engine-backend-3.5.0.1-1.fc19.noarch
sos-3.1-1.1.fc19.ovirt.noarch
ovirt-engine-setup-base-3.5.0.1-1.fc19.noarch
ovirt-engine-extensions-api-impl-3.5.0.1-1.fc19.noarch
ovirt-engine-webadmin-portal-3.5.0.1-1.fc19.noarch
ovirt-engine-setup-plugin-ovirt-engine-3.5.0.1-1.fc19.noarch
ovirt-iso-uploader-3.5.0-1.fc19.noarch
ovirt-host-deploy-1.3.0-1.fc19.noarch
ovirt-engine-setup-plugin-ovirt-engine-common-3.5.0.1-1.fc19.noarch
ovirt-engine-3.5.0.1-1.fc19.noarch
ovirt-engine-setup-plugin-websocket-proxy-3.5.0.1-1.fc19.noarch
ovirt-engine-userportal-3.5.0.1-1.fc19.noarch
ovirt-engine-cli-3.5.0.5-1.fc19.noarch
ovirt-engine-restapi-3.5.0.1-1.fc19.noarch
libvirt-daemon-driver-nwfilter-1.1.3.2-1.fc19.x86_64
libvirt-daemon-driver-qemu-1.1.3.2-1.fc19.x86_64
libvirt-daemon-driver-libxl-1.1.3.2-1.fc19.x86_64
libvirt-daemon-driver-secret-1.1.3.2-1.fc19.x86_64
libvirt-daemon-config-network-1.1.3.2-1.fc19.x86_64
libvirt-daemon-driver-storage-1.1.3.2-1.fc19.x86_64
libvirt-daemon-driver-network-1.1.3.2-1.fc19.x86_64
libvirt-1.1.3.2-1.fc19.x86_64
libvirt-daemon-kvm-1.1.3.2-1.fc19.x86_64
libvirt-client-1.1.3.2-1.fc19.x86_64
libvirt-daemon-driver-nodedev-1.1.3.2-1.fc19.x86_64
libvirt-daemon-driver-uml-1.1.3.2-1.fc19.x86_64
libvirt-daemon-driver-xen-1.1.3.2-1.fc19.x86_64
libvirt-daemon-driver-interface-1.1.3.2-1.fc19.x86_64
libvirt-daemon-config-nwfilter-1.1.3.2-1.fc19.x86_64
libvirt-daemon-1.1.3.2-1.fc19.x86_64
libvirt-daemon-qemu-1.1.3.2-1.fc19.x86_64
libvirt-daemon-driver-vbox-1.1.3.2-1.fc19.x86_64
libvirt-daemon-driver-lxc-1.1.3.2-1.fc19.x86_64
qemu-system-lm32-1.4.2-15.fc19.x86_64
qemu-system-s390x-1.4.2-15.fc19.x86_64
libvirt-daemon-driver-qemu-1.1.3.2-1.fc19.x86_64
qemu-system-ppc-1.4.2-15.fc19.x86_64
qemu-user-1.4.2-15.fc19.x86_64
qemu-system-x86-1.4.2-15.fc19.x86_64
qemu-system-unicore32-1.4.2-15.fc19.x86_64
qemu-system-mips-1.4.2-15.fc19.x86_64
qemu-system-or32-1.4.2-15.fc19.x86_64
qemu-system-m68k-1.4.2-15.fc19.x86_64
qemu-img-1.4.2-15.fc19.x86_64
qemu-kvm-1.4.2-15.fc19.x86_64
qemu-system-xtensa-1.4.2-15.fc19.x86_64
qemu-1.4.2-15.fc19.x86_64
qemu-system-microblaze-1.4.2-15.fc19.x86_64
qemu-system-alpha-1.4.2-15.fc19.x86_64
libvirt-daemon-qemu-1.1.3.2-1.fc19.x86_64
qemu-system-arm-1.4.2-15.fc19.x86_64
qemu-common-1.4.2-15.fc19.x86_64
ipxe-roms-qemu-20130517-2.gitc4bce43.fc19.noarch
qemu-system-sh4-1.4.2-15.fc19.x86_64
qemu-system-cris-1.4.2-15.fc19.x86_64
qemu-system-sparc-1.4.2-15.fc19.x86_64
libvirt-daemon-kvm-1.1.3.2-1.fc19.x86_64
qemu-kvm-1.4.2-15.fc19.x86_64
Host Details:
CentOS release 6.9 (Final)
vdsm-yajsonrpc-4.16.30-0.el6.noarch
vdsm-python-4.16.30-0.el6.noarch
vdsm-4.16.30-0.el6.x86_64
vdsm-cli-4.16.30-0.el6.noarch
vdsm-jsonrpc-4.16.30-0.el6.noarch
vdsm-python-zombiereaper-4.16.30-0.el6.noarch
vdsm-xmlrpc-4.16.30-0.el6.noarch
srvadmin-itunnelprovider-7.4.0-4.14.1.el6.x86_64
ovirt-release34-1.0.3-1.noarch
ovirt-release33-1.0.0-0.1.master.noarch
ovirt-release35-006-1.noarch
qemu-kvm-rhev-tools-0.12.1.2-2.479.el6_7.2.x86_64
qemu-kvm-rhev-0.12.1.2-2.479.el6_7.2.x86_64
[root@ovhost3 ~]# rpm -qa | grep -i qemu
qemu-kvm-rhev-tools-0.12.1.2-2.479.el6_7.2.x86_64
qemu-img-rhev-0.12.1.2-2.479.el6_7.2.x86_64
gpxe-roms-qemu-0.9.7-6.16.el6.noarch
qemu-kvm-rhev-0.12.1.2-2.479.el6_7.2.x86_64
libvirt-lock-sanlock-0.10.2-62.el6.x86_64
libvirt-client-0.10.2-62.el6.x86_64
libvirt-0.10.2-62.el6.x86_64
libvirt-python-0.10.2-62.el6.x86_64
I have tried renaming the NFS share and as well as unmounting it manually
with a -l option (because it says it's busy when unmounting it from the
hosts after deleting it from my DC) and I've restarted all hosts after
upgrading too.
Google reveals lots of similar problems but none of the options tried seem
to work. I have recently tried enabling selinux as well because I did have
it disabled on hosts and engine.
Any assistance is appreciated.
Thank you.
Regards.
Neil Wilson.
7 years, 3 months
Excluding Hosts from 'use any host' assignment
by Mark Steele
Hello,
We have recently added two new hosts to our Ovirt cluster and I would like
to exclude them from accepting any virtual machines starting on them unless
the VM configuration is set to start from one of those two hosts
specifically.
I am not clear if this can be accomplished from the web interface and am
looking for guidance.
Best regards,
***
*Mark Steele*
CIO / VP Technical Operations | TelVue Corporation
TelVue - We Share Your Vision
16000 Horizon Way, Suite 100 | Mt. Laurel, NJ 08054
800.885.8886 x128 | msteele(a)telvue.com | http://www.telvue.com
twitter: http://twitter.com/telvue | facebook:
https://www.facebook.com/telvue
7 years, 3 months
ovirt upgrade problem
by gabriel_skupien@o2.pl
Hi, While trying to upgrade my ovirt node from 4.1.5 to 4.1.6 I get the below errors. Can you help? . . . 2017-09-18 09:25:30 DEBUG otopi.plugins.otopi.packagers. yumpackager.verbose:76 Yum Downloading: ovirt-node-ng-image-update-4.1 550 M(99%) 2017-09-18 09:25:30 DEBUG otopi.plugins.otopi.packagers. yumpackager.verbose:76 Yum Downloading: ovirt-node-ng-image-update-4.1 551 M(99%) 2017-09-18 09:25:30 DEBUG otopi.plugins.otopi.packagers. yumpackager.verbose:76 Yum Downloading: ovirt-node-ng-image-update-4.1 551 M(99%) 2017-09-18 09:25:31 INFO otopi.plugins.otopi.packagers. yumpackager.info:80 Yum Download/Verify: ovirt-node-ng-image-update-4.1 2017-09-18 09:25:35 DEBUG otopi.plugins.otopi.packagers. yumpackager.verbose:76 Yum Downloading: ovirt-node-ng-image-update-4.1 551 M(100%) 2017-09-18 09:25:35 INFO otopi.plugins.otopi.packagers. yumpackager.info:80 Yum Status: Check Package Signatures 2017-09-18 09:25:35 INFO otopi.plugins.otopi.packagers. yumpackager.info:80 Yum Status: Running Test Transaction Running Transaction Check 2017-09-18 09:25:35 INFO otopi.plugins.otopi.packagers. yumpackager.info:80 Yum Status: Running Transaction 2017-09-18 09:25:35 INFO otopi.plugins.otopi.packagers. yumpackager.info:80 Yum obsoleting: 1/3: ovirt-node-ng-image-update-4.1 2017-09-18 09:32:11 DEBUG otopi.plugins.otopi.packagers. yumpackager.verbose:76 Yum Script sink: warning: %post(ovirt-node-ng-image-upda scriptlet failed, exit status 1 2017-09-18 09:32:11 DEBUG otopi.plugins.otopi.packagers. yumpackager.verbose:76 Yum Done: ovirt-node-ng-image-update.noa 0:4.1.6-0.2.rc2.20170830082757 - u 2017-09-18 09:32:11 ERROR otopi.plugins.otopi.packagers. yumpackager.error:85 Yum Non-fatal POSTIN scriptlet failure in rpm package ovirt-node-ng-image-update-4.1 2017-09-18 09:32:11 DEBUG otopi.plugins.otopi.packagers. yumpackager.verbose:76 Yum Done: ovirt-node-ng-image-update-4.1 2017-09-18 09:32:11 DEBUG otopi.plugins.otopi.packagers. yumpackager.verbose:76 Yum Done: ovirt-node-ng-image-update-4.1 2017-09-18 09:32:11 INFO otopi.plugins.otopi.packagers. yumpackager.info:80 Yum erase: 2/3: ovirt-node-ng-image-update-pla 2017-09-18 09:32:11 DEBUG otopi.plugins.otopi.packagers. yumpackager.verbose:76 Yum Done: ovirt-node-ng-image-update-pla 2017-09-18 09:32:11 INFO otopi.plugins.otopi.packagers. yumpackager.info:80 Yum updated: 3/3: ovirt-node-ng-image-update 2017-09-18 09:32:11 DEBUG otopi.plugins.otopi.packagers. yumpackager.verbose:76 Yum Done: ovirt-node-ng-image-update-4.1 2017-09-18 09:32:12 INFO otopi.plugins.otopi.packagers. yumpackager.info:80 Yum Verify: 1/3: ovirt-node-ng-image-update.noa 0:4.1.6-0.2.rc2.20170830082757 - u 2017-09-18 09:32:12 INFO otopi.plugins.otopi.packagers. yumpackager.info:80 Yum Verify: 2/3: ovirt-node-ng-image-update-pla 0:4.1.5-1.el7.centos - od 2017-09-18 09:32:12 INFO otopi.plugins.otopi.packagers. yumpackager.info:80 Yum Verify: 3/3: ovirt-node-ng-image-update.noa 0:4.1.5-1.el7.centos - ud 2017-09-18 09:32:12 DEBUG otopi.plugins.otopi.packagers. yumpackager.verbose:76 Yum Transaction processed 2017-09-18 09:32:12 DEBUG otopi.context context._executeMethod:142 method exception Traceback (most recent call last): File "/tmp/ovirt-kRPMlHbiO5/pythonl line 132, in _executeMethod method['method']() File "/tmp/ovirt-kRPMlHbiO5/otopi-p line 261, in _packages self._miniyum.processTransacti File "/tmp/ovirt-kRPMlHbiO5/pythonl line 1049, in processTransaction _('One or more elements within Yum transaction failed') RuntimeError: One or more elements within Yum transaction failed 2017-09-18 09:32:12 ERROR otopi.context context._executeMethod:151 Failed to execute stage 'Package installation': One or more elements within Yum transaction failed 2017-09-18 09:32:12 DEBUG otopi.transaction transaction.abort:119 aborting 'Yum Transaction' 2017-09-18 09:32:12 INFO otopi.plugins.otopi.packagers. yumpackager.info:80 Yum Performing yum transaction rollback 2017-09-18 09:32:12 DEBUG otopi.plugins.otopi.packagers. yumpackager.verbose:76 Yum Downloading: centos-opstools-release/7/x86_ (0%) 2017-09-18 09:32:12 DEBUG otopi.plugins.otopi.packagers. yumpackager.verbose:76 Yum Downloading: centos-opstools-release/7/x86_ 300 k(84%) 2017-09-18 09:32:13 DEBUG otopi.plugins.otopi.packagers. yumpackager.verbose:76 Yum Downloading: centos-opstools-release/7/x86_ 355 k(100%) 2017-09-18 09:32:13 DEBUG otopi.plugins.otopi.packagers. yumpackager.verbose:76 Yum Downloading: centos-opstools-release/7/x86_ (0%) 2017-09-18 09:32:13 DEBUG otopi.plugins.otopi.packagers. yumpackager.verbose:76 Yum Downloading: centos-opstools-release/7/x86_ 50 k(100%) 2017-09-18 09:32:14 DEBUG otopi.plugins.otopi.packagers. yumpackager.verbose:76 Yum Downloading: ovirt-4.1/7/filelists_db (0%) 2017-09-18 09:32:14 DEBUG otopi.plugins.otopi.packagers. yumpackager.verbose:76 Yum Downloading: ovirt-4.1/7/filelists_db 41 k(4%) 2017-09-18 09:32:14 DEBUG otopi.plugins.otopi.packagers. yumpackager.verbose:76 Yum Downloading: ovirt-4.1/7/filelists_db 192 k(21%) 2017-09-18 09:32:15 DEBUG otopi.plugins.otopi.packagers. yumpackager.verbose:76 Yum Downloading: ovirt-4.1/7/filelists_db 416 k(45%) 2017-09-18 09:32:15 DEBUG otopi.plugins.otopi.packagers. yumpackager.verbose:76 Yum Downloading: ovirt-4.1/7/filelists_db 545 k(60%) 2017-09-18 09:32:15 DEBUG otopi.plugins.otopi.packagers. yumpackager.verbose:76 Yum Downloading: ovirt-4.1/7/filelists_db 716 k(79%) 2017-09-18 09:32:15 DEBUG otopi.plugins.otopi.packagers. yumpackager.verbose:76 Yum Downloading: ovirt-4.1/7/filelists_db 899 k(99%) 2017-09-18 09:32:15 DEBUG otopi.plugins.otopi.packagers. yumpackager.verbose:76 Yum Downloading: ovirt-4.1/7/filelists_db 904 k(100%) 2017-09-18 09:32:16 DEBUG otopi.plugins.otopi.packagers. yumpackager.verbose:76 Yum Downloading: ovirt-4.1/7/other_db (0%) 2017-09-18 09:32:17 DEBUG otopi.plugins.otopi.packagers. yumpackager.verbose:76 Yum Downloading: ovirt-4.1/7/other_db 41 k(16%) 2017-09-18 09:32:17 DEBUG otopi.plugins.otopi.packagers. yumpackager.verbose:76 Yum Downloading: ovirt-4.1/7/other_db 171 k(68%) 2017-09-18 09:32:17 DEBUG otopi.plugins.otopi.packagers. yumpackager.verbose:76 Yum Downloading: ovirt-4.1/7/other_db 249 k(100%) 2017-09-18 09:32:17 DEBUG otopi.plugins.otopi.packagers. yumpackager.verbose:76 Yum Downloading: ovirt-4.1-centos-gluster38/x86 (0%) 2017-09-18 09:32:17 DEBUG otopi.plugins.otopi.packagers. yumpackager.verbose:76 Yum Downloading: ovirt-4.1-centos-gluster38/x86 18 k(100%) 2017-09-18 09:32:18 DEBUG otopi.plugins.otopi.packagers. yumpackager.verbose:76 Yum Downloading: ovirt-4.1-centos-gluster38/x86 (0%) 2017-09-18 09:32:18 DEBUG otopi.plugins.otopi.packagers. yumpackager.verbose:76 Yum Downloading: ovirt-4.1-centos-gluster38/x86 7.6 k(100%) 2017-09-18 09:32:18 DEBUG otopi.plugins.otopi.packagers. yumpackager.verbose:76 Yum Downloading: ovirt-4.1-epel/x86_64/filelist (0%) 2017-09-18 09:32:18 DEBUG otopi.plugins.otopi.packagers. yumpackager.verbose:76 Yum Downloading: ovirt-4.1-epel/x86_64/filelist 668 k(8%) 2017-09-18 09:32:18 DEBUG otopi.plugins.otopi.packagers. yumpackager.verbose:76 Yum Downloading: ovirt-4.1-epel/x86_64/filelist 1.2 M(15%) 2017-09-18 09:32:19 DEBUG otopi.plugins.otopi.packagers. yumpackager.verbose:76 Yum Downloading: ovirt-4.1-epel/x86_64/filelist 1.9 M(23%) 2017-09-18 09:32:19 DEBUG otopi.plugins.otopi.packagers. yumpackager.verbose:76 Yum Downloading: ovirt-4.1-epel/x86_64/filelist 2.6 M(31%) 2017-09-18 09:32:19 DEBUG otopi.plugins.otopi.packagers. yumpackager.verbose:76 Yum Downloading: ovirt-4.1-epel/x86_64/filelist 3.2 M(39%) 2017-09-18 09:32:20 DEBUG otopi.plugins.otopi.packagers. yumpackager.verbose:76 Yum Downloading: ovirt-4.1-epel/x86_64/filelist 3.7 M(46%) 2017-09-18 09:32:20 DEBUG otopi.plugins.otopi.packagers. yumpackager.verbose:76 Yum Downloading: ovirt-4.1-epel/x86_64/filelist 4.3 M(53%) 2017-09-18 09:32:20 DEBUG otopi.plugins.otopi.packagers. yumpackager.verbose:76 Yum Downloading: ovirt-4.1-epel/x86_64/filelist 5.0 M(62%) 2017-09-18 09:32:21 DEBUG otopi.plugins.otopi.packagers. yumpackager.verbose:76 Yum Downloading: ovirt-4.1-epel/x86_64/filelist 5.6 M(69%) 2017-09-18 09:32:21 DEBUG otopi.plugins.otopi.packagers. yumpackager.verbose:76 Yum Downloading: ovirt-4.1-epel/x86_64/filelist 6.2 M(77%) 2017-09-18 09:32:21 DEBUG otopi.plugins.otopi.packagers. yumpackager.verbose:76 Yum Downloading: ovirt-4.1-epel/x86_64/filelist 6.8 M(84%) 2017-09-18 09:32:21 DEBUG otopi.plugins.otopi.packagers. yumpackager.verbose:76 Yum Downloading: ovirt-4.1-epel/x86_64/filelist 7.4 M(92%) 2017-09-18 09:32:22 DEBUG otopi.plugins.otopi.packagers. yumpackager.verbose:76 Yum Downloading: ovirt-4.1-epel/x86_64/filelist 8.1 M(100%) 2017-09-18 09:32:23 DEBUG otopi.plugins.otopi.packagers. yumpackager.verbose:76 Yum Downloading: ovirt-4.1-epel/x86_64/other_db (0%) 2017-09-18 09:32:24 DEBUG otopi.plugins.otopi.packagers. yumpackager.verbose:76 Yum Downloading: ovirt-4.1-epel/x86_64/other_db 642 k(28%) 2017-09-18 09:32:24 DEBUG otopi.plugins.otopi.packagers. yumpackager.verbose:76 Yum Downloading: ovirt-4.1-epel/x86_64/other_db 1.3 M(57%) 2017-09-18 09:32:24 DEBUG otopi.plugins.otopi.packagers. yumpackager.verbose:76 Yum Downloading: ovirt-4.1-epel/x86_64/other_db 1.9 M(88%) 2017-09-18 09:32:25 DEBUG otopi.plugins.otopi.packagers. yumpackager.verbose:76 Yum Downloading: ovirt-4.1-epel/x86_64/other_db 2.2 M(100%) 2017-09-18 09:32:25 DEBUG otopi.plugins.otopi.packagers. yumpackager.verbose:76 Yum Downloading: ovirt-4.1-patternfly1-noarch-e (0%) 2017-09-18 09:32:25 DEBUG otopi.plugins.otopi.packagers. yumpackager.verbose:76 Yum Downloading: ovirt-4.1-patternfly1-noarch-e 6.5 k(100%) 2017-09-18 09:32:25 DEBUG otopi.plugins.otopi.packagers. yumpackager.verbose:76 Yum Downloading: ovirt-4.1-patternfly1-noarch-e (0%) 2017-09-18 09:32:25 DEBUG otopi.plugins.otopi.packagers. yumpackager.verbose:76 Yum Downloading: ovirt-4.1-patternfly1-noarch-e 851 (100%) 2017-09-18 09:32:26 DEBUG otopi.plugins.otopi.packagers. yumpackager.verbose:76 Yum Downloading: ovirt-centos-ovirt41/7/x86_64/ (0%) 2017-09-18 09:32:26 DEBUG otopi.plugins.otopi.packagers. yumpackager.verbose:76 Yum Downloading: ovirt-centos-ovirt41/7/x86_64/ 207 k(100%) 2017-09-18 09:32:26 DEBUG otopi.plugins.otopi.packagers. yumpackager.verbose:76 Yum Downloading: ovirt-centos-ovirt41/7/x86_64/ (0%) 2017-09-18 09:32:26 DEBUG otopi.plugins.otopi.packagers. yumpackager.verbose:76 Yum Downloading: ovirt-centos-ovirt41/7/x86_64/ 66 k(100%) 2017-09-18 09:32:27 DEBUG otopi.plugins.otopi.packagers. yumpackager.verbose:76 Yum Downloading: sac-gdeploy/x86_64/filelists_d (0%) 2017-09-18 09:32:27 DEBUG otopi.plugins.otopi.packagers. yumpackager.verbose:76 Yum Downloading: sac-gdeploy/x86_64/filelists_d 4.6 k(100%) 2017-09-18 09:32:27 DEBUG otopi.plugins.otopi.packagers. yumpackager.verbose:76 Yum Downloading: sac-gdeploy/x86_64/other_db (0%) 2017-09-18 09:32:27 DEBUG otopi.plugins.otopi.packagers. yumpackager.verbose:76 Yum Downloading: sac-gdeploy/x86_64/other_db 1.5 k(100%) 2017-09-18 09:32:28 DEBUG otopi.plugins.otopi.packagers. yumpackager.verbose:76 Yum Downloading: virtio-win-stable/filelists_db (0%) 2017-09-18 09:32:28 DEBUG otopi.plugins.otopi.packagers. yumpackager.verbose:76 Yum Downloading: virtio-win-stable/filelists_db 3.9 k(100%) 2017-09-18 09:32:29 DEBUG otopi.plugins.otopi.packagers. yumpackager.verbose:76 Yum Downloading: virtio-win-stable/other_db (0%) 2017-09-18 09:32:29 DEBUG otopi.plugins.otopi.packagers. yumpackager.verbose:76 Yum Downloading: virtio-win-stable/other_db 4.3 k(100%) 2017-09-18 09:32:29 ERROR otopi.plugins.otopi.packagers. yumpackager.error:85 Yum Transaction close failed: Traceback (most recent call last): File "/tmp/ovirt-kRPMlHbiO5/pythonl line 761, in endTransaction if self._yb.history_undo(transact File "/usr/lib/python2.7/site-packa line 6086, in history_undo if self.install(pkgtup=pkg.pkgtup File "/usr/lib/python2.7/site-packa line 4910, in install raise Errors.InstallError, _('No package(s) available to install') InstallError: No package(s) available to install Loaded plugins: fastestmirror 2017-09-18 09:32:29 DEBUG otopi.context context.dumpEnvironment:760 ENVIRONMENT DUMP - BEGIN 2017-09-18 09:32:29 DEBUG otopi.context context.dumpEnvironment:770 ENV BASE/error=bool:'True' 2017-09-18 09:32:29 DEBUG otopi.context context.dumpEnvironment:770 ENV BASE/exceptionInfo=list:'[(<ty 'exceptions.RuntimeError'>, RuntimeError('One or more elements within Yum transaction failed',), <traceback object at 0x174fd40>)]' 2017-09-18 09:32:29 DEBUG otopi.context context.dumpEnvironment:774 ENVIRONMENT DUMP - END 2017-09-18 09:32:29 INFO otopi.context context.runSequence:687 Stage: Pre-termination 2017-09-18 09:32:29 DEBUG otopi.context context.runSequence:691 STAGE pre-terminate 2017-09-18 09:32:29 DEBUG otopi.context context._executeMethod:128 Stage pre-terminate METHOD otopi.plugins.otopi.core.misc. 2017-09-18 09:32:29 DEBUG otopi.context context.dumpEnvironment:760 ENVIRONMENT DUMP - BEGIN 2017-09-18 09:32:29 DEBUG otopi.context context.dumpEnvironment:770 ENV BASE/aborted=bool:'False' 2017-09-18 09:32:29 DEBUG otopi.context context.dumpEnvironment:770 ENV BASE/debug=int:'0' 2017-09-18 09:32:29 DEBUG otopi.context context.dumpEnvironment:770 ENV BASE/error=bool:'True' 2017-09-18 09:32:29 DEBUG otopi.context context.dumpEnvironment:770 ENV BASE/exceptionInfo=list:'[(<ty 'exceptions.RuntimeError'>, RuntimeError('One or more elements within Yum transaction failed',), <traceback object at 0x174fd40>)]' 2017-09-18 09:32:29 DEBUG otopi.context context.dumpEnvironment:770 ENV BASE/executionDirectory=str:'/ 2017-09-18 09:32:29 DEBUG otopi.context context.dumpEnvironment:770 ENV BASE/exitCode=list:'[{'priorit 90001, 'code': 0}]' 2017-09-18 09:32:29 DEBUG otopi.context context.dumpEnvironment:770 ENV BASE/log=bool:'True' 2017-09-18 09:32:29 DEBUG otopi.context context.dumpEnvironment:770 ENV BASE/pluginGroups=str:'otopi:o 2017-09-18 09:32:29 DEBUG otopi.context context.dumpEnvironment:770 ENV BASE/pluginPath=str:'/tmp/ovir 2017-09-18 09:32:29 DEBUG otopi.context context.dumpEnvironment:770 ENV BASE/suppressEnvironmentKeys=l 2017-09-18 09:32:29 DEBUG otopi.context context.dumpEnvironment:770 ENV COMMAND/chkconfig=str:'/sbin/c 2017-09-18 09:32:29 DEBUG otopi.context context.dumpEnvironment:770 ENV COMMAND/chronyc=str:'/bin/chro 2017-09-18 09:32:29 DEBUG otopi.context context.dumpEnvironment:770 ENV COMMAND/date=str:'/bin/date' 2017-09-18 09:32:29 DEBUG otopi.context context.dumpEnvironment:770 ENV COMMAND/firewall-cmd=str:'/bin 2017-09-18 09:32:29 DEBUG otopi.context context.dumpEnvironment:770 ENV COMMAND/hwclock=str:'/sbin/hwc 2017-09-18 09:32:29 DEBUG otopi.context context.dumpEnvironment:770 ENV COMMAND/initctl=NoneType:'None 2017-09-18 09:32:29 DEBUG otopi.context context.dumpEnvironment:770 ENV COMMAND/ip=str:'/sbin/ip' 2017-09-18 09:32:29 DEBUG otopi.context context.dumpEnvironment:770 ENV COMMAND/ntpq=str:'/sbin/ntpq' 2017-09-18 09:32:29 DEBUG otopi.context context.dumpEnvironment:770 ENV COMMAND/openssl=str:'/bin/open 2017-09-18 09:32:29 DEBUG otopi.context context.dumpEnvironment:770 ENV COMMAND/rc=NoneType:'None' 2017-09-18 09:32:29 DEBUG otopi.context context.dumpEnvironment:770 ENV COMMAND/rc-update=NoneType:'No 2017-09-18 09:32:29 DEBUG otopi.context context.dumpEnvironment:770 ENV COMMAND/reboot=str:'/sbin/rebo 2017-09-18 09:32:29 DEBUG otopi.context context.dumpEnvironment:770 ENV COMMAND/service=str:'/sbin/ser 2017-09-18 09:32:29 DEBUG otopi.context context.dumpEnvironment:770 ENV COMMAND/systemctl=str:'/bin/sy 2017-09-18 09:32:29 DEBUG otopi.context context.dumpEnvironment:770 ENV CORE/configFileAppend=NoneType 2017-09-18 09:32:29 DEBUG otopi.context context.dumpEnvironment:770 ENV CORE/configFileName=str:'/etc/ 2017-09-18 09:32:29 DEBUG otopi.context context.dumpEnvironment:770 ENV CORE/failOnPrioOverride=bool:' 2017-09-18 09:32:29 DEBUG otopi.context context.dumpEnvironment:770 ENV CORE/internalPackageTransactio 2017-09-18 09:32:29 DEBUG otopi.context context.dumpEnvironment:770 ENV CORE/logDir=str:'/tmp' 2017-09-18 09:32:29 DEBUG otopi.context context.dumpEnvironment:770 ENV CORE/logFileHandle=file:'<open file '/tmp/ovirt-host-mgmt-20170918 mode 'a' at 0x7f454b9a9d20>' 2017-09-18 09:32:29 DEBUG otopi.context context.dumpEnvironment:770 ENV CORE/logFileName=str:'/tmp/ovi 2017-09-18 09:32:29 DEBUG otopi.context context.dumpEnvironment:770 ENV CORE/logFileNamePrefix=str:'ov 2017-09-18 09:32:29 DEBUG otopi.context context.dumpEnvironment:770 ENV CORE/logFilter=_MyLoggerFilter 2017-09-18 09:32:29 DEBUG otopi.context context.dumpEnvironment:770 ENV CORE/logFilterKeys=list:'[]' 2017-09-18 09:32:29 DEBUG otopi.context context.dumpEnvironment:770 ENV CORE/logFilterRe=list:'[<_sre. object at 0xc6a8d0>]' 2017-09-18 09:32:29 DEBUG otopi.context context.dumpEnvironment:770 ENV CORE/logRemoveAtExit=bool:'Fal 2017-09-18 09:32:29 DEBUG otopi.context context.dumpEnvironment:770 ENV CORE/mainTransaction=Transacti 2017-09-18 09:32:29 DEBUG otopi.context context.dumpEnvironment:770 ENV CORE/modifiedFiles=list:'[]' 2017-09-18 09:32:29 DEBUG otopi.context context.dumpEnvironment:770 ENV CORE/randomizeEvents=bool:'Fal 2017-09-18 09:32:29 DEBUG otopi.context context.dumpEnvironment:770 ENV DIALOG/boundary=str:'**BOUNDAR 2017-09-18 09:32:29 DEBUG otopi.context context.dumpEnvironment:770 ENV DIALOG/cliVersion=int:'1' 2017-09-18 09:32:29 DEBUG otopi.context context.dumpEnvironment:770 ENV DIALOG/customization=bool:'Tru 2017-09-18 09:32:29 DEBUG otopi.context context.dumpEnvironment:770 ENV DIALOG/dialect=str:'machine' 2017-09-18 09:32:29 DEBUG otopi.context context.dumpEnvironment:770 ENV INFO/PACKAGE_NAME=str:'otopi' 2017-09-18 09:32:29 DEBUG otopi.context context.dumpEnvironment:770 ENV INFO/PACKAGE_VERSION=str:'1.6. 2017-09-18 09:32:29 DEBUG otopi.context context.dumpEnvironment:770 ENV NETWORK/firewalldAvailable=boo 2017-09-18 09:32:29 DEBUG otopi.context context.dumpEnvironment:770 ENV NETWORK/firewalldDisableServic 2017-09-18 09:32:29 DEBUG otopi.context context.dumpEnvironment:770 ENV NETWORK/firewalldEnable=bool:' 2017-09-18 09:32:29 DEBUG otopi.context context.dumpEnvironment:770 ENV NETWORK/iptablesEnable=bool:'F 2017-09-18 09:32:29 DEBUG otopi.context context.dumpEnvironment:770 ENV NETWORK/iptablesRules=NoneType 2017-09-18 09:32:29 DEBUG otopi.context context.dumpEnvironment:770 ENV NETWORK/sshEnable=bool:'False' 2017-09-18 09:32:29 DEBUG otopi.context context.dumpEnvironment:770 ENV NETWORK/sshKey=NoneType:'None' 2017-09-18 09:32:29 DEBUG otopi.context context.dumpEnvironment:770 ENV NETWORK/sshUser=str:'' 2017-09-18 09:32:29 DEBUG otopi.context context.dumpEnvironment:770 ENV ODEPLOY/installIncomplete=bool 2017-09-18 09:32:29 DEBUG otopi.context context.dumpEnvironment:770 ENV ODEPLOY/installIncompleteReaso 2017-09-18 09:32:29 DEBUG otopi.context context.dumpEnvironment:770 ENV OMGMT_CORE/offlinePackager=boo 2017-09-18 09:32:29 DEBUG otopi.context context.dumpEnvironment:770 ENV OMGMT_PACKAGES/packages=list:' 2017-09-18 09:32:29 DEBUG otopi.context context.dumpEnvironment:770 ENV OMGMT_PACKAGES/packagesInfo=li 2017-09-18 09:32:29 DEBUG otopi.context context.dumpEnvironment:770 ENV OMGMT_PACKAGES/packagesUpdateM 2017-09-18 09:32:29 DEBUG otopi.context context.dumpEnvironment:770 ENV OVIRT_ENGINE/correlationId=str 2017-09-18 09:32:29 DEBUG otopi.context context.dumpEnvironment:770 ENV PACKAGER/dnfDisabledPlugins=li 2017-09-18 09:32:29 DEBUG otopi.context context.dumpEnvironment:770 ENV PACKAGER/dnfExpireCache=bool:' 2017-09-18 09:32:29 DEBUG otopi.context context.dumpEnvironment:770 ENV PACKAGER/dnfRollback=bool:'Tru 2017-09-18 09:32:29 DEBUG otopi.context context.dumpEnvironment:770 ENV PACKAGER/dnfpackagerEnabled=bo 2017-09-18 09:32:29 DEBUG otopi.context context.dumpEnvironment:770 ENV PACKAGER/keepAliveInterval=int 2017-09-18 09:32:29 DEBUG otopi.context context.dumpEnvironment:770 ENV PACKAGER/yumDisabledPlugins=li 2017-09-18 09:32:29 DEBUG otopi.context context.dumpEnvironment:770 ENV PACKAGER/yumEnabledPlugins=lis 2017-09-18 09:32:29 DEBUG otopi.context context.dumpEnvironment:770 ENV PACKAGER/yumExpireCache=bool:' 2017-09-18 09:32:29 DEBUG otopi.context context.dumpEnvironment:770 ENV PACKAGER/yumRollback=bool:'Tru 2017-09-18 09:32:29 DEBUG otopi.context context.dumpEnvironment:770 ENV PACKAGER/yumpackagerEnabled=bo 2017-09-18 09:32:29 DEBUG otopi.context context.dumpEnvironment:770 ENV SYSTEM/clockMaxGap=int:'5' 2017-09-18 09:32:29 DEBUG otopi.context context.dumpEnvironment:770 ENV SYSTEM/clockSet=bool:'False' 2017-09-18 09:32:29 DEBUG otopi.context context.dumpEnvironment:770 ENV SYSTEM/commandPath=str:'/usr/l 2017-09-18 09:32:29 DEBUG otopi.context context.dumpEnvironment:770 ENV SYSTEM/reboot=bool:'False' 2017-09-18 09:32:29 DEBUG otopi.context context.dumpEnvironment:770 ENV SYSTEM/rebootAllow=bool:'True' 2017-09-18 09:32:29 DEBUG otopi.context context.dumpEnvironment:770 ENV SYSTEM/rebootDeferTime=int:'10 2017-09-18 09:32:29 DEBUG otopi.context context.dumpEnvironment:770 ENV VDSM/certificateChain=NoneType 2017-09-18 09:32:29 DEBUG otopi.context context.dumpEnvironment:770 ENV VDSM/certificateEnrollment=str 2017-09-18 09:32:29 DEBUG otopi.context context.dumpEnvironment:770 ENV VDSM/keySize=int:'2048' 2017-09-18 09:32:29 DEBUG otopi.context context.dumpEnvironment:770 ENV VMCONSOLE/caKey=NoneType:'None 2017-09-18 09:32:29 DEBUG otopi.context context.dumpEnvironment:770 ENV VMCONSOLE/certificate=NoneType 2017-09-18 09:32:29 DEBUG otopi.context context.dumpEnvironment:770 ENV VMCONSOLE/certificateEnrollmen 2017-09-18 09:32:29 DEBUG otopi.context context.dumpEnvironment:770 ENV VMCONSOLE/keySize=int:'2048' 2017-09-18 09:32:29 DEBUG otopi.context context.dumpEnvironment:774 ENVIRONMENT DUMP - END 2017-09-18 09:32:29 DEBUG otopi.context context._executeMethod:128 Stage pre-terminate METHOD otopi.plugins.otopi.dialog.cli 2017-09-18 09:32:29 DEBUG otopi.plugins.otopi.dialog.mac dialog.__logString:204 DIALOG:SEND **%QStart: TERMINATION_COMMAND 2017-09-18 09:32:29 DEBUG otopi.plugins.otopi.dialog.mac dialog.__logString:204 DIALOG:SEND ### 2017-09-18 09:32:29 DEBUG otopi.plugins.otopi.dialog.mac dialog.__logString:204 DIALOG:SEND ### Processing ended, use 'quit' to quit 2017-09-18 09:32:29 DEBUG otopi.plugins.otopi.dialog.mac dialog.__logString:204 DIALOG:SEND ### COMMAND> 2017-09-18 09:32:29 DEBUG otopi.plugins.otopi.dialog.mac dialog.__logString:204 DIALOG:SEND **%QHidden: FALSE 2017-09-18 09:32:29 DEBUG otopi.plugins.otopi.dialog.mac dialog.__logString:204 DIALOG:SEND ***Q:STRING TERMINATION_COMMAND 2017-09-18 09:32:29 DEBUG otopi.plugins.otopi.dialog.mac dialog.__logString:204 DIALOG:SEND **%QEnd: TERMINATION_COMMAND 2017-09-18 09:32:29 DEBUG otopi.plugins.otopi.dialog.mac dialog.__logString:204 DIALOG:RECEIVE env-get -k OMGMT_PACKAGES/packagesInfo 2017-09-18 09:32:29 DEBUG otopi.plugins.otopi.dialog.mac dialog.__logString:204 DIALOG:SEND ***D:MULTI-STRING OMGMT_PACKAGES/packagesInfo **BOUNDARY** 2017-09-18 09:32:29 DEBUG otopi.plugins.otopi.dialog.mac dialog.__logString:204 DIALOG:SEND **BOUNDARY** 2017-09-18 09:32:29 DEBUG otopi.plugins.otopi.dialog.mac dialog.__logString:204 DIALOG:SEND **%QStart: TERMINATION_COMMAND 2017-09-18 09:32:29 DEBUG otopi.plugins.otopi.dialog.mac dialog.__logString:204 DIALOG:SEND ### 2017-09-18 09:32:29 DEBUG otopi.plugins.otopi.dialog.mac dialog.__logString:204 DIALOG:SEND ### Processing ended, use 'quit' to quit 2017-09-18 09:32:29 DEBUG otopi.plugins.otopi.dialog.mac dialog.__logString:204 DIALOG:SEND ### COMMAND> 2017-09-18 09:32:29 DEBUG otopi.plugins.otopi.dialog.mac dialog.__logString:204 DIALOG:SEND **%QHidden: FALSE 2017-09-18 09:32:29 DEBUG otopi.plugins.otopi.dialog.mac dialog.__logString:204 DIALOG:SEND ***Q:STRING TERMINATION_COMMAND 2017-09-18 09:32:29 DEBUG otopi.plugins.otopi.dialog.mac dialog.__logString:204 DIALOG:SEND **%QEnd: TERMINATION_COMMAND 2017-09-18 09:32:29 DEBUG otopi.plugins.otopi.dialog.mac dialog.__logString:204 DIALOG:RECEIVE env-get -k BASE/error 2017-09-18 09:32:29 DEBUG otopi.plugins.otopi.dialog.mac dialog.__logString:204 DIALOG:SEND ***D:VALUE BASE/error=bool:True 2017-09-18 09:32:29 DEBUG otopi.plugins.otopi.dialog.mac dialog.__logString:204 DIALOG:SEND **%QStart: TERMINATION_COMMAND 2017-09-18 09:32:29 DEBUG otopi.plugins.otopi.dialog.mac dialog.__logString:204 DIALOG:SEND ### 2017-09-18 09:32:29 DEBUG otopi.plugins.otopi.dialog.mac dialog.__logString:204 DIALOG:SEND ### Processing ended, use 'quit' to quit 2017-09-18 09:32:29 DEBUG otopi.plugins.otopi.dialog.mac dialog.__logString:204 DIALOG:SEND ### COMMAND> 2017-09-18 09:32:29 DEBUG otopi.plugins.otopi.dialog.mac dialog.__logString:204 DIALOG:SEND **%QHidden: FALSE 2017-09-18 09:32:29 DEBUG otopi.plugins.otopi.dialog.mac dialog.__logString:204 DIALOG:SEND ***Q:STRING TERMINATION_COMMAND 2017-09-18 09:32:29 DEBUG otopi.plugins.otopi.dialog.mac dialog.__logString:204 DIALOG:SEND **%QEnd: TERMINATION_COMMAND 2017-09-18 09:32:29 DEBUG otopi.plugins.otopi.dialog.mac dialog.__logString:204 DIALOG:RECEIVE env-get -k ODEPLOY/installIncomplete 2017-09-18 09:32:29 DEBUG otopi.plugins.otopi.dialog.mac dialog.__logString:204 DIALOG:SEND ***D:VALUE ODEPLOY/installIncomplete=bool 2017-09-18 09:32:29 DEBUG otopi.plugins.otopi.dialog.mac dialog.__logString:204 DIALOG:SEND **%QStart: TERMINATION_COMMAND 2017-09-18 09:32:29 DEBUG otopi.plugins.otopi.dialog.mac dialog.__logString:204 DIALOG:SEND ### 2017-09-18 09:32:29 DEBUG otopi.plugins.otopi.dialog.mac dialog.__logString:204 DIALOG:SEND ### Processing ended, use 'quit' to quit 2017-09-18 09:32:29 DEBUG otopi.plugins.otopi.dialog.mac dialog.__logString:204 DIALOG:SEND ### COMMAND> 2017-09-18 09:32:29 DEBUG otopi.plugins.otopi.dialog.mac dialog.__logString:204 DIALOG:SEND **%QHidden: FALSE 2017-09-18 09:32:29 DEBUG otopi.plugins.otopi.dialog.mac dialog.__logString:204 DIALOG:SEND ***Q:STRING TERMINATION_COMMAND 2017-09-18 09:32:29 DEBUG otopi.plugins.otopi.dialog.mac dialog.__logString:204 DIALOG:SEND **%QEnd: TERMINATION_COMMAND 2017-09-18 09:32:29 DEBUG otopi.plugins.otopi.dialog.mac dialog.__logString:204 DIALOG:RECEIVE env-get -k SYSTEM/reboot 2017-09-18 09:32:29 DEBUG otopi.plugins.otopi.dialog.mac dialog.__logString:204 DIALOG:SEND ***D:VALUE SYSTEM/reboot=bool:False 2017-09-18 09:32:29 DEBUG otopi.plugins.otopi.dialog.mac dialog.__logString:204 DIALOG:SEND **%QStart: TERMINATION_COMMAND 2017-09-18 09:32:29 DEBUG otopi.plugins.otopi.dialog.mac dialog.__logString:204 DIALOG:SEND ### 2017-09-18 09:32:29 DEBUG otopi.plugins.otopi.dialog.mac dialog.__logString:204 DIALOG:SEND ### Processing ended, use 'quit' to quit 2017-09-18 09:32:29 DEBUG otopi.plugins.otopi.dialog.mac dialog.__logString:204 DIALOG:SEND ### COMMAND> 2017-09-18 09:32:29 DEBUG otopi.plugins.otopi.dialog.mac dialog.__logString:204 DIALOG:SEND **%QHidden: FALSE 2017-09-18 09:32:29 DEBUG otopi.plugins.otopi.dialog.mac dialog.__logString:204 DIALOG:SEND ***Q:STRING TERMINATION_COMMAND 2017-09-18 09:32:29 DEBUG otopi.plugins.otopi.dialog.mac dialog.__logString:204 DIALOG:SEND **%QEnd: TERMINATION_COMMAND 2017-09-18 09:32:29 DEBUG otopi.plugins.otopi.dialog.mac dialog.__logString:204 DIALOG:RECEIVE noop 2017-09-18 09:32:29 DEBUG otopi.plugins.otopi.dialog.mac dialog.__logString:204 DIALOG:SEND **%QStart: TERMINATION_COMMAND 2017-09-18 09:32:29 DEBUG otopi.plugins.otopi.dialog.mac dialog.__logString:204 DIALOG:SEND ### 2017-09-18 09:32:29 DEBUG otopi.plugins.otopi.dialog.mac dialog.__logString:204 DIALOG:SEND ### Processing ended, use 'quit' to quit 2017-09-18 09:32:29 DEBUG otopi.plugins.otopi.dialog.mac dialog.__logString:204 DIALOG:SEND ### COMMAND> 2017-09-18 09:32:29 DEBUG otopi.plugins.otopi.dialog.mac dialog.__logString:204 DIALOG:SEND **%QHidden: FALSE 2017-09-18 09:32:29 DEBUG otopi.plugins.otopi.dialog.mac dialog.__logString:204 DIALOG:SEND ***Q:STRING TERMINATION_COMMAND 2017-09-18 09:32:29 DEBUG otopi.plugins.otopi.dialog.mac dialog.__logString:204 DIALOG:SEND **%QEnd: TERMINATION_COMMAND 2017-09-18 09:32:29 DEBUG otopi.plugins.otopi.dialog.mac dialog.__logString:204 DIALOG:RECEIVE log
7 years, 3 months
can't upload a disk with web UI or API
by Nathanaël Blanchet
When trying to upload a raw/qcow2 image, I get this issue:
Unable to upload image to disk b9099a35-247d-40e5-9f24-b4c39db2dfff due
to a network error. Make sure ovirt-imageio-proxy service is installed
and configured, and ovirt-engine's certificate is registered as a valid
CA in the browser. The certificate can be fetched from
https://<engine_url>/ovirt-engine/services/pki-resource?resource=ca-certificate&format=X509-PEM-CA
While certificate is already imported into browser, ovirt-imageio-proxy
is started on engine and ovirt-imageio-daemon started on nodes. Ports
are correctly opened on proxy and nodes and hosts are reachable by their
fqdn
Here are logs on the proxy
ovirt-imageio-proxy web ERROR 10.34.10.191 - PUT 500 209
(0.00s)#012Traceback (most recent call last):#012 Fil e
"/usr/lib/python2.7/site-packages/ovirt_imageio_common/web.py", line 48,
in __call__#012 resp = self.dispatch(request)#012 File "/usr/lib
/python2.7/site-packages/ovirt_imageio_common/web.py", line 73, in
dispatch#012 return method(*match.groups())#012 File
"/usr/lib/python2.7 /site-packages/ovirt_imageio_proxy/http_helper.py",
line 88, in wrapper#012 ret = func(self, *args)#012 File
"/usr/lib/python2.7/site-packa ges/ovirt_imageio_proxy/http_helper.py",
line 59, in wrapper#012 ret = func(self, *args)#012 File
"/usr/lib/python2.7/site-packages/ovirt_i
mageio_proxy/image_handler.py", line 75, in put#012 return
self.send_data(self.request)#012 File
"/usr/lib/python2.7/site-packages/ovirt_im
ageio_proxy/image_handler.py", line 107, in send_data#012 body =
web.CappedStream(request.body_file,
max_transfer_bytes)#012AttributeError: 'module' object has no
attribute 'CappedStream'
Thank you for your help.
--
Nathanaël Blanchet
Supervision réseau
Pôle Infrastrutures Informatiques
227 avenue Professeur-Jean-Louis-Viala
34193 MONTPELLIER CEDEX 5
Tél. 33 (0)4 67 54 84 55
Fax 33 (0)4 67 54 84 14
blanchet(a)abes.fr
7 years, 3 months
Current state of infiniband support in ovirt?
by Jeff Wiegley
I'm looking at creating a scalable HA cluster. I've been looking at
ovirt for the
VM management side. (Proxmox/VMware are essentially licensed products and
I'm at a university with no money and OpenStack seemed overkill and I don't
need random users managing VM provisioning ala AWS)
I need a central HA backend storage and I'm interested in using infiniband
because it's very fast (40Gb) and cheap to obtain switches and adapters
for.
However, I was wondering if ovirt is capable of using infiniband in a No-IP
SAN configuration? (I've seen that infiniband/IP over Infiniband/NFS is
possible
but I would rather use SAN instead of NAS and also avoid the IP overhead
in the long run.
What is the current state of using raw infiniband to provide SAN storage for
ovirt based installations?
Thank you for your expertise,
Jeff W.
7 years, 3 months
Server Not Responding
by Bryan Sockel
Hi
Having an issue where i frequently have a server that is set to not
responsive. VM's are set to unknown status, but still continue to run.
This issue is isolated to just a single host. My Setup is currently a 2
Data Center Configuration with 2 servers in each data center. Issue is
occurring at my remote site.
The primary storage volumes are setup on dedicated hardware, with the
arbiter running on the server that is having issues. There is also another
gluster replica volume hosted on this box, the replica is the other
dedicated server.
The logs are showing:
[org.ovirt.engine.core.vdsbroker.vdsbroker.GetCapabilitiesVDSCommand]
(DefaultQuartzScheduler8) [] Command 'GetCapabilitiesVDSCommand(HostName =
vm-host-colo-1, VdsIdAndVdsVDSCommandParametersBase:{runAsync='true',
hostId='e75d4446-9bfc-47cb-8bf8-a2e681720b66',
vds='Host[vm-host-colo-1,e75d4446-9bfc-47cb-8bf8-a2e681720b66]'})' execution
failed: java.rmi.ConnectException: Connection timeout
[org.ovirt.engine.core.vdsbroker.monitoring.HostMonitoring]
(DefaultQuartzScheduler8) [] Failure to refresh host 'vm-host-colo-1'
runtime info: java.rmi.ConnectException: Connection timeout.
I have attached the vdsm.log from the server with issues and the engine.log.
Thanks
Bryan Sockel
7 years, 3 months
Fwd: [CentOS-devel] LinchPin v1.0.3 (bugfix) has been released
by Sandro Bonazzola
This new version supports oVirt topologies and VMs:
http://linchpin.readthedocs.io/en/develop/topologies_ovirt.html?highlight...
---------- Forwarded message ----------
From: Clint Savage <herlo(a)redhat.com>
Date: 2017-09-12 22:12 GMT+02:00
Subject: [CentOS-devel] LinchPin v1.0.3 (bugfix) has been released
To: Linchpin Mailing List <linchpin(a)redhat.com>, continous-infra <
continuous-infra(a)redhat.com>, "Development discussion for the PnT DevOps
Factory 2.0 initiative" <pnt-factory2-devel(a)redhat.com>, ci-ops-central <
ci-ops-central(a)redhat.com>, ci-infra-list(a)redhat.com, qe-dept-list <
qe-dept-list(a)redhat.com>, CentOS Devel <centos-devel(a)centos.org>
Hi all,
We are happy to announce that LinchPin v1.0.3 has been released. This is a
bugfix release, which has the following updates:
* remove gce2 module and use the standard module from ansible
* requirements.txt to use Ansible >= 2.3.1 (this means we can package it
for Fedora)
* remove other requirements from requirements.txt that are not necessary
* improved documentation around clouds.yaml and --creds-path
* CLI exits with proper exit codes (with tests to verify)
* linchpin fetch (download local/remote workspaces)
* flake8 testing
* ovirt support with docs!
* linchpin.conf supports overriding specific sections from previous config
* os_server module updated to handle multiple provisioning properly
* remove cruft from api framework
* upgrade setuptools and pip in tests
* move lp_init into api and cli to work together
* workspace set and get methods
* other miscellany
The official release notes are available at https://github.com/CentOS-
PaaS-SIG/linchpin/releases/tag/v1.0.3
This update is available via PyPi - https://pypi.python.org/pypi/linchpin
If you discover any errors or regressions, please open a Github issue (
https://github.com/CentOS-PaaS-SIG/linchpin/issues).
Cheers and enjoy!
Clint Savage
LinchPin Maintainer
Senior Software Engineer, Red Hat
twitter: @herlo, github: herlo, IRC: herlo, #linchpin
_______________________________________________
CentOS-devel mailing list
CentOS-devel(a)centos.org
https://lists.centos.org/mailman/listinfo/centos-devel
--
SANDRO BONAZZOLA
ASSOCIATE MANAGER, SOFTWARE ENGINEERING, EMEA ENG VIRTUALIZATION R&D
Red Hat EMEA <https://www.redhat.com/>
<https://red.ht/sig>
TRIED. TESTED. TRUSTED. <https://redhat.com/trusted>
7 years, 3 months