<html>
<head>
<meta content="text/html; charset=utf-8" http-equiv="Content-Type">
</head>
<body bgcolor="#FFFFFF" text="#000000">
Hi Adam,<br>
<br>
Thanks for looking! The storage is fibre attached and I've verified
with the SAN folks nothing went wonky during this window on their
side.<br>
<br>
Here is what I've got from vdsm.log during the window (and a bit
surrounding it for context):<br>
<br>
libvirtEventLoop::WARNING::2017-02-16
08:35:17,435::utils::140::root::(rmFile) File:
/var/lib/libvirt/qemu/channels/ba806b93-b6fe-4873-99ec-55bb34c12e5f.com.redhat.rhevm.vdsm
already removed<br>
libvirtEventLoop::WARNING::2017-02-16
08:35:17,435::utils::140::root::(rmFile) File:
/var/lib/libvirt/qemu/channels/ba806b93-b6fe-4873-99ec-55bb34c12e5f.org.qemu.guest_agent.0
already removed<br>
periodic/2::WARNING::2017-02-16
08:35:18,144::periodic::295::virt.vm::(__call__)
vmId=`ba806b93-b6fe-4873-99ec-55bb34c12e5f`::could not run on
ba806b93-b6fe-4873-99ec-55bb34c12e5f: domain not connected<br>
periodic/3::WARNING::2017-02-16
08:35:18,305::periodic::261::virt.periodic.VmDispatcher::(__call__)
could not run <class 'virt.periodic.DriveWatermarkMonitor'> on
['ba806b93-b6fe-4873-99ec-55bb34c12e5f']<br>
Thread-23021::ERROR::2017-02-16
09:28:33,096::task::866::Storage.TaskManager.Task::(_setError)
Task=`ecab8086-261f-44b9-8123-eefb9bbf5b05`::Unexpected error<br>
Thread-23021::ERROR::2017-02-16
09:28:33,097::dispatcher::76::Storage.Dispatcher::(wrapper)
{'status': {'message': "Storage domain is member of pool:
'domain=81f19871-4d91-4698-a97d-36452bfae281'", 'code': 900}}<br>
Thread-23783::ERROR::2017-02-16
10:13:32,876::task::866::Storage.TaskManager.Task::(_setError)
Task=`ff628204-6e41-4e5e-b83a-dad6ec94d0d3`::Unexpected error<br>
Thread-23783::ERROR::2017-02-16
10:13:32,877::dispatcher::76::Storage.Dispatcher::(wrapper)
{'status': {'message': "Storage domain is member of pool:
'domain=81f19871-4d91-4698-a97d-36452bfae281'", 'code': 900}}<br>
Thread-24542::ERROR::2017-02-16
10:58:32,578::task::866::Storage.TaskManager.Task::(_setError)
Task=`f5111200-e980-46bb-bbc3-898ae312d556`::Unexpected error<br>
Thread-24542::ERROR::2017-02-16
10:58:32,579::dispatcher::76::Storage.Dispatcher::(wrapper)
{'status': {'message': "Storage domain is member of pool:
'domain=81f19871-4d91-4698-a97d-36452bfae281'", 'code': 900}}<br>
jsonrpc.Executor/4::ERROR::2017-02-16
11:28:24,049::sdc::139::Storage.StorageDomainCache::(_findDomain)
looking for unfetched domain 13127103-3f59-418a-90f1-5b1ade8526b1<br>
jsonrpc.Executor/4::ERROR::2017-02-16
11:28:24,049::sdc::156::Storage.StorageDomainCache::(_findUnfetchedDomain)
looking for domain 13127103-3f59-418a-90f1-5b1ade8526b1<br>
jsonrpc.Executor/4::ERROR::2017-02-16
11:28:24,305::sdc::145::Storage.StorageDomainCache::(_findDomain)
domain 13127103-3f59-418a-90f1-5b1ade8526b1 not found<br>
6e31bf97-458c-4a30-9df5-14f475db3339::ERROR::2017-02-16
11:29:19,402::image::205::Storage.Image::(getChain) There is no leaf
in the image e17ebd7c-0763-42b2-b344-5ad7f9cf448e<br>
6e31bf97-458c-4a30-9df5-14f475db3339::ERROR::2017-02-16
11:29:19,403::task::866::Storage.TaskManager.Task::(_setError)
Task=`6e31bf97-458c-4a30-9df5-14f475db3339`::Unexpected error<br>
79ed31a2-5ac7-4304-ab4d-d05f72694860::ERROR::2017-02-16
11:29:20,649::image::205::Storage.Image::(getChain) There is no leaf
in the image b4c4b53e-3813-4959-a145-16f1dfcf1838<br>
79ed31a2-5ac7-4304-ab4d-d05f72694860::ERROR::2017-02-16
11:29:20,650::task::866::Storage.TaskManager.Task::(_setError)
Task=`79ed31a2-5ac7-4304-ab4d-d05f72694860`::Unexpected error<br>
jsonrpc.Executor/5::ERROR::2017-02-16
11:30:17,063::image::205::Storage.Image::(getChain) There is no leaf
in the image e17ebd7c-0763-42b2-b344-5ad7f9cf448e<br>
jsonrpc.Executor/5::ERROR::2017-02-16
11:30:17,064::task::866::Storage.TaskManager.Task::(_setError)
Task=`62f20e22-e850-44c8-8943-faa4ce71e973`::Unexpected error<br>
jsonrpc.Executor/5::ERROR::2017-02-16
11:30:17,065::dispatcher::76::Storage.Dispatcher::(wrapper)
{'status': {'message': "Image is not a legal chain:
('e17ebd7c-0763-42b2-b344-5ad7f9cf448e',)", 'code': 262}}<br>
jsonrpc.Executor/4::ERROR::2017-02-16
11:33:18,487::image::205::Storage.Image::(getChain) There is no leaf
in the image e17ebd7c-0763-42b2-b344-5ad7f9cf448e<br>
jsonrpc.Executor/4::ERROR::2017-02-16
11:33:18,488::task::866::Storage.TaskManager.Task::(_setError)
Task=`e4d893f2-7be6-4f84-9ac6-58b5a5d1364e`::Unexpected error<br>
jsonrpc.Executor/4::ERROR::2017-02-16
11:33:18,489::dispatcher::76::Storage.Dispatcher::(wrapper)
{'status': {'message': "Image is not a legal chain:
('e17ebd7c-0763-42b2-b344-5ad7f9cf448e',)", 'code': 262}}<br>
3132106a-ce35-4b12-9a72-812e415eff7f::ERROR::2017-02-16
11:34:47,595::image::205::Storage.Image::(getChain) There is no leaf
in the image e17ebd7c-0763-42b2-b344-5ad7f9cf448e<br>
3132106a-ce35-4b12-9a72-812e415eff7f::ERROR::2017-02-16
11:34:47,596::task::866::Storage.TaskManager.Task::(_setError)
Task=`3132106a-ce35-4b12-9a72-812e415eff7f`::Unexpected error<br>
112fb772-a497-4788-829f-190d6d008d95::ERROR::2017-02-16
11:34:48,517::image::205::Storage.Image::(getChain) There is no leaf
in the image b4c4b53e-3813-4959-a145-16f1dfcf1838<br>
112fb772-a497-4788-829f-190d6d008d95::ERROR::2017-02-16
11:34:48,517::task::866::Storage.TaskManager.Task::(_setError)
Task=`112fb772-a497-4788-829f-190d6d008d95`::Unexpected error<br>
Thread-25336::ERROR::2017-02-16
11:43:32,726::task::866::Storage.TaskManager.Task::(_setError)
Task=`fafb120e-e7c6-4d3e-b87a-8116484f1c1a`::Unexpected error<br>
Thread-25336::ERROR::2017-02-16
11:43:32,727::dispatcher::76::Storage.Dispatcher::(wrapper)
{'status': {'message': "Storage domain is member of pool:
'domain=81f19871-4d91-4698-a97d-36452bfae281'", 'code': 900}}<br>
jsonrpc.Executor/0::WARNING::2017-02-16
11:54:05,875::momIF::113::MOM::(getStatus) MOM not available.<br>
jsonrpc.Executor/0::WARNING::2017-02-16
11:54:05,877::momIF::76::MOM::(getKsmStats) MOM not available, KSM
stats will be missing.<br>
ioprocess communication (10025)::ERROR::2017-02-16
11:54:05,890::__init__::176::IOProcessClient::(_communicate)
IOProcess failure<br>
ioprocess communication (10364)::ERROR::2017-02-16
11:54:05,892::__init__::176::IOProcessClient::(_communicate)
IOProcess failure<br>
ioprocess communication (23403)::ERROR::2017-02-16
11:54:05,892::__init__::176::IOProcessClient::(_communicate)
IOProcess failure<br>
ioprocess communication (31710)::ERROR::2017-02-16
11:54:05,999::__init__::176::IOProcessClient::(_communicate)
IOProcess failure<br>
ioprocess communication (31717)::ERROR::2017-02-16
11:54:05,999::__init__::176::IOProcessClient::(_communicate)
IOProcess failure<br>
ioprocess communication (31724)::ERROR::2017-02-16
11:54:06,000::__init__::176::IOProcessClient::(_communicate)
IOProcess failure<br>
Thread-16::ERROR::2017-02-16
11:54:21,657::monitor::387::Storage.Monitor::(_acquireHostId) Error
acquiring host id 2 for domain 81f19871-4d91-4698-a97d-36452bfae281<br>
jsonrpc.Executor/7::ERROR::2017-02-16
11:54:21,885::API::1871::vds::(_getHaInfo) failed to retrieve Hosted
Engine HA info<br>
jsonrpc.Executor/0::ERROR::2017-02-16
11:54:21,890::task::866::Storage.TaskManager.Task::(_setError)
Task=`73ca0c58-3e86-47e8-80f2-31d97346f0a3`::Unexpected error<br>
jsonrpc.Executor/0::ERROR::2017-02-16
11:54:21,892::dispatcher::79::Storage.Dispatcher::(wrapper) Secured
object is not in safe state<br>
Thread-16::ERROR::2017-02-16
11:54:31,673::monitor::387::Storage.Monitor::(_acquireHostId) Error
acquiring host id 2 for domain 81f19871-4d91-4698-a97d-36452bfae281<br>
jsonrpc.Executor/4::ERROR::2017-02-16
11:54:34,309::API::1871::vds::(_getHaInfo) failed to retrieve Hosted
Engine HA info<br>
jsonrpc.Executor/2::ERROR::2017-02-16
11:57:30,796::API::1871::vds::(_getHaInfo) failed to retrieve Hosted
Engine HA info<br>
jsonrpc.Executor/7::ERROR::2017-02-16
11:57:39,847::image::205::Storage.Image::(getChain) There is no leaf
in the image e17ebd7c-0763-42b2-b344-5ad7f9cf448e<br>
jsonrpc.Executor/7::ERROR::2017-02-16
11:57:39,848::task::866::Storage.TaskManager.Task::(_setError)
Task=`e4ae2972-77d4-406a-ac71-b285953b76ae`::Unexpected error<br>
jsonrpc.Executor/7::ERROR::2017-02-16
11:57:39,849::dispatcher::76::Storage.Dispatcher::(wrapper)
{'status': {'message': "Image is not a legal chain:
('e17ebd7c-0763-42b2-b344-5ad7f9cf448e',)", 'code': 262}}<br>
jsonrpc.Executor/0::ERROR::2017-02-16
11:57:45,965::API::1871::vds::(_getHaInfo) failed to retrieve Hosted
Engine HA info<br>
jsonrpc.Executor/5::ERROR::2017-02-16
13:01:26,274::image::205::Storage.Image::(getChain) There is no leaf
in the image e17ebd7c-0763-42b2-b344-5ad7f9cf448e<br>
jsonrpc.Executor/5::ERROR::2017-02-16
13:01:26,275::task::866::Storage.TaskManager.Task::(_setError)
Task=`2a214b3a-a50b-425a-ad99-bf5cc6be13ef`::Unexpected error<br>
jsonrpc.Executor/5::ERROR::2017-02-16
13:01:26,276::dispatcher::76::Storage.Dispatcher::(wrapper)
{'status': {'message': "Image is not a legal chain:
('e17ebd7c-0763-42b2-b344-5ad7f9cf448e',)", 'code': 262}}<br>
periodic/3::WARNING::2017-02-16
13:13:52,268::periodic::261::virt.periodic.VmDispatcher::(__call__)
could not run <class 'virt.periodic.DriveWatermarkMonitor'> on
['ba806b93-b6fe-4873-99ec-55bb34c12e5f']<br>
periodic/2::WARNING::2017-02-16
13:50:15,062::periodic::261::virt.periodic.VmDispatcher::(__call__)
could not run <class 'virt.periodic.DriveWatermarkMonitor'> on
['ba806b93-b6fe-4873-99ec-55bb34c12e5f']<br>
periodic/1::WARNING::2017-02-16
13:51:15,085::periodic::261::virt.periodic.VmDispatcher::(__call__)
could not run <class 'virt.periodic.DriveWatermarkMonitor'> on
['ba806b93-b6fe-4873-99ec-55bb34c12e5f']<br>
periodic/3::WARNING::2017-02-16
13:51:45,081::periodic::261::virt.periodic.VmDispatcher::(__call__)
could not run <class 'virt.periodic.DriveWatermarkMonitor'> on
['ba806b93-b6fe-4873-99ec-55bb34c12e5f']<br>
periodic/0::WARNING::2017-02-16
15:21:45,347::periodic::261::virt.periodic.VmDispatcher::(__call__)
could not run <class 'virt.periodic.DriveWatermarkMonitor'> on
['ba806b93-b6fe-4873-99ec-55bb34c12e5f']<br>
periodic/0::WARNING::2017-02-16
16:21:00,522::periodic::261::virt.periodic.VmDispatcher::(__call__)
could not run <class 'virt.periodic.DriveWatermarkMonitor'> on
['ba806b93-b6fe-4873-99ec-55bb34c12e5f']<br>
periodic/3::WARNING::2017-02-16
17:49:00,858::periodic::261::virt.periodic.VmDispatcher::(__call__)
could not run <class 'virt.periodic.DriveWatermarkMonitor'> on
['ba806b93-b6fe-4873-99ec-55bb34c12e5f']<br>
periodic/3::WARNING::2017-02-16
17:50:00,868::periodic::261::virt.periodic.VmDispatcher::(__call__)
could not run <class 'virt.periodic.DriveWatermarkMonitor'> on
['ba806b93-b6fe-4873-99ec-55bb34c12e5f']<br>
periodic/0::WARNING::2017-02-16
17:51:30,899::periodic::261::virt.periodic.VmDispatcher::(__call__)
could not run <class 'virt.periodic.DriveWatermarkMonitor'> on
['ba806b93-b6fe-4873-99ec-55bb34c12e5f']<br>
periodic/0::WARNING::2017-02-16
17:52:30,907::periodic::261::virt.periodic.VmDispatcher::(__call__)
could not run <class 'virt.periodic.DriveWatermarkMonitor'> on
['ba806b93-b6fe-4873-99ec-55bb34c12e5f']<br>
<br>
<br>
<div class="moz-cite-prefix">On 02/20/2017 08:45 AM, Adam Litke
wrote:<br>
</div>
<blockquote
cite="mid:CAG8F9UEQ7cJQV=KCZ9bVVLAkZgLMFusXGr-6=G+kNV9RO9KAXw@mail.gmail.com"
type="cite">
<meta http-equiv="Content-Type" content="text/html; charset=utf-8">
<div dir="ltr">Hi Pat. I'd like to help you investigate this
issue further. Could you send a snippet of the vdsm.log on
slam-vmnode-03 that covers the time period during this failure?
Engine is reporting that vdsm has likely thrown an exception
while acquiring locks associated with the VM disk you are
exporting.<br>
</div>
<div class="gmail_extra"><br>
<div class="gmail_quote">On Thu, Feb 16, 2017 at 12:40 PM, Pat
Riehecky <span dir="ltr"><<a moz-do-not-send="true"
href="mailto:riehecky@fnal.gov" target="_blank">riehecky@fnal.gov</a>></span>
wrote:<br>
<blockquote class="gmail_quote" style="margin:0 0 0
.8ex;border-left:1px #ccc solid;padding-left:1ex">Any
attempts to export my VM error out. Last night the disk
images got 'unregistered' from oVirt and I had to rescan the
storage domain to find them again. Now I'm just trying to
get a backup of the VM.<br>
<br>
The snapshots off of the old disks are still listed, but I
don't know if the lvm slices are still real or if that is
even what is wrong.<br>
<br>
steps I followed -><br>
Halt VM<br>
Click Export<br>
leave things unchecked and click OK<br>
<br>
oVirt version:<br>
ovirt-engine-4.0.3-1.el7.cento<wbr>s.noarch<br>
ovirt-engine-backend-4.0.3-1.e<wbr>l7.centos.noarch<br>
ovirt-engine-cli-3.6.9.2-1.el7<wbr>.noarch<br>
ovirt-engine-dashboard-1.0.3-1<wbr>.el7.centos.noarch<br>
ovirt-engine-dbscripts-4.0.3-1<wbr>.el7.centos.noarch<br>
ovirt-engine-dwh-4.0.2-1.el7.c<wbr>entos.noarch<br>
ovirt-engine-dwh-setup-4.0.2-1<wbr>.el7.centos.noarch<br>
ovirt-engine-extension-aaa-jdb<wbr>c-1.1.0-1.el7.noarch<br>
ovirt-engine-extension-aaa-lda<wbr>p-1.2.1-1.el7.noarch<br>
ovirt-engine-extension-aaa-lda<wbr>p-setup-1.2.1-1.el7.noarch<br>
ovirt-engine-extensions-api-im<wbr>pl-4.0.3-1.el7.centos.noarch<br>
ovirt-engine-lib-4.0.3-1.el7.c<wbr>entos.noarch<br>
ovirt-engine-restapi-4.0.3-1.e<wbr>l7.centos.noarch<br>
ovirt-engine-sdk-python-3.6.9.<wbr>1-1.el7.noarch<br>
ovirt-engine-setup-4.0.3-1.el7<wbr>.centos.noarch<br>
ovirt-engine-setup-base-4.0.3-<wbr>1.el7.centos.noarch<br>
ovirt-engine-setup-plugin-ovir<wbr>t-engine-4.0.3-1.el7.centos.<wbr>noarch<br>
ovirt-engine-setup-plugin-ovir<wbr>t-engine-common-4.0.3-1.el7.<wbr>centos.noarch<br>
ovirt-engine-setup-plugin-vmco<wbr>nsole-proxy-helper-4.0.3-1.<wbr>el7.centos.noarch<br>
ovirt-engine-setup-plugin-webs<wbr>ocket-proxy-4.0.3-1.el7.centos<wbr>.noarch<br>
ovirt-engine-tools-4.0.3-1.el7<wbr>.centos.noarch<br>
ovirt-engine-tools-backup-4.0.<wbr>3-1.el7.centos.noarch<br>
ovirt-engine-userportal-4.0.3-<wbr>1.el7.centos.noarch<br>
ovirt-engine-vmconsole-proxy-h<wbr>elper-4.0.3-1.el7.centos.noarc<wbr>h<br>
ovirt-engine-webadmin-portal-4<wbr>.0.3-1.el7.centos.noarch<br>
ovirt-engine-websocket-proxy-4<wbr>.0.3-1.el7.centos.noarch<br>
ovirt-engine-wildfly-10.0.0-1.<wbr>el7.x86_64<br>
ovirt-engine-wildfly-overlay-1<wbr>0.0.0-1.el7.noarch<br>
ovirt-guest-agent-common-1.0.1<wbr>2-4.el7.noarch<br>
ovirt-host-deploy-1.5.1-1.el7.<wbr>centos.noarch<br>
ovirt-host-deploy-java-1.5.1-1<wbr>.el7.centos.noarch<br>
ovirt-imageio-common-0.3.0-1.e<wbr>l7.noarch<br>
ovirt-imageio-proxy-0.3.0-0.20<wbr>1606191345.git9f3d6d4.el7.cent<wbr>os.noarch<br>
ovirt-imageio-proxy-setup-0.3.<wbr>0-0.201606191345.git9f3d6d4.el<wbr>7.centos.noarch<br>
ovirt-image-uploader-4.0.0-1.e<wbr>l7.centos.noarch<br>
ovirt-iso-uploader-4.0.0-1.el7<wbr>.centos.noarch<br>
ovirt-setup-lib-1.0.2-1.el7.ce<wbr>ntos.noarch<br>
ovirt-vmconsole-1.0.4-1.el7.ce<wbr>ntos.noarch<br>
ovirt-vmconsole-proxy-1.0.4-1.<wbr>el7.centos.noarch<br>
<br>
<br>
<br>
<br>
log snippet:<br>
2017-02-16 11:34:44,959 INFO [org.ovirt.engine.core.vdsbrok<wbr>er.irsbroker.GetVmsInfoVDSComm<wbr>and]
(default task-28) [] START, GetVmsInfoVDSCommand(
GetVmsInfoVDSCommandParameters<wbr>:{runAsync='true',
storagePoolId='00000001-0001-0<wbr>001-0001-0000000001a5',
ignoreFailoverLimit='false', storageDomainId='13127103-3f59<wbr>-418a-90f1-5b1ade8526b1',
vmIdList='null'}), log id: 3c406c84<br>
2017-02-16 11:34:45,967 INFO [org.ovirt.engine.core.vdsbrok<wbr>er.irsbroker.GetVmsInfoVDSComm<wbr>and]
(default task-28) [] FINISH, GetVmsInfoVDSCommand, log id:
3c406c84<br>
2017-02-16 11:34:46,178 INFO [org.ovirt.engine.core.bll.exp<wbr>ortimport.ExportVmCommand]
(default task-24) [50b27eef] Lock Acquired to object
'EngineLock:{exclusiveLocks='[<wbr>ba806b93-b6fe-4873-99ec-55bb34<wbr>c12e5f=<VM,
ACTION_TYPE_FAILED_OBJECT_LOCK<wbr>ED>]',
sharedLocks='null'}'<br>
2017-02-16 11:34:46,221 INFO [org.ovirt.engine.core.vdsbrok<wbr>er.irsbroker.GetVmsInfoVDSComm<wbr>and]
(default task-24) [50b27eef] START, GetVmsInfoVDSCommand(
GetVmsInfoVDSCommandParameters<wbr>:{runAsync='true',
storagePoolId='00000001-0001-0<wbr>001-0001-0000000001a5',
ignoreFailoverLimit='false', storageDomainId='13127103-3f59<wbr>-418a-90f1-5b1ade8526b1',
vmIdList='null'}), log id: 61bfd908<br>
2017-02-16 11:34:47,227 INFO [org.ovirt.engine.core.vdsbrok<wbr>er.irsbroker.GetVmsInfoVDSComm<wbr>and]
(default task-24) [50b27eef] FINISH, GetVmsInfoVDSCommand,
log id: 61bfd908<br>
2017-02-16 11:34:47,242 INFO [org.ovirt.engine.core.vdsbrok<wbr>er.irsbroker.GetVmsInfoVDSComm<wbr>and]
(default task-24) [50b27eef] START, GetVmsInfoVDSCommand(
GetVmsInfoVDSCommandParameters<wbr>:{runAsync='true',
storagePoolId='00000001-0001-0<wbr>001-0001-0000000001a5',
ignoreFailoverLimit='false', storageDomainId='13127103-3f59<wbr>-418a-90f1-5b1ade8526b1',
vmIdList='null'}), log id: 7cd19381<br>
2017-02-16 11:34:47,276 INFO [org.ovirt.engine.core.vdsbrok<wbr>er.irsbroker.GetVmsInfoVDSComm<wbr>and]
(default task-24) [50b27eef] FINISH, GetVmsInfoVDSCommand,
log id: 7cd19381<br>
2017-02-16 11:34:47,294 INFO [org.ovirt.engine.core.bll.exp<wbr>ortimport.ExportVmCommand]
(org.ovirt.thread.pool-8-threa<wbr>d-39) [50b27eef] Running
command: ExportVmCommand internal: false. Entities affected
: ID: 13127103-3f59-418a-90f1-5b1ade<wbr>8526b1 Type:
StorageAction group IMPORT_EXPORT_VM with role type ADMIN<br>
2017-02-16 11:34:47,296 INFO [org.ovirt.engine.core.vdsbrok<wbr>er.SetVmStatusVDSCommand]
(org.ovirt.thread.pool-8-threa<wbr>d-39) [50b27eef] START,
SetVmStatusVDSCommand( SetVmStatusVDSCommandParameter<wbr>s:{runAsync='true',
vmId='ba806b93-b6fe-4873-99ec-<wbr>55bb34c12e5f',
status='ImageLocked', exitStatus='Normal'}), log id:
61f2f832<br>
2017-02-16 11:34:47,299 INFO [org.ovirt.engine.core.vdsbrok<wbr>er.SetVmStatusVDSCommand]
(org.ovirt.thread.pool-8-threa<wbr>d-39) [50b27eef] FINISH,
SetVmStatusVDSCommand, log id: 61f2f832<br>
2017-02-16 11:34:47,301 INFO [org.ovirt.engine.core.bll.exp<wbr>ortimport.ExportVmCommand]
(org.ovirt.thread.pool-8-threa<wbr>d-39) [50b27eef] Lock
freed to object 'EngineLock:{exclusiveLocks='[<wbr>ba806b93-b6fe-4873-99ec-55bb34<wbr>c12e5f=<VM,
ACTION_TYPE_FAILED_OBJECT_LOCK<wbr>ED>]',
sharedLocks='null'}'<br>
2017-02-16 11:34:47,339 INFO [org.ovirt.engine.core.bll.sto<wbr>rage.disk.image.CopyImageGroup<wbr>Command]
(org.ovirt.thread.pool-8-threa<wbr>d-39) [a12f549] Running
command: CopyImageGroupCommand internal: true. Entities
affected : ID: 13127103-3f59-418a-90f1-5b1ade<wbr>8526b1
Type: Storage<br>
2017-02-16 11:34:47,356 INFO [org.ovirt.engine.core.vdsbrok<wbr>er.irsbroker.MoveImageGroupVDS<wbr>Command]
(org.ovirt.thread.pool-8-threa<wbr>d-39) [a12f549] START,
MoveImageGroupVDSCommand( MoveImageGroupVDSCommandParame<wbr>ters:{runAsync='true',
storagePoolId='00000001-0001-0<wbr>001-0001-0000000001a5',
ignoreFailoverLimit='false', storageDomainId='a89a626f-3f6b<wbr>-452b-840e-ce0fee6f6461',
imageGroupId='e17ebd7c-0763-42<wbr>b2-b344-5ad7f9cf448e',
dstDomainId='13127103-3f59-418<wbr>a-90f1-5b1ade8526b1',
vmId='ba806b93-b6fe-4873-99ec-<wbr>55bb34c12e5f', op='Copy',
postZero='false', force='false'}), log id: 1ee1f0ae<br>
2017-02-16 11:34:48,211 INFO [org.ovirt.engine.core.vdsbrok<wbr>er.irsbroker.MoveImageGroupVDS<wbr>Command]
(org.ovirt.thread.pool-8-threa<wbr>d-39) [a12f549] FINISH,
MoveImageGroupVDSCommand, log id: 1ee1f0ae<br>
2017-02-16 11:34:48,216 INFO [org.ovirt.engine.core.bll.tas<wbr>ks.CommandAsyncTask]
(org.ovirt.thread.pool-8-threa<wbr>d-39) [a12f549]
CommandAsyncTask::Adding CommandMultiAsyncTasks object for
command '0b807437-17fe-4773-a539-09dde<wbr>e3df215'<br>
2017-02-16 11:34:48,216 INFO [<a moz-do-not-send="true"
href="http://org.ovirt.engine.core.bll.Com">org.ovirt.engine.core.bll.Com</a><wbr>mandMultiAsyncTasks]
(org.ovirt.thread.pool-8-threa<wbr>d-39) [a12f549]
CommandMultiAsyncTasks::attach<wbr>Task: Attaching task
'3132106a-ce35-4b12-9a72-812e4<wbr>15eff7f' to command
'0b807437-17fe-4773-a539-09dde<wbr>e3df215'.<br>
2017-02-16 11:34:48,225 INFO [org.ovirt.engine.core.bll.tas<wbr>ks.AsyncTaskManager]
(org.ovirt.thread.pool-8-threa<wbr>d-39) [a12f549] Adding
task '3132106a-ce35-4b12-9a72-812e4<wbr>15eff7f' (Parent
Command 'ExportVm', Parameters Type
'org.ovirt.engine.core.common.<wbr>asynctasks.AsyncTaskParameters<wbr>'),
polling hasn't started yet..<br>
2017-02-16 11:34:48,256 INFO [org.ovirt.engine.core.bll.sto<wbr>rage.disk.image.CopyImageGroup<wbr>Command]
(org.ovirt.thread.pool-8-threa<wbr>d-39) [8887fd8] Running
command: CopyImageGroupCommand internal: true. Entities
affected : ID: 13127103-3f59-418a-90f1-5b1ade<wbr>8526b1
Type: Storage<br>
2017-02-16 11:34:48,271 INFO [org.ovirt.engine.core.vdsbrok<wbr>er.irsbroker.MoveImageGroupVDS<wbr>Command]
(org.ovirt.thread.pool-8-threa<wbr>d-39) [8887fd8] START,
MoveImageGroupVDSCommand( MoveImageGroupVDSCommandParame<wbr>ters:{runAsync='true',
storagePoolId='00000001-0001-0<wbr>001-0001-0000000001a5',
ignoreFailoverLimit='false', storageDomainId='a89a626f-3f6b<wbr>-452b-840e-ce0fee6f6461',
imageGroupId='b4c4b53e-3813-49<wbr>59-a145-16f1dfcf1838',
dstDomainId='13127103-3f59-418<wbr>a-90f1-5b1ade8526b1',
vmId='ba806b93-b6fe-4873-99ec-<wbr>55bb34c12e5f', op='Copy',
postZero='false', force='false'}), log id: 74c9d14e<br>
2017-02-16 11:34:48,354 INFO [org.ovirt.engine.core.vdsbrok<wbr>er.monitoring.VmsStatisticsFet<wbr>cher]
(DefaultQuartzScheduler4) [2004a741] Fetched 1 VMs from VDS
'627314c4-7861-4ded-8257-22023<wbr>a6a748d'<br>
2017-02-16 11:34:49,369 INFO [org.ovirt.engine.core.vdsbrok<wbr>er.irsbroker.MoveImageGroupVDS<wbr>Command]
(org.ovirt.thread.pool-8-threa<wbr>d-39) [8887fd8] FINISH,
MoveImageGroupVDSCommand, log id: 74c9d14e<br>
2017-02-16 11:34:49,373 INFO [<a moz-do-not-send="true"
href="http://org.ovirt.engine.core.bll.Com">org.ovirt.engine.core.bll.Com</a><wbr>mandMultiAsyncTasks]
(org.ovirt.thread.pool-8-threa<wbr>d-39) [8887fd8]
CommandMultiAsyncTasks::attach<wbr>Task: Attaching task
'112fb772-a497-4788-829f-190d6<wbr>d008d95' to command
'0b807437-17fe-4773-a539-09dde<wbr>e3df215'.<br>
2017-02-16 11:34:49,390 INFO [org.ovirt.engine.core.bll.tas<wbr>ks.AsyncTaskManager]
(org.ovirt.thread.pool-8-threa<wbr>d-39) [8887fd8] Adding
task '112fb772-a497-4788-829f-190d6<wbr>d008d95' (Parent
Command 'ExportVm', Parameters Type
'org.ovirt.engine.core.common.<wbr>asynctasks.AsyncTaskParameters<wbr>'),
polling hasn't started yet..<br>
2017-02-16 11:34:49,410 INFO [org.ovirt.engine.core.dal.dbb<wbr>roker.auditloghandling.AuditLo<wbr>gDirector]
(org.ovirt.thread.pool-8-threa<wbr>d-39) [8887fd8]
Correlation ID: 50b27eef, Job ID:
276e8e51-cbed-43bb-bcfa-984246<wbr>7e978b, Call Stack: null,
Custom Event ID: -1, Message: Starting export Vm <a
moz-do-not-send="true" href="http://ecf-sat6.fnal.gov"
rel="noreferrer" target="_blank">ecf-sat6.fnal.gov</a> to
RITM0524722<br>
2017-02-16 11:34:49,411 INFO [org.ovirt.engine.core.bll.tas<wbr>ks.SPMAsyncTask]
(org.ovirt.thread.pool-8-threa<wbr>d-39) [8887fd8]
BaseAsyncTask::startPollingTas<wbr>k: Starting to poll task
'3132106a-ce35-4b12-9a72-812e4<wbr>15eff7f'.<br>
2017-02-16 11:34:49,411 INFO [org.ovirt.engine.core.bll.tas<wbr>ks.SPMAsyncTask]
(org.ovirt.thread.pool-8-threa<wbr>d-39) [8887fd8]
BaseAsyncTask::startPollingTas<wbr>k: Starting to poll task
'112fb772-a497-4788-829f-190d6<wbr>d008d95'.<br>
2017-02-16 11:34:50,302 INFO [org.ovirt.engine.core.bll.tas<wbr>ks.AsyncTaskManager]
(DefaultQuartzScheduler4) [2004a741] Polling and updating
Async Tasks: 2 tasks, 2 tasks to poll now<br>
2017-02-16 11:34:50,361 ERROR [org.ovirt.engine.core.vdsbrok<wbr>er.vdsbroker.HSMGetAllTasksSta<wbr>tusesVDSCommand]
(DefaultQuartzScheduler4) [2004a741] Failed in
'HSMGetAllTasksStatusesVDS' method<br>
2017-02-16 11:34:50,365 ERROR [org.ovirt.engine.core.dal.dbb<wbr>roker.auditloghandling.AuditLo<wbr>gDirector]
(DefaultQuartzScheduler4) [2004a741] Correlation ID: null,
Call Stack: null, Custom Event ID: -1, Message: VDSM <a
moz-do-not-send="true"
href="http://slam-vmnode-03.fnal.gov" rel="noreferrer"
target="_blank">slam-vmnode-03.fnal.gov</a> command
failed: Could not acquire resource. Probably resource
factory threw an exception.: ()<br>
2017-02-16 11:34:50,365 ERROR [org.ovirt.engine.core.vdsbrok<wbr>er.vdsbroker.HSMGetAllTasksSta<wbr>tusesVDSCommand]
(DefaultQuartzScheduler4) [2004a741] Failed in
'HSMGetAllTasksStatusesVDS' method<br>
2017-02-16 11:34:50,368 ERROR [org.ovirt.engine.core.dal.dbb<wbr>roker.auditloghandling.AuditLo<wbr>gDirector]
(DefaultQuartzScheduler4) [2004a741] Correlation ID: null,
Call Stack: null, Custom Event ID: -1, Message: VDSM <a
moz-do-not-send="true"
href="http://slam-vmnode-03.fnal.gov" rel="noreferrer"
target="_blank">slam-vmnode-03.fnal.gov</a> command
failed: Could not acquire resource. Probably resource
factory threw an exception.: ()<br>
2017-02-16 11:34:50,368 INFO [org.ovirt.engine.core.bll.tas<wbr>ks.SPMAsyncTask]
(DefaultQuartzScheduler4) [2004a741] SPMAsyncTask::PollTask:
Polling task '3132106a-ce35-4b12-9a72-812e4<wbr>15eff7f'
(Parent Command 'ExportVm', Parameters Type
'org.ovirt.engine.core.common.<wbr>asynctasks.AsyncTaskParameters<wbr>')
returned status 'finished', result 'cleanSuccess'.<br>
2017-02-16 11:34:50,371 ERROR [org.ovirt.engine.core.bll.tas<wbr>ks.SPMAsyncTask]
(DefaultQuartzScheduler4) [2004a741]
BaseAsyncTask::logEndTaskFailu<wbr>re: Task
'3132106a-ce35-4b12-9a72-812e4<wbr>15eff7f' (Parent Command
'ExportVm', Parameters Type 'org.ovirt.engine.core.common.<wbr>asynctasks.AsyncTaskParameters<wbr>')
ended with failure:<br>
-- Result: 'cleanSuccess'<br>
-- Message: 'VDSGenericException: VDSErrorException: Failed
to HSMGetAllTasksStatusesVDS, error = Could not acquire
resource. Probably resource factory threw an exception.: (),
code = 100',<br>
-- Exception: 'VDSGenericException: VDSErrorException:
Failed to HSMGetAllTasksStatusesVDS, error = Could not
acquire resource. Probably resource factory threw an
exception.: (), code = 100'<br>
2017-02-16 11:34:50,374 INFO [<a moz-do-not-send="true"
href="http://org.ovirt.engine.core.bll.Com">org.ovirt.engine.core.bll.Com</a><wbr>mandMultiAsyncTasks]
(DefaultQuartzScheduler4) [2004a741] Task with DB Task ID
'ae5bd098-51e8-4415-b0f7-0f3ef<wbr>010ec7b' and VDSM Task ID
'112fb772-a497-4788-829f-190d6<wbr>d008d95' is in state
Polling. End action for command
0b807437-17fe-4773-a539-09ddee<wbr>3df215 will proceed when
all the entity's tasks are completed.<br>
2017-02-16 11:34:50,374 INFO [org.ovirt.engine.core.bll.tas<wbr>ks.SPMAsyncTask]
(DefaultQuartzScheduler4) [2004a741] SPMAsyncTask::PollTask:
Polling task '112fb772-a497-4788-829f-190d6<wbr>d008d95'
(Parent Command 'ExportVm', Parameters Type
'org.ovirt.engine.core.common.<wbr>asynctasks.AsyncTaskParameters<wbr>')
returned status 'finished', result 'cleanSuccess'.<br>
2017-02-16 11:34:50,377 ERROR [org.ovirt.engine.core.bll.tas<wbr>ks.SPMAsyncTask]
(DefaultQuartzScheduler4) [2004a741]
BaseAsyncTask::logEndTaskFailu<wbr>re: Task
'112fb772-a497-4788-829f-190d6<wbr>d008d95' (Parent Command
'ExportVm', Parameters Type 'org.ovirt.engine.core.common.<wbr>asynctasks.AsyncTaskParameters<wbr>')
ended with failure:<br>
-- Result: 'cleanSuccess'<br>
-- Message: 'VDSGenericException: VDSErrorException: Failed
to HSMGetAllTasksStatusesVDS, error = Could not acquire
resource. Probably resource factory threw an exception.: (),
code = 100',<br>
-- Exception: 'VDSGenericException: VDSErrorException:
Failed to HSMGetAllTasksStatusesVDS, error = Could not
acquire resource. Probably resource factory threw an
exception.: (), code = 100'<br>
2017-02-16 11:34:50,379 INFO [org.ovirt.engine.core.bll.tas<wbr>ks.CommandAsyncTask]
(DefaultQuartzScheduler4) [2004a741]
CommandAsyncTask::endActionIfN<wbr>ecessary: All tasks of
command '0b807437-17fe-4773-a539-09dde<wbr>e3df215' has
ended -> executing 'endAction'<br>
2017-02-16 11:34:50,379 INFO [org.ovirt.engine.core.bll.tas<wbr>ks.CommandAsyncTask]
(DefaultQuartzScheduler4) [2004a741]
CommandAsyncTask::endAction: Ending action for '2' tasks
(command ID: '0b807437-17fe-4773-a539-09dde<wbr>e3df215'):
calling endAction '.<br>
2017-02-16 11:34:50,380 INFO [org.ovirt.engine.core.bll.tas<wbr>ks.CommandAsyncTask]
(org.ovirt.thread.pool-8-threa<wbr>d-41) [2004a741]
CommandAsyncTask::endCommandAc<wbr>tion [within thread]
context: Attempting to endAction 'ExportVm', executionIndex:
'0'<br>
2017-02-16 11:34:50,495 ERROR [org.ovirt.engine.core.bll.exp<wbr>ortimport.ExportVmCommand]
(org.ovirt.thread.pool-8-threa<wbr>d-41) [2004a741] Ending
command 'org.ovirt.engine.core.bll.exp<wbr>ortimport.ExportVmCommand'
with failure.<br>
2017-02-16 11:34:50,507 ERROR [org.ovirt.engine.core.bll.sto<wbr>rage.disk.image.CopyImageGroup<wbr>Command]
(org.ovirt.thread.pool-8-threa<wbr>d-41) [a12f549] Ending
command 'org.ovirt.engine.core.bll.sto<wbr>rage.disk.image.CopyImageGroup<wbr>Command'
with failure.<br>
2017-02-16 11:34:50,529 ERROR [org.ovirt.engine.core.bll.sto<wbr>rage.disk.image.CopyImageGroup<wbr>Command]
(org.ovirt.thread.pool-8-threa<wbr>d-41) [8887fd8] Ending
command 'org.ovirt.engine.core.bll.sto<wbr>rage.disk.image.CopyImageGroup<wbr>Command'
with failure.<br>
2017-02-16 11:34:50,534 INFO [org.ovirt.engine.core.vdsbrok<wbr>er.SetVmStatusVDSCommand]
(org.ovirt.thread.pool-8-threa<wbr>d-41) [] START,
SetVmStatusVDSCommand( SetVmStatusVDSCommandParameter<wbr>s:{runAsync='true',
vmId='ba806b93-b6fe-4873-99ec-<wbr>55bb34c12e5f',
status='Down', exitStatus='Normal'}), log id: 28c5e88c<br>
2017-02-16 11:34:50,536 INFO [org.ovirt.engine.core.vdsbrok<wbr>er.SetVmStatusVDSCommand]
(org.ovirt.thread.pool-8-threa<wbr>d-41) [] FINISH,
SetVmStatusVDSCommand, log id: 28c5e88c<br>
2017-02-16 11:34:50,549 ERROR [org.ovirt.engine.core.dal.dbb<wbr>roker.auditloghandling.AuditLo<wbr>gDirector]
(org.ovirt.thread.pool-8-threa<wbr>d-41) [] Correlation ID:
50b27eef, Job ID: 276e8e51-cbed-43bb-bcfa-984246<wbr>7e978b,
Call Stack: null, Custom Event ID: -1, Message: Failed to
export Vm <a moz-do-not-send="true"
href="http://ecf-sat6.fnal.gov" rel="noreferrer"
target="_blank">ecf-sat6.fnal.gov</a> to RITM0524722<br>
2017-02-16 11:34:50,549 INFO [org.ovirt.engine.core.bll.tas<wbr>ks.CommandAsyncTask]
(org.ovirt.thread.pool-8-threa<wbr>d-41) []
CommandAsyncTask::HandleEndAct<wbr>ionResult [within
thread]: endAction for action type 'ExportVm' completed,
handling the result.<br>
2017-02-16 11:34:50,549 INFO [org.ovirt.engine.core.bll.tas<wbr>ks.CommandAsyncTask]
(org.ovirt.thread.pool-8-threa<wbr>d-41) []
CommandAsyncTask::HandleEndAct<wbr>ionResult [within
thread]: endAction for action type 'ExportVm' succeeded,
clearing tasks.<br>
2017-02-16 11:34:50,549 INFO [org.ovirt.engine.core.bll.tas<wbr>ks.SPMAsyncTask]
(org.ovirt.thread.pool-8-threa<wbr>d-41) []
SPMAsyncTask::ClearAsyncTask: Attempting to clear task
'3132106a-ce35-4b12-9a72-812e4<wbr>15eff7f'<br>
2017-02-16 11:34:50,551 INFO [org.ovirt.engine.core.vdsbrok<wbr>er.irsbroker.SPMClearTaskVDSCo<wbr>mmand]
(org.ovirt.thread.pool-8-threa<wbr>d-41) [] START,
SPMClearTaskVDSCommand( SPMTaskGuidBaseVDSCommandParam<wbr>eters:{runAsync='true',
storagePoolId='00000001-0001-0<wbr>001-0001-0000000001a5',
ignoreFailoverLimit='false', taskId='3132106a-ce35-4b12-9a7<wbr>2-812e415eff7f'}),
log id: 675799a2<br>
2017-02-16 11:34:50,552 INFO [org.ovirt.engine.core.vdsbrok<wbr>er.vdsbroker.HSMClearTaskVDSCo<wbr>mmand]
(org.ovirt.thread.pool-8-threa<wbr>d-41) [] START,
HSMClearTaskVDSCommand(HostNam<wbr>e = <a
moz-do-not-send="true"
href="http://slam-vmnode-03.fnal.gov" rel="noreferrer"
target="_blank">slam-vmnode-03.fnal.gov</a>,
HSMTaskGuidBaseVDSCommandParam<wbr>eters:{runAsync='true',
hostId='eacb0ca6-794a-4c94-8dc<wbr>8-00a8a6d88042',
taskId='3132106a-ce35-4b12-9a7<wbr>2-812e415eff7f'}), log
id: 2d8fe4d0<br>
2017-02-16 11:34:50,554 INFO [org.ovirt.engine.core.vdsbrok<wbr>er.monitoring.VmsStatisticsFet<wbr>cher]
(DefaultQuartzScheduler6) [77b1baeb] Fetched 0 VMs from VDS
'eacb0ca6-794a-4c94-8dc8-00a8a<wbr>6d88042'<br>
2017-02-16 11:34:51,560 INFO [org.ovirt.engine.core.vdsbrok<wbr>er.vdsbroker.HSMClearTaskVDSCo<wbr>mmand]
(org.ovirt.thread.pool-8-threa<wbr>d-41) [] FINISH,
HSMClearTaskVDSCommand, log id: 2d8fe4d0<br>
2017-02-16 11:34:51,560 INFO [org.ovirt.engine.core.vdsbrok<wbr>er.irsbroker.SPMClearTaskVDSCo<wbr>mmand]
(org.ovirt.thread.pool-8-threa<wbr>d-41) [] FINISH,
SPMClearTaskVDSCommand, log id: 675799a2<br>
2017-02-16 11:34:51,564 INFO [org.ovirt.engine.core.bll.tas<wbr>ks.SPMAsyncTask]
(org.ovirt.thread.pool-8-threa<wbr>d-41) []
BaseAsyncTask::removeTaskFromD<wbr>B: Removed task
'3132106a-ce35-4b12-9a72-812e4<wbr>15eff7f' from DataBase<br>
2017-02-16 11:34:51,564 INFO [org.ovirt.engine.core.bll.tas<wbr>ks.SPMAsyncTask]
(org.ovirt.thread.pool-8-threa<wbr>d-41) []
SPMAsyncTask::ClearAsyncTask: Attempting to clear task
'112fb772-a497-4788-829f-190d6<wbr>d008d95'<br>
2017-02-16 11:34:51,566 INFO [org.ovirt.engine.core.vdsbrok<wbr>er.irsbroker.SPMClearTaskVDSCo<wbr>mmand]
(org.ovirt.thread.pool-8-threa<wbr>d-41) [] START,
SPMClearTaskVDSCommand( SPMTaskGuidBaseVDSCommandParam<wbr>eters:{runAsync='true',
storagePoolId='00000001-0001-0<wbr>001-0001-0000000001a5',
ignoreFailoverLimit='false', taskId='112fb772-a497-4788-829<wbr>f-190d6d008d95'}),
log id: 6b4cf8ff<br>
2017-02-16 11:34:51,567 INFO [org.ovirt.engine.core.vdsbrok<wbr>er.vdsbroker.HSMClearTaskVDSCo<wbr>mmand]
(org.ovirt.thread.pool-8-threa<wbr>d-41) [] START,
HSMClearTaskVDSCommand(HostNam<wbr>e = <a
moz-do-not-send="true"
href="http://slam-vmnode-03.fnal.gov" rel="noreferrer"
target="_blank">slam-vmnode-03.fnal.gov</a>,
HSMTaskGuidBaseVDSCommandParam<wbr>eters:{runAsync='true',
hostId='eacb0ca6-794a-4c94-8dc<wbr>8-00a8a6d88042',
taskId='112fb772-a497-4788-829<wbr>f-190d6d008d95'}), log
id: 6f2df357<br>
2017-02-16 11:34:51,608 INFO [org.ovirt.engine.core.vdsbrok<wbr>er.vdsbroker.HSMClearTaskVDSCo<wbr>mmand]
(org.ovirt.thread.pool-8-threa<wbr>d-41) [] FINISH,
HSMClearTaskVDSCommand, log id: 6f2df357<br>
2017-02-16 11:34:51,608 INFO [org.ovirt.engine.core.vdsbrok<wbr>er.irsbroker.SPMClearTaskVDSCo<wbr>mmand]
(org.ovirt.thread.pool-8-threa<wbr>d-41) [] FINISH,
SPMClearTaskVDSCommand, log id: 6b4cf8ff<br>
2017-02-16 11:34:51,611 INFO [org.ovirt.engine.core.bll.tas<wbr>ks.SPMAsyncTask]
(org.ovirt.thread.pool-8-threa<wbr>d-41) []
BaseAsyncTask::removeTaskFromD<wbr>B: Removed task
'112fb772-a497-4788-829f-190d6<wbr>d008d95' from DataBase<br>
2017-02-16 11:34:51,611 INFO [org.ovirt.engine.core.bll.tas<wbr>ks.CommandAsyncTask]
(org.ovirt.thread.pool-8-threa<wbr>d-41) []
CommandAsyncTask::HandleEndAct<wbr>ionResult [within
thread]: Removing CommandMultiAsyncTasks object for entity
'0b807437-17fe-4773-a539-09dde<wbr>e3df215'<span
class="HOEnZb"><font color="#888888"><br>
<br>
<br>
-- <br>
Pat Riehecky<br>
<br>
Fermi National Accelerator Laboratory<br>
<a moz-do-not-send="true" href="http://www.fnal.gov"
rel="noreferrer" target="_blank">www.fnal.gov</a><br>
<a moz-do-not-send="true"
href="http://www.scientificlinux.org" rel="noreferrer"
target="_blank">www.scientificlinux.org</a><br>
<br>
______________________________<wbr>_________________<br>
Users mailing list<br>
<a moz-do-not-send="true" href="mailto:Users@ovirt.org"
target="_blank">Users@ovirt.org</a><br>
<a moz-do-not-send="true"
href="http://lists.ovirt.org/mailman/listinfo/users"
rel="noreferrer" target="_blank">http://lists.ovirt.org/mailman<wbr>/listinfo/users</a><br>
</font></span></blockquote>
</div>
<br>
<br clear="all">
<br>
-- <br>
<div class="gmail_signature" data-smartmail="gmail_signature">
<div dir="ltr">Adam Litke</div>
</div>
</div>
</blockquote>
<br>
</body>
</html>