KSM and cross-vm attack
by Jorick Astrego
This is a multi-part message in MIME format.
--------------040006070006020209070007
Content-Type: text/plain; charset=ISO-8859-1; format=flowed
Content-Transfer-Encoding: 7bit
Hi,
Maybe I should be posting to the kvm mailing list, but I think people
here should know a thing or two about it.
I just read the following research paper and although the attack was
done on VMWare; from what I read about it, it could be possible with KSM
on KVM also. If you really need tight security it looks like it would be
better to disable KSM.
But don't take my word for it as IANAC (I Am Not A Cryptographer).
http://soylentnews.org/article.pl?sid=14/06/12/1349234&from=rss
Practical Cross-VM AES Full Key Recovery Attack
<http://soylentnews.org/article.pl?sid=14/06/12/1349234>
posted by janrinok <http://soylentnews.org/%7Ejanrinok/> on Thursday
June 12, @02:53PM
**
dbot <http://soylentnews.org/%7Edbot/> writes:
Researchers from Worcester Polytechnic Institute (Worcester, MA),
have published a paper illustrating a practical full Advanced
Encryption Standard key recovery from AES operations preformed in
one virtual machine, by another VM
<http://eprint.iacr.org/2014/435.pdf> [*PDF*] running on the same
hardware at the same time.
The attack specifically requires memory de-duplication to be
enabled, and they target VMWare's VM software. Combining various
attacks on memory de-duplication, and existing side channel attacks:
In summary, this works:
* shows for the first time that de-duplication enables fine
grain cross-VM attacks;
* introduces a new Flush+Reload based attack that does not
require interrupting the victim after each encryption round;
* presents the first practical cross-VM attack on AES; the
attack is generic and can be adapted to any table-based
block ciphers.
They target OpenSSL 1.0.1.
It will be interesting to see if the suggested countermeasure,
flushing the T table cache after each operation (effective against
other Flush+Reload attacks), is added to LibreSSL. Will it be left
out, in the name of performance - or will they move to a different
implementation of AES (not T table-based)?
Kind regards,
Jorick Astrego
Netbulae
--------------040006070006020209070007
Content-Type: text/html; charset=ISO-8859-1
Content-Transfer-Encoding: 7bit
<html>
<head>
<meta http-equiv="content-type" content="text/html; charset=ISO-8859-1">
</head>
<body text="#000000" bgcolor="#FFFFFF">
Hi,<br>
<br>
Maybe I should be posting to the kvm mailing list, but I think
people here should know a thing or two about it.<br>
<br>
I just read the following research paper and although the attack was
done on VMWare; from what I read about it, it could be possible with
KSM on KVM also. If you really need tight security it looks like it
would be better to disable KSM. <br>
<br>
But don't take my word for it as IANAC (I Am Not A Cryptographer). <br>
<blockquote><a class="moz-txt-link-freetext" href="http://soylentnews.org/article.pl?sid=14/06/12/1349234&amp;from=rss">http://soylentnews.org/article.pl?sid=14/06/12/1349234&amp;from=rss</a><br>
<div class="generaltitle">
<div class="title">
<h3> <a
href="http://soylentnews.org/article.pl?sid=14/06/12/1349234">Practical
Cross-VM AES Full Key Recovery Attack</a> </h3>
</div>
</div>
<div class="body"> posted by <a
href="http://soylentnews.org/%7Ejanrinok/"> janrinok</a> on
Thursday June 12, @02:53PM <br>
<strong></strong>
<div class="intro">
<p class="byline"> <a href="http://soylentnews.org/%7Edbot/">dbot</a>
writes:</p>
<p>Researchers from Worcester Polytechnic Institute
(Worcester, MA), have published a paper illustrating a <a
href="http://eprint.iacr.org/2014/435.pdf">practical full
Advanced Encryption Standard key recovery from AES
operations preformed in one virtual machine, by another VM</a>
[<b>PDF</b>] running on the same hardware at the same time.</p>
<p>The attack specifically requires memory de-duplication to
be enabled, and they target VMWare's VM software. Combining
various attacks on memory de-duplication, and existing side
channel attacks:</p>
<blockquote>
<div>
<p>In summary, this works:</p>
<ul>
<li>shows for the first time that de-duplication enables
fine grain cross-VM attacks;</li>
<li>introduces a new Flush+Reload based attack that does
not require interrupting the victim after each
encryption round;</li>
<li>presents the first practical cross-VM attack on AES;
the attack is generic and can be adapted to any
table-based block ciphers.</li>
</ul>
</div>
</blockquote>
<p>They target OpenSSL 1.0.1.</p>
<p>It will be interesting to see if the suggested
countermeasure, flushing the T table cache after each
operation (effective against other Flush+Reload attacks), is
added to LibreSSL. Will it be left out, in the name of
performance - or will they move to a different
implementation of AES (not T table-based)?</p>
</div>
</div>
</blockquote>
Kind regards,<br>
<br>
Jorick Astrego<br>
Netbulae<br>
</body>
</html>
--------------040006070006020209070007--
10 years, 5 months
Re: [ovirt-users] SLA : RAM scheduling
by Karli Sjöberg
--_000_5F9E965F5A80BC468BE5F40576769F0979FC230Bexchange21_
Content-Type: text/plain; charset="utf-8"
Content-Transfer-Encoding: base64
DQpEZW4gMjMgbWFqIDIwMTQgMTc6MTMgc2tyZXYgPT9JU08tODg1OS0xP1E/TmF0aGFuYT1FQmxf
QmxhbmNoZXQ/PSA8YmxhbmNoZXRAYWJlcy5mcj46DQo+DQo+DQo+IExlIDIzLzA1LzIwMTQgMTc6
MTEsIE5hdGhhbmHDq2wgQmxhbmNoZXQgYSDDqWNyaXQgOg0KPiA+IEhlbGxvLA0KPiA+IE9uIG92
aXJ0IDMuNCwgaXMgaXQgcG9zc2libGUgdG8gc2NoZWR1bGUgdm1zIGRpc3RyaWJ1dGlvbiBkZXBl
bmRpbmcgb24NCj4gPiBob3N0IFJBTSBhdmFpbGliaWxpdHk/DQo+ID4gQ29uY3JldGx5LCBJIGhh
ZCB0byBtYW51YWxseSBtb3ZlIHZtcyBhbGwgdGhlIHZtcyB0byB0aGUgc2Vjb25kIGhvc3QNCj4g
PiBvZiB0aGUgY2x1c3RlciwgdGhpcyBsZWFkIHRvIHJlYWNoIDkwJSBvY2N1cGF0aW9uIG9mIG1l
bW9yeSBvbiB0aGUNCj4gPiBkZXN0aW5hdGlvbiBob3N0LiBXaGVuIG15IGZpcnN0IGhvc3QgaGFz
IHJlYm9vdGVkLCBub25lIHZtcyBvZiB0aGUNCj4gPiBzZWNvbmQgaG9zdCBhdXRvbWF0aWNhbGx5
IG1pZ3JhdGVkIHRvIHRoZSBmaXJzdCBvbmUgd2hpY2ggaGFkIGZ1bGwNCj4gPiBSQU0uIEhvdyB0
byBtYWtlIHRoaXMgaGFwcGVuPw0KPiA+DQo+IC4uLiBzbyBhcyB0byBib3RoIGhvc3RzIGJlIFJB
TSBldmVubHkgZGlzdHJpYnV0ZWQuLi4gaG9wZSB0byBiZSBlbm91Z2gNCj4gY2xlYXIuLi4NCg0K
U291bmRzIGxpa2UgeW91IGp1c3Qgd2FudCB0byBhcHBseSB0aGUgY2x1c3RlciBwb2xpY3kgZm9y
IGV2ZW4gZGlzdHJpYnV0aW9uLiBIYXZlIHlvdSBhc3NpZ25lZCBhbnkgcG9saWN5IGZvciB0aGF0
IGNsdXN0ZXI/DQoNCi9LDQoNCj4NCj4gLS0NCj4gTmF0aGFuYcOrbCBCbGFuY2hldA0KPg0KPiBT
dXBlcnZpc2lvbiByw6lzZWF1DQo+IFDDtGxlIGV4cGxvaXRhdGlvbiBldCBtYWludGVuYW5jZQ0K
PiBEw6lwYXJ0ZW1lbnQgZGVzIHN5c3TDqG1lcyBkJ2luZm9ybWF0aW9uDQo+IDIyNyBhdmVudWUg
UHJvZmVzc2V1ci1KZWFuLUxvdWlzLVZpYWxhDQo+IDM0MTkzIE1PTlRQRUxMSUVSIENFREVYIDUN
Cj4gVMOpbC4gMzMgKDApNCA2NyA1NCA4NCA1NQ0KPiBGYXggIDMzICgwKTQgNjcgNTQgODQgMTQN
Cj4gYmxhbmNoZXRAYWJlcy5mcg0KPg0KPiBfX19fX19fX19fX19fX19fX19fX19fX19fX19fX19f
X19fX19fX19fX19fX19fXw0KPiBVc2VycyBtYWlsaW5nIGxpc3QNCj4gVXNlcnNAb3ZpcnQub3Jn
DQo+IGh0dHA6Ly9saXN0cy5vdmlydC5vcmcvbWFpbG1hbi9saXN0aW5mby91c2Vycw0K
--_000_5F9E965F5A80BC468BE5F40576769F0979FC230Bexchange21_
Content-Type: text/html; charset="utf-8"
Content-ID: <04232DB00184AB4AA78CEAFB5A968397(a)ad.slu.se>
Content-Transfer-Encoding: base64
PGh0bWw+DQo8aGVhZD4NCjxtZXRhIGh0dHAtZXF1aXY9IkNvbnRlbnQtVHlwZSIgY29udGVudD0i
dGV4dC9odG1sOyBjaGFyc2V0PXV0Zi04Ij4NCjwvaGVhZD4NCjxib2R5Pg0KPHAgZGlyPSJsdHIi
Pjxicj4NCkRlbiAyMyBtYWogMjAxNCAxNzoxMyBza3JldiA9P0lTTy04ODU5LTE/UT9OYXRoYW5h
PUVCbF9CbGFuY2hldD89ICZsdDtibGFuY2hldEBhYmVzLmZyJmd0Ozo8YnI+DQomZ3Q7PGJyPg0K
Jmd0Ozxicj4NCiZndDsgTGUgMjMvMDUvMjAxNCAxNzoxMSwgTmF0aGFuYcOrbCBCbGFuY2hldCBh
IMOpY3JpdCA6PGJyPg0KJmd0OyAmZ3Q7IEhlbGxvLDxicj4NCiZndDsgJmd0OyBPbiBvdmlydCAz
LjQsIGlzIGl0IHBvc3NpYmxlIHRvIHNjaGVkdWxlIHZtcyBkaXN0cmlidXRpb24gZGVwZW5kaW5n
IG9uIDxicj4NCiZndDsgJmd0OyBob3N0IFJBTSBhdmFpbGliaWxpdHk/PGJyPg0KJmd0OyAmZ3Q7
IENvbmNyZXRseSwgSSBoYWQgdG8gbWFudWFsbHkgbW92ZSB2bXMgYWxsIHRoZSB2bXMgdG8gdGhl
IHNlY29uZCBob3N0IDxicj4NCiZndDsgJmd0OyBvZiB0aGUgY2x1c3RlciwgdGhpcyBsZWFkIHRv
IHJlYWNoIDkwJSBvY2N1cGF0aW9uIG9mIG1lbW9yeSBvbiB0aGUgPGJyPg0KJmd0OyAmZ3Q7IGRl
c3RpbmF0aW9uIGhvc3QuIFdoZW4gbXkgZmlyc3QgaG9zdCBoYXMgcmVib290ZWQsIG5vbmUgdm1z
IG9mIHRoZSA8YnI+DQomZ3Q7ICZndDsgc2Vjb25kIGhvc3QgYXV0b21hdGljYWxseSBtaWdyYXRl
ZCB0byB0aGUgZmlyc3Qgb25lIHdoaWNoIGhhZCBmdWxsIDxicj4NCiZndDsgJmd0OyBSQU0uIEhv
dyB0byBtYWtlIHRoaXMgaGFwcGVuPzxicj4NCiZndDsgJmd0Ozxicj4NCiZndDsgLi4uIHNvIGFz
IHRvIGJvdGggaG9zdHMgYmUgUkFNIGV2ZW5seSBkaXN0cmlidXRlZC4uLiBob3BlIHRvIGJlIGVu
b3VnaCA8YnI+DQomZ3Q7IGNsZWFyLi4uPC9wPg0KPHAgZGlyPSJsdHIiPlNvdW5kcyBsaWtlIHlv
dSBqdXN0IHdhbnQgdG8gYXBwbHkgdGhlIGNsdXN0ZXIgcG9saWN5IGZvciBldmVuIGRpc3RyaWJ1
dGlvbi4gSGF2ZSB5b3UgYXNzaWduZWQgYW55IHBvbGljeSBmb3IgdGhhdCBjbHVzdGVyPzwvcD4N
CjxwIGRpcj0ibHRyIj4vSzwvcD4NCjxwIGRpcj0ibHRyIj4mZ3Q7PGJyPg0KJmd0OyAtLSA8YnI+
DQomZ3Q7IE5hdGhhbmHDq2wgQmxhbmNoZXQ8YnI+DQomZ3Q7PGJyPg0KJmd0OyBTdXBlcnZpc2lv
biByw6lzZWF1PGJyPg0KJmd0OyBQw7RsZSBleHBsb2l0YXRpb24gZXQgbWFpbnRlbmFuY2U8YnI+
DQomZ3Q7IETDqXBhcnRlbWVudCBkZXMgc3lzdMOobWVzIGQnaW5mb3JtYXRpb248YnI+DQomZ3Q7
IDIyNyBhdmVudWUgUHJvZmVzc2V1ci1KZWFuLUxvdWlzLVZpYWxhPGJyPg0KJmd0OyAzNDE5MyBN
T05UUEVMTElFUiBDRURFWCA1Jm5ic3A7Jm5ic3A7Jm5ic3A7Jm5ic3A7Jm5ic3A7Jm5ic3A7Jm5i
c3A7IDxicj4NCiZndDsgVMOpbC4gMzMgKDApNCA2NyA1NCA4NCA1NTxicj4NCiZndDsgRmF4Jm5i
c3A7IDMzICgwKTQgNjcgNTQgODQgMTQ8YnI+DQomZ3Q7IGJsYW5jaGV0QGFiZXMuZnI8YnI+DQom
Z3Q7PGJyPg0KJmd0OyBfX19fX19fX19fX19fX19fX19fX19fX19fX19fX19fX19fX19fX19fX19f
X19fXzxicj4NCiZndDsgVXNlcnMgbWFpbGluZyBsaXN0PGJyPg0KJmd0OyBVc2Vyc0BvdmlydC5v
cmc8YnI+DQomZ3Q7IGh0dHA6Ly9saXN0cy5vdmlydC5vcmcvbWFpbG1hbi9saXN0aW5mby91c2Vy
czxicj4NCjwvcD4NCjwvYm9keT4NCjwvaHRtbD4NCg==
--_000_5F9E965F5A80BC468BE5F40576769F0979FC230Bexchange21_--
10 years, 5 months
Re: [ovirt-users] fail to shutdown ubuntu guest
by Dan Kenigsberg
On Fri, Jun 13, 2014 at 09:22:37AM +0800, John Xue wrote:
> I went through the xfce GUI options "Settings Manager" - "Power
> Manager" to the field "When power button is pressed", set to power
> off, but only successful one time, after reboot it was already set to
> "Ask". If no one login to guest(just power on from console, and power
> off), it always fail.
>
> I try to modify acpi configuration:
> #cat /etc/acpi/events/powerbtn
> event=button[ /]power
> #action=/etc/acpi/powerbtn.sh
> action=/sbin/poweroff
>
> It work, but I think this isn't a good solution. Any idea? Thanks!
>
> On Thu, Jun 12, 2014 at 11:50 PM, Dan Kenigsberg <danken(a)redhat.com> wrote:
> > On Thu, Jun 12, 2014 at 01:37:19PM +0000, Sven Kieske wrote:
> >> are you sure acpid is running inside the guest?
> >
> > ... or a guest agent?
>
> yes, both of them are running in guest.
>
> >
> > Can you find the shutdown request on /var/log/vdsm/vdsm.log on the host
> > that runs your guest?
>
> yes, this is the log:
>
> Thread-158109::DEBUG::2014-06-12
> 16:08:26,589::BindingXMLRPC::965::vds::(wrapper) client
> [10.10.10.75]::call vmShutdown with
> ('b552d1aa-bc35-4788-a448-1726d4b984d5', '30', 'System Administrator
> has initiated shutdown of this Virtual Machine. Virtual Machine is
> shutting down.') {} flowID [5939b847]
> Thread-158109::DEBUG::2014-06-12
> 16:08:26,590::vm::2532::vm.Vm::(shutdown)
> vmId=`b552d1aa-bc35-4788-a448-1726d4b984d5`::guestAgent shutdown
> called
> Thread-158109::DEBUG::2014-06-12
> 16:08:26,590::guestIF::304::vm.Vm::(desktopShutdown)
> vmId=`b552d1aa-bc35-4788-a448-1726d4b984d5`::desktopShutdown called
> Thread-158109::DEBUG::2014-06-12
> 16:08:26,591::BindingXMLRPC::972::vds::(wrapper) return vmShutdown
> with {'status': {'message': 'Machine shut down', 'code': 0}}
I there no attempt to use ACPI lower in the logs?
Anyway, it seems that an ACPI even is received by the guest. Could you
now share the log of your guest agent, in order to see if it received
the shutdown request and handled it somehow?
10 years, 5 months
VDSM update warning
by Chris
Hi,
After updating VDSM to 4.14.9-0.el6 on a running vdsm node, I saw this
warning/error:
Updating : vdsm-python-zombiereaper-4.14.9-0.el6.noarch 1/10
Updating : vdsm-python-4.14.9-0.el6.x86_64 2/10
Updating : vdsm-xmlrpc-4.14.9-0.el6.noarch 3/10
Updating : vdsm-cli-4.14.9-0.el6.noarch 4/10
Updating : vdsm-4.14.9-0.el6.x86_64 5/10
Checking configuration status...
Traceback (most recent call last):
File "/usr/bin/vdsm-tool", line 145, in <module>
sys.exit(main())
File "/usr/bin/vdsm-tool", line 142, in main
return tool_command[cmd]["command"](*args[1:])
File "/usr/lib64/python2.6/site-packages/vdsm/tool/configurator.py",
line 230, in configure
service.service_stop(s)
File "/usr/lib64/python2.6/site-packages/vdsm/tool/service.py", line
370, in service_stop
return _runAlts(_srvStopAlts, srvName)
File "/usr/lib64/python2.6/site-packages/vdsm/tool/service.py", line
351, in _runAlts
"%s failed" % alt.func_name, out, err)
vdsm.tool.service.ServiceOperationError: ServiceOperationError:
_serviceStop failed
Sending stop signal sanlock (1809): [ OK ]
Waiting for sanlock (1809) to stop:[FAILED]
Do I need to be worried about it?
--
Chris
10 years, 5 months
Installation of ovirt-node-iso rpm on oVirt Engine
by Faltermeier, Florian
--_000_64382B11A805E2499F47FBB0E6BA41071DEBA043XMBX1uibkacat_
Content-Type: text/plain; charset="us-ascii"
Content-Transfer-Encoding: quoted-printable
Hi all,
I'm planning to update/reinstall my ovirt 3.4 hypervisors via the update me=
chanism that provided in the ovirt-engine.
I red the documentation about Ovirt Node http://www.ovirt.org/Category:Node
The first topic Upgrading -> Through ovirt Engine tells me that I have to i=
nstall the "ovirt-node-iso rpm".
So where can I find an actual RPM package? I've googled around already but =
I didn't found any useful hints.
Thank you!
Regards,
Florian
--_000_64382B11A805E2499F47FBB0E6BA41071DEBA043XMBX1uibkacat_
Content-Type: text/html; charset="us-ascii"
Content-Transfer-Encoding: quoted-printable
<html xmlns:v=3D"urn:schemas-microsoft-com:vml" xmlns:o=3D"urn:schemas-micr=
osoft-com:office:office" xmlns:w=3D"urn:schemas-microsoft-com:office:word" =
xmlns:m=3D"http://schemas.microsoft.com/office/2004/12/omml" xmlns=3D"http:=
//www.w3.org/TR/REC-html40">
<head>
<meta http-equiv=3D"Content-Type" content=3D"text/html; charset=3Dus-ascii"=
>
<meta name=3D"Generator" content=3D"Microsoft Word 14 (filtered medium)">
<style><!--
/* Font Definitions */
@font-face
{font-family:Calibri;
panose-1:2 15 5 2 2 2 4 3 2 4;}
/* Style Definitions */
p.MsoNormal, li.MsoNormal, div.MsoNormal
{margin:0cm;
margin-bottom:.0001pt;
font-size:11.0pt;
font-family:"Calibri","sans-serif";
mso-fareast-language:EN-US;}
a:link, span.MsoHyperlink
{mso-style-priority:99;
color:blue;
text-decoration:underline;}
a:visited, span.MsoHyperlinkFollowed
{mso-style-priority:99;
color:purple;
text-decoration:underline;}
span.E-MailFormatvorlage17
{mso-style-type:personal-compose;
font-family:"Calibri","sans-serif";
color:windowtext;}
.MsoChpDefault
{mso-style-type:export-only;
font-family:"Calibri","sans-serif";
mso-fareast-language:EN-US;}
@page WordSection1
{size:612.0pt 792.0pt;
margin:70.85pt 70.85pt 2.0cm 70.85pt;}
div.WordSection1
{page:WordSection1;}
--></style><!--[if gte mso 9]><xml>
<o:shapedefaults v:ext=3D"edit" spidmax=3D"1026" />
</xml><![endif]--><!--[if gte mso 9]><xml>
<o:shapelayout v:ext=3D"edit">
<o:idmap v:ext=3D"edit" data=3D"1" />
</o:shapelayout></xml><![endif]-->
</head>
<body lang=3D"DE-AT" link=3D"blue" vlink=3D"purple">
<div class=3D"WordSection1">
<p class=3D"MsoNormal"><span lang=3D"EN-US">Hi all,<o:p></o:p></span></p>
<p class=3D"MsoNormal"><span lang=3D"EN-US"><o:p> </o:p></span></p>
<p class=3D"MsoNormal"><span lang=3D"EN-US">I’m planning to update/re=
install my ovirt 3.4 hypervisors via the update mechanism that provided in =
the ovirt-engine.<o:p></o:p></span></p>
<p class=3D"MsoNormal"><span lang=3D"EN-US">I red the documentation about O=
virt Node <a href=3D"http://www.ovirt.org/Category:Node">
http://www.ovirt.org/Category:Node</a><o:p></o:p></span></p>
<p class=3D"MsoNormal"><span lang=3D"EN-US"><o:p> </o:p></span></p>
<p class=3D"MsoNormal"><span lang=3D"EN-US">The first topic Upgrading ->=
Through ovirt Engine tells me that I have to install the “ovirt-node=
-iso rpm”.<o:p></o:p></span></p>
<p class=3D"MsoNormal"><span lang=3D"EN-US">So where can I find an actual R=
PM package? I’ve googled around already but I didn’t foun=
d any useful hints.
<o:p></o:p></span></p>
<p class=3D"MsoNormal"><span lang=3D"EN-US"><o:p> </o:p></span></p>
<p class=3D"MsoNormal"><span lang=3D"EN-US">Thank you!<o:p></o:p></span></p=
>
<p class=3D"MsoNormal"><o:p> </o:p></p>
<p class=3D"MsoNormal">Regards,<o:p></o:p></p>
<p class=3D"MsoNormal">Florian<o:p></o:p></p>
</div>
</body>
</html>
--_000_64382B11A805E2499F47FBB0E6BA41071DEBA043XMBX1uibkacat_--
10 years, 5 months
Qemu guest agent to install RPMs in guest VM from host machine
by Puneet Bakshi
Hi,
I want to be able to install RPM packages (available in host system at some
path) to the guest VM and want this facility to be available as a tool.
I am thinking of having a gemu guest agent (qemu-ga) running inside guest
VM. I did not find any available command ("virsh qemu-agent-command
<guest_vm> ...") which can do the same.
I am planning to implement a command in qemu guest agent, which I can
invoke from virsh like below.
"virsh qemu-agent-command vm_01 \
'{"execute":"guest-rpm-install", \
"arguments":{"path":"/usr/local/bin/ABC.rpm"}}
I am able to pass arguments from host to guest VM but how am I supposed to
pass the whole RPM image from host to guest (which the guest agent can
receive and install)?
Regards,
~Puneet
10 years, 5 months
Live migration - quest VM stall
by Markus Stockhausen
------=_NextPartTM-000-6b409cfe-38cc-42a4-8291-cb24f54121f9
Content-Type: multipart/alternative;
boundary="_000_12EF8D94C6F8734FB2FF37B9FBEDD17358629D8CEXCHANGEcollogi_"
--_000_12EF8D94C6F8734FB2FF37B9FBEDD17358629D8CEXCHANGEcollogi_
Content-Type: text/plain; charset="iso-8859-1"
Content-Transfer-Encoding: quoted-printable
Hello,
at the moment we are investigating stalls of Windows XP VMs during
live migration. Our environment consists of:
- FC20 hypervisor nodes
- qemu 1.6.2
- OVirt 3.4.1
- Guest: Windows XP SP2
- VM Disks: Virtio & IDE tested
- SPICE / VNC: both tested
- Balloon: With & without tested
- Cluster compatibility: 3.4 - CPU Nehalem
After 2-10 live migrations the Windows XP guest is no longer responsive.
First of all we thougth that it might be related to SPICE because we were
no longer able to logon to the console. So we installed XP telnet server in
the VM but that showed a similar behaviour:
- The telnet welcome dialogue is always available (network seems ok)
- Sometime after a live migration if you enter the password the telnet
gives no response.
In parallel the SPICE console allows to move open windows. But as soon
as one clicks on the start the menu the system gives no response.
Even after updating to qemu 2.0 with virt-preview respositories the
behaviour stays the same. Looks like the system cannot access
Any ideas?
Markus
--_000_12EF8D94C6F8734FB2FF37B9FBEDD17358629D8CEXCHANGEcollogi_
Content-Type: text/html; charset="iso-8859-1"
Content-Transfer-Encoding: quoted-printable
<html dir=3D"ltr">
<head>
<meta http-equiv=3D"Content-Type" content=3D"text/html; charset=3Diso-8859-=
1">
<style type=3D"text/css" id=3D"owaParaStyle"></style>
</head>
<body fpstyle=3D"1" ocsi=3D"0">
<div style=3D"direction: ltr;font-family: Tahoma;color: #000000;font-size: =
10pt;">Hello,
<div><br>
</div>
<div>at the moment we are investigating stalls of Windows XP VMs during</di=
v>
<div>live migration. Our environment consists of:</div>
<div><br>
</div>
<div>- FC20 hypervisor nodes </div>
<div>- qemu 1.6.2</div>
<div>- OVirt 3.4.1</div>
<div>- Guest: Windows XP SP2</div>
<div>- VM Disks: Virtio & IDE tested</div>
<div>- SPICE / VNC: both tested</div>
<div>- Balloon: With & without tested</div>
<div>- Cluster compatibility: 3.4 - CPU Nehalem</div>
<div><br>
</div>
<div>After 2-10 live migrations the Windows XP guest is no longer responsiv=
e.</div>
<div><br>
</div>
<div>First of all we thougth that it might be related to SPICE because we w=
ere</div>
<div>no longer able to logon to the console. So we installed XP telnet serv=
er in </div>
<div>the VM <span style=3D"font-size: 10pt;">but </span><span sty=
le=3D"font-size: 10pt;">that showed a similar behaviour:</span></div>
<div><span style=3D"font-size: 10pt;"><br>
</span></div>
<div><span style=3D"font-size: 10pt;">- The telnet welcome dialogue is alwa=
ys available (network seems ok)</span></div>
<div><span style=3D"font-size: 10pt;">- </span><span style=3D"font-siz=
e: 10pt;">Sometime after a live migration</span><span style=3D"font-size: 1=
0pt;"> if you enter the</span><span style=3D"font-size: 10pt;"> =
password the telnet </span></div>
<div><span style=3D"font-size: 10pt;"> gives no response.</span></div=
>
<div><span style=3D"font-size: 10pt;">In parallel the SPICE console allows =
to move open windows. But as soon</span></div>
<div><span style=3D"font-size: 10pt;">as one clicks on the start the menu t=
he system gives no response.</span></div>
<div><span style=3D"font-size: 10pt;"><br>
</span></div>
<div><span style=3D"font-size: 10pt;">Even after updating to qemu 2.0 with =
virt-preview respositories the</span></div>
<div><span style=3D"font-size: 10pt;">behaviour stays the same. Looks like =
the system cannot access</span></div>
<div><span style=3D"font-size: 10pt;"><br>
</span></div>
<div><span style=3D"font-size: 10pt;">Any ideas?</span></div>
<div><span style=3D"font-size: 10pt;"><br>
</span></div>
<div><span style=3D"font-size: 10pt;">Markus</span></div>
</div>
</body>
</html>
--_000_12EF8D94C6F8734FB2FF37B9FBEDD17358629D8CEXCHANGEcollogi_--
------=_NextPartTM-000-6b409cfe-38cc-42a4-8291-cb24f54121f9
Content-Type: text/plain;
name="InterScan_Disclaimer.txt"
Content-Transfer-Encoding: 7bit
Content-Disposition: attachment;
filename="InterScan_Disclaimer.txt"
****************************************************************************
Diese E-Mail enthält vertrauliche und/oder rechtlich geschützte
Informationen. Wenn Sie nicht der richtige Adressat sind oder diese E-Mail
irrtümlich erhalten haben, informieren Sie bitte sofort den Absender und
vernichten Sie diese Mail. Das unerlaubte Kopieren sowie die unbefugte
Weitergabe dieser Mail ist nicht gestattet.
Über das Internet versandte E-Mails können unter fremden Namen erstellt oder
manipuliert werden. Deshalb ist diese als E-Mail verschickte Nachricht keine
rechtsverbindliche Willenserklärung.
Collogia
Unternehmensberatung AG
Ubierring 11
D-50678 Köln
Vorstand:
Kadir Akin
Dr. Michael Höhnerbach
Vorsitzender des Aufsichtsrates:
Hans Kristian Langva
Registergericht: Amtsgericht Köln
Registernummer: HRB 52 497
This e-mail may contain confidential and/or privileged information. If you
are not the intended recipient (or have received this e-mail in error)
please notify the sender immediately and destroy this e-mail. Any
unauthorized copying, disclosure or distribution of the material in this
e-mail is strictly forbidden.
e-mails sent over the internet may have been written under a wrong name or
been manipulated. That is why this message sent as an e-mail is not a
legally binding declaration of intention.
Collogia
Unternehmensberatung AG
Ubierring 11
D-50678 Köln
executive board:
Kadir Akin
Dr. Michael Höhnerbach
President of the supervisory board:
Hans Kristian Langva
Registry office: district court Cologne
Register number: HRB 52 497
****************************************************************************
------=_NextPartTM-000-6b409cfe-38cc-42a4-8291-cb24f54121f9--
10 years, 5 months
Unable to shutdown VM's
by Jorick Astrego
This is a multi-part message in MIME format.
--------------000602090709060001060807
Content-Type: text/plain; charset=ISO-8859-1; format=flowed
Content-Transfer-Encoding: 7bit
Hi,
After upgrading to 3.4.2 rc I want to reinstall the nodes. But there are
some benchmark VM's that I'm unable to shutdown to put the nodes in
maintenance. How can I kill these VM's without having to kill the power
to the nodes manually?
Kind regards,
Jorick Astrego
Netbulae B.V.
2014-Jun-05, 12:24
Failed to switch Host node1.test.now to Maintenance mode.
2014-Jun-05, 12:24
Host node1.test.now cannot change into maintenance mode - not all Vms
have been migrated successfully. Consider manual intervention:
stopping/migrating Vms: Bench1 (User: admin).
2014-Jun-05, 12:05
Shutdown of VM Bench2 failed.
2014-Jun-05, 12:05
Shutdown of VM Bench4 failed.
2014-Jun-05, 12:05
Shutdown of VM Bench3 failed.
2014-Jun-05, 12:05
Shutdown of VM Bench1 failed.
2014-Jun-05, 12:00
VM shutdown initiated by admin on VM Bench2 (Host: node2.test.now).
2014-Jun-05, 12:00
VM shutdown initiated by admin on VM Bench4 (Host: node4.test.now).
2014-Jun-05, 12:00
VM shutdown initiated by admin on VM Bench3 (Host: node3.test.now).
2014-Jun-05, 12:00
VM shutdown initiated by admin on VM Bench1 (Host: node1.test.now).
2014-06-05 12:00:33,127 INFO
[org.ovirt.engine.core.bll.MultipleActionsRunner]
(org.ovirt.thread.pool-6-thread-8) [7c4b08f7] Start running CanDoAction
for command number 1/3 (Command type: ShutdownVm)
2014-06-05 12:00:33,130 INFO
[org.ovirt.engine.core.bll.MultipleActionsRunner]
(org.ovirt.thread.pool-6-thread-14) [2239cfe1] Start running CanDoAction
for command number 2/3 (Command type: ShutdownVm)
2014-06-05 12:00:33,134 INFO
[org.ovirt.engine.core.bll.MultipleActionsRunner]
(org.ovirt.thread.pool-6-thread-50) [17ece55] Start running CanDoAction
for command number 3/3 (Command type: ShutdownVm)
2014-06-05 12:00:33,225 INFO
[org.ovirt.engine.core.bll.MultipleActionsRunner]
(org.ovirt.thread.pool-6-thread-50) [17ece55] Finish handling
CanDoAction for command number 3/3 (Command type: ShutdownVm)
2014-06-05 12:00:33,229 INFO
[org.ovirt.engine.core.bll.MultipleActionsRunner]
(org.ovirt.thread.pool-6-thread-8) [7c4b08f7] Finish handling
CanDoAction for command number 1/3 (Command type: ShutdownVm)
2014-06-05 12:00:33,234 INFO
[org.ovirt.engine.core.bll.MultipleActionsRunner]
(org.ovirt.thread.pool-6-thread-14) [2239cfe1] Finish handling
CanDoAction for command number 2/3 (Command type: ShutdownVm)
2014-06-05 12:00:33,710 INFO
[org.ovirt.engine.core.bll.ShutdownVmCommand]
(org.ovirt.thread.pool-6-thread-3) [7c4b08f7] Running command:
ShutdownVmCommand internal: false. Entities affected : ID:
6709eaa1-163f-4dc5-9101-e46870438f38 Type: VM
2014-06-05 12:00:33,728 INFO
[org.ovirt.engine.core.bll.ShutdownVmCommand]
(org.ovirt.thread.pool-6-thread-3) [7c4b08f7] Entered (VM Bench3).
2014-06-05 12:00:33,728 INFO
[org.ovirt.engine.core.bll.ShutdownVmCommand]
(org.ovirt.thread.pool-6-thread-3) [7c4b08f7] Sending shutdown command
for VM Bench3.
2014-06-05 12:00:33,764 INFO
[org.ovirt.engine.core.vdsbroker.DestroyVmVDSCommand]
(org.ovirt.thread.pool-6-thread-3) [7c4b08f7] START,
DestroyVmVDSCommand(HostName = node3.test.now, HostId =
7415506c-cda7-4018-804d-5f6d3beddbfb,
vmId=6709eaa1-163f-4dc5-9101-e46870438f38, force=false,
secondsToWait=30, gracefully=true), log id: 5eddc358
2014-06-05 12:00:33,790 INFO
[org.ovirt.engine.core.vdsbroker.vdsbroker.DestroyVDSCommand]
(org.ovirt.thread.pool-6-thread-3) [7c4b08f7] START,
DestroyVDSCommand(HostName = node3.test.now, HostId =
7415506c-cda7-4018-804d-5f6d3beddbfb,
vmId=6709eaa1-163f-4dc5-9101-e46870438f38, force=false,
secondsToWait=30, gracefully=true), log id: 19f6f7c8
2014-06-05 12:00:33,838 INFO
[org.ovirt.engine.core.vdsbroker.vdsbroker.DestroyVDSCommand]
(org.ovirt.thread.pool-6-thread-3) [7c4b08f7] FINISH, DestroyVDSCommand,
log id: 19f6f7c8
2014-06-05 12:00:33,843 INFO
[org.ovirt.engine.core.vdsbroker.DestroyVmVDSCommand]
(org.ovirt.thread.pool-6-thread-3) [7c4b08f7] FINISH,
DestroyVmVDSCommand, return: PoweringDown, log id: 5eddc358
2014-06-05 12:00:33,855 INFO
[org.ovirt.engine.core.dal.dbbroker.auditloghandling.AuditLogDirector]
(org.ovirt.thread.pool-6-thread-3) [7c4b08f7] Correlation ID: 7c4b08f7,
Job ID: 406c481e-102b-4488-9286-c9b38197ef36, Call Stack: null, Custom
Event ID: -1, Message: VM shutdown initiated by admin on VM Bench3
(Host: node3.test.now).
2014-06-05 12:00:33,921 INFO
[org.ovirt.engine.core.bll.ShutdownVmCommand]
(org.ovirt.thread.pool-6-thread-3) [2239cfe1] Running command:
ShutdownVmCommand internal: false. Entities affected : ID:
8b305f06-2c82-4a13-99f2-5beab5a056ea Type: VM
2014-06-05 12:00:33,967 INFO
[org.ovirt.engine.core.bll.ShutdownVmCommand]
(org.ovirt.thread.pool-6-thread-3) [2239cfe1] Entered (VM Bench4).
2014-06-05 12:00:33,969 INFO
[org.ovirt.engine.core.bll.ShutdownVmCommand]
(org.ovirt.thread.pool-6-thread-3) [2239cfe1] Sending shutdown command
for VM Bench4.
2014-06-05 12:00:34,001 INFO
[org.ovirt.engine.core.vdsbroker.DestroyVmVDSCommand]
(org.ovirt.thread.pool-6-thread-3) [2239cfe1] START,
DestroyVmVDSCommand(HostName = node4.test.now, HostId =
bb13752e-85cb-4945-822b-48ab2a7b1329,
vmId=8b305f06-2c82-4a13-99f2-5beab5a056ea, force=false,
secondsToWait=30, gracefully=true), log id: 23611c0e
2014-06-05 12:00:34,041 INFO
[org.ovirt.engine.core.vdsbroker.vdsbroker.DestroyVDSCommand]
(org.ovirt.thread.pool-6-thread-3) [2239cfe1] START,
DestroyVDSCommand(HostName = node4.test.now, HostId =
bb13752e-85cb-4945-822b-48ab2a7b1329,
vmId=8b305f06-2c82-4a13-99f2-5beab5a056ea, force=false,
secondsToWait=30, gracefully=true), log id: 4f1663eb
2014-06-05 12:00:34,048 INFO
[org.ovirt.engine.core.vdsbroker.vdsbroker.DestroyVDSCommand]
(org.ovirt.thread.pool-6-thread-3) [2239cfe1] FINISH, DestroyVDSCommand,
log id: 4f1663eb
2014-06-05 12:00:34,257 INFO
[org.ovirt.engine.core.vdsbroker.DestroyVmVDSCommand]
(org.ovirt.thread.pool-6-thread-3) [2239cfe1] FINISH,
DestroyVmVDSCommand, return: PoweringDown, log id: 23611c0e
2014-06-05 12:00:34,455 INFO
[org.ovirt.engine.core.dal.dbbroker.auditloghandling.AuditLogDirector]
(org.ovirt.thread.pool-6-thread-3) [2239cfe1] Correlation ID: 2239cfe1,
Job ID: 00fb00a7-88b0-4fb1-b14a-fa099e3d6409, Call Stack: null, Custom
Event ID: -1, Message: VM shutdown initiated by admin on VM Bench4
(Host: node4.test.now).
2014-06-05 12:00:34,677 INFO
[org.ovirt.engine.core.bll.ShutdownVmCommand]
(org.ovirt.thread.pool-6-thread-3) [17ece55] Running command:
ShutdownVmCommand internal: false. Entities affected : ID:
764b60fd-c255-479b-9082-3f4f04b95cb2 Type: VM
2014-06-05 12:00:34,695 INFO
[org.ovirt.engine.core.bll.ShutdownVmCommand]
(org.ovirt.thread.pool-6-thread-3) [17ece55] Entered (VM Bench2).
2014-06-05 12:00:34,696 INFO
[org.ovirt.engine.core.bll.ShutdownVmCommand]
(org.ovirt.thread.pool-6-thread-3) [17ece55] Sending shutdown command
for VM Bench2.
2014-06-05 12:00:34,718 INFO
[org.ovirt.engine.core.vdsbroker.DestroyVmVDSCommand]
(org.ovirt.thread.pool-6-thread-3) [17ece55] START,
DestroyVmVDSCommand(HostName = node2.test.now, HostId =
28fdcb5d-7acd-410e-8b65-0b4f483cb575,
vmId=764b60fd-c255-479b-9082-3f4f04b95cb2, force=false,
secondsToWait=30, gracefully=true), log id: 215e8d55
2014-06-05 12:00:34,740 INFO
[org.ovirt.engine.core.vdsbroker.vdsbroker.DestroyVDSCommand]
(org.ovirt.thread.pool-6-thread-3) [17ece55] START,
DestroyVDSCommand(HostName = node2.test.now, HostId =
28fdcb5d-7acd-410e-8b65-0b4f483cb575,
vmId=764b60fd-c255-479b-9082-3f4f04b95cb2, force=false,
secondsToWait=30, gracefully=true), log id: 6a5599de
2014-06-05 12:00:34,755 INFO
[org.ovirt.engine.core.vdsbroker.vdsbroker.DestroyVDSCommand]
(org.ovirt.thread.pool-6-thread-3) [17ece55] FINISH, DestroyVDSCommand,
log id: 6a5599de
2014-06-05 12:00:34,777 INFO
[org.ovirt.engine.core.vdsbroker.DestroyVmVDSCommand]
(org.ovirt.thread.pool-6-thread-3) [17ece55] FINISH,
DestroyVmVDSCommand, return: PoweringDown, log id: 215e8d55
2014-06-05 12:00:34,833 INFO
[org.ovirt.engine.core.dal.dbbroker.auditloghandling.AuditLogDirector]
(org.ovirt.thread.pool-6-thread-3) [17ece55] Correlation ID: 17ece55,
Job ID: d92b0c9a-fb0f-4ca6-bc51-8d363aaf378c, Call Stack: null, Custom
Event ID: -1, Message: VM shutdown initiated by admin on VM Bench2
(Host: node2.test.now).
2014-06-05 12:01:00,648 INFO
[org.ovirt.engine.core.vdsbroker.vdsbroker.ConnectStorageServerVDSCommand]
(org.ovirt.thread.pool-6-thread-45) [22ac5fb6] FINISH,
ConnectStorageServerVDSCommand, return:
{f47e9f99-9989-4297-a84f-4f75338eeced=0}, log id: 7045fa3b
2014-06-05 12:01:00,650 INFO
[org.ovirt.engine.core.vdsbroker.vdsbroker.ConnectStorageServerVDSCommand]
(org.ovirt.thread.pool-6-thread-48) [4fbc37fa] FINISH,
ConnectStorageServerVDSCommand, return:
{f47e9f99-9989-4297-a84f-4f75338eeced=0}, log id: 306b54b0
2014-06-05 12:01:00,657 INFO
[org.ovirt.engine.core.vdsbroker.vdsbroker.ConnectStorageServerVDSCommand]
(org.ovirt.thread.pool-6-thread-25) [760d647e] FINISH,
ConnectStorageServerVDSCommand, return:
{f47e9f99-9989-4297-a84f-4f75338eeced=0}, log id: 7f7a769d
2014-06-05 12:01:00,661 INFO
[org.ovirt.engine.core.vdsbroker.vdsbroker.ConnectStorageServerVDSCommand]
(org.ovirt.thread.pool-6-thread-44) [6949bbc8] FINISH,
ConnectStorageServerVDSCommand, return:
{f47e9f99-9989-4297-a84f-4f75338eeced=0}, log id: 610dcd45
2014-06-05 12:01:00,663 INFO
[org.ovirt.engine.core.bll.storage.ConnectDomainToStorageCommand]
(DefaultQuartzScheduler_Worker-94) [21aa560a] ConnectDomainToStorage.
After Connect all hosts to pool. Time:6/5/14 12:01 PM
2014-06-05 12:02:12,172 WARN
[org.ovirt.engine.core.utils.threadpool.ThreadPoolUtil]
(org.ovirt.thread.pool-6-thread-50) Executing a command:
java.util.concurrent.FutureTask , but note that there are 1 tasks in the
queue.
2014-06-05 12:02:15,245 WARN
[org.ovirt.engine.core.utils.threadpool.ThreadPoolUtil]
(org.ovirt.thread.pool-6-thread-37) Executing a command:
java.util.concurrent.FutureTask , but note that there are 1 tasks in the
queue.
2014-06-05 12:02:30,697 WARN
[org.ovirt.engine.core.utils.threadpool.ThreadPoolUtil]
(org.ovirt.thread.pool-6-thread-9) Executing a command:
java.util.concurrent.FutureTask , but note that there are 1 tasks in the
queue.
2014-06-05 12:02:30,724 WARN
[org.ovirt.engine.core.utils.threadpool.ThreadPoolUtil]
(org.ovirt.thread.pool-6-thread-2) Executing a command:
java.util.concurrent.FutureTask , but note that there are 1 tasks in the
queue.
2014-06-05 12:03:07,860 WARN
[org.ovirt.engine.core.utils.threadpool.ThreadPoolUtil]
(org.ovirt.thread.pool-6-thread-39) Executing a command:
java.util.concurrent.FutureTask , but note that there are 1 tasks in the
queue.
2014-06-05 12:04:27,836 WARN
[org.ovirt.engine.core.utils.threadpool.ThreadPoolUtil]
(org.ovirt.thread.pool-6-thread-19) Executing a command:
java.util.concurrent.FutureTask , but note that there are 1 tasks in the
queue.
2014-06-05 12:04:27,841 WARN
[org.ovirt.engine.core.utils.threadpool.ThreadPoolUtil]
(org.ovirt.thread.pool-6-thread-50) Executing a command:
java.util.concurrent.FutureTask , but note that there are 1 tasks in the
queue.
2014-06-05 12:05:00,296 INFO
[org.ovirt.engine.core.bll.AutoRecoveryManager]
(DefaultQuartzScheduler_Worker-28) [10a45e4f] Autorecovering 1 storage
domains
2014-06-05 12:05:00,296 INFO
[org.ovirt.engine.core.bll.AutoRecoveryManager]
(DefaultQuartzScheduler_Worker-28) [10a45e4f] Autorecovering storage
domains id: 9923a5a1-61e0-4edb-a04d-22c962190c53
2014-06-05 12:05:00,299 INFO
[org.ovirt.engine.core.bll.storage.ConnectDomainToStorageCommand]
(DefaultQuartzScheduler_Worker-28) [9bf7039] Running command:
ConnectDomainToStorageCommand internal: true. Entities affected : ID:
9923a5a1-61e0-4edb-a04d-22c962190c53 Type: Storage
2014-06-05 12:05:00,304 INFO
[org.ovirt.engine.core.bll.storage.ConnectDomainToStorageCommand]
(DefaultQuartzScheduler_Worker-28) [9bf7039] ConnectDomainToStorage.
Before Connect all hosts to pool. Time:6/5/14 12:05 PM
2014-06-05 12:05:00,441 INFO
[org.ovirt.engine.core.bll.storage.ConnectStorageToVdsCommand]
(org.ovirt.thread.pool-6-thread-35) [3164009c] Running command:
ConnectStorageToVdsCommand internal: true. Entities affected : ID:
aaa00000-0000-0000-0000-123456789aaa Type: System
2014-06-05 12:05:00,446 INFO
[org.ovirt.engine.core.bll.storage.ConnectStorageToVdsCommand]
(org.ovirt.thread.pool-6-thread-29) [1a51a7f5] Running command:
ConnectStorageToVdsCommand internal: true. Entities affected : ID:
aaa00000-0000-0000-0000-123456789aaa Type: System
2014-06-05 12:05:00,450 INFO
[org.ovirt.engine.core.bll.storage.ConnectStorageToVdsCommand]
(org.ovirt.thread.pool-6-thread-5) [19f33713] Running command:
ConnectStorageToVdsCommand internal: true. Entities affected : ID:
aaa00000-0000-0000-0000-123456789aaa Type: System
2014-06-05 12:05:00,454 INFO
[org.ovirt.engine.core.bll.storage.ConnectStorageToVdsCommand]
(org.ovirt.thread.pool-6-thread-10) [3dd2802] Running command:
ConnectStorageToVdsCommand internal: true. Entities affected : ID:
aaa00000-0000-0000-0000-123456789aaa Type: System
2014-06-05 12:05:00,577 INFO
[org.ovirt.engine.core.vdsbroker.vdsbroker.ConnectStorageServerVDSCommand]
(org.ovirt.thread.pool-6-thread-35) [3164009c] START,
ConnectStorageServerVDSCommand(HostName = node4.test.now, HostId =
bb13752e-85cb-4945-822b-48ab2a7b1329, storagePoolId =
00000000-0000-0000-0000-000000000000, storageType = NFS, connectionList
= [{ id: f47e9f99-9989-4297-a84f-4f75338eeced, connection:
10.100.100.105:/var/lib/exports/iso, iqn: null, vfsType: null,
mountOptions: null, nfsVersion: null, nfsRetrans: null, nfsTimeo: null
};]), log id: 7ea7832a
2014-06-05 12:05:00,581 INFO
[org.ovirt.engine.core.vdsbroker.vdsbroker.ConnectStorageServerVDSCommand]
(org.ovirt.thread.pool-6-thread-29) [1a51a7f5] START,
ConnectStorageServerVDSCommand(HostName = node3.test.now, HostId =
7415506c-cda7-4018-804d-5f6d3beddbfb, storagePoolId =
00000000-0000-0000-0000-000000000000, storageType = NFS, connectionList
= [{ id: f47e9f99-9989-4297-a84f-4f75338eeced, connection:
10.100.100.105:/var/lib/exports/iso, iqn: null, vfsType: null,
mountOptions: null, nfsVersion: null, nfsRetrans: null, nfsTimeo: null
};]), log id: 143f44d4
2014-06-05 12:05:00,580 INFO
[org.ovirt.engine.core.vdsbroker.vdsbroker.ConnectStorageServerVDSCommand]
(org.ovirt.thread.pool-6-thread-10) [3dd2802] START,
ConnectStorageServerVDSCommand(HostName = node1.test.now, HostId =
e17a740c-47ba-4a81-99ab-6c386b572f26, storagePoolId =
00000000-0000-0000-0000-000000000000, storageType = NFS, connectionList
= [{ id: f47e9f99-9989-4297-a84f-4f75338eeced, connection:
10.100.100.105:/var/lib/exports/iso, iqn: null, vfsType: null,
mountOptions: null, nfsVersion: null, nfsRetrans: null, nfsTimeo: null
};]), log id: 4058bc78
2014-06-05 12:05:00,578 INFO
[org.ovirt.engine.core.vdsbroker.vdsbroker.ConnectStorageServerVDSCommand]
(org.ovirt.thread.pool-6-thread-5) [19f33713] START,
ConnectStorageServerVDSCommand(HostName = node2.test.now, HostId =
28fdcb5d-7acd-410e-8b65-0b4f483cb575, storagePoolId =
00000000-0000-0000-0000-000000000000, storageType = NFS, connectionList
= [{ id: f47e9f99-9989-4297-a84f-4f75338eeced, connection:
10.100.100.105:/var/lib/exports/iso, iqn: null, vfsType: null,
mountOptions: null, nfsVersion: null, nfsRetrans: null, nfsTimeo: null
};]), log id: 6eb8491e
2014-06-05 12:05:23,071 INFO
[org.ovirt.engine.core.vdsbroker.VdsUpdateRunTimeInfo]
(DefaultQuartzScheduler_Worker-10) [77f10397] VM Bench1
63b1fd02-1fb5-44d1-b1cc-54cb5c0fdb0e moved from PoweringDown --> Up
2014-06-05 12:05:23,153 INFO
[org.ovirt.engine.core.dal.dbbroker.auditloghandling.AuditLogDirector]
(DefaultQuartzScheduler_Worker-10) [77f10397] Correlation ID: null, Call
Stack: null, Custom Event ID: -1, Message: Shutdown of VM Bench1 failed.
2014-06-05 12:05:34,341 INFO
[org.ovirt.engine.core.vdsbroker.VdsUpdateRunTimeInfo]
(DefaultQuartzScheduler_Worker-65) [604c5b5d] VM Bench3
6709eaa1-163f-4dc5-9101-e46870438f38 moved from PoweringDown --> Up
2014-06-05 12:05:34,411 INFO
[org.ovirt.engine.core.dal.dbbroker.auditloghandling.AuditLogDirector]
(DefaultQuartzScheduler_Worker-65) [604c5b5d] Correlation ID: null, Call
Stack: null, Custom Event ID: -1, Message: Shutdown of VM Bench3 failed.
2014-06-05 12:05:35,312 INFO
[org.ovirt.engine.core.vdsbroker.VdsUpdateRunTimeInfo]
(DefaultQuartzScheduler_Worker-38) [3bbcf288] VM Bench4
8b305f06-2c82-4a13-99f2-5beab5a056ea moved from PoweringDown --> Up
2014-06-05 12:05:35,439 INFO
[org.ovirt.engine.core.dal.dbbroker.auditloghandling.AuditLogDirector]
(DefaultQuartzScheduler_Worker-38) [3bbcf288] Correlation ID: null, Call
Stack: null, Custom Event ID: -1, Message: Shutdown of VM Bench4 failed.
2014-06-05 12:05:35,663 INFO
[org.ovirt.engine.core.vdsbroker.VdsUpdateRunTimeInfo]
(DefaultQuartzScheduler_Worker-64) [40fb7b62] VM Bench2
764b60fd-c255-479b-9082-3f4f04b95cb2 moved from PoweringDown --> Up
2014-06-05 12:05:36,101 INFO
[org.ovirt.engine.core.dal.dbbroker.auditloghandling.AuditLogDirector]
(DefaultQuartzScheduler_Worker-64) [40fb7b62] Correlation ID: null, Call
Stack: null, Custom Event ID: -1, Message: Shutdown of VM Bench2 failed.
2014-06-05 12:06:00,627 INFO
[org.ovirt.engine.core.vdsbroker.vdsbroker.ConnectStorageServerVDSCommand]
(org.ovirt.thread.pool-6-thread-10) [3dd2802] FINISH,
ConnectStorageServerVDSCommand, return:
{f47e9f99-9989-4297-a84f-4f75338eeced=0}, log id: 4058bc78
2014-06-05 12:06:00,629 INFO
[org.ovirt.engine.core.vdsbroker.vdsbroker.ConnectStorageServerVDSCommand]
(org.ovirt.thread.pool-6-thread-29) [1a51a7f5] FINISH,
ConnectStorageServerVDSCommand, return:
{f47e9f99-9989-4297-a84f-4f75338eeced=0}, log id: 143f44d4
2014-06-05 12:06:00,632 INFO
[org.ovirt.engine.core.vdsbroker.vdsbroker.ConnectStorageServerVDSCommand]
(org.ovirt.thread.pool-6-thread-35) [3164009c] FINISH,
ConnectStorageServerVDSCommand, return:
{f47e9f99-9989-4297-a84f-4f75338eeced=0}, log id: 7ea7832a
2014-06-05 12:06:00,645 INFO
[org.ovirt.engine.core.vdsbroker.vdsbroker.ConnectStorageServerVDSCommand]
(org.ovirt.thread.pool-6-thread-5) [19f33713] FINISH,
ConnectStorageServerVDSCommand, return:
{f47e9f99-9989-4297-a84f-4f75338eeced=0}, log id: 6eb8491e
2014-06-05 12:06:00,647 INFO
[org.ovirt.engine.core.bll.storage.ConnectDomainToStorageCommand]
(DefaultQuartzScheduler_Worker-28) [9bf7039] ConnectDomainToStorage.
After Connect all hosts to pool. Time:6/5/14 12:06 PM
2014-06-05 12:07:17,243 WARN
[org.ovirt.engine.core.utils.threadpool.ThreadPoolUtil]
(org.ovirt.thread.pool-6-thread-11) Executing a command:
java.util.concurrent.FutureTask , but note that there are 1 tasks in the
queue.
2014-06-05 12:08:16,532 WARN
[org.ovirt.engine.core.utils.threadpool.ThreadPoolUtil]
(org.ovirt.thread.pool-6-thread-1) Executing a command:
java.util.concurrent.FutureTask , but note that there are 1 tasks in the
queue.
2014-06-05 12:08:25,687 WARN
[org.ovirt.engine.core.utils.threadpool.ThreadPoolUtil]
(org.ovirt.thread.pool-6-thread-46) Executing a command:
java.util.concurrent.FutureTask , but note that there are 1 tasks in the
queue.
2014-06-05 12:10:00,442 INFO
[org.ovirt.engine.core.bll.AutoRecoveryManager]
(DefaultQuartzScheduler_Worker-77) [7cace458] Autorecovering 1 storage
domains
2014-06-05 12:10:00,442 INFO
[org.ovirt.engine.core.bll.AutoRecoveryManager]
(DefaultQuartzScheduler_Worker-77) [7cace458] Autorecovering storage
domains id: 9923a5a1-61e0-4edb-a04d-22c962190c53
2014-06-05 12:10:00,443 INFO
[org.ovirt.engine.core.bll.storage.ConnectDomainToStorageCommand]
(DefaultQuartzScheduler_Worker-77) [77f9bcd3] Running command:
ConnectDomainToStorageCommand internal: true. Entities affected : ID:
9923a5a1-61e0-4edb-a04d-22c962190c53 Type: Storage
2014-06-05 12:10:00,444 INFO
[org.ovirt.engine.core.bll.storage.ConnectDomainToStorageCommand]
(DefaultQuartzScheduler_Worker-77) [77f9bcd3] ConnectDomainToStorage.
Before Connect all hosts to pool. Time:6/5/14 12:10 PM
2014-06-05 12:10:00,597 WARN
[org.ovirt.engine.core.utils.threadpool.ThreadPoolUtil]
(org.ovirt.thread.pool-6-thread-37) Executing a command:
java.util.concurrent.FutureTask , but note that there are 3 tasks in the
queue.
2014-06-05 12:10:00,598 WARN
[org.ovirt.engine.core.utils.threadpool.ThreadPoolUtil]
(org.ovirt.thread.pool-6-thread-10) Executing a command:
java.util.concurrent.FutureTask , but note that there are 2 tasks in the
queue.
2014-06-05 12:10:00,599 WARN
[org.ovirt.engine.core.utils.threadpool.ThreadPoolUtil]
(org.ovirt.thread.pool-6-thread-35) Executing a command:
java.util.concurrent.FutureTask , but note that there are 1 tasks in the
queue.
2014-06-05 12:10:00,600 INFO
[org.ovirt.engine.core.bll.storage.ConnectStorageToVdsCommand]
(org.ovirt.thread.pool-6-thread-5) [7ecb4630] Running command:
ConnectStorageToVdsCommand internal: true. Entities affected : ID:
aaa00000-0000-0000-0000-123456789aaa Type: System
2014-06-05 12:10:00,602 INFO
[org.ovirt.engine.core.bll.storage.ConnectStorageToVdsCommand]
(org.ovirt.thread.pool-6-thread-35) [25233142] Running command:
ConnectStorageToVdsCommand internal: true. Entities affected : ID:
aaa00000-0000-0000-0000-123456789aaa Type: System
2014-06-05 12:10:00,606 INFO
[org.ovirt.engine.core.bll.storage.ConnectStorageToVdsCommand]
(org.ovirt.thread.pool-6-thread-37) [175926fc] Running command:
ConnectStorageToVdsCommand internal: true. Entities affected : ID:
aaa00000-0000-0000-0000-123456789aaa Type: System
2014-06-05 12:10:00,607 INFO
[org.ovirt.engine.core.bll.storage.ConnectStorageToVdsCommand]
(org.ovirt.thread.pool-6-thread-10) [8d05e48] Running command:
ConnectStorageToVdsCommand internal: true. Entities affected : ID:
aaa00000-0000-0000-0000-123456789aaa Type: System
2014-06-05 12:10:00,745 INFO
[org.ovirt.engine.core.vdsbroker.vdsbroker.ConnectStorageServerVDSCommand]
(org.ovirt.thread.pool-6-thread-35) [25233142] START,
ConnectStorageServerVDSCommand(HostName = node2.test.now, HostId =
28fdcb5d-7acd-410e-8b65-0b4f483cb575, storagePoolId =
00000000-0000-0000-0000-000000000000, storageType = NFS, connectionList
= [{ id: f47e9f99-9989-4297-a84f-4f75338eeced, connection:
10.100.100.105:/var/lib/exports/iso, iqn: null, vfsType: null,
mountOptions: null, nfsVersion: null, nfsRetrans: null, nfsTimeo: null
};]), log id: 16b62d67
2014-06-05 12:10:00,748 INFO
[org.ovirt.engine.core.vdsbroker.vdsbroker.ConnectStorageServerVDSCommand]
(org.ovirt.thread.pool-6-thread-5) [7ecb4630] START,
ConnectStorageServerVDSCommand(HostName = node1.test.now, HostId =
e17a740c-47ba-4a81-99ab-6c386b572f26, storagePoolId =
00000000-0000-0000-0000-000000000000, storageType = NFS, connectionList
= [{ id: f47e9f99-9989-4297-a84f-4f75338eeced, connection:
10.100.100.105:/var/lib/exports/iso, iqn: null, vfsType: null,
mountOptions: null, nfsVersion: null, nfsRetrans: null, nfsTimeo: null
};]), log id: 2bc2c88
2014-06-05 12:10:00,748 INFO
[org.ovirt.engine.core.vdsbroker.vdsbroker.ConnectStorageServerVDSCommand]
(org.ovirt.thread.pool-6-thread-10) [8d05e48] START,
ConnectStorageServerVDSCommand(HostName = node3.test.now, HostId =
7415506c-cda7-4018-804d-5f6d3beddbfb, storagePoolId =
00000000-0000-0000-0000-000000000000, storageType = NFS, connectionList
= [{ id: f47e9f99-9989-4297-a84f-4f75338eeced, connection:
10.100.100.105:/var/lib/exports/iso, iqn: null, vfsType: null,
mountOptions: null, nfsVersion: null, nfsRetrans: null, nfsTimeo: null
};]), log id: 4cb7e97c
2014-06-05 12:10:00,747 INFO
[org.ovirt.engine.core.vdsbroker.vdsbroker.ConnectStorageServerVDSCommand]
(org.ovirt.thread.pool-6-thread-37) [175926fc] START,
ConnectStorageServerVDSCommand(HostName = node4.test.now, HostId =
bb13752e-85cb-4945-822b-48ab2a7b1329, storagePoolId =
00000000-0000-0000-0000-000000000000, storageType = NFS, connectionList
= [{ id: f47e9f99-9989-4297-a84f-4f75338eeced, connection:
10.100.100.105:/var/lib/exports/iso, iqn: null, vfsType: null,
mountOptions: null, nfsVersion: null, nfsRetrans: null, nfsTimeo: null
};]), log id: 442ca304
2014-06-05 12:10:51,626 WARN
[org.ovirt.engine.core.utils.threadpool.ThreadPoolUtil]
(org.ovirt.thread.pool-6-thread-38) Executing a command:
java.util.concurrent.FutureTask , but note that there are 1 tasks in the
queue.
2014-06-05 12:10:54,669 WARN
[org.ovirt.engine.core.utils.threadpool.ThreadPoolUtil]
(org.ovirt.thread.pool-6-thread-14) Executing a command:
java.util.concurrent.FutureTask , but note that there are 1 tasks in the
queue.
2014-06-05 12:11:00,801 INFO
[org.ovirt.engine.core.vdsbroker.vdsbroker.ConnectStorageServerVDSCommand]
(org.ovirt.thread.pool-6-thread-10) [8d05e48] FINISH,
ConnectStorageServerVDSCommand, return:
{f47e9f99-9989-4297-a84f-4f75338eeced=0}, log id: 4cb7e97c
2014-06-05 12:11:00,811 INFO
[org.ovirt.engine.core.vdsbroker.vdsbroker.ConnectStorageServerVDSCommand]
(org.ovirt.thread.pool-6-thread-37) [175926fc] FINISH,
ConnectStorageServerVDSCommand, return:
{f47e9f99-9989-4297-a84f-4f75338eeced=0}, log id: 442ca304
2014-06-05 12:11:00,814 INFO
[org.ovirt.engine.core.vdsbroker.vdsbroker.ConnectStorageServerVDSCommand]
(org.ovirt.thread.pool-6-thread-5) [7ecb4630] FINISH,
ConnectStorageServerVDSCommand, return:
{f47e9f99-9989-4297-a84f-4f75338eeced=0}, log id: 2bc2c88
2014-06-05 12:11:00,852 INFO
[org.ovirt.engine.core.vdsbroker.vdsbroker.ConnectStorageServerVDSCommand]
(org.ovirt.thread.pool-6-thread-35) [25233142] FINISH,
ConnectStorageServerVDSCommand, return:
{f47e9f99-9989-4297-a84f-4f75338eeced=0}, log id: 16b62d67
2014-06-05 12:11:00,854 INFO
[org.ovirt.engine.core.bll.storage.ConnectDomainToStorageCommand]
(DefaultQuartzScheduler_Worker-77) [77f9bcd3] ConnectDomainToStorage.
After Connect all hosts to pool. Time:6/5/14 12:11 PM
2014-06-05 12:11:40,654 WARN
[org.ovirt.engine.core.utils.threadpool.ThreadPoolUtil]
(org.ovirt.thread.pool-6-thread-25) Executing a command:
java.util.concurrent.FutureTask , but note that there are 1 tasks in the
queue.
2014-06-05 12:12:14,657 WARN
[org.ovirt.engine.core.utils.threadpool.ThreadPoolUtil]
(org.ovirt.thread.pool-6-thread-31) Executing a command:
java.util.concurrent.FutureTask , but note that there are 1 tasks in the
queue.
2014-06-05 12:12:27,462 WARN
[org.ovirt.engine.core.utils.threadpool.ThreadPoolUtil]
(org.ovirt.thread.pool-6-thread-37) Executing a command:
java.util.concurrent.FutureTask , but note that there are 1 tasks in the
queue.
2014-06-05 12:13:59,781 WARN
[org.ovirt.engine.core.utils.threadpool.ThreadPoolUtil]
(org.ovirt.thread.pool-6-thread-22) Executing a command:
java.util.concurrent.FutureTask , but note that there are 1 tasks in the
queue.
2014-06-05 12:14:15,341 WARN
[org.ovirt.engine.core.utils.threadpool.ThreadPoolUtil]
(org.ovirt.thread.pool-6-thread-10) Executing a command:
java.util.concurrent.FutureTask , but note that there are 1 tasks in the
queue.
2014-06-05 12:14:21,429 WARN
[org.ovirt.engine.core.utils.threadpool.ThreadPoolUtil]
(org.ovirt.thread.pool-6-thread-35) Executing a command:
java.util.concurrent.FutureTask , but note that there are 1 tasks in the
queue.
2014-06-05 12:14:21,455 WARN
[org.ovirt.engine.core.utils.threadpool.ThreadPoolUtil]
(org.ovirt.thread.pool-6-thread-4) Executing a command:
java.util.concurrent.FutureTask , but note that there are 1 tasks in the
queue.
2014-06-05 12:14:40,159 WARN
[org.ovirt.engine.core.utils.threadpool.ThreadPoolUtil]
(org.ovirt.thread.pool-6-thread-17) Executing a command:
java.util.concurrent.FutureTask , but note that there are 1 tasks in the
queue.
2014-06-05 12:14:52,383 WARN
[org.ovirt.engine.core.utils.threadpool.ThreadPoolUtil]
(org.ovirt.thread.pool-6-thread-15) Executing a command:
java.util.concurrent.FutureTask , but note that there are 1 tasks in the
queue.
2014-06-05 12:14:52,412 WARN
[org.ovirt.engine.core.utils.threadpool.ThreadPoolUtil]
(org.ovirt.thread.pool-6-thread-27) Executing a command:
java.util.concurrent.FutureTask , but note that there are 1 tasks in the
queue.
2014-06-05 12:14:55,658 WARN
[org.ovirt.engine.core.utils.threadpool.ThreadPoolUtil]
(org.ovirt.thread.pool-6-thread-42) Executing a command:
java.util.concurrent.FutureTask , but note that there are 1 tasks in the
queue.
2014-06-05 12:15:00,031 INFO
[org.ovirt.engine.core.bll.AutoRecoveryManager]
(DefaultQuartzScheduler_Worker-65) [604c5b5d] Autorecovering 1 storage
domains
2014-06-05 12:15:00,031 INFO
[org.ovirt.engine.core.bll.AutoRecoveryManager]
(DefaultQuartzScheduler_Worker-65) [604c5b5d] Autorecovering storage
domains id: 9923a5a1-61e0-4edb-a04d-22c962190c53
2014-06-05 12:15:00,032 INFO
[org.ovirt.engine.core.bll.storage.ConnectDomainToStorageCommand]
(DefaultQuartzScheduler_Worker-65) [7d4ef42] Running command:
ConnectDomainToStorageCommand internal: true. Entities affected : ID:
9923a5a1-61e0-4edb-a04d-22c962190c53 Type: Storage
2014-06-05 12:15:00,035 INFO
[org.ovirt.engine.core.bll.storage.ConnectDomainToStorageCommand]
(DefaultQuartzScheduler_Worker-65) [7d4ef42] ConnectDomainToStorage.
Before Connect all hosts to pool. Time:6/5/14 12:15 PM
2014-06-05 12:15:00,109 INFO
[org.ovirt.engine.core.bll.storage.ConnectStorageToVdsCommand]
(org.ovirt.thread.pool-6-thread-45) [6c18413] Running command:
ConnectStorageToVdsCommand internal: true. Entities affected : ID:
aaa00000-0000-0000-0000-123456789aaa Type: System
2014-06-05 12:15:00,114 INFO
[org.ovirt.engine.core.bll.storage.ConnectStorageToVdsCommand]
(org.ovirt.thread.pool-6-thread-5) [d3ad154] Running command:
ConnectStorageToVdsCommand internal: true. Entities affected : ID:
aaa00000-0000-0000-0000-123456789aaa Type: System
2014-06-05 12:15:00,118 INFO
[org.ovirt.engine.core.bll.storage.ConnectStorageToVdsCommand]
(org.ovirt.thread.pool-6-thread-1) [32d6d63f] Running command:
ConnectStorageToVdsCommand internal: true. Entities affected : ID:
aaa00000-0000-0000-0000-123456789aaa Type: System
2014-06-05 12:15:00,122 INFO
[org.ovirt.engine.core.bll.storage.ConnectStorageToVdsCommand]
(org.ovirt.thread.pool-6-thread-24) [718f19c7] Running command:
ConnectStorageToVdsCommand internal: true. Entities affected : ID:
aaa00000-0000-0000-0000-123456789aaa Type: System
2014-06-05 12:15:00,247 INFO
[org.ovirt.engine.core.vdsbroker.vdsbroker.ConnectStorageServerVDSCommand]
(org.ovirt.thread.pool-6-thread-24) [718f19c7] START,
ConnectStorageServerVDSCommand(HostName = node1.test.now, HostId =
e17a740c-47ba-4a81-99ab-6c386b572f26, storagePoolId =
00000000-0000-0000-0000-000000000000, storageType = NFS, connectionList
= [{ id: f47e9f99-9989-4297-a84f-4f75338eeced, connection:
10.100.100.105:/var/lib/exports/iso, iqn: null, vfsType: null,
mountOptions: null, nfsVersion: null, nfsRetrans: null, nfsTimeo: null
};]), log id: 4bba2a9a
2014-06-05 12:15:00,255 INFO
[org.ovirt.engine.core.vdsbroker.vdsbroker.ConnectStorageServerVDSCommand]
(org.ovirt.thread.pool-6-thread-5) [d3ad154] START,
ConnectStorageServerVDSCommand(HostName = node3.test.now, HostId =
7415506c-cda7-4018-804d-5f6d3beddbfb, storagePoolId =
00000000-0000-0000-0000-000000000000, storageType = NFS, connectionList
= [{ id: f47e9f99-9989-4297-a84f-4f75338eeced, connection:
10.100.100.105:/var/lib/exports/iso, iqn: null, vfsType: null,
mountOptions: null, nfsVersion: null, nfsRetrans: null, nfsTimeo: null
};]), log id: 4c27ad04
2014-06-05 12:15:00,249 INFO
[org.ovirt.engine.core.vdsbroker.vdsbroker.ConnectStorageServerVDSCommand]
(org.ovirt.thread.pool-6-thread-1) [32d6d63f] START,
ConnectStorageServerVDSCommand(HostName = node2.test.now, HostId =
28fdcb5d-7acd-410e-8b65-0b4f483cb575, storagePoolId =
00000000-0000-0000-0000-000000000000, storageType = NFS, connectionList
= [{ id: f47e9f99-9989-4297-a84f-4f75338eeced, connection:
10.100.100.105:/var/lib/exports/iso, iqn: null, vfsType: null,
mountOptions: null, nfsVersion: null, nfsRetrans: null, nfsTimeo: null
};]), log id: 1d382d00
2014-06-05 12:15:00,252 INFO
[org.ovirt.engine.core.vdsbroker.vdsbroker.ConnectStorageServerVDSCommand]
(org.ovirt.thread.pool-6-thread-45) [6c18413] START,
ConnectStorageServerVDSCommand(HostName = node4.test.now, HostId =
bb13752e-85cb-4945-822b-48ab2a7b1329, storagePoolId =
00000000-0000-0000-0000-000000000000, storageType = NFS, connectionList
= [{ id: f47e9f99-9989-4297-a84f-4f75338eeced, connection:
10.100.100.105:/var/lib/exports/iso, iqn: null, vfsType: null,
mountOptions: null, nfsVersion: null, nfsRetrans: null, nfsTimeo: null
};]), log id: 7daf7955
2014-06-05 12:16:00,301 INFO
[org.ovirt.engine.core.vdsbroker.vdsbroker.ConnectStorageServerVDSCommand]
(org.ovirt.thread.pool-6-thread-5) [d3ad154] FINISH,
ConnectStorageServerVDSCommand, return:
{f47e9f99-9989-4297-a84f-4f75338eeced=0}, log id: 4c27ad04
2014-06-05 12:16:00,302 INFO
[org.ovirt.engine.core.vdsbroker.vdsbroker.ConnectStorageServerVDSCommand]
(org.ovirt.thread.pool-6-thread-24) [718f19c7] FINISH,
ConnectStorageServerVDSCommand, return:
{f47e9f99-9989-4297-a84f-4f75338eeced=0}, log id: 4bba2a9a
2014-06-05 12:16:00,305 INFO
[org.ovirt.engine.core.vdsbroker.vdsbroker.ConnectStorageServerVDSCommand]
(org.ovirt.thread.pool-6-thread-45) [6c18413] FINISH,
ConnectStorageServerVDSCommand, return:
{f47e9f99-9989-4297-a84f-4f75338eeced=0}, log id: 7daf7955
2014-06-05 12:16:00,313 INFO
[org.ovirt.engine.core.vdsbroker.vdsbroker.ConnectStorageServerVDSCommand]
(org.ovirt.thread.pool-6-thread-1) [32d6d63f] FINISH,
ConnectStorageServerVDSCommand, return:
{f47e9f99-9989-4297-a84f-4f75338eeced=0}, log id: 1d382d00
2014-06-05 12:16:00,424 INFO
[org.ovirt.engine.core.bll.storage.ConnectDomainToStorageCommand]
(DefaultQuartzScheduler_Worker-65) [7d4ef42] ConnectDomainToStorage.
After Connect all hosts to pool. Time:6/5/14 12:16 PM
2014-06-05 12:18:27,725 WARN
[org.ovirt.engine.core.utils.threadpool.ThreadPoolUtil]
(org.ovirt.thread.pool-6-thread-30) Executing a command:
java.util.concurrent.FutureTask , but note that there are 1 tasks in the
queue.
2014-06-05 12:18:43,043 WARN
[org.ovirt.engine.core.utils.threadpool.ThreadPoolUtil]
(org.ovirt.thread.pool-6-thread-48) Executing a command:
java.util.concurrent.FutureTask , but note that there are 1 tasks in the
queue.
2014-06-05 12:18:43,064 WARN
[org.ovirt.engine.core.utils.threadpool.ThreadPoolUtil]
(org.ovirt.thread.pool-6-thread-9) Executing a command:
java.util.concurrent.FutureTask , but note that there are 1 tasks in the
queue.
2014-06-05 12:18:55,383 WARN
[org.ovirt.engine.core.utils.threadpool.ThreadPoolUtil]
(org.ovirt.thread.pool-6-thread-40) Executing a command:
java.util.concurrent.FutureTask , but note that there are 1 tasks in the
queue.
2014-06-05 12:18:58,450 WARN
[org.ovirt.engine.core.utils.threadpool.ThreadPoolUtil]
(org.ovirt.thread.pool-6-thread-31) Executing a command:
java.util.concurrent.FutureTask , but note that there are 1 tasks in the
queue.
2014-06-05 12:19:02,314 WARN
[org.ovirt.engine.core.utils.threadpool.ThreadPoolUtil]
(org.ovirt.thread.pool-6-thread-6) Executing a command:
java.util.concurrent.FutureTask , but note that there are 1 tasks in the
queue.
2014-06-05 12:20:00,088 INFO
[org.ovirt.engine.core.bll.AutoRecoveryManager]
(DefaultQuartzScheduler_Worker-58) [2e5174b2] Autorecovering 1 storage
domains
2014-06-05 12:20:00,088 INFO
[org.ovirt.engine.core.bll.AutoRecoveryManager]
(DefaultQuartzScheduler_Worker-58) [2e5174b2] Autorecovering storage
domains id: 9923a5a1-61e0-4edb-a04d-22c962190c53
2014-06-05 12:20:00,089 INFO
[org.ovirt.engine.core.bll.storage.ConnectDomainToStorageCommand]
(DefaultQuartzScheduler_Worker-58) [106f43a5] Running command:
ConnectDomainToStorageCommand internal: true. Entities affected : ID:
9923a5a1-61e0-4edb-a04d-22c962190c53 Type: Storage
2014-06-05 12:20:00,090 INFO
[org.ovirt.engine.core.bll.storage.ConnectDomainToStorageCommand]
(DefaultQuartzScheduler_Worker-58) [106f43a5] ConnectDomainToStorage.
Before Connect all hosts to pool. Time:6/5/14 12:20 PM
2014-06-05 12:20:00,175 INFO
[org.ovirt.engine.core.bll.storage.ConnectStorageToVdsCommand]
(org.ovirt.thread.pool-6-thread-23) [4233af79] Running command:
ConnectStorageToVdsCommand internal: true. Entities affected : ID:
aaa00000-0000-0000-0000-123456789aaa Type: System
2014-06-05 12:20:00,180 INFO
[org.ovirt.engine.core.bll.storage.ConnectStorageToVdsCommand]
(org.ovirt.thread.pool-6-thread-11) [4aa18a96] Running command:
ConnectStorageToVdsCommand internal: true. Entities affected : ID:
aaa00000-0000-0000-0000-123456789aaa Type: System
2014-06-05 12:20:00,184 INFO
[org.ovirt.engine.core.bll.storage.ConnectStorageToVdsCommand]
(org.ovirt.thread.pool-6-thread-40) [48fedac6] Running command:
ConnectStorageToVdsCommand internal: true. Entities affected : ID:
aaa00000-0000-0000-0000-123456789aaa Type: System
2014-06-05 12:20:00,188 INFO
[org.ovirt.engine.core.bll.storage.ConnectStorageToVdsCommand]
(org.ovirt.thread.pool-6-thread-26) [4b39df17] Running command:
ConnectStorageToVdsCommand internal: true. Entities affected : ID:
aaa00000-0000-0000-0000-123456789aaa Type: System
2014-06-05 12:20:00,317 INFO
[org.ovirt.engine.core.vdsbroker.vdsbroker.ConnectStorageServerVDSCommand]
(org.ovirt.thread.pool-6-thread-40) [48fedac6] START,
ConnectStorageServerVDSCommand(HostName = node2.test.now, HostId =
28fdcb5d-7acd-410e-8b65-0b4f483cb575, storagePoolId =
00000000-0000-0000-0000-000000000000, storageType = NFS, connectionList
= [{ id: f47e9f99-9989-4297-a84f-4f75338eeced, connection:
10.100.100.105:/var/lib/exports/iso, iqn: null, vfsType: null,
mountOptions: null, nfsVersion: null, nfsRetrans: null, nfsTimeo: null
};]), log id: 2eb75130
2014-06-05 12:20:00,321 INFO
[org.ovirt.engine.core.vdsbroker.vdsbroker.ConnectStorageServerVDSCommand]
(org.ovirt.thread.pool-6-thread-26) [4b39df17] START,
ConnectStorageServerVDSCommand(HostName = node1.test.now, HostId =
e17a740c-47ba-4a81-99ab-6c386b572f26, storagePoolId =
00000000-0000-0000-0000-000000000000, storageType = NFS, connectionList
= [{ id: f47e9f99-9989-4297-a84f-4f75338eeced, connection:
10.100.100.105:/var/lib/exports/iso, iqn: null, vfsType: null,
mountOptions: null, nfsVersion: null, nfsRetrans: null, nfsTimeo: null
};]), log id: 957b133
2014-06-05 12:20:00,313 INFO
[org.ovirt.engine.core.vdsbroker.vdsbroker.ConnectStorageServerVDSCommand]
(org.ovirt.thread.pool-6-thread-11) [4aa18a96] START,
ConnectStorageServerVDSCommand(HostName = node3.test.now, HostId =
7415506c-cda7-4018-804d-5f6d3beddbfb, storagePoolId =
00000000-0000-0000-0000-000000000000, storageType = NFS, connectionList
= [{ id: f47e9f99-9989-4297-a84f-4f75338eeced, connection:
10.100.100.105:/var/lib/exports/iso, iqn: null, vfsType: null,
mountOptions: null, nfsVersion: null, nfsRetrans: null, nfsTimeo: null
};]), log id: 28e7a51e
2014-06-05 12:20:00,320 INFO
[org.ovirt.engine.core.vdsbroker.vdsbroker.ConnectStorageServerVDSCommand]
(org.ovirt.thread.pool-6-thread-23) [4233af79] START,
ConnectStorageServerVDSCommand(HostName = node4.test.now, HostId =
bb13752e-85cb-4945-822b-48ab2a7b1329, storagePoolId =
00000000-0000-0000-0000-000000000000, storageType = NFS, connectionList
= [{ id: f47e9f99-9989-4297-a84f-4f75338eeced, connection:
10.100.100.105:/var/lib/exports/iso, iqn: null, vfsType: null,
mountOptions: null, nfsVersion: null, nfsRetrans: null, nfsTimeo: null
};]), log id: 1e3eb653
2014-06-05 12:21:00,371 INFO
[org.ovirt.engine.core.vdsbroker.vdsbroker.ConnectStorageServerVDSCommand]
(org.ovirt.thread.pool-6-thread-40) [48fedac6] FINISH,
ConnectStorageServerVDSCommand, return:
{f47e9f99-9989-4297-a84f-4f75338eeced=0}, log id: 2eb75130
2014-06-05 12:21:00,375 INFO
[org.ovirt.engine.core.vdsbroker.vdsbroker.ConnectStorageServerVDSCommand]
(org.ovirt.thread.pool-6-thread-11) [4aa18a96] FINISH,
ConnectStorageServerVDSCommand, return:
{f47e9f99-9989-4297-a84f-4f75338eeced=0}, log id: 28e7a51e
2014-06-05 12:21:00,376 INFO
[org.ovirt.engine.core.vdsbroker.vdsbroker.ConnectStorageServerVDSCommand]
(org.ovirt.thread.pool-6-thread-26) [4b39df17] FINISH,
ConnectStorageServerVDSCommand, return:
{f47e9f99-9989-4297-a84f-4f75338eeced=0}, log id: 957b133
2014-06-05 12:21:00,400 INFO
[org.ovirt.engine.core.vdsbroker.vdsbroker.ConnectStorageServerVDSCommand]
(org.ovirt.thread.pool-6-thread-23) [4233af79] FINISH,
ConnectStorageServerVDSCommand, return:
{f47e9f99-9989-4297-a84f-4f75338eeced=0}, log id: 1e3eb653
2014-06-05 12:21:00,403 INFO
[org.ovirt.engine.core.bll.storage.ConnectDomainToStorageCommand]
(DefaultQuartzScheduler_Worker-58) [106f43a5] ConnectDomainToStorage.
After Connect all hosts to pool. Time:6/5/14 12:21 PM
2014-06-05 12:23:10,334 WARN
[org.ovirt.engine.core.utils.threadpool.ThreadPoolUtil]
(org.ovirt.thread.pool-6-thread-23) Executing a command:
java.util.concurrent.FutureTask , but note that there are 1 tasks in the
queue.
2014-06-05 12:23:34,686 INFO [org.ovirt.engine.core.bll.OvfDataUpdater]
(DefaultQuartzScheduler_Worker-7) [74449d64] Attempting to update
VMs/Templates Ovf.
2014-06-05 12:23:34,687 INFO [org.ovirt.engine.core.bll.OvfDataUpdater]
(DefaultQuartzScheduler_Worker-7) [74449d64] Attempting to update VM
OVFs in Data Center Default
2014-06-05 12:23:34,692 INFO [org.ovirt.engine.core.bll.OvfDataUpdater]
(DefaultQuartzScheduler_Worker-7) [74449d64] Successfully updated VM
OVFs in Data Center Default
2014-06-05 12:23:34,693 INFO [org.ovirt.engine.core.bll.OvfDataUpdater]
(DefaultQuartzScheduler_Worker-7) [74449d64] Attempting to update
template OVFs in Data Center Default
2014-06-05 12:23:34,695 INFO [org.ovirt.engine.core.bll.OvfDataUpdater]
(DefaultQuartzScheduler_Worker-7) [74449d64] Successfully updated
templates OVFs in Data Center Default
2014-06-05 12:23:34,695 INFO [org.ovirt.engine.core.bll.OvfDataUpdater]
(DefaultQuartzScheduler_Worker-7) [74449d64] Attempting to remove
unneeded template/vm OVFs in Data Center Default
2014-06-05 12:23:34,697 INFO [org.ovirt.engine.core.bll.OvfDataUpdater]
(DefaultQuartzScheduler_Worker-7) [74449d64] Successfully removed
unneeded template/vm OVFs in Data Center Default
2014-06-05 12:23:38,356 WARN
[org.ovirt.engine.core.utils.threadpool.ThreadPoolUtil]
(org.ovirt.thread.pool-6-thread-29) Executing a command:
java.util.concurrent.FutureTask , but note that there are 1 tasks in the
queue.
2014-06-05 12:23:44,441 WARN
[org.ovirt.engine.core.utils.threadpool.ThreadPoolUtil]
(org.ovirt.thread.pool-6-thread-16) Executing a command:
java.util.concurrent.FutureTask , but note that there are 1 tasks in the
queue.
2014-06-05 12:24:00,113 INFO
[org.ovirt.engine.core.bll.MaintenanceNumberOfVdssCommand]
(org.ovirt.thread.pool-6-thread-44) [43b031c4] Running command:
MaintenanceNumberOfVdssCommand internal: false. Entities affected : ID:
e17a740c-47ba-4a81-99ab-6c386b572f26 Type: VDS
2014-06-05 12:24:00,146 INFO
[org.ovirt.engine.core.vdsbroker.SetVdsStatusVDSCommand]
(org.ovirt.thread.pool-6-thread-44) [43b031c4] START,
SetVdsStatusVDSCommand(HostName = node1.test.now, HostId =
e17a740c-47ba-4a81-99ab-6c386b572f26, status=PreparingForMaintenance,
nonOperationalReason=NONE, stopSpmFailureLogged=true), log id: 1d0d3a10
2014-06-05 12:24:00,171 INFO
[org.ovirt.engine.core.vdsbroker.SetVdsStatusVDSCommand]
(org.ovirt.thread.pool-6-thread-44) [43b031c4] FINISH,
SetVdsStatusVDSCommand, log id: 1d0d3a10
2014-06-05 12:24:00,460 INFO
[org.ovirt.engine.core.bll.MaintenanceVdsCommand]
(org.ovirt.thread.pool-6-thread-44) [43b031c4] Running command:
MaintenanceVdsCommand internal: true. Entities affected : ID:
e17a740c-47ba-4a81-99ab-6c386b572f26 Type: VDS
2014-06-05 12:24:00,539 INFO
[org.ovirt.engine.core.bll.InternalMigrateVmCommand]
(org.ovirt.thread.pool-6-thread-44) [155d3855] Lock Acquired to object
EngineLock [exclusiveLocks= key: 63b1fd02-1fb5-44d1-b1cc-54cb5c0fdb0e
value: VM
, sharedLocks= ]
2014-06-05 12:24:00,622 INFO
[org.ovirt.engine.core.bll.MaintenanceNumberOfVdssCommand]
(DefaultQuartzScheduler_Worker-69) [2b68c51d] Running command:
MaintenanceNumberOfVdssCommand internal: true. Entities affected : ID:
e17a740c-47ba-4a81-99ab-6c386b572f26 Type: VDS
2014-06-05 12:24:00,779 INFO
[org.ovirt.engine.core.bll.scheduling.SchedulingManager]
(org.ovirt.thread.pool-6-thread-44) [155d3855] Candidate host
node2.test.now (28fdcb5d-7acd-410e-8b65-0b4f483cb575) was filtered out
by VAR__FILTERTYPE__INTERNAL filter Memory
2014-06-05 12:24:00,808 INFO
[org.ovirt.engine.core.bll.scheduling.SchedulingManager]
(org.ovirt.thread.pool-6-thread-44) [155d3855] Candidate host
node3.test.now (7415506c-cda7-4018-804d-5f6d3beddbfb) was filtered out
by VAR__FILTERTYPE__INTERNAL filter Memory
2014-06-05 12:24:00,809 INFO
[org.ovirt.engine.core.bll.scheduling.SchedulingManager]
(org.ovirt.thread.pool-6-thread-44) [155d3855] Candidate host
node4.test.now (bb13752e-85cb-4945-822b-48ab2a7b1329) was filtered out
by VAR__FILTERTYPE__INTERNAL filter Memory
2014-06-05 12:24:00,813 WARN
[org.ovirt.engine.core.bll.InternalMigrateVmCommand]
(org.ovirt.thread.pool-6-thread-44) [155d3855] CanDoAction of action
InternalMigrateVm failed.
Reasons:VAR__ACTION__MIGRATE,VAR__TYPE__VM,ACTION_TYPE_FAILED_VDS_VM_MEMORY,ACTION_TYPE_FAILED_VDS_VM_MEMORY,ACTION_TYPE_FAILED_VDS_VM_MEMORY,SCHEDULING_ALL_HOSTS_FILTERED_OUT,VAR__FILTERTYPE__INTERNAL,$hostName
node2.test.now,$filterName
Memory,SCHEDULING_HOST_FILTERED_REASON,VAR__FILTERTYPE__INTERNAL,$hostName
node4.test.now,$filterName
Memory,SCHEDULING_HOST_FILTERED_REASON,VAR__FILTERTYPE__INTERNAL,$hostName
node3.test.now,$filterName Memory,SCHEDULING_HOST_FILTERED_REASON
2014-06-05 12:24:00,825 INFO
[org.ovirt.engine.core.bll.InternalMigrateVmCommand]
(org.ovirt.thread.pool-6-thread-44) [155d3855] Lock freed to object
EngineLock [exclusiveLocks= key: 63b1fd02-1fb5-44d1-b1cc-54cb5c0fdb0e
value: VM
, sharedLocks= ]
2014-06-05 12:24:00,860 ERROR
[org.ovirt.engine.core.bll.MaintenanceVdsCommand]
(org.ovirt.thread.pool-6-thread-44) [155d3855]
ResourceManager::vdsMaintenance - Failed migrating desktop Bench1
2014-06-05 12:24:00,883 INFO
[org.ovirt.engine.core.dal.dbbroker.auditloghandling.AuditLogDirector]
(org.ovirt.thread.pool-6-thread-44) [155d3855] Correlation ID: 43b031c4,
Job ID: 38a05481-0ced-4285-b3ee-6574ce39eb77, Call Stack: null, Custom
Event ID: -1, Message: Host node1.test.now cannot change into
maintenance mode - not all Vms have been migrated successfully. Consider
manual intervention: stopping/migrating Vms: Bench1 (User: admin).
2014-06-05 12:24:00,894 INFO
[org.ovirt.engine.core.bll.MaintenanceVdsCommand]
(DefaultQuartzScheduler_Worker-69) [2b68c51d] Running command:
MaintenanceVdsCommand internal: true. Entities affected : ID:
e17a740c-47ba-4a81-99ab-6c386b572f26 Type: VDS
2014-06-05 12:24:01,005 INFO
[org.ovirt.engine.core.bll.InternalMigrateVmCommand]
(DefaultQuartzScheduler_Worker-69) [36b0c4c7] Lock Acquired to object
EngineLock [exclusiveLocks= key: 63b1fd02-1fb5-44d1-b1cc-54cb5c0fdb0e
value: VM
, sharedLocks= ]
2014-06-05 12:24:01,147 INFO
[org.ovirt.engine.core.bll.scheduling.SchedulingManager]
(DefaultQuartzScheduler_Worker-69) [36b0c4c7] Candidate host
node2.test.now (28fdcb5d-7acd-410e-8b65-0b4f483cb575) was filtered out
by VAR__FILTERTYPE__INTERNAL filter Memory
2014-06-05 12:24:01,155 INFO
[org.ovirt.engine.core.bll.scheduling.SchedulingManager]
(DefaultQuartzScheduler_Worker-69) [36b0c4c7] Candidate host
node3.test.now (7415506c-cda7-4018-804d-5f6d3beddbfb) was filtered out
by VAR__FILTERTYPE__INTERNAL filter Memory
2014-06-05 12:24:01,166 INFO
[org.ovirt.engine.core.bll.scheduling.SchedulingManager]
(DefaultQuartzScheduler_Worker-69) [36b0c4c7] Candidate host
node4.test.now (bb13752e-85cb-4945-822b-48ab2a7b1329) was filtered out
by VAR__FILTERTYPE__INTERNAL filter Memory
2014-06-05 12:24:01,174 WARN
[org.ovirt.engine.core.bll.InternalMigrateVmCommand]
(DefaultQuartzScheduler_Worker-69) [36b0c4c7] CanDoAction of action
InternalMigrateVm failed.
Reasons:VAR__ACTION__MIGRATE,VAR__TYPE__VM,ACTION_TYPE_FAILED_VDS_VM_MEMORY,ACTION_TYPE_FAILED_VDS_VM_MEMORY,ACTION_TYPE_FAILED_VDS_VM_MEMORY,SCHEDULING_ALL_HOSTS_FILTERED_OUT,VAR__FILTERTYPE__INTERNAL,$hostName
node2.test.now,$filterName
Memory,SCHEDULING_HOST_FILTERED_REASON,VAR__FILTERTYPE__INTERNAL,$hostName
node4.test.now,$filterName
Memory,SCHEDULING_HOST_FILTERED_REASON,VAR__FILTERTYPE__INTERNAL,$hostName
node3.test.now,$filterName Memory,SCHEDULING_HOST_FILTERED_REASON
2014-06-05 12:24:01,187 INFO
[org.ovirt.engine.core.bll.InternalMigrateVmCommand]
(DefaultQuartzScheduler_Worker-69) [36b0c4c7] Lock freed to object
EngineLock [exclusiveLocks= key: 63b1fd02-1fb5-44d1-b1cc-54cb5c0fdb0e
value: VM
, sharedLocks= ]
2014-06-05 12:24:01,228 ERROR
[org.ovirt.engine.core.bll.MaintenanceVdsCommand]
(DefaultQuartzScheduler_Worker-69) [36b0c4c7]
ResourceManager::vdsMaintenance - Failed migrating desktop Bench1
2014-06-05 12:24:01,238 INFO
[org.ovirt.engine.core.dal.dbbroker.auditloghandling.AuditLogDirector]
(DefaultQuartzScheduler_Worker-69) [36b0c4c7] Correlation ID: 2b68c51d,
Job ID: 3a956cf5-ec2e-45b5-a2bc-40632496fec6, Call Stack: null, Custom
Event ID: -1, Message: Failed to switch Host node1.test.now to
Maintenance mode.
2014-06-05 12:24:40,507 INFO [org.ovirt.engine.core.bll.StopVdsCommand]
(ajp--127.0.0.1-8702-4) [32d3da21] Lock Acquired to object EngineLock
[exclusiveLocks= key: e17a740c-47ba-4a81-99ab-6c386b572f26 value: VDS_FENCE
, sharedLocks= ]
2014-06-05 12:24:40,798 INFO
[org.ovirt.engine.core.dal.dbbroker.auditloghandling.AuditLogDirector]
(ajp--127.0.0.1-8702-4) Correlation ID: null, Call Stack: null, Custom
Event ID: -1, Message: Host node3.test.now from cluster Default was
chosen as a proxy to execute Status command on Host node1.test.now.
2014-06-05 12:24:40,799 INFO [org.ovirt.engine.core.bll.FenceExecutor]
(ajp--127.0.0.1-8702-4) Using Host node3.test.now from cluster Default
as proxy to execute Status command on Host node1.test.now
2014-06-05 12:24:40,800 WARN [org.ovirt.engine.core.bll.StopVdsCommand]
(ajp--127.0.0.1-8702-4) CanDoAction of action StopVds failed.
Reasons:VAR__ACTION__STOP,VDS_STATUS_NOT_VALID_FOR_STOP
2014-06-05 12:24:40,811 INFO [org.ovirt.engine.core.bll.StopVdsCommand]
(ajp--127.0.0.1-8702-4) Lock freed to object EngineLock [exclusiveLocks=
key: e17a740c-47ba-4a81-99ab-6c386b572f26 value: VDS_FENCE
, sharedLocks= ]
2014-06-05 12:25:00,021 INFO
[org.ovirt.engine.core.bll.AutoRecoveryManager]
(DefaultQuartzScheduler_Worker-27) [70486be2] Autorecovering 1 storage
domains
2014-06-05 12:25:00,021 INFO
[org.ovirt.engine.core.bll.AutoRecoveryManager]
(DefaultQuartzScheduler_Worker-27) [70486be2] Autorecovering storage
domains id: 9923a5a1-61e0-4edb-a04d-22c962190c53
2014-06-05 12:25:00,022 INFO
[org.ovirt.engine.core.bll.storage.ConnectDomainToStorageCommand]
(DefaultQuartzScheduler_Worker-27) [70d58101] Running command:
ConnectDomainToStorageCommand internal: true. Entities affected : ID:
9923a5a1-61e0-4edb-a04d-22c962190c53 Type: Storage
2014-06-05 12:25:00,023 INFO
[org.ovirt.engine.core.bll.storage.ConnectDomainToStorageCommand]
(DefaultQuartzScheduler_Worker-27) [70d58101] ConnectDomainToStorage.
Before Connect all hosts to pool. Time:6/5/14 12:25 PM
2014-06-05 12:25:00,088 INFO
[org.ovirt.engine.core.bll.storage.ConnectStorageToVdsCommand]
(org.ovirt.thread.pool-6-thread-6) [362a5fe3] Running command:
ConnectStorageToVdsCommand internal: true. Entities affected : ID:
aaa00000-0000-0000-0000-123456789aaa Type: System
2014-06-05 12:25:00,093 INFO
[org.ovirt.engine.core.bll.storage.ConnectStorageToVdsCommand]
(org.ovirt.thread.pool-6-thread-47) [683e82b0] Running command:
ConnectStorageToVdsCommand internal: true. Entities affected : ID:
aaa00000-0000-0000-0000-123456789aaa Type: System
2014-06-05 12:25:00,100 INFO
[org.ovirt.engine.core.bll.storage.ConnectStorageToVdsCommand]
(org.ovirt.thread.pool-6-thread-14) [6ed26267] Running command:
ConnectStorageToVdsCommand internal: true. Entities affected : ID:
aaa00000-0000-0000-0000-123456789aaa Type: System
2014-06-05 12:25:00,196 INFO
[org.ovirt.engine.core.vdsbroker.vdsbroker.ConnectStorageServerVDSCommand]
(org.ovirt.thread.pool-6-thread-6) [362a5fe3] START,
ConnectStorageServerVDSCommand(HostName = node4.test.now, HostId =
bb13752e-85cb-4945-822b-48ab2a7b1329, storagePoolId =
00000000-0000-0000-0000-000000000000, storageType = NFS, connectionList
= [{ id: f47e9f99-9989-4297-a84f-4f75338eeced, connection:
10.100.100.105:/var/lib/exports/iso, iqn: null, vfsType: null,
mountOptions: null, nfsVersion: null, nfsRetrans: null, nfsTimeo: null
};]), log id: 4591e85a
2014-06-05 12:25:00,197 INFO
[org.ovirt.engine.core.vdsbroker.vdsbroker.ConnectStorageServerVDSCommand]
(org.ovirt.thread.pool-6-thread-14) [6ed26267] START,
ConnectStorageServerVDSCommand(HostName = node2.test.now, HostId =
28fdcb5d-7acd-410e-8b65-0b4f483cb575, storagePoolId =
00000000-0000-0000-0000-000000000000, storageType = NFS, connectionList
= [{ id: f47e9f99-9989-4297-a84f-4f75338eeced, connection:
10.100.100.105:/var/lib/exports/iso, iqn: null, vfsType: null,
mountOptions: null, nfsVersion: null, nfsRetrans: null, nfsTimeo: null
};]), log id: 17ef6900
2014-06-05 12:25:00,197 INFO
[org.ovirt.engine.core.vdsbroker.vdsbroker.ConnectStorageServerVDSCommand]
(org.ovirt.thread.pool-6-thread-47) [683e82b0] START,
ConnectStorageServerVDSCommand(HostName = node3.test.now, HostId =
7415506c-cda7-4018-804d-5f6d3beddbfb, storagePoolId =
00000000-0000-0000-0000-000000000000, storageType = NFS, connectionList
= [{ id: f47e9f99-9989-4297-a84f-4f75338eeced, connection:
10.100.100.105:/var/lib/exports/iso, iqn: null, vfsType: null,
mountOptions: null, nfsVersion: null, nfsRetrans: null, nfsTimeo: null
};]), log id: db1af1a
2014-06-05 12:25:00,207 WARN
[org.ovirt.engine.core.utils.threadpool.ThreadPoolUtil]
(org.ovirt.thread.pool-6-thread-2) Executing a command:
java.util.concurrent.FutureTask , but note that there are 1 tasks in the
queue.
2014-06-05 12:25:19,989 WARN
[org.ovirt.engine.core.utils.threadpool.ThreadPoolUtil]
(org.ovirt.thread.pool-6-thread-42) Executing a command:
java.util.concurrent.FutureTask , but note that there are 1 tasks in the
queue.
2014-06-05 12:26:00,247 INFO
[org.ovirt.engine.core.vdsbroker.vdsbroker.ConnectStorageServerVDSCommand]
(org.ovirt.thread.pool-6-thread-47) [683e82b0] FINISH,
ConnectStorageServerVDSCommand, return:
{f47e9f99-9989-4297-a84f-4f75338eeced=0}, log id: db1af1a
2014-06-05 12:26:00,256 INFO
[org.ovirt.engine.core.vdsbroker.vdsbroker.ConnectStorageServerVDSCommand]
(org.ovirt.thread.pool-6-thread-6) [362a5fe3] FINISH,
ConnectStorageServerVDSCommand, return:
{f47e9f99-9989-4297-a84f-4f75338eeced=0}, log id: 4591e85a
2014-06-05 12:26:00,263 INFO
[org.ovirt.engine.core.vdsbroker.vdsbroker.ConnectStorageServerVDSCommand]
(org.ovirt.thread.pool-6-thread-14) [6ed26267] FINISH,
ConnectStorageServerVDSCommand, return:
{f47e9f99-9989-4297-a84f-4f75338eeced=0}, log id: 17ef6900
2014-06-05 12:26:00,265 INFO
[org.ovirt.engine.core.bll.storage.ConnectDomainToStorageCommand]
(DefaultQuartzScheduler_Worker-27) [70d58101] ConnectDomainToStorage.
After Connect all hosts to pool. Time:6/5/14 12:26 PM
2014-06-05 12:28:15,399 WARN
[org.ovirt.engine.core.utils.threadpool.ThreadPoolUtil]
(org.ovirt.thread.pool-6-thread-17) Executing a command:
java.util.concurrent.FutureTask , but note that there are 1 tasks in the
queue.
2014-06-05 12:28:18,443 WARN
[org.ovirt.engine.core.utils.threadpool.ThreadPoolUtil]
(org.ovirt.thread.pool-6-thread-8) Executing a command:
java.util.concurrent.FutureTask , but note that there are 1 tasks in the
queue.
2014-06-05 12:28:55,121 WARN
[org.ovirt.engine.core.utils.threadpool.ThreadPoolUtil]
(org.ovirt.thread.pool-6-thread-45) Executing a command:
java.util.concurrent.FutureTask , but note that there are 1 tasks in the
queue.
2014-06-05 12:29:04,927 INFO
[org.ovirt.engine.core.bll.MaintenanceNumberOfVdssCommand]
(DefaultQuartzScheduler_Worker-51) [51b730a8] Running command:
MaintenanceNumberOfVdssCommand internal: true. Entities affected : ID:
e17a740c-47ba-4a81-99ab-6c386b572f26 Type: VDS
2014-06-05 12:29:05,029 INFO
[org.ovirt.engine.core.bll.MaintenanceVdsCommand]
(DefaultQuartzScheduler_Worker-51) [51b730a8] Running command:
MaintenanceVdsCommand internal: true. Entities affected : ID:
e17a740c-47ba-4a81-99ab-6c386b572f26 Type: VDS
2014-06-05 12:29:05,073 INFO
[org.ovirt.engine.core.bll.InternalMigrateVmCommand]
(DefaultQuartzScheduler_Worker-51) [63736865] Lock Acquired to object
EngineLock [exclusiveLocks= key: 63b1fd02-1fb5-44d1-b1cc-54cb5c0fdb0e
value: VM
, sharedLocks= ]
2014-06-05 12:29:05,164 INFO
[org.ovirt.engine.core.bll.scheduling.SchedulingManager]
(DefaultQuartzScheduler_Worker-51) [63736865] Candidate host
node2.test.now (28fdcb5d-7acd-410e-8b65-0b4f483cb575) was filtered out
by VAR__FILTERTYPE__INTERNAL filter Memory
2014-06-05 12:29:05,167 INFO
[org.ovirt.engine.core.bll.scheduling.SchedulingManager]
(DefaultQuartzScheduler_Worker-51) [63736865] Candidate host
node3.test.now (7415506c-cda7-4018-804d-5f6d3beddbfb) was filtered out
by VAR__FILTERTYPE__INTERNAL filter Memory
2014-06-05 12:29:05,170 INFO
[org.ovirt.engine.core.bll.scheduling.SchedulingManager]
(DefaultQuartzScheduler_Worker-51) [63736865] Candidate host
node4.test.now (bb13752e-85cb-4945-822b-48ab2a7b1329) was filtered out
by VAR__FILTERTYPE__INTERNAL filter Memory
2014-06-05 12:29:05,173 WARN
[org.ovirt.engine.core.bll.InternalMigrateVmCommand]
(DefaultQuartzScheduler_Worker-51) [63736865] CanDoAction of action
InternalMigrateVm failed.
Reasons:VAR__ACTION__MIGRATE,VAR__TYPE__VM,ACTION_TYPE_FAILED_VDS_VM_MEMORY,ACTION_TYPE_FAILED_VDS_VM_MEMORY,ACTION_TYPE_FAILED_VDS_VM_MEMORY,SCHEDULING_ALL_HOSTS_FILTERED_OUT,VAR__FILTERTYPE__INTERNAL,$hostName
node2.test.now,$filterName
Memory,SCHEDULING_HOST_FILTERED_REASON,VAR__FILTERTYPE__INTERNAL,$hostName
node4.test.now,$filterName
Memory,SCHEDULING_HOST_FILTERED_REASON,VAR__FILTERTYPE__INTERNAL,$hostName
node3.test.now,$filterName Memory,SCHEDULING_HOST_FILTERED_REASON
2014-06-05 12:29:05,181 INFO
[org.ovirt.engine.core.bll.InternalMigrateVmCommand]
(DefaultQuartzScheduler_Worker-51) [63736865] Lock freed to object
EngineLock [exclusiveLocks= key: 63b1fd02-1fb5-44d1-b1cc-54cb5c0fdb0e
value: VM
, sharedLocks= ]
2014-06-05 12:29:05,207 ERROR
[org.ovirt.engine.core.bll.MaintenanceVdsCommand]
(DefaultQuartzScheduler_Worker-51) [63736865]
ResourceManager::vdsMaintenance - Failed migrating desktop Bench1
2014-06-05 12:29:05,218 INFO
[org.ovirt.engine.core.dal.dbbroker.auditloghandling.AuditLogDirector]
(DefaultQuartzScheduler_Worker-51) [63736865] Correlation ID: 51b730a8,
Job ID: b8441378-051c-4af0-8802-e2efb9e5d4a4, Call Stack: null, Custom
Event ID: -1, Message: Failed to switch Host node1.test.now to
Maintenance mode.
2014-06-05 12:30:00,018 INFO
[org.ovirt.engine.core.bll.AutoRecoveryManager]
(DefaultQuartzScheduler_Worker-52) [2e3993e6] Autorecovering 1 storage
domains
2014-06-05 12:30:00,018 INFO
[org.ovirt.engine.core.bll.AutoRecoveryManager]
(DefaultQuartzScheduler_Worker-52) [2e3993e6] Autorecovering storage
domains id: 9923a5a1-61e0-4edb-a04d-22c962190c53
2014-06-05 12:30:00,021 INFO
[org.ovirt.engine.core.bll.storage.ConnectDomainToStorageCommand]
(DefaultQuartzScheduler_Worker-52) [1585ee27] Running command:
ConnectDomainToStorageCommand internal: true. Entities affected : ID:
9923a5a1-61e0-4edb-a04d-22c962190c53 Type: Storage
2014-06-05 12:30:00,024 INFO
[org.ovirt.engine.core.bll.storage.ConnectDomainToStorageCommand]
(DefaultQuartzScheduler_Worker-52) [1585ee27] ConnectDomainToStorage.
Before Connect all hosts to pool. Time:6/5/14 12:30 PM
2014-06-05 12:30:00,119 INFO
[org.ovirt.engine.core.bll.storage.ConnectStorageToVdsCommand]
(org.ovirt.thread.pool-6-thread-17) [6c0a8af9] Running command:
ConnectStorageToVdsCommand internal: true. Entities affected : ID:
aaa00000-0000-0000-0000-123456789aaa Type: System
2014-06-05 12:30:00,121 INFO
[org.ovirt.engine.core.bll.storage.ConnectStorageToVdsCommand]
(org.ovirt.thread.pool-6-thread-26) [10056f73] Running command:
ConnectStorageToVdsCommand internal: true. Entities affected : ID:
aaa00000-0000-0000-0000-123456789aaa Type: System
2014-06-05 12:30:00,134 INFO
[org.ovirt.engine.core.bll.storage.ConnectStorageToVdsCommand]
(org.ovirt.thread.pool-6-thread-24) [750ef0f4] Running command:
ConnectStorageToVdsCommand internal: true. Entities affected : ID:
aaa00000-0000-0000-0000-123456789aaa Type: System
2014-06-05 12:30:00,248 INFO
[org.ovirt.engine.core.vdsbroker.vdsbroker.ConnectStorageServerVDSCommand]
(org.ovirt.thread.pool-6-thread-17) [6c0a8af9] START,
ConnectStorageServerVDSCommand(HostName = node4.test.now, HostId =
bb13752e-85cb-4945-822b-48ab2a7b1329, storagePoolId =
00000000-0000-0000-0000-000000000000, storageType = NFS, connectionList
= [{ id: f47e9f99-9989-4297-a84f-4f75338eeced, connection:
10.100.100.105:/var/lib/exports/iso, iqn: null, vfsType: null,
mountOptions: null, nfsVersion: null, nfsRetrans: null, nfsTimeo: null
};]), log id: 1c35901d
2014-06-05 12:30:00,250 INFO
[org.ovirt.engine.core.vdsbroker.vdsbroker.ConnectStorageServerVDSCommand]
(org.ovirt.thread.pool-6-thread-24) [750ef0f4] START,
ConnectStorageServerVDSCommand(HostName = node2.test.now, HostId =
28fdcb5d-7acd-410e-8b65-0b4f483cb575, storagePoolId =
00000000-0000-0000-0000-000000000000, storageType = NFS, connectionList
= [{ id: f47e9f99-9989-4297-a84f-4f75338eeced, connection:
10.100.100.105:/var/lib/exports/iso, iqn: null, vfsType: null,
mountOptions: null, nfsVersion: null, nfsRetrans: null, nfsTimeo: null
};]), log id: 1a4d78a1
2014-06-05 12:30:00,250 INFO
[org.ovirt.engine.core.vdsbroker.vdsbroker.ConnectStorageServerVDSCommand]
(org.ovirt.thread.pool-6-thread-26) [10056f73] START,
ConnectStorageServerVDSCommand(HostName = node3.test.now, HostId =
7415506c-cda7-4018-804d-5f6d3beddbfb, storagePoolId =
00000000-0000-0000-0000-000000000000, storageType = NFS, connectionList
= [{ id: f47e9f99-9989-4297-a84f-4f75338eeced, connection:
10.100.100.105:/var/lib/exports/iso, iqn: null, vfsType: null,
mountOptions: null, nfsVersion: null, nfsRetrans: null, nfsTimeo: null
};]), log id: 60464b56
--------------000602090709060001060807
Content-Type: multipart/related;
boundary="------------090404080503060404080706"
--------------090404080503060404080706
Content-Type: text/html; charset=ISO-8859-1
Content-Transfer-Encoding: 7bit
<html>
<head>
<meta http-equiv="content-type" content="text/html; charset=ISO-8859-1">
</head>
<body text="#000000" bgcolor="#FFFFFF">
Hi,<br>
<br>
After upgrading to 3.4.2 rc I want to reinstall the nodes. But there
are some benchmark VM's that I'm unable to shutdown to put the nodes
in maintenance. How can I kill these VM's without having to kill the
power to the nodes manually?<br>
<br>
Kind regards, <br>
Jorick Astrego<br>
<br>
Netbulae B.V.<br>
<br>
<table style="table-layout: fixed;" class="GIIOMO5DDTB"
__gwtcellbasedwidgetimpldispatchingblur="true"
__gwtcellbasedwidgetimpldispatchingfocus="true" height="403"
width="825" cellspacing="0">
<tbody>
<tr __gwt_row="1" __gwt_subrow="0" class="GIIOMO5DMSB
GIIOMO5DDSB">
<td class="GIIOMO5DLRB GIIOMO5DNSB GIIOMO5DORB GIIOMO5DESB">
<div style="outline-style:none;"
__gwt_cell="cell-gwt-uid-18350">
<div style="line-height: 100%; text-align: center;
vertical-align: middle;"><br>
<img onload='this.__gwtLastUnhandledEvent="load";'
src="cid:part1.07090907.08000003@netbulae.eu"
style="width:14px;height:12px;background:url(data:image/gif;base64,R0lGODlhDgAMAJEDAMwzAP8zAGYAAP///yH5BAEAAAMALAAAAAAOAAwAAAIs3GA5ogeYQGBGyADlo+u+gHHVBQJiYpVnBJaN4XmNJ6QYG45uZypWX1nVEgUAOw==)
no-repeat 0px 0px;" border="0"></div>
</div>
</td>
<td class="GIIOMO5DLRB GIIOMO5DNSB GIIOMO5DESB">
<div title="" style="outline-style: none;"
__gwt_cell="cell-gwt-uid-18351">
<div id="gwt-uid-3662_col1_row1">2014-Jun-05, 12:24</div>
</div>
</td>
<td class="GIIOMO5DLRB GIIOMO5DNSB GIIOMO5DISB GIIOMO5DESB">
<div title="" style="outline-style: none;"
__gwt_cell="cell-gwt-uid-18352">
<div id="gwt-uid-3662_col2_row1">Failed to switch Host
node1.test.now to Maintenance mode.</div>
</div>
</td>
</tr>
<tr __gwt_row="2" __gwt_subrow="0" class="GIIOMO5DMRB">
<td class="GIIOMO5DLRB GIIOMO5DNRB GIIOMO5DORB">
<div style="outline-style:none;"
__gwt_cell="cell-gwt-uid-18350">
<div style="line-height: 100%; text-align: center;
vertical-align: middle;"><img
onload='this.__gwtLastUnhandledEvent="load";'
src="cid:part1.07090907.08000003@netbulae.eu"
style="width:14px;height:12px;background:url(data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAAA4AAAAMCAYAAABSgIzaAAAAKklEQVR42mNgwAL+n0n7/3+mMRifSTP+z0AsGNVIC40gxciYgeZgcPoRAJrmd4n8GZZPAAAAAElFTkSuQmCC)
no-repeat 0px 0px;" border="0"></div>
</div>
</td>
<td class="GIIOMO5DLRB GIIOMO5DNRB">
<div title="" style="outline-style: none;"
__gwt_cell="cell-gwt-uid-18351">
<div id="gwt-uid-3662_col1_row2">2014-Jun-05, 12:24</div>
</div>
</td>
<td class="GIIOMO5DLRB GIIOMO5DNRB GIIOMO5DISB">
<div title="" style="outline-style: none;"
__gwt_cell="cell-gwt-uid-18352">
<div id="gwt-uid-3662_col2_row2">Host node1.test.now
cannot change into maintenance mode - not all Vms have
been migrated successfully. Consider manual
intervention: stopping/migrating Vms: Bench1 (User:
admin).</div>
</div>
</td>
</tr>
<tr __gwt_row="3" __gwt_subrow="0" class="GIIOMO5DMSB">
<td class="GIIOMO5DLRB GIIOMO5DNSB GIIOMO5DORB">
<div style="outline-style:none;"
__gwt_cell="cell-gwt-uid-18350">
<div style="line-height: 100%; text-align: center;
vertical-align: middle;"><img
onload='this.__gwtLastUnhandledEvent="load";'
src="cid:part1.07090907.08000003@netbulae.eu"
style="width:14px;height:12px;background:url(data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAAA4AAAAMCAYAAABSgIzaAAAAKklEQVR42mNgwAL+n0n7/3+mMRifSTP+z0AsGNVIC40gxciYgeZgcPoRAJrmd4n8GZZPAAAAAElFTkSuQmCC)
no-repeat 0px 0px;" border="0"></div>
</div>
</td>
<td class="GIIOMO5DLRB GIIOMO5DNSB">
<div tabindex="0" style="outline-style:none;"
__gwt_cell="cell-gwt-uid-18351">
<div id="gwt-uid-3662_col1_row3">2014-Jun-05, 12:05</div>
</div>
</td>
<td class="GIIOMO5DLRB GIIOMO5DNSB GIIOMO5DISB">
<div title="" style="outline-style: none;"
__gwt_cell="cell-gwt-uid-18352">
<div id="gwt-uid-3662_col2_row3">Shutdown of VM Bench2
failed.</div>
</div>
</td>
</tr>
<tr __gwt_row="4" __gwt_subrow="0" class="GIIOMO5DMRB">
<td class="GIIOMO5DLRB GIIOMO5DNRB GIIOMO5DORB">
<div style="outline-style:none;"
__gwt_cell="cell-gwt-uid-18350">
<div style="line-height: 100%; text-align: center;
vertical-align: middle;"><img
onload='this.__gwtLastUnhandledEvent="load";'
src="cid:part1.07090907.08000003@netbulae.eu"
style="width:14px;height:12px;background:url(data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAAA4AAAAMCAYAAABSgIzaAAAAKklEQVR42mNgwAL+n0n7/3+mMRifSTP+z0AsGNVIC40gxciYgeZgcPoRAJrmd4n8GZZPAAAAAElFTkSuQmCC)
no-repeat 0px 0px;" border="0"></div>
</div>
</td>
<td class="GIIOMO5DLRB GIIOMO5DNRB">
<div style="outline-style:none;"
__gwt_cell="cell-gwt-uid-18351">
<div id="gwt-uid-3662_col1_row4">2014-Jun-05, 12:05</div>
</div>
</td>
<td class="GIIOMO5DLRB GIIOMO5DNRB GIIOMO5DISB">
<div title="" style="outline-style: none;"
__gwt_cell="cell-gwt-uid-18352">
<div id="gwt-uid-3662_col2_row4">Shutdown of VM Bench4
failed.</div>
</div>
</td>
</tr>
<tr __gwt_row="5" __gwt_subrow="0" class="GIIOMO5DMSB">
<td class="GIIOMO5DLRB GIIOMO5DNSB GIIOMO5DORB">
<div style="outline-style:none;"
__gwt_cell="cell-gwt-uid-18350">
<div style="line-height: 100%; text-align: center;
vertical-align: middle;"><img
onload='this.__gwtLastUnhandledEvent="load";'
src="cid:part1.07090907.08000003@netbulae.eu"
style="width:14px;height:12px;background:url(data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAAA4AAAAMCAYAAABSgIzaAAAAKklEQVR42mNgwAL+n0n7/3+mMRifSTP+z0AsGNVIC40gxciYgeZgcPoRAJrmd4n8GZZPAAAAAElFTkSuQmCC)
no-repeat 0px 0px;" border="0"></div>
</div>
</td>
<td class="GIIOMO5DLRB GIIOMO5DNSB">
<div style="outline-style:none;"
__gwt_cell="cell-gwt-uid-18351">
<div id="gwt-uid-3662_col1_row5">2014-Jun-05, 12:05</div>
</div>
</td>
<td class="GIIOMO5DLRB GIIOMO5DNSB GIIOMO5DISB">
<div title="" style="outline-style: none;"
__gwt_cell="cell-gwt-uid-18352">
<div id="gwt-uid-3662_col2_row5">Shutdown of VM Bench3
failed. <br>
</div>
</div>
</td>
</tr>
<tr __gwt_row="6" __gwt_subrow="0" class="GIIOMO5DMRB">
<td class="GIIOMO5DLRB GIIOMO5DNRB GIIOMO5DORB">
<div style="outline-style:none;"
__gwt_cell="cell-gwt-uid-18350">
<div style="line-height: 100%; text-align: center;
vertical-align: middle;"><img
onload='this.__gwtLastUnhandledEvent="load";'
src="cid:part1.07090907.08000003@netbulae.eu"
style="width:14px;height:12px;background:url(data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAAA4AAAAMCAYAAABSgIzaAAAAKklEQVR42mNgwAL+n0n7/3+mMRifSTP+z0AsGNVIC40gxciYgeZgcPoRAJrmd4n8GZZPAAAAAElFTkSuQmCC)
no-repeat 0px 0px;" border="0"></div>
</div>
</td>
<td class="GIIOMO5DLRB GIIOMO5DNRB">
<div style="outline-style:none;"
__gwt_cell="cell-gwt-uid-18351">
<div id="gwt-uid-3662_col1_row6">2014-Jun-05, 12:05</div>
</div>
</td>
<td class="GIIOMO5DLRB GIIOMO5DNRB GIIOMO5DISB">
<div title="" style="outline-style: none;"
__gwt_cell="cell-gwt-uid-18352">
<div id="gwt-uid-3662_col2_row6">Shutdown of VM Bench1
failed.</div>
</div>
</td>
</tr>
<tr __gwt_row="7" __gwt_subrow="0" class="GIIOMO5DMSB
GIIOMO5DOSB">
<td class="GIIOMO5DLRB GIIOMO5DNSB GIIOMO5DORB GIIOMO5DPSB">
<div style="outline-style:none;"
__gwt_cell="cell-gwt-uid-18350">
<div style="line-height: 100%; text-align: center;
vertical-align: middle;"><img
onload='this.__gwtLastUnhandledEvent="load";'
src="cid:part1.07090907.08000003@netbulae.eu"
style="width:14px;height:12px;background:url(data:image/gif;base64,R0lGODlhDgAMAKIHAE3SEhlQAGHwKiYmJje5BqL/olXgG////yH5BAEAAAcALAAAAAAOAAwAAAM0eLp8ES2GAmBUUxArizDbMQxMBkIDUZCOdypDIY8eEMKAIMiGzeG63q0xABiHkRTyMvgxEgA7)
no-repeat 0px 0px;" border="0"></div>
</div>
</td>
<td class="GIIOMO5DLRB GIIOMO5DNSB GIIOMO5DPSB">
<div style="outline-style:none;"
__gwt_cell="cell-gwt-uid-18351">
<div id="gwt-uid-3662_col1_row7">2014-Jun-05, 12:00</div>
</div>
</td>
<td class="GIIOMO5DLRB GIIOMO5DNSB GIIOMO5DPSB GIIOMO5DISB">
<div title="" tabindex="0" style="outline-style: none;"
__gwt_cell="cell-gwt-uid-18352">
<div id="gwt-uid-3662_col2_row7">VM shutdown initiated by
admin on VM Bench2 (Host: node2.test.now).</div>
</div>
</td>
</tr>
<tr __gwt_row="8" __gwt_subrow="0" class="GIIOMO5DMRB">
<td class="GIIOMO5DLRB GIIOMO5DNRB GIIOMO5DORB">
<div style="outline-style:none;"
__gwt_cell="cell-gwt-uid-18350">
<div style="line-height: 100%; text-align: center;
vertical-align: middle;"><img
onload='this.__gwtLastUnhandledEvent="load";'
src="cid:part1.07090907.08000003@netbulae.eu"
style="width:14px;height:12px;background:url(data:image/gif;base64,R0lGODlhDgAMAKIHAE3SEhlQAGHwKiYmJje5BqL/olXgG////yH5BAEAAAcALAAAAAAOAAwAAAM0eLp8ES2GAmBUUxArizDbMQxMBkIDUZCOdypDIY8eEMKAIMiGzeG63q0xABiHkRTyMvgxEgA7)
no-repeat 0px 0px;" border="0"></div>
</div>
</td>
<td class="GIIOMO5DLRB GIIOMO5DNRB">
<div style="outline-style:none;"
__gwt_cell="cell-gwt-uid-18351">
<div id="gwt-uid-3662_col1_row8">2014-Jun-05, 12:00</div>
</div>
</td>
<td class="GIIOMO5DLRB GIIOMO5DNRB GIIOMO5DISB">
<div title="" style="outline-style: none;"
__gwt_cell="cell-gwt-uid-18352">
<div id="gwt-uid-3662_col2_row8">VM shutdown initiated by
admin on VM Bench4 (Host: node4.test.now).</div>
</div>
</td>
</tr>
<tr __gwt_row="9" __gwt_subrow="0" class="GIIOMO5DMSB">
<td class="GIIOMO5DLRB GIIOMO5DNSB GIIOMO5DORB">
<div style="outline-style:none;"
__gwt_cell="cell-gwt-uid-18350">
<div style="line-height: 100%; text-align: center;
vertical-align: middle;"><img
onload='this.__gwtLastUnhandledEvent="load";'
src="cid:part1.07090907.08000003@netbulae.eu"
style="width:14px;height:12px;background:url(data:image/gif;base64,R0lGODlhDgAMAKIHAE3SEhlQAGHwKiYmJje5BqL/olXgG////yH5BAEAAAcALAAAAAAOAAwAAAM0eLp8ES2GAmBUUxArizDbMQxMBkIDUZCOdypDIY8eEMKAIMiGzeG63q0xABiHkRTyMvgxEgA7)
no-repeat 0px 0px;" border="0"></div>
</div>
</td>
<td class="GIIOMO5DLRB GIIOMO5DNSB">
<div style="outline-style:none;"
__gwt_cell="cell-gwt-uid-18351">
<div id="gwt-uid-3662_col1_row9">2014-Jun-05, 12:00</div>
</div>
</td>
<td class="GIIOMO5DLRB GIIOMO5DNSB GIIOMO5DISB">
<div title="" style="outline-style: none;"
__gwt_cell="cell-gwt-uid-18352">
<div id="gwt-uid-3662_col2_row9">VM shutdown initiated by
admin on VM Bench3 (Host: node3.test.now).</div>
</div>
</td>
</tr>
<tr __gwt_row="10" __gwt_subrow="0" class="GIIOMO5DMRB">
<td class="GIIOMO5DLRB GIIOMO5DNRB GIIOMO5DORB">
<div style="outline-style:none;"
__gwt_cell="cell-gwt-uid-18350">
<div style="line-height: 100%; text-align: center;
vertical-align: middle;"><img
onload='this.__gwtLastUnhandledEvent="load";'
src="cid:part1.07090907.08000003@netbulae.eu"
style="width:14px;height:12px;background:url(data:image/gif;base64,R0lGODlhDgAMAKIHAE3SEhlQAGHwKiYmJje5BqL/olXgG////yH5BAEAAAcALAAAAAAOAAwAAAM0eLp8ES2GAmBUUxArizDbMQxMBkIDUZCOdypDIY8eEMKAIMiGzeG63q0xABiHkRTyMvgxEgA7)
no-repeat 0px 0px;" border="0"></div>
</div>
</td>
<td class="GIIOMO5DLRB GIIOMO5DNRB">
<div style="outline-style:none;"
__gwt_cell="cell-gwt-uid-18351">
<div id="gwt-uid-3662_col1_row10">2014-Jun-05, 12:00</div>
</div>
</td>
<td class="GIIOMO5DLRB GIIOMO5DNRB GIIOMO5DISB">
<div title="" style="outline-style: none;"
__gwt_cell="cell-gwt-uid-18352">
<div id="gwt-uid-3662_col2_row10">VM shutdown initiated by
admin on VM Bench1 (Host: node1.test.now). <br>
</div>
</div>
</td>
</tr>
</tbody>
</table>
<br>
<br>
<br>
2014-06-05 12:00:33,127 INFO
[org.ovirt.engine.core.bll.MultipleActionsRunner]
(org.ovirt.thread.pool-6-thread-8) [7c4b08f7] Start running
CanDoAction for command number 1/3 (Command type: ShutdownVm)<br>
2014-06-05 12:00:33,130 INFO
[org.ovirt.engine.core.bll.MultipleActionsRunner]
(org.ovirt.thread.pool-6-thread-14) [2239cfe1] Start running
CanDoAction for command number 2/3 (Command type: ShutdownVm)<br>
2014-06-05 12:00:33,134 INFO
[org.ovirt.engine.core.bll.MultipleActionsRunner]
(org.ovirt.thread.pool-6-thread-50) [17ece55] Start running
CanDoAction for command number 3/3 (Command type: ShutdownVm)<br>
2014-06-05 12:00:33,225 INFO
[org.ovirt.engine.core.bll.MultipleActionsRunner]
(org.ovirt.thread.pool-6-thread-50) [17ece55] Finish handling
CanDoAction for command number 3/3 (Command type: ShutdownVm)<br>
2014-06-05 12:00:33,229 INFO
[org.ovirt.engine.core.bll.MultipleActionsRunner]
(org.ovirt.thread.pool-6-thread-8) [7c4b08f7] Finish handling
CanDoAction for command number 1/3 (Command type: ShutdownVm)<br>
2014-06-05 12:00:33,234 INFO
[org.ovirt.engine.core.bll.MultipleActionsRunner]
(org.ovirt.thread.pool-6-thread-14) [2239cfe1] Finish handling
CanDoAction for command number 2/3 (Command type: ShutdownVm)<br>
2014-06-05 12:00:33,710 INFO
[org.ovirt.engine.core.bll.ShutdownVmCommand]
(org.ovirt.thread.pool-6-thread-3) [7c4b08f7] Running command:
ShutdownVmCommand internal: false. Entities affected : ID:
6709eaa1-163f-4dc5-9101-e46870438f38 Type: VM<br>
2014-06-05 12:00:33,728 INFO
[org.ovirt.engine.core.bll.ShutdownVmCommand]
(org.ovirt.thread.pool-6-thread-3) [7c4b08f7] Entered (VM Bench3).<br>
2014-06-05 12:00:33,728 INFO
[org.ovirt.engine.core.bll.ShutdownVmCommand]
(org.ovirt.thread.pool-6-thread-3) [7c4b08f7] Sending shutdown
command for VM Bench3.<br>
2014-06-05 12:00:33,764 INFO
[org.ovirt.engine.core.vdsbroker.DestroyVmVDSCommand]
(org.ovirt.thread.pool-6-thread-3) [7c4b08f7] START,
DestroyVmVDSCommand(HostName = node3.test.now, HostId =
7415506c-cda7-4018-804d-5f6d3beddbfb,
vmId=6709eaa1-163f-4dc5-9101-e46870438f38, force=false,
secondsToWait=30, gracefully=true), log id: 5eddc358<br>
2014-06-05 12:00:33,790 INFO
[org.ovirt.engine.core.vdsbroker.vdsbroker.DestroyVDSCommand]
(org.ovirt.thread.pool-6-thread-3) [7c4b08f7] START,
DestroyVDSCommand(HostName = node3.test.now, HostId =
7415506c-cda7-4018-804d-5f6d3beddbfb,
vmId=6709eaa1-163f-4dc5-9101-e46870438f38, force=false,
secondsToWait=30, gracefully=true), log id: 19f6f7c8<br>
2014-06-05 12:00:33,838 INFO
[org.ovirt.engine.core.vdsbroker.vdsbroker.DestroyVDSCommand]
(org.ovirt.thread.pool-6-thread-3) [7c4b08f7] FINISH,
DestroyVDSCommand, log id: 19f6f7c8<br>
2014-06-05 12:00:33,843 INFO
[org.ovirt.engine.core.vdsbroker.DestroyVmVDSCommand]
(org.ovirt.thread.pool-6-thread-3) [7c4b08f7] FINISH,
DestroyVmVDSCommand, return: PoweringDown, log id: 5eddc358<br>
2014-06-05 12:00:33,855 INFO
[org.ovirt.engine.core.dal.dbbroker.auditloghandling.AuditLogDirector]
(org.ovirt.thread.pool-6-thread-3) [7c4b08f7] Correlation ID:
7c4b08f7, Job ID: 406c481e-102b-4488-9286-c9b38197ef36, Call Stack:
null, Custom Event ID: -1, Message: VM shutdown initiated by admin
on VM Bench3 (Host: node3.test.now).<br>
2014-06-05 12:00:33,921 INFO
[org.ovirt.engine.core.bll.ShutdownVmCommand]
(org.ovirt.thread.pool-6-thread-3) [2239cfe1] Running command:
ShutdownVmCommand internal: false. Entities affected : ID:
8b305f06-2c82-4a13-99f2-5beab5a056ea Type: VM<br>
2014-06-05 12:00:33,967 INFO
[org.ovirt.engine.core.bll.ShutdownVmCommand]
(org.ovirt.thread.pool-6-thread-3) [2239cfe1] Entered (VM Bench4).<br>
2014-06-05 12:00:33,969 INFO
[org.ovirt.engine.core.bll.ShutdownVmCommand]
(org.ovirt.thread.pool-6-thread-3) [2239cfe1] Sending shutdown
command for VM Bench4.<br>
2014-06-05 12:00:34,001 INFO
[org.ovirt.engine.core.vdsbroker.DestroyVmVDSCommand]
(org.ovirt.thread.pool-6-thread-3) [2239cfe1] START,
DestroyVmVDSCommand(HostName = node4.test.now, HostId =
bb13752e-85cb-4945-822b-48ab2a7b1329,
vmId=8b305f06-2c82-4a13-99f2-5beab5a056ea, force=false,
secondsToWait=30, gracefully=true), log id: 23611c0e<br>
2014-06-05 12:00:34,041 INFO
[org.ovirt.engine.core.vdsbroker.vdsbroker.DestroyVDSCommand]
(org.ovirt.thread.pool-6-thread-3) [2239cfe1] START,
DestroyVDSCommand(HostName = node4.test.now, HostId =
bb13752e-85cb-4945-822b-48ab2a7b1329,
vmId=8b305f06-2c82-4a13-99f2-5beab5a056ea, force=false,
secondsToWait=30, gracefully=true), log id: 4f1663eb<br>
2014-06-05 12:00:34,048 INFO
[org.ovirt.engine.core.vdsbroker.vdsbroker.DestroyVDSCommand]
(org.ovirt.thread.pool-6-thread-3) [2239cfe1] FINISH,
DestroyVDSCommand, log id: 4f1663eb<br>
2014-06-05 12:00:34,257 INFO
[org.ovirt.engine.core.vdsbroker.DestroyVmVDSCommand]
(org.ovirt.thread.pool-6-thread-3) [2239cfe1] FINISH,
DestroyVmVDSCommand, return: PoweringDown, log id: 23611c0e<br>
2014-06-05 12:00:34,455 INFO
[org.ovirt.engine.core.dal.dbbroker.auditloghandling.AuditLogDirector]
(org.ovirt.thread.pool-6-thread-3) [2239cfe1] Correlation ID:
2239cfe1, Job ID: 00fb00a7-88b0-4fb1-b14a-fa099e3d6409, Call Stack:
null, Custom Event ID: -1, Message: VM shutdown initiated by admin
on VM Bench4 (Host: node4.test.now).<br>
2014-06-05 12:00:34,677 INFO
[org.ovirt.engine.core.bll.ShutdownVmCommand]
(org.ovirt.thread.pool-6-thread-3) [17ece55] Running command:
ShutdownVmCommand internal: false. Entities affected : ID:
764b60fd-c255-479b-9082-3f4f04b95cb2 Type: VM<br>
2014-06-05 12:00:34,695 INFO
[org.ovirt.engine.core.bll.ShutdownVmCommand]
(org.ovirt.thread.pool-6-thread-3) [17ece55] Entered (VM Bench2).<br>
2014-06-05 12:00:34,696 INFO
[org.ovirt.engine.core.bll.ShutdownVmCommand]
(org.ovirt.thread.pool-6-thread-3) [17ece55] Sending shutdown
command for VM Bench2.<br>
2014-06-05 12:00:34,718 INFO
[org.ovirt.engine.core.vdsbroker.DestroyVmVDSCommand]
(org.ovirt.thread.pool-6-thread-3) [17ece55] START,
DestroyVmVDSCommand(HostName = node2.test.now, HostId =
28fdcb5d-7acd-410e-8b65-0b4f483cb575,
vmId=764b60fd-c255-479b-9082-3f4f04b95cb2, force=false,
secondsToWait=30, gracefully=true), log id: 215e8d55<br>
2014-06-05 12:00:34,740 INFO
[org.ovirt.engine.core.vdsbroker.vdsbroker.DestroyVDSCommand]
(org.ovirt.thread.pool-6-thread-3) [17ece55] START,
DestroyVDSCommand(HostName = node2.test.now, HostId =
28fdcb5d-7acd-410e-8b65-0b4f483cb575,
vmId=764b60fd-c255-479b-9082-3f4f04b95cb2, force=false,
secondsToWait=30, gracefully=true), log id: 6a5599de<br>
2014-06-05 12:00:34,755 INFO
[org.ovirt.engine.core.vdsbroker.vdsbroker.DestroyVDSCommand]
(org.ovirt.thread.pool-6-thread-3) [17ece55] FINISH,
DestroyVDSCommand, log id: 6a5599de<br>
2014-06-05 12:00:34,777 INFO
[org.ovirt.engine.core.vdsbroker.DestroyVmVDSCommand]
(org.ovirt.thread.pool-6-thread-3) [17ece55] FINISH,
DestroyVmVDSCommand, return: PoweringDown, log id: 215e8d55<br>
2014-06-05 12:00:34,833 INFO
[org.ovirt.engine.core.dal.dbbroker.auditloghandling.AuditLogDirector]
(org.ovirt.thread.pool-6-thread-3) [17ece55] Correlation ID:
17ece55, Job ID: d92b0c9a-fb0f-4ca6-bc51-8d363aaf378c, Call Stack:
null, Custom Event ID: -1, Message: VM shutdown initiated by admin
on VM Bench2 (Host: node2.test.now).<br>
2014-06-05 12:01:00,648 INFO
[org.ovirt.engine.core.vdsbroker.vdsbroker.ConnectStorageServerVDSCommand]
(org.ovirt.thread.pool-6-thread-45) [22ac5fb6] FINISH,
ConnectStorageServerVDSCommand, return:
{f47e9f99-9989-4297-a84f-4f75338eeced=0}, log id: 7045fa3b<br>
2014-06-05 12:01:00,650 INFO
[org.ovirt.engine.core.vdsbroker.vdsbroker.ConnectStorageServerVDSCommand]
(org.ovirt.thread.pool-6-thread-48) [4fbc37fa] FINISH,
ConnectStorageServerVDSCommand, return:
{f47e9f99-9989-4297-a84f-4f75338eeced=0}, log id: 306b54b0<br>
2014-06-05 12:01:00,657 INFO
[org.ovirt.engine.core.vdsbroker.vdsbroker.ConnectStorageServerVDSCommand]
(org.ovirt.thread.pool-6-thread-25) [760d647e] FINISH,
ConnectStorageServerVDSCommand, return:
{f47e9f99-9989-4297-a84f-4f75338eeced=0}, log id: 7f7a769d<br>
2014-06-05 12:01:00,661 INFO
[org.ovirt.engine.core.vdsbroker.vdsbroker.ConnectStorageServerVDSCommand]
(org.ovirt.thread.pool-6-thread-44) [6949bbc8] FINISH,
ConnectStorageServerVDSCommand, return:
{f47e9f99-9989-4297-a84f-4f75338eeced=0}, log id: 610dcd45<br>
2014-06-05 12:01:00,663 INFO
[org.ovirt.engine.core.bll.storage.ConnectDomainToStorageCommand]
(DefaultQuartzScheduler_Worker-94) [21aa560a]
ConnectDomainToStorage. After Connect all hosts to pool. Time:6/5/14
12:01 PM<br>
2014-06-05 12:02:12,172 WARN
[org.ovirt.engine.core.utils.threadpool.ThreadPoolUtil]
(org.ovirt.thread.pool-6-thread-50) Executing a command:
java.util.concurrent.FutureTask , but note that there are 1 tasks in
the queue.<br>
2014-06-05 12:02:15,245 WARN
[org.ovirt.engine.core.utils.threadpool.ThreadPoolUtil]
(org.ovirt.thread.pool-6-thread-37) Executing a command:
java.util.concurrent.FutureTask , but note that there are 1 tasks in
the queue.<br>
2014-06-05 12:02:30,697 WARN
[org.ovirt.engine.core.utils.threadpool.ThreadPoolUtil]
(org.ovirt.thread.pool-6-thread-9) Executing a command:
java.util.concurrent.FutureTask , but note that there are 1 tasks in
the queue.<br>
2014-06-05 12:02:30,724 WARN
[org.ovirt.engine.core.utils.threadpool.ThreadPoolUtil]
(org.ovirt.thread.pool-6-thread-2) Executing a command:
java.util.concurrent.FutureTask , but note that there are 1 tasks in
the queue.<br>
2014-06-05 12:03:07,860 WARN
[org.ovirt.engine.core.utils.threadpool.ThreadPoolUtil]
(org.ovirt.thread.pool-6-thread-39) Executing a command:
java.util.concurrent.FutureTask , but note that there are 1 tasks in
the queue.<br>
2014-06-05 12:04:27,836 WARN
[org.ovirt.engine.core.utils.threadpool.ThreadPoolUtil]
(org.ovirt.thread.pool-6-thread-19) Executing a command:
java.util.concurrent.FutureTask , but note that there are 1 tasks in
the queue.<br>
2014-06-05 12:04:27,841 WARN
[org.ovirt.engine.core.utils.threadpool.ThreadPoolUtil]
(org.ovirt.thread.pool-6-thread-50) Executing a command:
java.util.concurrent.FutureTask , but note that there are 1 tasks in
the queue.<br>
2014-06-05 12:05:00,296 INFO
[org.ovirt.engine.core.bll.AutoRecoveryManager]
(DefaultQuartzScheduler_Worker-28) [10a45e4f] Autorecovering 1
storage domains<br>
2014-06-05 12:05:00,296 INFO
[org.ovirt.engine.core.bll.AutoRecoveryManager]
(DefaultQuartzScheduler_Worker-28) [10a45e4f] Autorecovering storage
domains id: 9923a5a1-61e0-4edb-a04d-22c962190c53<br>
2014-06-05 12:05:00,299 INFO
[org.ovirt.engine.core.bll.storage.ConnectDomainToStorageCommand]
(DefaultQuartzScheduler_Worker-28) [9bf7039] Running command:
ConnectDomainToStorageCommand internal: true. Entities affected :
ID: 9923a5a1-61e0-4edb-a04d-22c962190c53 Type: Storage<br>
2014-06-05 12:05:00,304 INFO
[org.ovirt.engine.core.bll.storage.ConnectDomainToStorageCommand]
(DefaultQuartzScheduler_Worker-28) [9bf7039] ConnectDomainToStorage.
Before Connect all hosts to pool. Time:6/5/14 12:05 PM<br>
2014-06-05 12:05:00,441 INFO
[org.ovirt.engine.core.bll.storage.ConnectStorageToVdsCommand]
(org.ovirt.thread.pool-6-thread-35) [3164009c] Running command:
ConnectStorageToVdsCommand internal: true. Entities affected : ID:
aaa00000-0000-0000-0000-123456789aaa Type: System<br>
2014-06-05 12:05:00,446 INFO
[org.ovirt.engine.core.bll.storage.ConnectStorageToVdsCommand]
(org.ovirt.thread.pool-6-thread-29) [1a51a7f5] Running command:
ConnectStorageToVdsCommand internal: true. Entities affected : ID:
aaa00000-0000-0000-0000-123456789aaa Type: System<br>
2014-06-05 12:05:00,450 INFO
[org.ovirt.engine.core.bll.storage.ConnectStorageToVdsCommand]
(org.ovirt.thread.pool-6-thread-5) [19f33713] Running command:
ConnectStorageToVdsCommand internal: true. Entities affected : ID:
aaa00000-0000-0000-0000-123456789aaa Type: System<br>
2014-06-05 12:05:00,454 INFO
[org.ovirt.engine.core.bll.storage.ConnectStorageToVdsCommand]
(org.ovirt.thread.pool-6-thread-10) [3dd2802] Running command:
ConnectStorageToVdsCommand internal: true. Entities affected : ID:
aaa00000-0000-0000-0000-123456789aaa Type: System<br>
2014-06-05 12:05:00,577 INFO
[org.ovirt.engine.core.vdsbroker.vdsbroker.ConnectStorageServerVDSCommand]
(org.ovirt.thread.pool-6-thread-35) [3164009c] START,
ConnectStorageServerVDSCommand(HostName = node4.test.now, HostId =
bb13752e-85cb-4945-822b-48ab2a7b1329, storagePoolId =
00000000-0000-0000-0000-000000000000, storageType = NFS,
connectionList = [{ id: f47e9f99-9989-4297-a84f-4f75338eeced,
connection: 10.100.100.105:/var/lib/exports/iso, iqn: null, vfsType:
null, mountOptions: null, nfsVersion: null, nfsRetrans: null,
nfsTimeo: null };]), log id: 7ea7832a<br>
2014-06-05 12:05:00,581 INFO
[org.ovirt.engine.core.vdsbroker.vdsbroker.ConnectStorageServerVDSCommand]
(org.ovirt.thread.pool-6-thread-29) [1a51a7f5] START,
ConnectStorageServerVDSCommand(HostName = node3.test.now, HostId =
7415506c-cda7-4018-804d-5f6d3beddbfb, storagePoolId =
00000000-0000-0000-0000-000000000000, storageType = NFS,
connectionList = [{ id: f47e9f99-9989-4297-a84f-4f75338eeced,
connection: 10.100.100.105:/var/lib/exports/iso, iqn: null, vfsType:
null, mountOptions: null, nfsVersion: null, nfsRetrans: null,
nfsTimeo: null };]), log id: 143f44d4<br>
2014-06-05 12:05:00,580 INFO
[org.ovirt.engine.core.vdsbroker.vdsbroker.ConnectStorageServerVDSCommand]
(org.ovirt.thread.pool-6-thread-10) [3dd2802] START,
ConnectStorageServerVDSCommand(HostName = node1.test.now, HostId =
e17a740c-47ba-4a81-99ab-6c386b572f26, storagePoolId =
00000000-0000-0000-0000-000000000000, storageType = NFS,
connectionList = [{ id: f47e9f99-9989-4297-a84f-4f75338eeced,
connection: 10.100.100.105:/var/lib/exports/iso, iqn: null, vfsType:
null, mountOptions: null, nfsVersion: null, nfsRetrans: null,
nfsTimeo: null };]), log id: 4058bc78<br>
2014-06-05 12:05:00,578 INFO
[org.ovirt.engine.core.vdsbroker.vdsbroker.ConnectStorageServerVDSCommand]
(org.ovirt.thread.pool-6-thread-5) [19f33713] START,
ConnectStorageServerVDSCommand(HostName = node2.test.now, HostId =
28fdcb5d-7acd-410e-8b65-0b4f483cb575, storagePoolId =
00000000-0000-0000-0000-000000000000, storageType = NFS,
connectionList = [{ id: f47e9f99-9989-4297-a84f-4f75338eeced,
connection: 10.100.100.105:/var/lib/exports/iso, iqn: null, vfsType:
null, mountOptions: null, nfsVersion: null, nfsRetrans: null,
nfsTimeo: null };]), log id: 6eb8491e<br>
2014-06-05 12:05:23,071 INFO
[org.ovirt.engine.core.vdsbroker.VdsUpdateRunTimeInfo]
(DefaultQuartzScheduler_Worker-10) [77f10397] VM Bench1
63b1fd02-1fb5-44d1-b1cc-54cb5c0fdb0e moved from PoweringDown -->
Up<br>
2014-06-05 12:05:23,153 INFO
[org.ovirt.engine.core.dal.dbbroker.auditloghandling.AuditLogDirector]
(DefaultQuartzScheduler_Worker-10) [77f10397] Correlation ID: null,
Call Stack: null, Custom Event ID: -1, Message: Shutdown of VM
Bench1 failed.<br>
2014-06-05 12:05:34,341 INFO
[org.ovirt.engine.core.vdsbroker.VdsUpdateRunTimeInfo]
(DefaultQuartzScheduler_Worker-65) [604c5b5d] VM Bench3
6709eaa1-163f-4dc5-9101-e46870438f38 moved from PoweringDown -->
Up<br>
2014-06-05 12:05:34,411 INFO
[org.ovirt.engine.core.dal.dbbroker.auditloghandling.AuditLogDirector]
(DefaultQuartzScheduler_Worker-65) [604c5b5d] Correlation ID: null,
Call Stack: null, Custom Event ID: -1, Message: Shutdown of VM
Bench3 failed.<br>
2014-06-05 12:05:35,312 INFO
[org.ovirt.engine.core.vdsbroker.VdsUpdateRunTimeInfo]
(DefaultQuartzScheduler_Worker-38) [3bbcf288] VM Bench4
8b305f06-2c82-4a13-99f2-5beab5a056ea moved from PoweringDown -->
Up<br>
2014-06-05 12:05:35,439 INFO
[org.ovirt.engine.core.dal.dbbroker.auditloghandling.AuditLogDirector]
(DefaultQuartzScheduler_Worker-38) [3bbcf288] Correlation ID: null,
Call Stack: null, Custom Event ID: -1, Message: Shutdown of VM
Bench4 failed.<br>
2014-06-05 12:05:35,663 INFO
[org.ovirt.engine.core.vdsbroker.VdsUpdateRunTimeInfo]
(DefaultQuartzScheduler_Worker-64) [40fb7b62] VM Bench2
764b60fd-c255-479b-9082-3f4f04b95cb2 moved from PoweringDown -->
Up<br>
2014-06-05 12:05:36,101 INFO
[org.ovirt.engine.core.dal.dbbroker.auditloghandling.AuditLogDirector]
(DefaultQuartzScheduler_Worker-64) [40fb7b62] Correlation ID: null,
Call Stack: null, Custom Event ID: -1, Message: Shutdown of VM
Bench2 failed.<br>
2014-06-05 12:06:00,627 INFO
[org.ovirt.engine.core.vdsbroker.vdsbroker.ConnectStorageServerVDSCommand]
(org.ovirt.thread.pool-6-thread-10) [3dd2802] FINISH,
ConnectStorageServerVDSCommand, return:
{f47e9f99-9989-4297-a84f-4f75338eeced=0}, log id: 4058bc78<br>
2014-06-05 12:06:00,629 INFO
[org.ovirt.engine.core.vdsbroker.vdsbroker.ConnectStorageServerVDSCommand]
(org.ovirt.thread.pool-6-thread-29) [1a51a7f5] FINISH,
ConnectStorageServerVDSCommand, return:
{f47e9f99-9989-4297-a84f-4f75338eeced=0}, log id: 143f44d4<br>
2014-06-05 12:06:00,632 INFO
[org.ovirt.engine.core.vdsbroker.vdsbroker.ConnectStorageServerVDSCommand]
(org.ovirt.thread.pool-6-thread-35) [3164009c] FINISH,
ConnectStorageServerVDSCommand, return:
{f47e9f99-9989-4297-a84f-4f75338eeced=0}, log id: 7ea7832a<br>
2014-06-05 12:06:00,645 INFO
[org.ovirt.engine.core.vdsbroker.vdsbroker.ConnectStorageServerVDSCommand]
(org.ovirt.thread.pool-6-thread-5) [19f33713] FINISH,
ConnectStorageServerVDSCommand, return:
{f47e9f99-9989-4297-a84f-4f75338eeced=0}, log id: 6eb8491e<br>
2014-06-05 12:06:00,647 INFO
[org.ovirt.engine.core.bll.storage.ConnectDomainToStorageCommand]
(DefaultQuartzScheduler_Worker-28) [9bf7039] ConnectDomainToStorage.
After Connect all hosts to pool. Time:6/5/14 12:06 PM<br>
2014-06-05 12:07:17,243 WARN
[org.ovirt.engine.core.utils.threadpool.ThreadPoolUtil]
(org.ovirt.thread.pool-6-thread-11) Executing a command:
java.util.concurrent.FutureTask , but note that there are 1 tasks in
the queue.<br>
2014-06-05 12:08:16,532 WARN
[org.ovirt.engine.core.utils.threadpool.ThreadPoolUtil]
(org.ovirt.thread.pool-6-thread-1) Executing a command:
java.util.concurrent.FutureTask , but note that there are 1 tasks in
the queue.<br>
2014-06-05 12:08:25,687 WARN
[org.ovirt.engine.core.utils.threadpool.ThreadPoolUtil]
(org.ovirt.thread.pool-6-thread-46) Executing a command:
java.util.concurrent.FutureTask , but note that there are 1 tasks in
the queue.<br>
2014-06-05 12:10:00,442 INFO
[org.ovirt.engine.core.bll.AutoRecoveryManager]
(DefaultQuartzScheduler_Worker-77) [7cace458] Autorecovering 1
storage domains<br>
2014-06-05 12:10:00,442 INFO
[org.ovirt.engine.core.bll.AutoRecoveryManager]
(DefaultQuartzScheduler_Worker-77) [7cace458] Autorecovering storage
domains id: 9923a5a1-61e0-4edb-a04d-22c962190c53<br>
2014-06-05 12:10:00,443 INFO
[org.ovirt.engine.core.bll.storage.ConnectDomainToStorageCommand]
(DefaultQuartzScheduler_Worker-77) [77f9bcd3] Running command:
ConnectDomainToStorageCommand internal: true. Entities affected :
ID: 9923a5a1-61e0-4edb-a04d-22c962190c53 Type: Storage<br>
2014-06-05 12:10:00,444 INFO
[org.ovirt.engine.core.bll.storage.ConnectDomainToStorageCommand]
(DefaultQuartzScheduler_Worker-77) [77f9bcd3]
ConnectDomainToStorage. Before Connect all hosts to pool.
Time:6/5/14 12:10 PM<br>
2014-06-05 12:10:00,597 WARN
[org.ovirt.engine.core.utils.threadpool.ThreadPoolUtil]
(org.ovirt.thread.pool-6-thread-37) Executing a command:
java.util.concurrent.FutureTask , but note that there are 3 tasks in
the queue.<br>
2014-06-05 12:10:00,598 WARN
[org.ovirt.engine.core.utils.threadpool.ThreadPoolUtil]
(org.ovirt.thread.pool-6-thread-10) Executing a command:
java.util.concurrent.FutureTask , but note that there are 2 tasks in
the queue.<br>
2014-06-05 12:10:00,599 WARN
[org.ovirt.engine.core.utils.threadpool.ThreadPoolUtil]
(org.ovirt.thread.pool-6-thread-35) Executing a command:
java.util.concurrent.FutureTask , but note that there are 1 tasks in
the queue.<br>
2014-06-05 12:10:00,600 INFO
[org.ovirt.engine.core.bll.storage.ConnectStorageToVdsCommand]
(org.ovirt.thread.pool-6-thread-5) [7ecb4630] Running command:
ConnectStorageToVdsCommand internal: true. Entities affected : ID:
aaa00000-0000-0000-0000-123456789aaa Type: System<br>
2014-06-05 12:10:00,602 INFO
[org.ovirt.engine.core.bll.storage.ConnectStorageToVdsCommand]
(org.ovirt.thread.pool-6-thread-35) [25233142] Running command:
ConnectStorageToVdsCommand internal: true. Entities affected : ID:
aaa00000-0000-0000-0000-123456789aaa Type: System<br>
2014-06-05 12:10:00,606 INFO
[org.ovirt.engine.core.bll.storage.ConnectStorageToVdsCommand]
(org.ovirt.thread.pool-6-thread-37) [175926fc] Running command:
ConnectStorageToVdsCommand internal: true. Entities affected : ID:
aaa00000-0000-0000-0000-123456789aaa Type: System<br>
2014-06-05 12:10:00,607 INFO
[org.ovirt.engine.core.bll.storage.ConnectStorageToVdsCommand]
(org.ovirt.thread.pool-6-thread-10) [8d05e48] Running command:
ConnectStorageToVdsCommand internal: true. Entities affected : ID:
aaa00000-0000-0000-0000-123456789aaa Type: System<br>
2014-06-05 12:10:00,745 INFO
[org.ovirt.engine.core.vdsbroker.vdsbroker.ConnectStorageServerVDSCommand]
(org.ovirt.thread.pool-6-thread-35) [25233142] START,
ConnectStorageServerVDSCommand(HostName = node2.test.now, HostId =
28fdcb5d-7acd-410e-8b65-0b4f483cb575, storagePoolId =
00000000-0000-0000-0000-000000000000, storageType = NFS,
connectionList = [{ id: f47e9f99-9989-4297-a84f-4f75338eeced,
connection: 10.100.100.105:/var/lib/exports/iso, iqn: null, vfsType:
null, mountOptions: null, nfsVersion: null, nfsRetrans: null,
nfsTimeo: null };]), log id: 16b62d67<br>
2014-06-05 12:10:00,748 INFO
[org.ovirt.engine.core.vdsbroker.vdsbroker.ConnectStorageServerVDSCommand]
(org.ovirt.thread.pool-6-thread-5) [7ecb4630] START,
ConnectStorageServerVDSCommand(HostName = node1.test.now, HostId =
e17a740c-47ba-4a81-99ab-6c386b572f26, storagePoolId =
00000000-0000-0000-0000-000000000000, storageType = NFS,
connectionList = [{ id: f47e9f99-9989-4297-a84f-4f75338eeced,
connection: 10.100.100.105:/var/lib/exports/iso, iqn: null, vfsType:
null, mountOptions: null, nfsVersion: null, nfsRetrans: null,
nfsTimeo: null };]), log id: 2bc2c88<br>
2014-06-05 12:10:00,748 INFO
[org.ovirt.engine.core.vdsbroker.vdsbroker.ConnectStorageServerVDSCommand]
(org.ovirt.thread.pool-6-thread-10) [8d05e48] START,
ConnectStorageServerVDSCommand(HostName = node3.test.now, HostId =
7415506c-cda7-4018-804d-5f6d3beddbfb, storagePoolId =
00000000-0000-0000-0000-000000000000, storageType = NFS,
connectionList = [{ id: f47e9f99-9989-4297-a84f-4f75338eeced,
connection: 10.100.100.105:/var/lib/exports/iso, iqn: null, vfsType:
null, mountOptions: null, nfsVersion: null, nfsRetrans: null,
nfsTimeo: null };]), log id: 4cb7e97c<br>
2014-06-05 12:10:00,747 INFO
[org.ovirt.engine.core.vdsbroker.vdsbroker.ConnectStorageServerVDSCommand]
(org.ovirt.thread.pool-6-thread-37) [175926fc] START,
ConnectStorageServerVDSCommand(HostName = node4.test.now, HostId =
bb13752e-85cb-4945-822b-48ab2a7b1329, storagePoolId =
00000000-0000-0000-0000-000000000000, storageType = NFS,
connectionList = [{ id: f47e9f99-9989-4297-a84f-4f75338eeced,
connection: 10.100.100.105:/var/lib/exports/iso, iqn: null, vfsType:
null, mountOptions: null, nfsVersion: null, nfsRetrans: null,
nfsTimeo: null };]), log id: 442ca304<br>
2014-06-05 12:10:51,626 WARN
[org.ovirt.engine.core.utils.threadpool.ThreadPoolUtil]
(org.ovirt.thread.pool-6-thread-38) Executing a command:
java.util.concurrent.FutureTask , but note that there are 1 tasks in
the queue.<br>
2014-06-05 12:10:54,669 WARN
[org.ovirt.engine.core.utils.threadpool.ThreadPoolUtil]
(org.ovirt.thread.pool-6-thread-14) Executing a command:
java.util.concurrent.FutureTask , but note that there are 1 tasks in
the queue.<br>
2014-06-05 12:11:00,801 INFO
[org.ovirt.engine.core.vdsbroker.vdsbroker.ConnectStorageServerVDSCommand]
(org.ovirt.thread.pool-6-thread-10) [8d05e48] FINISH,
ConnectStorageServerVDSCommand, return:
{f47e9f99-9989-4297-a84f-4f75338eeced=0}, log id: 4cb7e97c<br>
2014-06-05 12:11:00,811 INFO
[org.ovirt.engine.core.vdsbroker.vdsbroker.ConnectStorageServerVDSCommand]
(org.ovirt.thread.pool-6-thread-37) [175926fc] FINISH,
ConnectStorageServerVDSCommand, return:
{f47e9f99-9989-4297-a84f-4f75338eeced=0}, log id: 442ca304<br>
2014-06-05 12:11:00,814 INFO
[org.ovirt.engine.core.vdsbroker.vdsbroker.ConnectStorageServerVDSCommand]
(org.ovirt.thread.pool-6-thread-5) [7ecb4630] FINISH,
ConnectStorageServerVDSCommand, return:
{f47e9f99-9989-4297-a84f-4f75338eeced=0}, log id: 2bc2c88<br>
2014-06-05 12:11:00,852 INFO
[org.ovirt.engine.core.vdsbroker.vdsbroker.ConnectStorageServerVDSCommand]
(org.ovirt.thread.pool-6-thread-35) [25233142] FINISH,
ConnectStorageServerVDSCommand, return:
{f47e9f99-9989-4297-a84f-4f75338eeced=0}, log id: 16b62d67<br>
2014-06-05 12:11:00,854 INFO
[org.ovirt.engine.core.bll.storage.ConnectDomainToStorageCommand]
(DefaultQuartzScheduler_Worker-77) [77f9bcd3]
ConnectDomainToStorage. After Connect all hosts to pool. Time:6/5/14
12:11 PM<br>
2014-06-05 12:11:40,654 WARN
[org.ovirt.engine.core.utils.threadpool.ThreadPoolUtil]
(org.ovirt.thread.pool-6-thread-25) Executing a command:
java.util.concurrent.FutureTask , but note that there are 1 tasks in
the queue.<br>
2014-06-05 12:12:14,657 WARN
[org.ovirt.engine.core.utils.threadpool.ThreadPoolUtil]
(org.ovirt.thread.pool-6-thread-31) Executing a command:
java.util.concurrent.FutureTask , but note that there are 1 tasks in
the queue.<br>
2014-06-05 12:12:27,462 WARN
[org.ovirt.engine.core.utils.threadpool.ThreadPoolUtil]
(org.ovirt.thread.pool-6-thread-37) Executing a command:
java.util.concurrent.FutureTask , but note that there are 1 tasks in
the queue.<br>
2014-06-05 12:13:59,781 WARN
[org.ovirt.engine.core.utils.threadpool.ThreadPoolUtil]
(org.ovirt.thread.pool-6-thread-22) Executing a command:
java.util.concurrent.FutureTask , but note that there are 1 tasks in
the queue.<br>
2014-06-05 12:14:15,341 WARN
[org.ovirt.engine.core.utils.threadpool.ThreadPoolUtil]
(org.ovirt.thread.pool-6-thread-10) Executing a command:
java.util.concurrent.FutureTask , but note that there are 1 tasks in
the queue.<br>
2014-06-05 12:14:21,429 WARN
[org.ovirt.engine.core.utils.threadpool.ThreadPoolUtil]
(org.ovirt.thread.pool-6-thread-35) Executing a command:
java.util.concurrent.FutureTask , but note that there are 1 tasks in
the queue.<br>
2014-06-05 12:14:21,455 WARN
[org.ovirt.engine.core.utils.threadpool.ThreadPoolUtil]
(org.ovirt.thread.pool-6-thread-4) Executing a command:
java.util.concurrent.FutureTask , but note that there are 1 tasks in
the queue.<br>
2014-06-05 12:14:40,159 WARN
[org.ovirt.engine.core.utils.threadpool.ThreadPoolUtil]
(org.ovirt.thread.pool-6-thread-17) Executing a command:
java.util.concurrent.FutureTask , but note that there are 1 tasks in
the queue.<br>
2014-06-05 12:14:52,383 WARN
[org.ovirt.engine.core.utils.threadpool.ThreadPoolUtil]
(org.ovirt.thread.pool-6-thread-15) Executing a command:
java.util.concurrent.FutureTask , but note that there are 1 tasks in
the queue.<br>
2014-06-05 12:14:52,412 WARN
[org.ovirt.engine.core.utils.threadpool.ThreadPoolUtil]
(org.ovirt.thread.pool-6-thread-27) Executing a command:
java.util.concurrent.FutureTask , but note that there are 1 tasks in
the queue.<br>
2014-06-05 12:14:55,658 WARN
[org.ovirt.engine.core.utils.threadpool.ThreadPoolUtil]
(org.ovirt.thread.pool-6-thread-42) Executing a command:
java.util.concurrent.FutureTask , but note that there are 1 tasks in
the queue.<br>
2014-06-05 12:15:00,031 INFO
[org.ovirt.engine.core.bll.AutoRecoveryManager]
(DefaultQuartzScheduler_Worker-65) [604c5b5d] Autorecovering 1
storage domains<br>
2014-06-05 12:15:00,031 INFO
[org.ovirt.engine.core.bll.AutoRecoveryManager]
(DefaultQuartzScheduler_Worker-65) [604c5b5d] Autorecovering storage
domains id: 9923a5a1-61e0-4edb-a04d-22c962190c53<br>
2014-06-05 12:15:00,032 INFO
[org.ovirt.engine.core.bll.storage.ConnectDomainToStorageCommand]
(DefaultQuartzScheduler_Worker-65) [7d4ef42] Running command:
ConnectDomainToStorageCommand internal: true. Entities affected :
ID: 9923a5a1-61e0-4edb-a04d-22c962190c53 Type: Storage<br>
2014-06-05 12:15:00,035 INFO
[org.ovirt.engine.core.bll.storage.ConnectDomainToStorageCommand]
(DefaultQuartzScheduler_Worker-65) [7d4ef42] ConnectDomainToStorage.
Before Connect all hosts to pool. Time:6/5/14 12:15 PM<br>
2014-06-05 12:15:00,109 INFO
[org.ovirt.engine.core.bll.storage.ConnectStorageToVdsCommand]
(org.ovirt.thread.pool-6-thread-45) [6c18413] Running command:
ConnectStorageToVdsCommand internal: true. Entities affected : ID:
aaa00000-0000-0000-0000-123456789aaa Type: System<br>
2014-06-05 12:15:00,114 INFO
[org.ovirt.engine.core.bll.storage.ConnectStorageToVdsCommand]
(org.ovirt.thread.pool-6-thread-5) [d3ad154] Running command:
ConnectStorageToVdsCommand internal: true. Entities affected : ID:
aaa00000-0000-0000-0000-123456789aaa Type: System<br>
2014-06-05 12:15:00,118 INFO
[org.ovirt.engine.core.bll.storage.ConnectStorageToVdsCommand]
(org.ovirt.thread.pool-6-thread-1) [32d6d63f] Running command:
ConnectStorageToVdsCommand internal: true. Entities affected : ID:
aaa00000-0000-0000-0000-123456789aaa Type: System<br>
2014-06-05 12:15:00,122 INFO
[org.ovirt.engine.core.bll.storage.ConnectStorageToVdsCommand]
(org.ovirt.thread.pool-6-thread-24) [718f19c7] Running command:
ConnectStorageToVdsCommand internal: true. Entities affected : ID:
aaa00000-0000-0000-0000-123456789aaa Type: System<br>
2014-06-05 12:15:00,247 INFO
[org.ovirt.engine.core.vdsbroker.vdsbroker.ConnectStorageServerVDSCommand]
(org.ovirt.thread.pool-6-thread-24) [718f19c7] START,
ConnectStorageServerVDSCommand(HostName = node1.test.now, HostId =
e17a740c-47ba-4a81-99ab-6c386b572f26, storagePoolId =
00000000-0000-0000-0000-000000000000, storageType = NFS,
connectionList = [{ id: f47e9f99-9989-4297-a84f-4f75338eeced,
connection: 10.100.100.105:/var/lib/exports/iso, iqn: null, vfsType:
null, mountOptions: null, nfsVersion: null, nfsRetrans: null,
nfsTimeo: null };]), log id: 4bba2a9a<br>
2014-06-05 12:15:00,255 INFO
[org.ovirt.engine.core.vdsbroker.vdsbroker.ConnectStorageServerVDSCommand]
(org.ovirt.thread.pool-6-thread-5) [d3ad154] START,
ConnectStorageServerVDSCommand(HostName = node3.test.now, HostId =
7415506c-cda7-4018-804d-5f6d3beddbfb, storagePoolId =
00000000-0000-0000-0000-000000000000, storageType = NFS,
connectionList = [{ id: f47e9f99-9989-4297-a84f-4f75338eeced,
connection: 10.100.100.105:/var/lib/exports/iso, iqn: null, vfsType:
null, mountOptions: null, nfsVersion: null, nfsRetrans: null,
nfsTimeo: null };]), log id: 4c27ad04<br>
2014-06-05 12:15:00,249 INFO
[org.ovirt.engine.core.vdsbroker.vdsbroker.ConnectStorageServerVDSCommand]
(org.ovirt.thread.pool-6-thread-1) [32d6d63f] START,
ConnectStorageServerVDSCommand(HostName = node2.test.now, HostId =
28fdcb5d-7acd-410e-8b65-0b4f483cb575, storagePoolId =
00000000-0000-0000-0000-000000000000, storageType = NFS,
connectionList = [{ id: f47e9f99-9989-4297-a84f-4f75338eeced,
connection: 10.100.100.105:/var/lib/exports/iso, iqn: null, vfsType:
null, mountOptions: null, nfsVersion: null, nfsRetrans: null,
nfsTimeo: null };]), log id: 1d382d00<br>
2014-06-05 12:15:00,252 INFO
[org.ovirt.engine.core.vdsbroker.vdsbroker.ConnectStorageServerVDSCommand]
(org.ovirt.thread.pool-6-thread-45) [6c18413] START,
ConnectStorageServerVDSCommand(HostName = node4.test.now, HostId =
bb13752e-85cb-4945-822b-48ab2a7b1329, storagePoolId =
00000000-0000-0000-0000-000000000000, storageType = NFS,
connectionList = [{ id: f47e9f99-9989-4297-a84f-4f75338eeced,
connection: 10.100.100.105:/var/lib/exports/iso, iqn: null, vfsType:
null, mountOptions: null, nfsVersion: null, nfsRetrans: null,
nfsTimeo: null };]), log id: 7daf7955<br>
2014-06-05 12:16:00,301 INFO
[org.ovirt.engine.core.vdsbroker.vdsbroker.ConnectStorageServerVDSCommand]
(org.ovirt.thread.pool-6-thread-5) [d3ad154] FINISH,
ConnectStorageServerVDSCommand, return:
{f47e9f99-9989-4297-a84f-4f75338eeced=0}, log id: 4c27ad04<br>
2014-06-05 12:16:00,302 INFO
[org.ovirt.engine.core.vdsbroker.vdsbroker.ConnectStorageServerVDSCommand]
(org.ovirt.thread.pool-6-thread-24) [718f19c7] FINISH,
ConnectStorageServerVDSCommand, return:
{f47e9f99-9989-4297-a84f-4f75338eeced=0}, log id: 4bba2a9a<br>
2014-06-05 12:16:00,305 INFO
[org.ovirt.engine.core.vdsbroker.vdsbroker.ConnectStorageServerVDSCommand]
(org.ovirt.thread.pool-6-thread-45) [6c18413] FINISH,
ConnectStorageServerVDSCommand, return:
{f47e9f99-9989-4297-a84f-4f75338eeced=0}, log id: 7daf7955<br>
2014-06-05 12:16:00,313 INFO
[org.ovirt.engine.core.vdsbroker.vdsbroker.ConnectStorageServerVDSCommand]
(org.ovirt.thread.pool-6-thread-1) [32d6d63f] FINISH,
ConnectStorageServerVDSCommand, return:
{f47e9f99-9989-4297-a84f-4f75338eeced=0}, log id: 1d382d00<br>
2014-06-05 12:16:00,424 INFO
[org.ovirt.engine.core.bll.storage.ConnectDomainToStorageCommand]
(DefaultQuartzScheduler_Worker-65) [7d4ef42] ConnectDomainToStorage.
After Connect all hosts to pool. Time:6/5/14 12:16 PM<br>
2014-06-05 12:18:27,725 WARN
[org.ovirt.engine.core.utils.threadpool.ThreadPoolUtil]
(org.ovirt.thread.pool-6-thread-30) Executing a command:
java.util.concurrent.FutureTask , but note that there are 1 tasks in
the queue.<br>
2014-06-05 12:18:43,043 WARN
[org.ovirt.engine.core.utils.threadpool.ThreadPoolUtil]
(org.ovirt.thread.pool-6-thread-48) Executing a command:
java.util.concurrent.FutureTask , but note that there are 1 tasks in
the queue.<br>
2014-06-05 12:18:43,064 WARN
[org.ovirt.engine.core.utils.threadpool.ThreadPoolUtil]
(org.ovirt.thread.pool-6-thread-9) Executing a command:
java.util.concurrent.FutureTask , but note that there are 1 tasks in
the queue.<br>
2014-06-05 12:18:55,383 WARN
[org.ovirt.engine.core.utils.threadpool.ThreadPoolUtil]
(org.ovirt.thread.pool-6-thread-40) Executing a command:
java.util.concurrent.FutureTask , but note that there are 1 tasks in
the queue.<br>
2014-06-05 12:18:58,450 WARN
[org.ovirt.engine.core.utils.threadpool.ThreadPoolUtil]
(org.ovirt.thread.pool-6-thread-31) Executing a command:
java.util.concurrent.FutureTask , but note that there are 1 tasks in
the queue.<br>
2014-06-05 12:19:02,314 WARN
[org.ovirt.engine.core.utils.threadpool.ThreadPoolUtil]
(org.ovirt.thread.pool-6-thread-6) Executing a command:
java.util.concurrent.FutureTask , but note that there are 1 tasks in
the queue.<br>
2014-06-05 12:20:00,088 INFO
[org.ovirt.engine.core.bll.AutoRecoveryManager]
(DefaultQuartzScheduler_Worker-58) [2e5174b2] Autorecovering 1
storage domains<br>
2014-06-05 12:20:00,088 INFO
[org.ovirt.engine.core.bll.AutoRecoveryManager]
(DefaultQuartzScheduler_Worker-58) [2e5174b2] Autorecovering storage
domains id: 9923a5a1-61e0-4edb-a04d-22c962190c53<br>
2014-06-05 12:20:00,089 INFO
[org.ovirt.engine.core.bll.storage.ConnectDomainToStorageCommand]
(DefaultQuartzScheduler_Worker-58) [106f43a5] Running command:
ConnectDomainToStorageCommand internal: true. Entities affected :
ID: 9923a5a1-61e0-4edb-a04d-22c962190c53 Type: Storage<br>
2014-06-05 12:20:00,090 INFO
[org.ovirt.engine.core.bll.storage.ConnectDomainToStorageCommand]
(DefaultQuartzScheduler_Worker-58) [106f43a5]
ConnectDomainToStorage. Before Connect all hosts to pool.
Time:6/5/14 12:20 PM<br>
2014-06-05 12:20:00,175 INFO
[org.ovirt.engine.core.bll.storage.ConnectStorageToVdsCommand]
(org.ovirt.thread.pool-6-thread-23) [4233af79] Running command:
ConnectStorageToVdsCommand internal: true. Entities affected : ID:
aaa00000-0000-0000-0000-123456789aaa Type: System<br>
2014-06-05 12:20:00,180 INFO
[org.ovirt.engine.core.bll.storage.ConnectStorageToVdsCommand]
(org.ovirt.thread.pool-6-thread-11) [4aa18a96] Running command:
ConnectStorageToVdsCommand internal: true. Entities affected : ID:
aaa00000-0000-0000-0000-123456789aaa Type: System<br>
2014-06-05 12:20:00,184 INFO
[org.ovirt.engine.core.bll.storage.ConnectStorageToVdsCommand]
(org.ovirt.thread.pool-6-thread-40) [48fedac6] Running command:
ConnectStorageToVdsCommand internal: true. Entities affected : ID:
aaa00000-0000-0000-0000-123456789aaa Type: System<br>
2014-06-05 12:20:00,188 INFO
[org.ovirt.engine.core.bll.storage.ConnectStorageToVdsCommand]
(org.ovirt.thread.pool-6-thread-26) [4b39df17] Running command:
ConnectStorageToVdsCommand internal: true. Entities affected : ID:
aaa00000-0000-0000-0000-123456789aaa Type: System<br>
2014-06-05 12:20:00,317 INFO
[org.ovirt.engine.core.vdsbroker.vdsbroker.ConnectStorageServerVDSCommand]
(org.ovirt.thread.pool-6-thread-40) [48fedac6] START,
ConnectStorageServerVDSCommand(HostName = node2.test.now, HostId =
28fdcb5d-7acd-410e-8b65-0b4f483cb575, storagePoolId =
00000000-0000-0000-0000-000000000000, storageType = NFS,
connectionList = [{ id: f47e9f99-9989-4297-a84f-4f75338eeced,
connection: 10.100.100.105:/var/lib/exports/iso, iqn: null, vfsType:
null, mountOptions: null, nfsVersion: null, nfsRetrans: null,
nfsTimeo: null };]), log id: 2eb75130<br>
2014-06-05 12:20:00,321 INFO
[org.ovirt.engine.core.vdsbroker.vdsbroker.ConnectStorageServerVDSCommand]
(org.ovirt.thread.pool-6-thread-26) [4b39df17] START,
ConnectStorageServerVDSCommand(HostName = node1.test.now, HostId =
e17a740c-47ba-4a81-99ab-6c386b572f26, storagePoolId =
00000000-0000-0000-0000-000000000000, storageType = NFS,
connectionList = [{ id: f47e9f99-9989-4297-a84f-4f75338eeced,
connection: 10.100.100.105:/var/lib/exports/iso, iqn: null, vfsType:
null, mountOptions: null, nfsVersion: null, nfsRetrans: null,
nfsTimeo: null };]), log id: 957b133<br>
2014-06-05 12:20:00,313 INFO
[org.ovirt.engine.core.vdsbroker.vdsbroker.ConnectStorageServerVDSCommand]
(org.ovirt.thread.pool-6-thread-11) [4aa18a96] START,
ConnectStorageServerVDSCommand(HostName = node3.test.now, HostId =
7415506c-cda7-4018-804d-5f6d3beddbfb, storagePoolId =
00000000-0000-0000-0000-000000000000, storageType = NFS,
connectionList = [{ id: f47e9f99-9989-4297-a84f-4f75338eeced,
connection: 10.100.100.105:/var/lib/exports/iso, iqn: null, vfsType:
null, mountOptions: null, nfsVersion: null, nfsRetrans: null,
nfsTimeo: null };]), log id: 28e7a51e<br>
2014-06-05 12:20:00,320 INFO
[org.ovirt.engine.core.vdsbroker.vdsbroker.ConnectStorageServerVDSCommand]
(org.ovirt.thread.pool-6-thread-23) [4233af79] START,
ConnectStorageServerVDSCommand(HostName = node4.test.now, HostId =
bb13752e-85cb-4945-822b-48ab2a7b1329, storagePoolId =
00000000-0000-0000-0000-000000000000, storageType = NFS,
connectionList = [{ id: f47e9f99-9989-4297-a84f-4f75338eeced,
connection: 10.100.100.105:/var/lib/exports/iso, iqn: null, vfsType:
null, mountOptions: null, nfsVersion: null, nfsRetrans: null,
nfsTimeo: null };]), log id: 1e3eb653<br>
2014-06-05 12:21:00,371 INFO
[org.ovirt.engine.core.vdsbroker.vdsbroker.ConnectStorageServerVDSCommand]
(org.ovirt.thread.pool-6-thread-40) [48fedac6] FINISH,
ConnectStorageServerVDSCommand, return:
{f47e9f99-9989-4297-a84f-4f75338eeced=0}, log id: 2eb75130<br>
2014-06-05 12:21:00,375 INFO
[org.ovirt.engine.core.vdsbroker.vdsbroker.ConnectStorageServerVDSCommand]
(org.ovirt.thread.pool-6-thread-11) [4aa18a96] FINISH,
ConnectStorageServerVDSCommand, return:
{f47e9f99-9989-4297-a84f-4f75338eeced=0}, log id: 28e7a51e<br>
2014-06-05 12:21:00,376 INFO
[org.ovirt.engine.core.vdsbroker.vdsbroker.ConnectStorageServerVDSCommand]
(org.ovirt.thread.pool-6-thread-26) [4b39df17] FINISH,
ConnectStorageServerVDSCommand, return:
{f47e9f99-9989-4297-a84f-4f75338eeced=0}, log id: 957b133<br>
2014-06-05 12:21:00,400 INFO
[org.ovirt.engine.core.vdsbroker.vdsbroker.ConnectStorageServerVDSCommand]
(org.ovirt.thread.pool-6-thread-23) [4233af79] FINISH,
ConnectStorageServerVDSCommand, return:
{f47e9f99-9989-4297-a84f-4f75338eeced=0}, log id: 1e3eb653<br>
2014-06-05 12:21:00,403 INFO
[org.ovirt.engine.core.bll.storage.ConnectDomainToStorageCommand]
(DefaultQuartzScheduler_Worker-58) [106f43a5]
ConnectDomainToStorage. After Connect all hosts to pool. Time:6/5/14
12:21 PM<br>
2014-06-05 12:23:10,334 WARN
[org.ovirt.engine.core.utils.threadpool.ThreadPoolUtil]
(org.ovirt.thread.pool-6-thread-23) Executing a command:
java.util.concurrent.FutureTask , but note that there are 1 tasks in
the queue.<br>
2014-06-05 12:23:34,686 INFO
[org.ovirt.engine.core.bll.OvfDataUpdater]
(DefaultQuartzScheduler_Worker-7) [74449d64] Attempting to update
VMs/Templates Ovf.<br>
2014-06-05 12:23:34,687 INFO
[org.ovirt.engine.core.bll.OvfDataUpdater]
(DefaultQuartzScheduler_Worker-7) [74449d64] Attempting to update VM
OVFs in Data Center Default<br>
2014-06-05 12:23:34,692 INFO
[org.ovirt.engine.core.bll.OvfDataUpdater]
(DefaultQuartzScheduler_Worker-7) [74449d64] Successfully updated VM
OVFs in Data Center Default<br>
2014-06-05 12:23:34,693 INFO
[org.ovirt.engine.core.bll.OvfDataUpdater]
(DefaultQuartzScheduler_Worker-7) [74449d64] Attempting to update
template OVFs in Data Center Default<br>
2014-06-05 12:23:34,695 INFO
[org.ovirt.engine.core.bll.OvfDataUpdater]
(DefaultQuartzScheduler_Worker-7) [74449d64] Successfully updated
templates OVFs in Data Center Default<br>
2014-06-05 12:23:34,695 INFO
[org.ovirt.engine.core.bll.OvfDataUpdater]
(DefaultQuartzScheduler_Worker-7) [74449d64] Attempting to remove
unneeded template/vm OVFs in Data Center Default<br>
2014-06-05 12:23:34,697 INFO
[org.ovirt.engine.core.bll.OvfDataUpdater]
(DefaultQuartzScheduler_Worker-7) [74449d64] Successfully removed
unneeded template/vm OVFs in Data Center Default<br>
2014-06-05 12:23:38,356 WARN
[org.ovirt.engine.core.utils.threadpool.ThreadPoolUtil]
(org.ovirt.thread.pool-6-thread-29) Executing a command:
java.util.concurrent.FutureTask , but note that there are 1 tasks in
the queue.<br>
2014-06-05 12:23:44,441 WARN
[org.ovirt.engine.core.utils.threadpool.ThreadPoolUtil]
(org.ovirt.thread.pool-6-thread-16) Executing a command:
java.util.concurrent.FutureTask , but note that there are 1 tasks in
the queue.<br>
2014-06-05 12:24:00,113 INFO
[org.ovirt.engine.core.bll.MaintenanceNumberOfVdssCommand]
(org.ovirt.thread.pool-6-thread-44) [43b031c4] Running command:
MaintenanceNumberOfVdssCommand internal: false. Entities affected :
ID: e17a740c-47ba-4a81-99ab-6c386b572f26 Type: VDS<br>
2014-06-05 12:24:00,146 INFO
[org.ovirt.engine.core.vdsbroker.SetVdsStatusVDSCommand]
(org.ovirt.thread.pool-6-thread-44) [43b031c4] START,
SetVdsStatusVDSCommand(HostName = node1.test.now, HostId =
e17a740c-47ba-4a81-99ab-6c386b572f26,
status=PreparingForMaintenance, nonOperationalReason=NONE,
stopSpmFailureLogged=true), log id: 1d0d3a10<br>
2014-06-05 12:24:00,171 INFO
[org.ovirt.engine.core.vdsbroker.SetVdsStatusVDSCommand]
(org.ovirt.thread.pool-6-thread-44) [43b031c4] FINISH,
SetVdsStatusVDSCommand, log id: 1d0d3a10<br>
2014-06-05 12:24:00,460 INFO
[org.ovirt.engine.core.bll.MaintenanceVdsCommand]
(org.ovirt.thread.pool-6-thread-44) [43b031c4] Running command:
MaintenanceVdsCommand internal: true. Entities affected : ID:
e17a740c-47ba-4a81-99ab-6c386b572f26 Type: VDS<br>
2014-06-05 12:24:00,539 INFO
[org.ovirt.engine.core.bll.InternalMigrateVmCommand]
(org.ovirt.thread.pool-6-thread-44) [155d3855] Lock Acquired to
object EngineLock [exclusiveLocks= key:
63b1fd02-1fb5-44d1-b1cc-54cb5c0fdb0e value: VM<br>
, sharedLocks= ]<br>
2014-06-05 12:24:00,622 INFO
[org.ovirt.engine.core.bll.MaintenanceNumberOfVdssCommand]
(DefaultQuartzScheduler_Worker-69) [2b68c51d] Running command:
MaintenanceNumberOfVdssCommand internal: true. Entities affected :
ID: e17a740c-47ba-4a81-99ab-6c386b572f26 Type: VDS<br>
2014-06-05 12:24:00,779 INFO
[org.ovirt.engine.core.bll.scheduling.SchedulingManager]
(org.ovirt.thread.pool-6-thread-44) [155d3855] Candidate host
node2.test.now (28fdcb5d-7acd-410e-8b65-0b4f483cb575) was filtered
out by VAR__FILTERTYPE__INTERNAL filter Memory<br>
2014-06-05 12:24:00,808 INFO
[org.ovirt.engine.core.bll.scheduling.SchedulingManager]
(org.ovirt.thread.pool-6-thread-44) [155d3855] Candidate host
node3.test.now (7415506c-cda7-4018-804d-5f6d3beddbfb) was filtered
out by VAR__FILTERTYPE__INTERNAL filter Memory<br>
2014-06-05 12:24:00,809 INFO
[org.ovirt.engine.core.bll.scheduling.SchedulingManager]
(org.ovirt.thread.pool-6-thread-44) [155d3855] Candidate host
node4.test.now (bb13752e-85cb-4945-822b-48ab2a7b1329) was filtered
out by VAR__FILTERTYPE__INTERNAL filter Memory<br>
2014-06-05 12:24:00,813 WARN
[org.ovirt.engine.core.bll.InternalMigrateVmCommand]
(org.ovirt.thread.pool-6-thread-44) [155d3855] CanDoAction of action
InternalMigrateVm failed.
Reasons:VAR__ACTION__MIGRATE,VAR__TYPE__VM,ACTION_TYPE_FAILED_VDS_VM_MEMORY,ACTION_TYPE_FAILED_VDS_VM_MEMORY,ACTION_TYPE_FAILED_VDS_VM_MEMORY,SCHEDULING_ALL_HOSTS_FILTERED_OUT,VAR__FILTERTYPE__INTERNAL,$hostName
node2.test.now,$filterName
Memory,SCHEDULING_HOST_FILTERED_REASON,VAR__FILTERTYPE__INTERNAL,$hostName
node4.test.now,$filterName
Memory,SCHEDULING_HOST_FILTERED_REASON,VAR__FILTERTYPE__INTERNAL,$hostName
node3.test.now,$filterName Memory,SCHEDULING_HOST_FILTERED_REASON<br>
2014-06-05 12:24:00,825 INFO
[org.ovirt.engine.core.bll.InternalMigrateVmCommand]
(org.ovirt.thread.pool-6-thread-44) [155d3855] Lock freed to object
EngineLock [exclusiveLocks= key:
63b1fd02-1fb5-44d1-b1cc-54cb5c0fdb0e value: VM<br>
, sharedLocks= ]<br>
2014-06-05 12:24:00,860 ERROR
[org.ovirt.engine.core.bll.MaintenanceVdsCommand]
(org.ovirt.thread.pool-6-thread-44) [155d3855]
ResourceManager::vdsMaintenance - Failed migrating desktop Bench1<br>
2014-06-05 12:24:00,883 INFO
[org.ovirt.engine.core.dal.dbbroker.auditloghandling.AuditLogDirector]
(org.ovirt.thread.pool-6-thread-44) [155d3855] Correlation ID:
43b031c4, Job ID: 38a05481-0ced-4285-b3ee-6574ce39eb77, Call Stack:
null, Custom Event ID: -1, Message: Host node1.test.now cannot
change into maintenance mode - not all Vms have been migrated
successfully. Consider manual intervention: stopping/migrating Vms:
Bench1 (User: admin).<br>
2014-06-05 12:24:00,894 INFO
[org.ovirt.engine.core.bll.MaintenanceVdsCommand]
(DefaultQuartzScheduler_Worker-69) [2b68c51d] Running command:
MaintenanceVdsCommand internal: true. Entities affected : ID:
e17a740c-47ba-4a81-99ab-6c386b572f26 Type: VDS<br>
2014-06-05 12:24:01,005 INFO
[org.ovirt.engine.core.bll.InternalMigrateVmCommand]
(DefaultQuartzScheduler_Worker-69) [36b0c4c7] Lock Acquired to
object EngineLock [exclusiveLocks= key:
63b1fd02-1fb5-44d1-b1cc-54cb5c0fdb0e value: VM<br>
, sharedLocks= ]<br>
2014-06-05 12:24:01,147 INFO
[org.ovirt.engine.core.bll.scheduling.SchedulingManager]
(DefaultQuartzScheduler_Worker-69) [36b0c4c7] Candidate host
node2.test.now (28fdcb5d-7acd-410e-8b65-0b4f483cb575) was filtered
out by VAR__FILTERTYPE__INTERNAL filter Memory<br>
2014-06-05 12:24:01,155 INFO
[org.ovirt.engine.core.bll.scheduling.SchedulingManager]
(DefaultQuartzScheduler_Worker-69) [36b0c4c7] Candidate host
node3.test.now (7415506c-cda7-4018-804d-5f6d3beddbfb) was filtered
out by VAR__FILTERTYPE__INTERNAL filter Memory<br>
2014-06-05 12:24:01,166 INFO
[org.ovirt.engine.core.bll.scheduling.SchedulingManager]
(DefaultQuartzScheduler_Worker-69) [36b0c4c7] Candidate host
node4.test.now (bb13752e-85cb-4945-822b-48ab2a7b1329) was filtered
out by VAR__FILTERTYPE__INTERNAL filter Memory<br>
2014-06-05 12:24:01,174 WARN
[org.ovirt.engine.core.bll.InternalMigrateVmCommand]
(DefaultQuartzScheduler_Worker-69) [36b0c4c7] CanDoAction of action
InternalMigrateVm failed.
Reasons:VAR__ACTION__MIGRATE,VAR__TYPE__VM,ACTION_TYPE_FAILED_VDS_VM_MEMORY,ACTION_TYPE_FAILED_VDS_VM_MEMORY,ACTION_TYPE_FAILED_VDS_VM_MEMORY,SCHEDULING_ALL_HOSTS_FILTERED_OUT,VAR__FILTERTYPE__INTERNAL,$hostName
node2.test.now,$filterName
Memory,SCHEDULING_HOST_FILTERED_REASON,VAR__FILTERTYPE__INTERNAL,$hostName
node4.test.now,$filterName
Memory,SCHEDULING_HOST_FILTERED_REASON,VAR__FILTERTYPE__INTERNAL,$hostName
node3.test.now,$filterName Memory,SCHEDULING_HOST_FILTERED_REASON<br>
2014-06-05 12:24:01,187 INFO
[org.ovirt.engine.core.bll.InternalMigrateVmCommand]
(DefaultQuartzScheduler_Worker-69) [36b0c4c7] Lock freed to object
EngineLock [exclusiveLocks= key:
63b1fd02-1fb5-44d1-b1cc-54cb5c0fdb0e value: VM<br>
, sharedLocks= ]<br>
2014-06-05 12:24:01,228 ERROR
[org.ovirt.engine.core.bll.MaintenanceVdsCommand]
(DefaultQuartzScheduler_Worker-69) [36b0c4c7]
ResourceManager::vdsMaintenance - Failed migrating desktop Bench1<br>
2014-06-05 12:24:01,238 INFO
[org.ovirt.engine.core.dal.dbbroker.auditloghandling.AuditLogDirector]
(DefaultQuartzScheduler_Worker-69) [36b0c4c7] Correlation ID:
2b68c51d, Job ID: 3a956cf5-ec2e-45b5-a2bc-40632496fec6, Call Stack:
null, Custom Event ID: -1, Message: Failed to switch Host
node1.test.now to Maintenance mode.<br>
2014-06-05 12:24:40,507 INFO
[org.ovirt.engine.core.bll.StopVdsCommand] (ajp--127.0.0.1-8702-4)
[32d3da21] Lock Acquired to object EngineLock [exclusiveLocks= key:
e17a740c-47ba-4a81-99ab-6c386b572f26 value: VDS_FENCE<br>
, sharedLocks= ]<br>
2014-06-05 12:24:40,798 INFO
[org.ovirt.engine.core.dal.dbbroker.auditloghandling.AuditLogDirector]
(ajp--127.0.0.1-8702-4) Correlation ID: null, Call Stack: null,
Custom Event ID: -1, Message: Host node3.test.now from cluster
Default was chosen as a proxy to execute Status command on Host
node1.test.now.<br>
2014-06-05 12:24:40,799 INFO
[org.ovirt.engine.core.bll.FenceExecutor] (ajp--127.0.0.1-8702-4)
Using Host node3.test.now from cluster Default as proxy to execute
Status command on Host node1.test.now<br>
2014-06-05 12:24:40,800 WARN
[org.ovirt.engine.core.bll.StopVdsCommand] (ajp--127.0.0.1-8702-4)
CanDoAction of action StopVds failed.
Reasons:VAR__ACTION__STOP,VDS_STATUS_NOT_VALID_FOR_STOP<br>
2014-06-05 12:24:40,811 INFO
[org.ovirt.engine.core.bll.StopVdsCommand] (ajp--127.0.0.1-8702-4)
Lock freed to object EngineLock [exclusiveLocks= key:
e17a740c-47ba-4a81-99ab-6c386b572f26 value: VDS_FENCE<br>
, sharedLocks= ]<br>
2014-06-05 12:25:00,021 INFO
[org.ovirt.engine.core.bll.AutoRecoveryManager]
(DefaultQuartzScheduler_Worker-27) [70486be2] Autorecovering 1
storage domains<br>
2014-06-05 12:25:00,021 INFO
[org.ovirt.engine.core.bll.AutoRecoveryManager]
(DefaultQuartzScheduler_Worker-27) [70486be2] Autorecovering storage
domains id: 9923a5a1-61e0-4edb-a04d-22c962190c53<br>
2014-06-05 12:25:00,022 INFO
[org.ovirt.engine.core.bll.storage.ConnectDomainToStorageCommand]
(DefaultQuartzScheduler_Worker-27) [70d58101] Running command:
ConnectDomainToStorageCommand internal: true. Entities affected :
ID: 9923a5a1-61e0-4edb-a04d-22c962190c53 Type: Storage<br>
2014-06-05 12:25:00,023 INFO
[org.ovirt.engine.core.bll.storage.ConnectDomainToStorageCommand]
(DefaultQuartzScheduler_Worker-27) [70d58101]
ConnectDomainToStorage. Before Connect all hosts to pool.
Time:6/5/14 12:25 PM<br>
2014-06-05 12:25:00,088 INFO
[org.ovirt.engine.core.bll.storage.ConnectStorageToVdsCommand]
(org.ovirt.thread.pool-6-thread-6) [362a5fe3] Running command:
ConnectStorageToVdsCommand internal: true. Entities affected : ID:
aaa00000-0000-0000-0000-123456789aaa Type: System<br>
2014-06-05 12:25:00,093 INFO
[org.ovirt.engine.core.bll.storage.ConnectStorageToVdsCommand]
(org.ovirt.thread.pool-6-thread-47) [683e82b0] Running command:
ConnectStorageToVdsCommand internal: true. Entities affected : ID:
aaa00000-0000-0000-0000-123456789aaa Type: System<br>
2014-06-05 12:25:00,100 INFO
[org.ovirt.engine.core.bll.storage.ConnectStorageToVdsCommand]
(org.ovirt.thread.pool-6-thread-14) [6ed26267] Running command:
ConnectStorageToVdsCommand internal: true. Entities affected : ID:
aaa00000-0000-0000-0000-123456789aaa Type: System<br>
2014-06-05 12:25:00,196 INFO
[org.ovirt.engine.core.vdsbroker.vdsbroker.ConnectStorageServerVDSCommand]
(org.ovirt.thread.pool-6-thread-6) [362a5fe3] START,
ConnectStorageServerVDSCommand(HostName = node4.test.now, HostId =
bb13752e-85cb-4945-822b-48ab2a7b1329, storagePoolId =
00000000-0000-0000-0000-000000000000, storageType = NFS,
connectionList = [{ id: f47e9f99-9989-4297-a84f-4f75338eeced,
connection: 10.100.100.105:/var/lib/exports/iso, iqn: null, vfsType:
null, mountOptions: null, nfsVersion: null, nfsRetrans: null,
nfsTimeo: null };]), log id: 4591e85a<br>
2014-06-05 12:25:00,197 INFO
[org.ovirt.engine.core.vdsbroker.vdsbroker.ConnectStorageServerVDSCommand]
(org.ovirt.thread.pool-6-thread-14) [6ed26267] START,
ConnectStorageServerVDSCommand(HostName = node2.test.now, HostId =
28fdcb5d-7acd-410e-8b65-0b4f483cb575, storagePoolId =
00000000-0000-0000-0000-000000000000, storageType = NFS,
connectionList = [{ id: f47e9f99-9989-4297-a84f-4f75338eeced,
connection: 10.100.100.105:/var/lib/exports/iso, iqn: null, vfsType:
null, mountOptions: null, nfsVersion: null, nfsRetrans: null,
nfsTimeo: null };]), log id: 17ef6900<br>
2014-06-05 12:25:00,197 INFO
[org.ovirt.engine.core.vdsbroker.vdsbroker.ConnectStorageServerVDSCommand]
(org.ovirt.thread.pool-6-thread-47) [683e82b0] START,
ConnectStorageServerVDSCommand(HostName = node3.test.now, HostId =
7415506c-cda7-4018-804d-5f6d3beddbfb, storagePoolId =
00000000-0000-0000-0000-000000000000, storageType = NFS,
connectionList = [{ id: f47e9f99-9989-4297-a84f-4f75338eeced,
connection: 10.100.100.105:/var/lib/exports/iso, iqn: null, vfsType:
null, mountOptions: null, nfsVersion: null, nfsRetrans: null,
nfsTimeo: null };]), log id: db1af1a<br>
2014-06-05 12:25:00,207 WARN
[org.ovirt.engine.core.utils.threadpool.ThreadPoolUtil]
(org.ovirt.thread.pool-6-thread-2) Executing a command:
java.util.concurrent.FutureTask , but note that there are 1 tasks in
the queue.<br>
2014-06-05 12:25:19,989 WARN
[org.ovirt.engine.core.utils.threadpool.ThreadPoolUtil]
(org.ovirt.thread.pool-6-thread-42) Executing a command:
java.util.concurrent.FutureTask , but note that there are 1 tasks in
the queue.<br>
2014-06-05 12:26:00,247 INFO
[org.ovirt.engine.core.vdsbroker.vdsbroker.ConnectStorageServerVDSCommand]
(org.ovirt.thread.pool-6-thread-47) [683e82b0] FINISH,
ConnectStorageServerVDSCommand, return:
{f47e9f99-9989-4297-a84f-4f75338eeced=0}, log id: db1af1a<br>
2014-06-05 12:26:00,256 INFO
[org.ovirt.engine.core.vdsbroker.vdsbroker.ConnectStorageServerVDSCommand]
(org.ovirt.thread.pool-6-thread-6) [362a5fe3] FINISH,
ConnectStorageServerVDSCommand, return:
{f47e9f99-9989-4297-a84f-4f75338eeced=0}, log id: 4591e85a<br>
2014-06-05 12:26:00,263 INFO
[org.ovirt.engine.core.vdsbroker.vdsbroker.ConnectStorageServerVDSCommand]
(org.ovirt.thread.pool-6-thread-14) [6ed26267] FINISH,
ConnectStorageServerVDSCommand, return:
{f47e9f99-9989-4297-a84f-4f75338eeced=0}, log id: 17ef6900<br>
2014-06-05 12:26:00,265 INFO
[org.ovirt.engine.core.bll.storage.ConnectDomainToStorageCommand]
(DefaultQuartzScheduler_Worker-27) [70d58101]
ConnectDomainToStorage. After Connect all hosts to pool. Time:6/5/14
12:26 PM<br>
2014-06-05 12:28:15,399 WARN
[org.ovirt.engine.core.utils.threadpool.ThreadPoolUtil]
(org.ovirt.thread.pool-6-thread-17) Executing a command:
java.util.concurrent.FutureTask , but note that there are 1 tasks in
the queue.<br>
2014-06-05 12:28:18,443 WARN
[org.ovirt.engine.core.utils.threadpool.ThreadPoolUtil]
(org.ovirt.thread.pool-6-thread-8) Executing a command:
java.util.concurrent.FutureTask , but note that there are 1 tasks in
the queue.<br>
2014-06-05 12:28:55,121 WARN
[org.ovirt.engine.core.utils.threadpool.ThreadPoolUtil]
(org.ovirt.thread.pool-6-thread-45) Executing a command:
java.util.concurrent.FutureTask , but note that there are 1 tasks in
the queue.<br>
2014-06-05 12:29:04,927 INFO
[org.ovirt.engine.core.bll.MaintenanceNumberOfVdssCommand]
(DefaultQuartzScheduler_Worker-51) [51b730a8] Running command:
MaintenanceNumberOfVdssCommand internal: true. Entities affected :
ID: e17a740c-47ba-4a81-99ab-6c386b572f26 Type: VDS<br>
2014-06-05 12:29:05,029 INFO
[org.ovirt.engine.core.bll.MaintenanceVdsCommand]
(DefaultQuartzScheduler_Worker-51) [51b730a8] Running command:
MaintenanceVdsCommand internal: true. Entities affected : ID:
e17a740c-47ba-4a81-99ab-6c386b572f26 Type: VDS<br>
2014-06-05 12:29:05,073 INFO
[org.ovirt.engine.core.bll.InternalMigrateVmCommand]
(DefaultQuartzScheduler_Worker-51) [63736865] Lock Acquired to
object EngineLock [exclusiveLocks= key:
63b1fd02-1fb5-44d1-b1cc-54cb5c0fdb0e value: VM<br>
, sharedLocks= ]<br>
2014-06-05 12:29:05,164 INFO
[org.ovirt.engine.core.bll.scheduling.SchedulingManager]
(DefaultQuartzScheduler_Worker-51) [63736865] Candidate host
node2.test.now (28fdcb5d-7acd-410e-8b65-0b4f483cb575) was filtered
out by VAR__FILTERTYPE__INTERNAL filter Memory<br>
2014-06-05 12:29:05,167 INFO
[org.ovirt.engine.core.bll.scheduling.SchedulingManager]
(DefaultQuartzScheduler_Worker-51) [63736865] Candidate host
node3.test.now (7415506c-cda7-4018-804d-5f6d3beddbfb) was filtered
out by VAR__FILTERTYPE__INTERNAL filter Memory<br>
2014-06-05 12:29:05,170 INFO
[org.ovirt.engine.core.bll.scheduling.SchedulingManager]
(DefaultQuartzScheduler_Worker-51) [63736865] Candidate host
node4.test.now (bb13752e-85cb-4945-822b-48ab2a7b1329) was filtered
out by VAR__FILTERTYPE__INTERNAL filter Memory<br>
2014-06-05 12:29:05,173 WARN
[org.ovirt.engine.core.bll.InternalMigrateVmCommand]
(DefaultQuartzScheduler_Worker-51) [63736865] CanDoAction of action
InternalMigrateVm failed.
Reasons:VAR__ACTION__MIGRATE,VAR__TYPE__VM,ACTION_TYPE_FAILED_VDS_VM_MEMORY,ACTION_TYPE_FAILED_VDS_VM_MEMORY,ACTION_TYPE_FAILED_VDS_VM_MEMORY,SCHEDULING_ALL_HOSTS_FILTERED_OUT,VAR__FILTERTYPE__INTERNAL,$hostName
node2.test.now,$filterName
Memory,SCHEDULING_HOST_FILTERED_REASON,VAR__FILTERTYPE__INTERNAL,$hostName
node4.test.now,$filterName
Memory,SCHEDULING_HOST_FILTERED_REASON,VAR__FILTERTYPE__INTERNAL,$hostName
node3.test.now,$filterName Memory,SCHEDULING_HOST_FILTERED_REASON<br>
2014-06-05 12:29:05,181 INFO
[org.ovirt.engine.core.bll.InternalMigrateVmCommand]
(DefaultQuartzScheduler_Worker-51) [63736865] Lock freed to object
EngineLock [exclusiveLocks= key:
63b1fd02-1fb5-44d1-b1cc-54cb5c0fdb0e value: VM<br>
, sharedLocks= ]<br>
2014-06-05 12:29:05,207 ERROR
[org.ovirt.engine.core.bll.MaintenanceVdsCommand]
(DefaultQuartzScheduler_Worker-51) [63736865]
ResourceManager::vdsMaintenance - Failed migrating desktop Bench1<br>
2014-06-05 12:29:05,218 INFO
[org.ovirt.engine.core.dal.dbbroker.auditloghandling.AuditLogDirector]
(DefaultQuartzScheduler_Worker-51) [63736865] Correlation ID:
51b730a8, Job ID: b8441378-051c-4af0-8802-e2efb9e5d4a4, Call Stack:
null, Custom Event ID: -1, Message: Failed to switch Host
node1.test.now to Maintenance mode.<br>
2014-06-05 12:30:00,018 INFO
[org.ovirt.engine.core.bll.AutoRecoveryManager]
(DefaultQuartzScheduler_Worker-52) [2e3993e6] Autorecovering 1
storage domains<br>
2014-06-05 12:30:00,018 INFO
[org.ovirt.engine.core.bll.AutoRecoveryManager]
(DefaultQuartzScheduler_Worker-52) [2e3993e6] Autorecovering storage
domains id: 9923a5a1-61e0-4edb-a04d-22c962190c53<br>
2014-06-05 12:30:00,021 INFO
[org.ovirt.engine.core.bll.storage.ConnectDomainToStorageCommand]
(DefaultQuartzScheduler_Worker-52) [1585ee27] Running command:
ConnectDomainToStorageCommand internal: true. Entities affected :
ID: 9923a5a1-61e0-4edb-a04d-22c962190c53 Type: Storage<br>
2014-06-05 12:30:00,024 INFO
[org.ovirt.engine.core.bll.storage.ConnectDomainToStorageCommand]
(DefaultQuartzScheduler_Worker-52) [1585ee27]
ConnectDomainToStorage. Before Connect all hosts to pool.
Time:6/5/14 12:30 PM<br>
2014-06-05 12:30:00,119 INFO
[org.ovirt.engine.core.bll.storage.ConnectStorageToVdsCommand]
(org.ovirt.thread.pool-6-thread-17) [6c0a8af9] Running command:
ConnectStorageToVdsCommand internal: true. Entities affected : ID:
aaa00000-0000-0000-0000-123456789aaa Type: System<br>
2014-06-05 12:30:00,121 INFO
[org.ovirt.engine.core.bll.storage.ConnectStorageToVdsCommand]
(org.ovirt.thread.pool-6-thread-26) [10056f73] Running command:
ConnectStorageToVdsCommand internal: true. Entities affected : ID:
aaa00000-0000-0000-0000-123456789aaa Type: System<br>
2014-06-05 12:30:00,134 INFO
[org.ovirt.engine.core.bll.storage.ConnectStorageToVdsCommand]
(org.ovirt.thread.pool-6-thread-24) [750ef0f4] Running command:
ConnectStorageToVdsCommand internal: true. Entities affected : ID:
aaa00000-0000-0000-0000-123456789aaa Type: System<br>
2014-06-05 12:30:00,248 INFO
[org.ovirt.engine.core.vdsbroker.vdsbroker.ConnectStorageServerVDSCommand]
(org.ovirt.thread.pool-6-thread-17) [6c0a8af9] START,
ConnectStorageServerVDSCommand(HostName = node4.test.now, HostId =
bb13752e-85cb-4945-822b-48ab2a7b1329, storagePoolId =
00000000-0000-0000-0000-000000000000, storageType = NFS,
connectionList = [{ id: f47e9f99-9989-4297-a84f-4f75338eeced,
connection: 10.100.100.105:/var/lib/exports/iso, iqn: null, vfsType:
null, mountOptions: null, nfsVersion: null, nfsRetrans: null,
nfsTimeo: null };]), log id: 1c35901d<br>
2014-06-05 12:30:00,250 INFO
[org.ovirt.engine.core.vdsbroker.vdsbroker.ConnectStorageServerVDSCommand]
(org.ovirt.thread.pool-6-thread-24) [750ef0f4] START,
ConnectStorageServerVDSCommand(HostName = node2.test.now, HostId =
28fdcb5d-7acd-410e-8b65-0b4f483cb575, storagePoolId =
00000000-0000-0000-0000-000000000000, storageType = NFS,
connectionList = [{ id: f47e9f99-9989-4297-a84f-4f75338eeced,
connection: 10.100.100.105:/var/lib/exports/iso, iqn: null, vfsType:
null, mountOptions: null, nfsVersion: null, nfsRetrans: null,
nfsTimeo: null };]), log id: 1a4d78a1<br>
2014-06-05 12:30:00,250 INFO
[org.ovirt.engine.core.vdsbroker.vdsbroker.ConnectStorageServerVDSCommand]
(org.ovirt.thread.pool-6-thread-26) [10056f73] START,
ConnectStorageServerVDSCommand(HostName = node3.test.now, HostId =
7415506c-cda7-4018-804d-5f6d3beddbfb, storagePoolId =
00000000-0000-0000-0000-000000000000, storageType = NFS,
connectionList = [{ id: f47e9f99-9989-4297-a84f-4f75338eeced,
connection: 10.100.100.105:/var/lib/exports/iso, iqn: null, vfsType:
null, mountOptions: null, nfsVersion: null, nfsRetrans: null,
nfsTimeo: null };]), log id: 60464b56<br>
<br>
</body>
</html>
--------------090404080503060404080706--
--------------000602090709060001060807--
10 years, 5 months
Re: [ovirt-users] [ ERROR ] Failed to execute stage 'Closing up': Command '/bin/systemctl' failed to execute
by Gianluca Cecchi
On Fri, Jun 13, 2014 at 12:23 AM, Todd <tdsan(a)yahoo.com> wrote:
> One thing I did, I added an additional 4 to make it 8GB to the machine.
>
>
>
> And it still does the same thing, but it is saying there is a problem with
> the systemctl “Closing up” error.
>
>
>
> How is that related to the amount of memory, at present, I just want to
> bring up the web interface and get it installed.
>
>
>
> Todd
>
>
>
Keep replies on list, so that other can help too if necessary.
I think the init service should have been installed anyway.
What you get from the command
# systemctl status ovirt-engine
for example on my system I get this
[g.cecchi@tekkaman ~]$ sudo systemctl status ovirt-engine
ovirt-engine.service - oVirt Engine
Loaded: loaded (/usr/lib/systemd/system/ovirt-engine.service; enabled)
Active: active (running) since Fri 2014-06-13 00:24:55 CEST; 2min 49s ago
Main PID: 1899 (ovirt-engine.py)
CGroup: name=systemd:/system/ovirt-engine.service
├─1899 /usr/bin/python
/usr/share/ovirt-engine/services/ovirt-engine/ovirt-engine.py
--redirect-output --system...
└─2370 ovirt-engine -server -XX:+TieredCompilation -Xms1g -Xmx1g
-XX:PermSize=256m -XX:MaxPermSize=256m -Djava....
Jun 13 00:24:54 tekkaman.localdomain.local systemd[1]: Starting oVirt
Engine...
Jun 13 00:24:55 tekkaman.localdomain.local systemd[1]: Started oVirt Engine.
And also the command
# journalctl -r -a -u ovirt-engine
for example I had a failure on November and the comamnd at a certain point
gives:
Nov 24 20:27:29 tekkaman.localdomain.local systemd[1]: Unit
ovirt-engine.service entered failed state.
Nov 24 20:27:29 tekkaman.localdomain.local systemd[1]: Failed to start
oVirt Engine.
Nov 24 20:27:29 tekkaman.localdomain.local systemd[1]:
ovirt-engine.service: control process exited, code=exited status=1
Nov 24 20:27:29 tekkaman.localdomain.local engine-service[2598]: Starting
engine-service: [FAILED]
Nov 24 20:27:29 tekkaman.localdomain.local engine-service[2598]: The
directory "/usr/lib/jvm/jre-1.7.0-openjdk.x86_64" doesn'
Nov 24 20:27:29 tekkaman.localdomain.local systemd[1]: Starting oVirt
Engine...
so your output should contain something useful.
Also, id you run
engine-cleanup
before running engine-setup again after adding memory? It should help too.
Gianluca
10 years, 5 months
problem engine-manage-domains add ldap domain
by lucas castro
I'm trying to add a ldap domain to ovirt-engine,
but getting problem with that.
I sent three files with the engine-manage-domains log
the krb5 config generated for testing
and the tcpdump port 53 from my dns server
can anybody help me to find what is happening?
--
contatos:
Celular: ( 99 ) 9143-5954 - Vivo
skype: lucasd3castro
msn: lucascastroborges(a)hotmail.com
10 years, 5 months