Users
Threads by month
- ----- 2026 -----
- February
- January
- ----- 2025 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2024 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2023 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2022 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2021 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2020 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2019 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2018 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2017 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2016 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2015 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2014 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2013 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2012 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2011 -----
- December
- November
- October
- 9 participants
- 19179 discussions
This is a multi-part message in MIME format.
--------------030302090701070207040008
Content-Type: text/plain; charset=ISO-8859-2; format=flowed
Content-Transfer-Encoding: 7bit
Hello,
I upgraded oVirt to 3.3 and when I want to launch SPICE console I get
error 500 on web ui. I tried to install websovketproxy package, enabled
it by "engine-config -s WebSocketProxy="Engine:6100" " But it did not
help. I dont know what I did wrong. Logs are bellow. Thank you for advice
==> ovirt-engine/engine.log <==
2013-07-25 15:22:31,717 INFO
[org.ovirt.engine.core.bll.SetVmTicketCommand] (ajp--127.0.0.1-8702-8)
Running command: SetVmTicketCommand internal: false. Entities affected
: ID: 8c6ea349-902f-4457-8c3f-49ee5d4cf6b9 Type: VM
2013-07-25 15:22:31,726 INFO
[org.ovirt.engine.core.vdsbroker.vdsbroker.SetVmTicketVDSCommand]
(ajp--127.0.0.1-8702-8) START, SetVmTicketVDSCommand(HostName =
node19.nbu.cz, HostId = dc1fde46-66af-4f66-947a-12791cd6b9a0,
vmId=8c6ea349-902f-4457-8c3f-49ee5d4cf6b9, ticket=7LP81DPq1CpR,
validTime=120,m userName=admin@internal,
userId=fdfc627c-d875-11e0-90f0-83df133b58cc), log id: 1b7e23bf
2013-07-25 15:22:31,764 INFO
[org.ovirt.engine.core.vdsbroker.vdsbroker.SetVmTicketVDSCommand]
(ajp--127.0.0.1-8702-8) FINISH, SetVmTicketVDSCommand, log id: 1b7e23bf
==> ovirt-engine/server.log <==
2013-07-25 15:22:31,865 ERROR
[org.apache.catalina.core.ContainerBase.[jboss.web].[default-host].[/webadmin]]
(ajp--127.0.0.1-8702-8) Exception while dispatching incoming RPC call:
java.lang.NullPointerException
at
com.google.gwt.rpc.server.WebModePayloadSink.getBytes(WebModePayloadSink.java:860)
[gwt-servlet.jar:]
at
com.google.gwt.rpc.server.WebModePayloadSink$PayloadVisitor.push(WebModePayloadSink.java:767)
[gwt-servlet.jar:]
at
com.google.gwt.rpc.server.WebModePayloadSink$PayloadVisitor.constructorFunction(WebModePayloadSink.java:636)
[gwt-servlet.jar:]
at
com.google.gwt.rpc.server.WebModePayloadSink$PayloadVisitor.visit(WebModePayloadSink.java:259)
[gwt-servlet.jar:]
at
com.google.gwt.rpc.client.ast.InstantiateCommand.traverse(InstantiateCommand.java:54)
[gwt-servlet.jar:]
at
com.google.gwt.rpc.client.ast.RpcCommandVisitor.doAccept(RpcCommandVisitor.java:320)
[gwt-servlet.jar:]
at
com.google.gwt.rpc.client.ast.RpcCommandVisitor.accept(RpcCommandVisitor.java:42)
[gwt-servlet.jar:]
at
com.google.gwt.rpc.server.WebModePayloadSink$PayloadVisitor.visit(WebModePayloadSink.java:236)
[gwt-servlet.jar:]
at
com.google.gwt.rpc.client.ast.ArrayValueCommand.traverse(ArrayValueCommand.java:53)
[gwt-servlet.jar:]
at
com.google.gwt.rpc.client.ast.RpcCommandVisitor.doAccept(RpcCommandVisitor.java:320)
[gwt-servlet.jar:]
at
com.google.gwt.rpc.client.ast.RpcCommandVisitor.accept(RpcCommandVisitor.java:42)
[gwt-servlet.jar:]
at
com.google.gwt.rpc.server.WebModePayloadSink$PayloadVisitor.visit(WebModePayloadSink.java:291)
[gwt-servlet.jar:]
at
com.google.gwt.rpc.client.ast.InstantiateCommand.traverse(InstantiateCommand.java:54)
[gwt-servlet.jar:]
at
com.google.gwt.rpc.client.ast.RpcCommandVisitor.doAccept(RpcCommandVisitor.java:320)
[gwt-servlet.jar:]
at
com.google.gwt.rpc.client.ast.RpcCommandVisitor.accept(RpcCommandVisitor.java:42)
[gwt-servlet.jar:]
at
com.google.gwt.rpc.server.WebModePayloadSink$PayloadVisitor.visit(WebModePayloadSink.java:375)
[gwt-servlet.jar:]
at
com.google.gwt.rpc.client.ast.InvokeCustomFieldSerializerCommand.traverse(InvokeCustomFieldSerializerCommand.java:76)
[gwt-servlet.jar:]
at
com.google.gwt.rpc.client.ast.RpcCommandVisitor.doAccept(RpcCommandVisitor.java:320)
[gwt-servlet.jar:]
at
com.google.gwt.rpc.client.ast.RpcCommandVisitor.accept(RpcCommandVisitor.java:42)
[gwt-servlet.jar:]
at
com.google.gwt.rpc.server.WebModePayloadSink$PayloadVisitor.visit(WebModePayloadSink.java:236)
[gwt-servlet.jar:]
at
com.google.gwt.rpc.client.ast.ArrayValueCommand.traverse(ArrayValueCommand.java:53)
[gwt-servlet.jar:]
at
com.google.gwt.rpc.client.ast.RpcCommandVisitor.doAccept(RpcCommandVisitor.java:320)
[gwt-servlet.jar:]
at
com.google.gwt.rpc.client.ast.RpcCommandVisitor.accept(RpcCommandVisitor.java:42)
[gwt-servlet.jar:]
at
com.google.gwt.rpc.server.WebModePayloadSink$PayloadVisitor.visit(WebModePayloadSink.java:291)
[gwt-servlet.jar:]
at
com.google.gwt.rpc.client.ast.InstantiateCommand.traverse(InstantiateCommand.java:54)
[gwt-servlet.jar:]
at
com.google.gwt.rpc.client.ast.RpcCommandVisitor.doAccept(RpcCommandVisitor.java:320)
[gwt-servlet.jar:]
at
com.google.gwt.rpc.client.ast.RpcCommandVisitor.accept(RpcCommandVisitor.java:42)
[gwt-servlet.jar:]
at
com.google.gwt.rpc.server.WebModePayloadSink$PayloadVisitor.visit(WebModePayloadSink.java:362)
[gwt-servlet.jar:]
at
com.google.gwt.rpc.client.ast.InvokeCustomFieldSerializerCommand.traverse(InvokeCustomFieldSerializerCommand.java:76)
[gwt-servlet.jar:]
at
com.google.gwt.rpc.client.ast.RpcCommandVisitor.doAccept(RpcCommandVisitor.java:320)
[gwt-servlet.jar:]
at
com.google.gwt.rpc.client.ast.RpcCommandVisitor.accept(RpcCommandVisitor.java:42)
[gwt-servlet.jar:]
at
com.google.gwt.rpc.server.WebModePayloadSink$PayloadVisitor.visit(WebModePayloadSink.java:236)
[gwt-servlet.jar:]
at
com.google.gwt.rpc.client.ast.ArrayValueCommand.traverse(ArrayValueCommand.java:53)
[gwt-servlet.jar:]
at
com.google.gwt.rpc.client.ast.RpcCommandVisitor.doAccept(RpcCommandVisitor.java:320)
[gwt-servlet.jar:]
at
com.google.gwt.rpc.client.ast.RpcCommandVisitor.accept(RpcCommandVisitor.java:42)
[gwt-servlet.jar:]
at
com.google.gwt.rpc.server.WebModePayloadSink$PayloadVisitor.visit(WebModePayloadSink.java:291)
[gwt-servlet.jar:]
at
com.google.gwt.rpc.client.ast.InstantiateCommand.traverse(InstantiateCommand.java:54)
[gwt-servlet.jar:]
at
com.google.gwt.rpc.client.ast.RpcCommandVisitor.doAccept(RpcCommandVisitor.java:320)
[gwt-servlet.jar:]
at
com.google.gwt.rpc.client.ast.RpcCommandVisitor.accept(RpcCommandVisitor.java:42)
[gwt-servlet.jar:]
at
com.google.gwt.rpc.server.WebModePayloadSink$PayloadVisitor.visit(WebModePayloadSink.java:375)
[gwt-servlet.jar:]
at
com.google.gwt.rpc.client.ast.InvokeCustomFieldSerializerCommand.traverse(InvokeCustomFieldSerializerCommand.java:76)
[gwt-servlet.jar:]
at
com.google.gwt.rpc.client.ast.RpcCommandVisitor.doAccept(RpcCommandVisitor.java:320)
[gwt-servlet.jar:]
at
com.google.gwt.rpc.client.ast.RpcCommandVisitor.accept(RpcCommandVisitor.java:42)
[gwt-servlet.jar:]
at
com.google.gwt.rpc.server.WebModePayloadSink$PayloadVisitor.visit(WebModePayloadSink.java:406)
[gwt-servlet.jar:]
at
com.google.gwt.rpc.client.ast.ReturnCommand.traverse(ReturnCommand.java:44)
[gwt-servlet.jar:]
at
com.google.gwt.rpc.client.ast.RpcCommandVisitor.doAccept(RpcCommandVisitor.java:320)
[gwt-servlet.jar:]
at
com.google.gwt.rpc.client.ast.RpcCommandVisitor.accept(RpcCommandVisitor.java:42)
[gwt-servlet.jar:]
at
com.google.gwt.rpc.server.WebModePayloadSink.accept(WebModePayloadSink.java:890)
[gwt-servlet.jar:]
at com.google.gwt.rpc.server.RPC.streamResponse(RPC.java:472)
[gwt-servlet.jar:]
at
com.google.gwt.rpc.server.RPC.invokeAndStreamResponse(RPC.java:198)
[gwt-servlet.jar:]
at
com.google.gwt.rpc.server.RpcServlet.processCall(RpcServlet.java:172)
[gwt-servlet.jar:]
at
com.google.gwt.rpc.server.RpcServlet.processPost(RpcServlet.java:233)
[gwt-servlet.jar:]
at
com.google.gwt.user.server.rpc.AbstractRemoteServiceServlet.doPost(AbstractRemoteServiceServlet.java:62)
[gwt-servlet.jar:]
at javax.servlet.http.HttpServlet.service(HttpServlet.java:754)
[jboss-servlet-3.0-api.jar:1.0.1.Final]
at javax.servlet.http.HttpServlet.service(HttpServlet.java:847)
[jboss-servlet-3.0-api.jar:1.0.1.Final]
at
org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:329)
[jboss-web.jar:]
at
org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:248)
[jboss-web.jar:]
at
org.ovirt.engine.ui.frontend.server.gwt.GwtCachingFilter.doFilter(GwtCachingFilter.java:132)
[frontend.jar:]
at
org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:280)
[jboss-web.jar:]
at
org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:248)
[jboss-web.jar:]
at
org.ovirt.engine.core.utils.servlet.LocaleFilter.doFilter(LocaleFilter.java:59)
[utils.jar:]
at
org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:280)
[jboss-web.jar:]
at
org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:248)
[jboss-web.jar:]
at
org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:275)
[jboss-web.jar:]
at
org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:161)
[jboss-web.jar:]
at
org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:489)
[jboss-web.jar:]
at
org.jboss.as.web.security.SecurityContextAssociationValve.invoke(SecurityContextAssociationValve.java:153)
[jboss-as-web-7.1.1.Final.jar:7.1.1.Final]
at
org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:155)
[jboss-web.jar:]
at
org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:102)
[jboss-web.jar:]
at org.jboss.web.rewrite.RewriteValve.invoke(RewriteValve.java:466)
[jboss-web.jar:]
at
org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:109)
[jboss-web.jar:]
at
org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:368)
[jboss-web.jar:]
at org.apache.coyote.ajp.AjpProcessor.process(AjpProcessor.java:505)
[jboss-web.jar:]
at
org.apache.coyote.ajp.AjpProtocol$AjpConnectionHandler.process(AjpProtocol.java:445)
[jboss-web.jar:]
at
org.apache.tomcat.util.net.JIoEndpoint$Worker.run(JIoEndpoint.java:930)
[jboss-web.jar:]
at java.lang.Thread.run(Thread.java:724) [rt.jar:1.7.0_25]
--------------030302090701070207040008
Content-Type: text/html; charset=ISO-8859-2
Content-Transfer-Encoding: 8bit
<html>
<head>
<meta http-equiv="content-type" content="text/html; charset=ISO-8859-2">
</head>
<body bgcolor="#FFFFFF" text="#000000">
<div class="moz-text-flowed" style="font-family: -moz-fixed;
font-size: 12px;" lang="x-central-euro">Hello,
<br>
I upgraded oVirt to 3.3 and when I want to launch SPICE console I
get error 500 on web ui. I tried to install websovketproxy
package, enabled it by "engine-config -s
WebSocketProxy="Engine:6100" " But it did not help. I dont know
what I did wrong. Logs are bellow. Thank you for advice
<br>
<br>
==> ovirt-engine/engine.log <==
<br>
2013-07-25 15:22:31,717 INFO
[org.ovirt.engine.core.bll.SetVmTicketCommand]
(ajp--127.0.0.1-8702-8) Running command: SetVmTicketCommand
internal: false. Entities affected : ID:
8c6ea349-902f-4457-8c3f-49ee5d4cf6b9 Type: VM
<br>
2013-07-25 15:22:31,726 INFO
[org.ovirt.engine.core.vdsbroker.vdsbroker.SetVmTicketVDSCommand]
(ajp--127.0.0.1-8702-8) START, SetVmTicketVDSCommand(HostName =
node19.nbu.cz, HostId = dc1fde46-66af-4f66-947a-12791cd6b9a0,
vmId=8c6ea349-902f-4457-8c3f-49ee5d4cf6b9, ticket=7LP81DPq1CpR,
validTime=120,m userName=admin@internal,
userId=fdfc627c-d875-11e0-90f0-83df133b58cc), log id: 1b7e23bf
<br>
2013-07-25 15:22:31,764 INFO
[org.ovirt.engine.core.vdsbroker.vdsbroker.SetVmTicketVDSCommand]
(ajp--127.0.0.1-8702-8) FINISH, SetVmTicketVDSCommand, log id:
1b7e23bf
<br>
<br>
==> ovirt-engine/server.log <==
<br>
2013-07-25 15:22:31,865 ERROR
[org.apache.catalina.core.ContainerBase.[jboss.web].[default-host].[/webadmin]]
(ajp--127.0.0.1-8702-8) Exception while dispatching incoming RPC
call: java.lang.NullPointerException
<br>
at
com.google.gwt.rpc.server.WebModePayloadSink.getBytes(WebModePayloadSink.java:860)
[gwt-servlet.jar:]
<br>
at
com.google.gwt.rpc.server.WebModePayloadSink$PayloadVisitor.push(WebModePayloadSink.java:767)
[gwt-servlet.jar:]
<br>
at
com.google.gwt.rpc.server.WebModePayloadSink$PayloadVisitor.constructorFunction(WebModePayloadSink.java:636)
[gwt-servlet.jar:]
<br>
at
com.google.gwt.rpc.server.WebModePayloadSink$PayloadVisitor.visit(WebModePayloadSink.java:259)
[gwt-servlet.jar:]
<br>
at
com.google.gwt.rpc.client.ast.InstantiateCommand.traverse(InstantiateCommand.java:54)
[gwt-servlet.jar:]
<br>
at
com.google.gwt.rpc.client.ast.RpcCommandVisitor.doAccept(RpcCommandVisitor.java:320)
[gwt-servlet.jar:]
<br>
at
com.google.gwt.rpc.client.ast.RpcCommandVisitor.accept(RpcCommandVisitor.java:42)
[gwt-servlet.jar:]
<br>
at
com.google.gwt.rpc.server.WebModePayloadSink$PayloadVisitor.visit(WebModePayloadSink.java:236)
[gwt-servlet.jar:]
<br>
at
com.google.gwt.rpc.client.ast.ArrayValueCommand.traverse(ArrayValueCommand.java:53)
[gwt-servlet.jar:]
<br>
at
com.google.gwt.rpc.client.ast.RpcCommandVisitor.doAccept(RpcCommandVisitor.java:320)
[gwt-servlet.jar:]
<br>
at
com.google.gwt.rpc.client.ast.RpcCommandVisitor.accept(RpcCommandVisitor.java:42)
[gwt-servlet.jar:]
<br>
at
com.google.gwt.rpc.server.WebModePayloadSink$PayloadVisitor.visit(WebModePayloadSink.java:291)
[gwt-servlet.jar:]
<br>
at
com.google.gwt.rpc.client.ast.InstantiateCommand.traverse(InstantiateCommand.java:54)
[gwt-servlet.jar:]
<br>
at
com.google.gwt.rpc.client.ast.RpcCommandVisitor.doAccept(RpcCommandVisitor.java:320)
[gwt-servlet.jar:]
<br>
at
com.google.gwt.rpc.client.ast.RpcCommandVisitor.accept(RpcCommandVisitor.java:42)
[gwt-servlet.jar:]
<br>
at
com.google.gwt.rpc.server.WebModePayloadSink$PayloadVisitor.visit(WebModePayloadSink.java:375)
[gwt-servlet.jar:]
<br>
at
com.google.gwt.rpc.client.ast.InvokeCustomFieldSerializerCommand.traverse(InvokeCustomFieldSerializerCommand.java:76)
[gwt-servlet.jar:]
<br>
at
com.google.gwt.rpc.client.ast.RpcCommandVisitor.doAccept(RpcCommandVisitor.java:320)
[gwt-servlet.jar:]
<br>
at
com.google.gwt.rpc.client.ast.RpcCommandVisitor.accept(RpcCommandVisitor.java:42)
[gwt-servlet.jar:]
<br>
at
com.google.gwt.rpc.server.WebModePayloadSink$PayloadVisitor.visit(WebModePayloadSink.java:236)
[gwt-servlet.jar:]
<br>
at
com.google.gwt.rpc.client.ast.ArrayValueCommand.traverse(ArrayValueCommand.java:53)
[gwt-servlet.jar:]
<br>
at
com.google.gwt.rpc.client.ast.RpcCommandVisitor.doAccept(RpcCommandVisitor.java:320)
[gwt-servlet.jar:]
<br>
at
com.google.gwt.rpc.client.ast.RpcCommandVisitor.accept(RpcCommandVisitor.java:42)
[gwt-servlet.jar:]
<br>
at
com.google.gwt.rpc.server.WebModePayloadSink$PayloadVisitor.visit(WebModePayloadSink.java:291)
[gwt-servlet.jar:]
<br>
at
com.google.gwt.rpc.client.ast.InstantiateCommand.traverse(InstantiateCommand.java:54)
[gwt-servlet.jar:]
<br>
at
com.google.gwt.rpc.client.ast.RpcCommandVisitor.doAccept(RpcCommandVisitor.java:320)
[gwt-servlet.jar:]
<br>
at
com.google.gwt.rpc.client.ast.RpcCommandVisitor.accept(RpcCommandVisitor.java:42)
[gwt-servlet.jar:]
<br>
at
com.google.gwt.rpc.server.WebModePayloadSink$PayloadVisitor.visit(WebModePayloadSink.java:362)
[gwt-servlet.jar:]
<br>
at
com.google.gwt.rpc.client.ast.InvokeCustomFieldSerializerCommand.traverse(InvokeCustomFieldSerializerCommand.java:76)
[gwt-servlet.jar:]
<br>
at
com.google.gwt.rpc.client.ast.RpcCommandVisitor.doAccept(RpcCommandVisitor.java:320)
[gwt-servlet.jar:]
<br>
at
com.google.gwt.rpc.client.ast.RpcCommandVisitor.accept(RpcCommandVisitor.java:42)
[gwt-servlet.jar:]
<br>
at
com.google.gwt.rpc.server.WebModePayloadSink$PayloadVisitor.visit(WebModePayloadSink.java:236)
[gwt-servlet.jar:]
<br>
at
com.google.gwt.rpc.client.ast.ArrayValueCommand.traverse(ArrayValueCommand.java:53)
[gwt-servlet.jar:]
<br>
at
com.google.gwt.rpc.client.ast.RpcCommandVisitor.doAccept(RpcCommandVisitor.java:320)
[gwt-servlet.jar:]
<br>
at
com.google.gwt.rpc.client.ast.RpcCommandVisitor.accept(RpcCommandVisitor.java:42)
[gwt-servlet.jar:]
<br>
at
com.google.gwt.rpc.server.WebModePayloadSink$PayloadVisitor.visit(WebModePayloadSink.java:291)
[gwt-servlet.jar:]
<br>
at
com.google.gwt.rpc.client.ast.InstantiateCommand.traverse(InstantiateCommand.java:54)
[gwt-servlet.jar:]
<br>
at
com.google.gwt.rpc.client.ast.RpcCommandVisitor.doAccept(RpcCommandVisitor.java:320)
[gwt-servlet.jar:]
<br>
at
com.google.gwt.rpc.client.ast.RpcCommandVisitor.accept(RpcCommandVisitor.java:42)
[gwt-servlet.jar:]
<br>
at
com.google.gwt.rpc.server.WebModePayloadSink$PayloadVisitor.visit(WebModePayloadSink.java:375)
[gwt-servlet.jar:]
<br>
at
com.google.gwt.rpc.client.ast.InvokeCustomFieldSerializerCommand.traverse(InvokeCustomFieldSerializerCommand.java:76)
[gwt-servlet.jar:]
<br>
at
com.google.gwt.rpc.client.ast.RpcCommandVisitor.doAccept(RpcCommandVisitor.java:320)
[gwt-servlet.jar:]
<br>
at
com.google.gwt.rpc.client.ast.RpcCommandVisitor.accept(RpcCommandVisitor.java:42)
[gwt-servlet.jar:]
<br>
at
com.google.gwt.rpc.server.WebModePayloadSink$PayloadVisitor.visit(WebModePayloadSink.java:406)
[gwt-servlet.jar:]
<br>
at
com.google.gwt.rpc.client.ast.ReturnCommand.traverse(ReturnCommand.java:44)
[gwt-servlet.jar:]
<br>
at
com.google.gwt.rpc.client.ast.RpcCommandVisitor.doAccept(RpcCommandVisitor.java:320)
[gwt-servlet.jar:]
<br>
at
com.google.gwt.rpc.client.ast.RpcCommandVisitor.accept(RpcCommandVisitor.java:42)
[gwt-servlet.jar:]
<br>
at
com.google.gwt.rpc.server.WebModePayloadSink.accept(WebModePayloadSink.java:890)
[gwt-servlet.jar:]
<br>
at com.google.gwt.rpc.server.RPC.streamResponse(RPC.java:472)
[gwt-servlet.jar:]
<br>
at
com.google.gwt.rpc.server.RPC.invokeAndStreamResponse(RPC.java:198)
[gwt-servlet.jar:]
<br>
at
com.google.gwt.rpc.server.RpcServlet.processCall(RpcServlet.java:172)
[gwt-servlet.jar:]
<br>
at
com.google.gwt.rpc.server.RpcServlet.processPost(RpcServlet.java:233)
[gwt-servlet.jar:]
<br>
at
com.google.gwt.user.server.rpc.AbstractRemoteServiceServlet.doPost(AbstractRemoteServiceServlet.java:62)
[gwt-servlet.jar:]
<br>
at javax.servlet.http.HttpServlet.service(HttpServlet.java:754)
[jboss-servlet-3.0-api.jar:1.0.1.Final]
<br>
at javax.servlet.http.HttpServlet.service(HttpServlet.java:847)
[jboss-servlet-3.0-api.jar:1.0.1.Final]
<br>
at
org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:329)
[jboss-web.jar:]
<br>
at
org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:248)
[jboss-web.jar:]
<br>
at
org.ovirt.engine.ui.frontend.server.gwt.GwtCachingFilter.doFilter(GwtCachingFilter.java:132)
[frontend.jar:]
<br>
at
org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:280)
[jboss-web.jar:]
<br>
at
org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:248)
[jboss-web.jar:]
<br>
at
org.ovirt.engine.core.utils.servlet.LocaleFilter.doFilter(LocaleFilter.java:59)
[utils.jar:]
<br>
at
org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:280)
[jboss-web.jar:]
<br>
at
org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:248)
[jboss-web.jar:]
<br>
at
org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:275)
[jboss-web.jar:]
<br>
at
org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:161)
[jboss-web.jar:]
<br>
at
org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:489)
[jboss-web.jar:]
<br>
at
org.jboss.as.web.security.SecurityContextAssociationValve.invoke(SecurityContextAssociationValve.java:153)
[jboss-as-web-7.1.1.Final.jar:7.1.1.Final]
<br>
at
org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:155)
[jboss-web.jar:]
<br>
at
org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:102)
[jboss-web.jar:]
<br>
at
org.jboss.web.rewrite.RewriteValve.invoke(RewriteValve.java:466)
[jboss-web.jar:]
<br>
at
org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:109)
[jboss-web.jar:]
<br>
at
org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:368)
[jboss-web.jar:]
<br>
at
org.apache.coyote.ajp.AjpProcessor.process(AjpProcessor.java:505)
[jboss-web.jar:]
<br>
at
org.apache.coyote.ajp.AjpProtocol$AjpConnectionHandler.process(AjpProtocol.java:445)
[jboss-web.jar:]
<br>
at
org.apache.tomcat.util.net.JIoEndpoint$Worker.run(JIoEndpoint.java:930)
[jboss-web.jar:]
<br>
at java.lang.Thread.run(Thread.java:724) [rt.jar:1.7.0_25]
<br>
</div>
</body>
</html>
--------------030302090701070207040008--
1
0
After downloading and testing transaction it fails. I think it is
because of language.
2013-07-25 14:17:32 INFO otopi.plugins.otopi.packagers.yumpackager
yumpackager.info:88 Yum Status: Check Package Signatures
2013-07-25 14:17:32 INFO otopi.plugins.otopi.packagers.yumpackager
yumpackager.info:88 Yum Status: Running Test Transaction
Spuštěna kontrola transakce
2013-07-25 14:17:32 INFO otopi.plugins.otopi.packagers.yumpackager
yumpackager.info:88 Yum Status: Running Transaction
2013-07-25 14:17:32 DEBUG otopi.plugins.otopi.packagers.yumpackager
yumpackager.verbose:84 Yum Varování: RPMDB byla změněna mimo yum.
2013-07-25 14:17:33 DEBUG otopi.plugins.otopi.packagers.yumpackager
yumpackager.verbose:84 Yum ** Nalezeny 4 existující problémy v rpmdb,
následuje výstup "yum check":
2013-07-25 14:17:33 DEBUG otopi.plugins.otopi.packagers.yumpackager
yumpackager.verbose:84 Yum
ovirt-host-deploy-1.1.0-0.2.master.20130723.gita991545.fc19.noarch je
duplicitní s ovirt-host-deploy-1.0.2-1.fc18.noarch
2013-07-25 14:17:33 DEBUG otopi.plugins.otopi.packagers.yumpackager
yumpackager.verbose:84 Yum
ovirt-host-deploy-java-1.1.0-0.2.master.20130723.gita991545.fc19.noarch
je duplicitní s ovirt-host-deploy-java-1.0.2-1.fc18.noarch
2013-07-25 14:17:33 DEBUG otopi.plugins.otopi.packagers.yumpackager
yumpackager.verbose:84 Yum
ovirt-iso-uploader-3.3.0-0.1.beta1.fc19.noarch je duplicitní s
ovirt-iso-uploader-3.2.2-1.fc18.noarch
2013-07-25 14:17:33 DEBUG otopi.plugins.otopi.packagers.yumpackager
yumpackager.verbose:84 Yum
ovirt-log-collector-3.3.0-0.1.beta1.fc19.noarch je duplicitní s
ovirt-log-collector-3.2.2-1.fc18.noarch
Traceback (most recent call last):
File "/usr/lib/python2.7/site-packages/yum/rpmtrans.py", line 458, in
callback
self._instProgress( bytes, total, h )
File "/usr/lib/python2.7/site-packages/yum/rpmtrans.py", line 541, in
_instProgress
self.complete_actions, self.total_actions)
File "/usr/lib/python2.7/site-packages/otopi/miniyum.py", line 184,
in event
package=package
UnicodeEncodeError: 'ascii' codec can't encode character u'\xe1' in
position 8: ordinal not in range(128)
FATAL ERROR: python callback <bound method RPMTransaction.callback of
<yum.rpmtrans.RPMTransaction instance at 0x4687f80>> failed, aborting!
2
4
Hi.
I have a few questions about data domains...
I'm not sure that I understand why when adding a new NFS data domain
what the "Use Host" is for?
From the RHEV documentation - "All communication to the storage domain
is from the selected host and not directly from the Red Hat Enterprise
Virtualization Manager. At least one active host must be attached to the
chosen Data Center before the storage is configured. "
.. but I'm puzzled.. don't all the nodes mount the NFS storage directly
from the NFS storage server?
Is this saying that if I have two nodes, v1 and v2, and I say "Use Host"
v1 then v2 gets at storage through v1? What if v1 is down?
Don't all nodes need a connection to the "logical" storage network?
---
On the topic of local storage...
Right now, I have one node with 1 disk (until some ordered equipment
arrives)...
/data/images is /dev/mapper/HostVG-Data
I want two of my nodes to store local data. The majority of VMs will
use the NFS datastore, but a few VMs need local storage, and I'd like to
split these VMs across two nodes, so two nodes will have their own local
storage...
If I was going to install local data on the node, I wouldn't install it
on the OS disk - I'd want another disk, or maybe even a few disks! If
I added another disk to this system, how would I go about making *this*
disk "/data/images" instead of the root disk? Do I have to reinstall the
node?
I'm also puzzled by this statement: "A local storage domain can be set
up on a host. When you set up host to use local storage, the host
automatically gets added to a new data center and cluster that no other
hosts can be added to. Multiple host clusters require that all hosts
have access to all storage domains, which is not possible with local
storage. Virtual machines created in a single host cluster cannot be
migrated, fenced or scheduled. "
So .. let's say I have two nodes, both of them have some local disk, and
use the NFS data store. I can see why I wouldn't be able to migrate a
host from one node to the other IF that has was using local data storage
for the specific virtual machine. On the other hand, if it's a VM that
is NOT using local storage, and everything is in the NFS datastore, then
does this I can't migrate it because each host would have to be in its
own cluster only because it has local storage for *some* of the VMs!?
Finally - I had previously asked about using MD RAID1 redundancy on the
root drive, which isn't available yet on the node. Are there any
options for creating redundant local storage using MD RAID1, or it's the
same -- no redundancy on local storage unless you're using a RAID card
where the driver for that card has been integrated into the node?
Jason.
3
4
Is it possible now for oVirt to install node without access the Internet?
Is there an option like this?
3
4
Hello,
we have one node based on centos 6.4 and using ovirt-stable chanel and
it can not start VM or accept migration. (May I try updating to ovirt
3.3 beta?)
Here is log:
Thread-2973::DEBUG::2013-07-24
18:37:20,415::task::568::TaskManager.Task::(_updateState)
Task=`9b957784-995f-48bd-b181-a23b991787e1`::moving from state init ->
state preparing
Thread-2973::INFO::2013-07-24
18:37:20,416::logUtils::41::dispatcher::(wrapper) Run and protect:
repoStats(options=None)
Thread-2973::INFO::2013-07-24
18:37:20,416::logUtils::44::dispatcher::(wrapper) Run and protect:
repoStats, Return response: {u'0ee30f68-c222-44c0-85e6-2ae246f4c1ec':
{'delay': '0.0137450695038', 'lastCheck': '7.3', 'code': 0, 'valid':
True}, u'a8c13187-d9d1-46b8-abe3-c322970d9d4d': {'delay':
'0.00467395782471', 'lastCheck': '5.4', 'code': 0, 'valid': True}}
Thread-2973::DEBUG::2013-07-24
18:37:20,416::task::1151::TaskManager.Task::(prepare)
Task=`9b957784-995f-48bd-b181-a23b991787e1`::finished:
{u'0ee30f68-c222-44c0-85e6-2ae246f4c1ec': {'delay': '0.0137450695038',
'lastCheck': '7.3', 'code': 0, 'valid': True},
u'a8c13187-d9d1-46b8-abe3-c322970d9d4d': {'delay': '0.00467395782471',
'lastCheck': '5.4', 'code': 0, 'valid': True}}
Thread-2973::DEBUG::2013-07-24
18:37:20,416::task::568::TaskManager.Task::(_updateState)
Task=`9b957784-995f-48bd-b181-a23b991787e1`::moving from state preparing
-> state finished
Thread-2973::DEBUG::2013-07-24
18:37:20,417::resourceManager::830::ResourceManager.Owner::(releaseAll)
Owner.releaseAll requests {} resources {}
Thread-2973::DEBUG::2013-07-24
18:37:20,417::resourceManager::864::ResourceManager.Owner::(cancelAll)
Owner.cancelAll requests {}
Thread-2973::DEBUG::2013-07-24
18:37:20,417::task::957::TaskManager.Task::(_decref)
Task=`9b957784-995f-48bd-b181-a23b991787e1`::ref 0 aborting False
Thread-68::DEBUG::2013-07-24
18:37:23,123::misc::84::Storage.Misc.excCmd::(<lambda>) '/bin/dd
iflag=direct if=/dev/0ee30f68-c222-44c0-85e6-2ae246f4c1ec/metadata
bs=4096 count=1' (cwd None)
Thread-68::DEBUG::2013-07-24
18:37:23,135::misc::84::Storage.Misc.excCmd::(<lambda>) SUCCESS: <err> =
'1+0 records in\n1+0 records out\n4096 bytes (4.1 kB) copied,
0.000485541 s, 8.4 MB/s\n'; <rc> = 0
Thread-2979::DEBUG::2013-07-24
18:37:30,600::task::568::TaskManager.Task::(_updateState)
Task=`12b753a3-ca6e-48e2-9888-2e9c71f0c490`::moving from state init ->
state preparing
Thread-2979::INFO::2013-07-24
18:37:30,601::logUtils::41::dispatcher::(wrapper) Run and protect:
repoStats(options=None)
Thread-2979::INFO::2013-07-24
18:37:30,601::logUtils::44::dispatcher::(wrapper) Run and protect:
repoStats, Return response: {u'0ee30f68-c222-44c0-85e6-2ae246f4c1ec':
{'delay': '0.0136978626251', 'lastCheck': '7.5', 'code': 0, 'valid':
True}, u'a8c13187-d9d1-46b8-abe3-c322970d9d4d': {'delay':
'0.00459504127502', 'lastCheck': '5.6', 'code': 0, 'valid': True}}
Thread-2979::DEBUG::2013-07-24
18:37:30,601::task::1151::TaskManager.Task::(prepare)
Task=`12b753a3-ca6e-48e2-9888-2e9c71f0c490`::finished:
{u'0ee30f68-c222-44c0-85e6-2ae246f4c1ec': {'delay': '0.0136978626251',
'lastCheck': '7.5', 'code': 0, 'valid': True},
u'a8c13187-d9d1-46b8-abe3-c322970d9d4d': {'delay': '0.00459504127502',
'lastCheck': '5.6', 'code': 0, 'valid': True}}
Thread-2979::DEBUG::2013-07-24
18:37:30,601::task::568::TaskManager.Task::(_updateState)
Task=`12b753a3-ca6e-48e2-9888-2e9c71f0c490`::moving from state preparing
-> state finished
Thread-2979::DEBUG::2013-07-24
18:37:30,601::resourceManager::830::ResourceManager.Owner::(releaseAll)
Owner.releaseAll requests {} resources {}
Thread-2979::DEBUG::2013-07-24
18:37:30,601::resourceManager::864::ResourceManager.Owner::(cancelAll)
Owner.cancelAll requests {}
Thread-2979::DEBUG::2013-07-24
18:37:30,602::task::957::TaskManager.Task::(_decref)
Task=`12b753a3-ca6e-48e2-9888-2e9c71f0c490`::ref 0 aborting False
Thread-68::DEBUG::2013-07-24
18:37:33,140::misc::84::Storage.Misc.excCmd::(<lambda>) '/bin/dd
iflag=direct if=/dev/0ee30f68-c222-44c0-85e6-2ae246f4c1ec/metadata
bs=4096 count=1' (cwd None)
Thread-68::DEBUG::2013-07-24
18:37:33,151::misc::84::Storage.Misc.excCmd::(<lambda>) SUCCESS: <err> =
'1+0 records in\n1+0 records out\n4096 bytes (4.1 kB) copied, 0.0004116
s, 10.0 MB/s\n'; <rc> = 0
Thread-2983::DEBUG::2013-07-24
18:37:35,734::BindingXMLRPC::913::vds::(wrapper) client
[192.168.3.207]::call vmGetStats with
('03ac5be8-75fc-43df-9fb8-c8e8af30ae84',) {}
Thread-2983::DEBUG::2013-07-24
18:37:35,734::BindingXMLRPC::920::vds::(wrapper) return vmGetStats with
{'status': {'message': 'Virtual machine does not exist', 'code': 1}}
Thread-2984::DEBUG::2013-07-24
18:37:35,880::BindingXMLRPC::913::vds::(wrapper) client
[192.168.3.207]::call vmMigrationCreate with ({'username': 'Unknown',
'acpiEnable': 'true', 'emulatedMachine': 'pc-0.14',
'afterMigrationStatus': 'Up', 'vmId':
'03ac5be8-75fc-43df-9fb8-c8e8af30ae84', 'transparentHugePages': 'true',
'displaySecurePort': '5913', 'timeOffset': '-43200', 'cpuType':
'Opteron_G3', 'custom': {'device_d7b86bcd-13fe-4259-b7a1-0e6243bf8aca':
'VmDevice {vmId=03ac5be8-75fc-43df-9fb8-c8e8af30ae84,
deviceId=d7b86bcd-13fe-4259-b7a1-0e6243bf8aca, device=ide,
type=controller, bootOrder=0, specParams={}, address={bus=0x00,
domain=0x0000, type=pci, slot=0x01, function=0x1}, managed=false,
plugged=true, readOnly=false, deviceAlias=ide0}',
'device_d7b86bcd-13fe-4259-b7a1-0e6243bf8acadevice_d15dafa0-a1ae-4637-bd9b-6708a31f7e41device_f68176fc-0731-4d98-9f35-b31140dcf568device_e9769f4d-b137-4560-8eed-230655922bfadevice_4b6fd867-b4f3-47f6-b6ef-1f3cd0ed9c34':
'VmDevice {vmId=03ac5be8-75fc-43df-9fb8-c8e8af30ae84,
deviceId=4b6fd867-b4f3-47f6-b6ef-1f3cd0ed9c34, device=spicevmc,
type=channel, bootOrder=0, specParams={}, address={port=3, bus=0,
controller=0, type=virtio-serial}, managed=false, plugged=true,
readOnly=false, deviceAlias=channel2}',
'device_d7b86bcd-13fe-4259-b7a1-0e6243bf8acadevice_d15dafa0-a1ae-4637-bd9b-6708a31f7e41device_f68176fc-0731-4d98-9f35-b31140dcf568device_e9769f4d-b137-4560-8eed-230655922bfa':
'VmDevice {vmId=03ac5be8-75fc-43df-9fb8-c8e8af30ae84,
deviceId=e9769f4d-b137-4560-8eed-230655922bfa, device=unix,
type=channel, bootOrder=0, specParams={}, address={port=2, bus=0,
controller=0, type=virtio-serial}, managed=false, plugged=true,
readOnly=false, deviceAlias=channel1}',
'device_d7b86bcd-13fe-4259-b7a1-0e6243bf8acadevice_d15dafa0-a1ae-4637-bd9b-6708a31f7e41device_f68176fc-0731-4d98-9f35-b31140dcf568':
'VmDevice {vmId=03ac5be8-75fc-43df-9fb8-c8e8af30ae84,
deviceId=f68176fc-0731-4d98-9f35-b31140dcf568, device=unix,
type=channel, bootOrder=0, specParams={}, address={port=1, bus=0,
controller=0, type=virtio-serial}, managed=false, plugged=true,
readOnly=false, deviceAlias=channel0}',
'device_d7b86bcd-13fe-4259-b7a1-0e6243bf8acadevice_d15dafa0-a1ae-4637-bd9b-6708a31f7e41':
'VmDevice {vmId=03ac5be8-75fc-43df-9fb8-c8e8af30ae84,
deviceId=d15dafa0-a1ae-4637-bd9b-6708a31f7e41, device=virtio-serial,
type=controller, bootOrder=0, specParams={}, address={bus=0x00,
domain=0x0000, type=pci, slot=0x04, function=0x0}, managed=false,
plugged=true, readOnly=false, deviceAlias=virtio-serial0}'},
'migrationDest': 'libvirt', 'smp': '2', 'vmType': 'kvm',
'spiceSslCipherSuite': 'DEFAULT', '_srcDomXML': "<domain type='kvm'
id='18'>\n <name>ipa2.nbu.cz</name>\n
<uuid>03ac5be8-75fc-43df-9fb8-c8e8af30ae84</uuid>\n <memory
unit='KiB'>2097152</memory>\n <currentMemory
unit='KiB'>2097152</currentMemory>\n <vcpu
placement='static'>2</vcpu>\n <cputune>\n <shares>1020</shares>\n
</cputune>\n <sysinfo type='smbios'>\n <system>\n <entry
name='manufacturer'>oVirt</entry>\n <entry name='product'>oVirt
Node</entry>\n <entry
name='version'>2.6.1-20120228.fc18</entry>\n <entry
name='serial'>35373031-3032-435A-4339-343331444B46</entry>\n <entry
name='uuid'>03ac5be8-75fc-43df-9fb8-c8e8af30ae84</entry>\n </system>\n
</sysinfo>\n <os>\n <type arch='x86_64'
machine='pc-0.14'>hvm</type>\n <boot dev='hd'/>\n <smbios
mode='sysinfo'/>\n </os>\n <features>\n <acpi/>\n </features>\n
<cpu mode='custom' match='exact'>\n <model
fallback='allow'>Opteron_G3</model>\n <topology sockets='2' cores='1'
threads='1'/>\n </cpu>\n <clock offset='variable' adjustment='-43200'
basis='utc'>\n <timer name='rtc' tickpolicy='catchup'/>\n </clock>\n
<on_poweroff>destroy</on_poweroff>\n <on_reboot>restart</on_reboot>\n
<on_crash>destroy</on_crash>\n <devices>\n
<emulator>/usr/bin/qemu-kvm</emulator>\n <disk type='file'
device='cdrom'>\n <driver name='qemu' type='raw'/>\n <source
startupPolicy='optional'/>\n <target dev='hdc' bus='ide'/>\n
<readonly/>\n <serial></serial>\n <alias name='ide0-1-0'/>\n
<address type='drive' controller='0' bus='1' target='0' unit='0'/>\n
</disk>\n <disk type='block' device='disk' snapshot='no'>\n
<driver name='qemu' type='qcow2' cache='none' error_policy='stop'
io='native'/>\n <source
dev='/rhev/data-center/5849b030-626e-47cb-ad90-3ce782d831b3/0ee30f68-c222-44c0-85e6-2ae246f4c1ec/images/975f7398-866e-43f6-8579-1552be81519d/ebce7305-2586-4c5d-bc02-08cc4743a983'/>\n
<target dev='vda' bus='virtio'/>\n
<serial>975f7398-866e-43f6-8579-1552be81519d</serial>\n <alias
name='virtio-disk0'/>\n <address type='pci' domain='0x0000'
bus='0x00' slot='0x05' function='0x0'/>\n </disk>\n <controller
type='usb' index='0'>\n <alias name='usb0'/>\n <address type='pci'
domain='0x0000' bus='0x00' slot='0x01' function='0x2'/>\n
</controller>\n <controller type='ide' index='0'>\n <alias
name='ide0'/>\n <address type='pci' domain='0x0000' bus='0x00'
slot='0x01' function='0x1'/>\n </controller>\n <controller
type='virtio-serial' index='0'>\n <alias
name='virtio-serial0'/>\n <address type='pci' domain='0x0000'
bus='0x00' slot='0x04' function='0x0'/>\n </controller>\n <interface
type='bridge'>\n <mac address='00:1a:4a:a8:03:9e'/>\n <source
bridge='ovirtmgmt'/>\n <target dev='vnet7'/>\n <model
type='virtio'/>\n <filterref filter='vdsm-no-mac-spoofing'/>\n
<link state='up'/>\n <alias name='net0'/>\n <address
type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0'/>\n
</interface>\n <channel type='unix'>\n <source mode='bind'
path='/var/lib/libvirt/qemu/channels/ipa2.nbu.cz.com.redhat.rhevm.vdsm'/>\n
<target type='virtio' name='com.redhat.rhevm.vdsm'/>\n <alias
name='channel0'/>\n <address type='virtio-serial' controller='0'
bus='0' port='1'/>\n </channel>\n <channel type='unix'>\n
<source mode='bind'
path='/var/lib/libvirt/qemu/channels/ipa2.nbu.cz.org.qemu.guest_agent.0'/>\n
<target type='virtio' name='org.qemu.guest_agent.0'/>\n <alias
name='channel1'/>\n <address type='virtio-serial' controller='0'
bus='0' port='2'/>\n </channel>\n <channel type='spicevmc'>\n <target
type='virtio' name='com.redhat.spice.0'/>\n <alias
name='channel2'/>\n <address type='virtio-serial' controller='0'
bus='0' port='3'/>\n </channel>\n <input type='mouse' bus='ps2'/>\n
<graphics type='spice' port='5912' tlsPort='5913' autoport='yes'
listen='0' keymap='en-us' passwdValidTo='1970-01-01T00:00:01'>\n
<listen type='address' address='0'/>\n <channel name='main'
mode='secure'/>\n <channel name='display' mode='secure'/>\n
<channel name='inputs' mode='secure'/>\n <channel name='cursor'
mode='secure'/>\n <channel name='playback' mode='secure'/>\n
<channel name='record' mode='secure'/>\n <channel name='smartcard'
mode='secure'/>\n <channel name='usbredir' mode='secure'/>\n
</graphics>\n <video>\n <model type='qxl' vram='65536'
heads='1'/>\n <alias name='video0'/>\n <address type='pci'
domain='0x0000' bus='0x00' slot='0x02' function='0x0'/>\n </video>\n
<memballoon model='virtio'>\n <alias name='balloon0'/>\n
<address type='pci' domain='0x0000' bus='0x00' slot='0x06'
function='0x0'/>\n </memballoon>\n </devices>\n <seclabel
type='dynamic' model='selinux' relabel='yes'>\n
<label>system_u:system_r:svirt_t:s0:c333,c972</label>\n
<imagelabel>system_u:object_r:svirt_image_t:s0:c333,c972</imagelabel>\n
</seclabel>\n</domain>\n", 'memSize': 2048, 'elapsedTimeOffset':
23573.797423124313, 'vmName': 'ipa2.nbu.cz', 'nice': '0', 'status':
'Up', 'clientIp': '', 'displayIp': '0', 'displayPort': '5912',
'smpCoresPerSocket': '1', 'smartcardEnable': 'false', 'guestIPs': '',
'nicModel': 'rtl8139,pv', 'keyboardLayout': 'en-us', 'kvmEnable':
'true', 'pitReinjection': 'false', 'devices': [{'specParams': {'vram':
'65536'}, 'alias': 'video0', 'deviceId':
'a74682a1-e49b-4c3c-a65a-679036bbd6f6', 'address': {'slot': '0x02',
'bus': '0x00', 'domain': '0x0000', 'type': 'pci', 'function': '0x0'},
'device': 'qxl', 'type': 'video'}, {'nicModel': 'pv', 'macAddr':
'00:1a:4a:a8:03:9e', 'linkActive': True, 'network': 'ovirtmgmt',
'specParams': {}, 'filter': 'vdsm-no-mac-spoofing', 'alias': 'net0',
'deviceId': '405d2d7f-cbe4-4a8c-aeaa-b6d11c0739fd', 'address': {'slot':
'0x03', 'bus': '0x00', 'domain': '0x0000', 'type': 'pci', 'function':
'0x0'}, 'device': 'bridge', 'type': 'interface', 'name': 'vnet7'},
{'index': '2', 'iface': 'ide', 'name': 'hdc', 'alias': 'ide0-1-0',
'shared': 'false', 'specParams': {'path': ''}, 'readonly': 'True',
'deviceId': 'f3399625-142a-44ed-897d-01b2fad56a89', 'address': {'bus':
'1', 'controller': '0', 'type': 'drive', 'target': '0', 'unit': '0'},
'device': 'cdrom', 'path': '', 'type': 'disk'}, {'address': {'slot':
'0x05', 'bus': '0x00', 'domain': '0x0000', 'type': 'pci', 'function':
'0x0'}, 'index': 0, 'iface': 'virtio', 'apparentsize': '7516192768',
'alias': 'virtio-disk0', 'imageID':
'975f7398-866e-43f6-8579-1552be81519d', 'readonly': 'False', 'shared':
'false', 'truesize': '7516192768', 'type': 'disk', 'domainID':
'0ee30f68-c222-44c0-85e6-2ae246f4c1ec', 'reqsize': '0', 'format': 'cow',
'deviceId': '975f7398-866e-43f6-8579-1552be81519d', 'poolID':
'5849b030-626e-47cb-ad90-3ce782d831b3', 'device': 'disk', 'path':
'/rhev/data-center/5849b030-626e-47cb-ad90-3ce782d831b3/0ee30f68-c222-44c0-85e6-2ae246f4c1ec/images/975f7398-866e-43f6-8579-1552be81519d/ebce7305-2586-4c5d-bc02-08cc4743a983',
'propagateErrors': 'off', 'optional': 'false', 'name': 'vda',
'volumeID': 'ebce7305-2586-4c5d-bc02-08cc4743a983', 'specParams': {},
'volumeChain': [{'path':
'/rhev/data-center/5849b030-626e-47cb-ad90-3ce782d831b3/0ee30f68-c222-44c0-85e6-2ae246f4c1ec/images/975f7398-866e-43f6-8579-1552be81519d/ebce7305-2586-4c5d-bc02-08cc4743a983',
'domainID': '0ee30f68-c222-44c0-85e6-2ae246f4c1ec', 'volumeID':
'ebce7305-2586-4c5d-bc02-08cc4743a983', 'imageID':
'975f7398-866e-43f6-8579-1552be81519d'}]}, {'target': 2097152,
'specParams': {'model': 'virtio'}, 'alias': 'balloon0', 'deviceId':
'c25a298c-698f-41ff-8968-12e6ed9b14fe', 'address': {'slot': '0x06',
'bus': '0x00', 'domain': '0x0000', 'type': 'pci', 'function': '0x0'},
'device': 'memballoon', 'type': 'balloon'}, {'device': 'usb', 'alias':
'usb0', 'type': 'controller', 'address': {'slot': '0x01', 'bus': '0x00',
'domain': '0x0000', 'type': 'pci', 'function': '0x2'}}, {'device':
'ide', 'alias': 'ide0', 'type': 'controller', 'address': {'slot':
'0x01', 'bus': '0x00', 'domain': '0x0000', 'type': 'pci', 'function':
'0x1'}}, {'device': 'virtio-serial', 'alias': 'virtio-serial0', 'type':
'controller', 'address': {'slot': '0x04', 'bus': '0x00', 'domain':
'0x0000', 'type': 'pci', 'function': '0x0'}}, {'device': 'unix',
'alias': 'channel0', 'type': 'channel', 'address': {'bus': '0',
'controller': '0', 'type': 'virtio-serial', 'port': '1'}}, {'device':
'unix', 'alias': 'channel1', 'type': 'channel', 'address': {'bus': '0',
'controller': '0', 'type': 'virtio-serial', 'port': '2'}}, {'device':
'spicevmc', 'alias': 'channel2', 'type': 'channel', 'address': {'bus':
'0', 'controller': '0', 'type': 'virtio-serial', 'port': '3'}}],
'spiceSecureChannels':
'smain,sinputs,scursor,splayback,srecord,sdisplay,susbredir,ssmartcard',
'display': 'qxl'},) {}
Thread-2984::DEBUG::2013-07-24
18:37:35,880::API::489::vds::(migrationCreate) Migration create
Thread-2984::INFO::2013-07-24
18:37:35,880::API::626::vds::(_getNetworkIp) network None: using 0
Thread-2984::INFO::2013-07-24
18:37:35,881::clientIF::334::vds::(createVm) vmContainerLock acquired by
vm 03ac5be8-75fc-43df-9fb8-c8e8af30ae84
Thread-2984::DEBUG::2013-07-24
18:37:35,885::clientIF::348::vds::(createVm) Total desktops after
creation of 03ac5be8-75fc-43df-9fb8-c8e8af30ae84 is 1
Thread-2985::DEBUG::2013-07-24
18:37:35,885::vm::671::vm.Vm::(_startUnderlyingVm)
vmId=`03ac5be8-75fc-43df-9fb8-c8e8af30ae84`::Start
Thread-2984::DEBUG::2013-07-24
18:37:35,886::libvirtvm::3107::vm.Vm::(waitForMigrationDestinationPrepare)
vmId=`03ac5be8-75fc-43df-9fb8-c8e8af30ae84`::migration destination:
waiting for VM creation
Thread-2985::DEBUG::2013-07-24
18:37:35,886::vm::675::vm.Vm::(_startUnderlyingVm)
vmId=`03ac5be8-75fc-43df-9fb8-c8e8af30ae84`::_ongoingCreations acquired
Thread-2985::INFO::2013-07-24
18:37:35,887::libvirtvm::1463::vm.Vm::(_run)
vmId=`03ac5be8-75fc-43df-9fb8-c8e8af30ae84`::VM wrapper has started
Thread-2984::DEBUG::2013-07-24
18:37:35,887::libvirtvm::3113::vm.Vm::(waitForMigrationDestinationPrepare)
vmId=`03ac5be8-75fc-43df-9fb8-c8e8af30ae84`::migration destination:
waiting 36s for path preparation
Thread-2985::WARNING::2013-07-24
18:37:35,887::vm::459::vm.Vm::(getConfDevices)
vmId=`03ac5be8-75fc-43df-9fb8-c8e8af30ae84`::Unknown type found, device:
'{'device': 'unix', 'alias': 'channel0', 'type': 'channel', 'address':
{'bus': '0', 'controller': '0', 'type': 'virtio-serial', 'port': '1'}}'
found
Thread-2985::WARNING::2013-07-24
18:37:35,888::vm::459::vm.Vm::(getConfDevices)
vmId=`03ac5be8-75fc-43df-9fb8-c8e8af30ae84`::Unknown type found, device:
'{'device': 'unix', 'alias': 'channel1', 'type': 'channel', 'address':
{'bus': '0', 'controller': '0', 'type': 'virtio-serial', 'port': '2'}}'
found
Thread-2985::WARNING::2013-07-24
18:37:35,888::vm::459::vm.Vm::(getConfDevices)
vmId=`03ac5be8-75fc-43df-9fb8-c8e8af30ae84`::Unknown type found, device:
'{'device': 'spicevmc', 'alias': 'channel2', 'type': 'channel',
'address': {'bus': '0', 'controller': '0', 'type': 'virtio-serial',
'port': '3'}}' found
Thread-2985::DEBUG::2013-07-24
18:37:35,889::task::568::TaskManager.Task::(_updateState)
Task=`925d13e4-6995-4f8e-9359-6fdb1be2873f`::moving from state init ->
state preparing
Thread-2985::INFO::2013-07-24
18:37:35,889::logUtils::41::dispatcher::(wrapper) Run and protect:
getVolumeSize(sdUUID='0ee30f68-c222-44c0-85e6-2ae246f4c1ec',
spUUID='5849b030-626e-47cb-ad90-3ce782d831b3',
imgUUID='975f7398-866e-43f6-8579-1552be81519d',
volUUID='ebce7305-2586-4c5d-bc02-08cc4743a983', options=None)
Thread-2985::INFO::2013-07-24
18:37:35,889::logUtils::44::dispatcher::(wrapper) Run and protect:
getVolumeSize, Return response: {'truesize': '7516192768',
'apparentsize': '7516192768'}
Thread-2985::DEBUG::2013-07-24
18:37:35,889::task::1151::TaskManager.Task::(prepare)
Task=`925d13e4-6995-4f8e-9359-6fdb1be2873f`::finished: {'truesize':
'7516192768', 'apparentsize': '7516192768'}
Thread-2985::DEBUG::2013-07-24
18:37:35,890::task::568::TaskManager.Task::(_updateState)
Task=`925d13e4-6995-4f8e-9359-6fdb1be2873f`::moving from state preparing
-> state finished
Thread-2985::DEBUG::2013-07-24
18:37:35,890::resourceManager::830::ResourceManager.Owner::(releaseAll)
Owner.releaseAll requests {} resources {}
Thread-2985::DEBUG::2013-07-24
18:37:35,890::resourceManager::864::ResourceManager.Owner::(cancelAll)
Owner.cancelAll requests {}
Thread-2985::DEBUG::2013-07-24
18:37:35,890::task::957::TaskManager.Task::(_decref)
Task=`925d13e4-6995-4f8e-9359-6fdb1be2873f`::ref 0 aborting False
Thread-2985::INFO::2013-07-24
18:37:35,890::clientIF::316::vds::(prepareVolumePath) prepared volume path:
Thread-2985::DEBUG::2013-07-24
18:37:35,890::task::568::TaskManager.Task::(_updateState)
Task=`5011aeeb-7b97-474b-b559-d29f3f6e7a22`::moving from state init ->
state preparing
Thread-2985::INFO::2013-07-24
18:37:35,891::logUtils::41::dispatcher::(wrapper) Run and protect:
prepareImage(sdUUID='0ee30f68-c222-44c0-85e6-2ae246f4c1ec',
spUUID='5849b030-626e-47cb-ad90-3ce782d831b3',
imgUUID='975f7398-866e-43f6-8579-1552be81519d',
volUUID='ebce7305-2586-4c5d-bc02-08cc4743a983')
Thread-2985::DEBUG::2013-07-24
18:37:35,891::resourceManager::190::ResourceManager.Request::(__init__)
ResName=`Storage.0ee30f68-c222-44c0-85e6-2ae246f4c1ec`ReqID=`39a62be2-0ecd-4be8-a5e9-00824ea4fa50`::Request
was made in '/usr/share/vdsm/storage/resourceManager.py' line '189' at
'__init__'
Thread-2985::DEBUG::2013-07-24
18:37:35,891::resourceManager::504::ResourceManager::(registerResource)
Trying to register resource
'Storage.0ee30f68-c222-44c0-85e6-2ae246f4c1ec' for lock type 'shared'
Thread-2985::DEBUG::2013-07-24
18:37:35,891::resourceManager::547::ResourceManager::(registerResource)
Resource 'Storage.0ee30f68-c222-44c0-85e6-2ae246f4c1ec' is free. Now
locking as 'shared' (1 active user)
Thread-2985::DEBUG::2013-07-24
18:37:35,892::resourceManager::227::ResourceManager.Request::(grant)
ResName=`Storage.0ee30f68-c222-44c0-85e6-2ae246f4c1ec`ReqID=`39a62be2-0ecd-4be8-a5e9-00824ea4fa50`::Granted
request
Thread-2985::DEBUG::2013-07-24
18:37:35,892::task::794::TaskManager.Task::(resourceAcquired)
Task=`5011aeeb-7b97-474b-b559-d29f3f6e7a22`::_resourcesAcquired:
Storage.0ee30f68-c222-44c0-85e6-2ae246f4c1ec (shared)
Thread-2985::DEBUG::2013-07-24
18:37:35,892::task::957::TaskManager.Task::(_decref)
Task=`5011aeeb-7b97-474b-b559-d29f3f6e7a22`::ref 1 aborting False
Thread-2985::DEBUG::2013-07-24
18:37:35,895::misc::84::Storage.Misc.excCmd::(<lambda>) '/bin/dd
iflag=direct skip=8 bs=512
if=/dev/0ee30f68-c222-44c0-85e6-2ae246f4c1ec/metadata count=1' (cwd None)
Thread-2985::DEBUG::2013-07-24
18:37:35,904::misc::84::Storage.Misc.excCmd::(<lambda>) SUCCESS: <err> =
'1+0 records in\n1+0 records out\n512 bytes (512 B) copied, 0.000358208
s, 1.4 MB/s\n'; <rc> = 0
Thread-2985::DEBUG::2013-07-24
18:37:35,905::misc::325::Storage.Misc::(validateDDBytes) err: ['1+0
records in', '1+0 records out', '512 bytes (512 B) copied, 0.000358208
s, 1.4 MB/s'], size: 512
Thread-2985::INFO::2013-07-24
18:37:35,905::image::344::Storage.Image::(getChain)
sdUUID=0ee30f68-c222-44c0-85e6-2ae246f4c1ec
imgUUID=975f7398-866e-43f6-8579-1552be81519d
chain=[<storage.blockVolume.BlockVolume object at 0x7fa284398950>]
Thread-2985::DEBUG::2013-07-24
18:37:35,906::misc::84::Storage.Misc.excCmd::(<lambda>) '/usr/bin/sudo
-n /sbin/lvm lvchange --config " devices { preferred_names =
[\\"^/dev/mapper/\\"] ignore_suspended_devices=1 write_cache_state=0
disable_after_error_count=3 filter = [
\\"a%360a9800042415569305d434565795a44|36782bcb045a7ef0015bbfa3e18b4758e%\\",
\\"r%.*%\\" ] } global { locking_type=1 prioritise_write_locks=1
wait_for_locks=1 } backup { retain_min = 50 retain_days = 0 } "
--autobackup n --available y
0ee30f68-c222-44c0-85e6-2ae246f4c1ec/ebce7305-2586-4c5d-bc02-08cc4743a983'
(cwd None)
Thread-2985::DEBUG::2013-07-24
18:37:36,271::misc::84::Storage.Misc.excCmd::(<lambda>) SUCCESS: <err> =
''; <rc> = 0
Thread-2985::DEBUG::2013-07-24
18:37:36,272::lvm::493::OperationMutex::(_invalidatelvs) Operation 'lvm
invalidate operation' got the operation mutex
Thread-2985::DEBUG::2013-07-24
18:37:36,272::lvm::505::OperationMutex::(_invalidatelvs) Operation 'lvm
invalidate operation' released the operation mutex
Thread-2985::INFO::2013-07-24
18:37:36,273::logUtils::44::dispatcher::(wrapper) Run and protect:
prepareImage, Return response: {'path':
'/rhev/data-center/5849b030-626e-47cb-ad90-3ce782d831b3/0ee30f68-c222-44c0-85e6-2ae246f4c1ec/images/975f7398-866e-43f6-8579-1552be81519d/ebce7305-2586-4c5d-bc02-08cc4743a983',
'chain': [{'path':
'/rhev/data-center/5849b030-626e-47cb-ad90-3ce782d831b3/0ee30f68-c222-44c0-85e6-2ae246f4c1ec/images/975f7398-866e-43f6-8579-1552be81519d/ebce7305-2586-4c5d-bc02-08cc4743a983',
'domainID': '0ee30f68-c222-44c0-85e6-2ae246f4c1ec', 'volumeID':
'ebce7305-2586-4c5d-bc02-08cc4743a983', 'imageID':
'975f7398-866e-43f6-8579-1552be81519d'}]}
Thread-2985::DEBUG::2013-07-24
18:37:36,273::task::1151::TaskManager.Task::(prepare)
Task=`5011aeeb-7b97-474b-b559-d29f3f6e7a22`::finished: {'path':
'/rhev/data-center/5849b030-626e-47cb-ad90-3ce782d831b3/0ee30f68-c222-44c0-85e6-2ae246f4c1ec/images/975f7398-866e-43f6-8579-1552be81519d/ebce7305-2586-4c5d-bc02-08cc4743a983',
'chain': [{'path':
'/rhev/data-center/5849b030-626e-47cb-ad90-3ce782d831b3/0ee30f68-c222-44c0-85e6-2ae246f4c1ec/images/975f7398-866e-43f6-8579-1552be81519d/ebce7305-2586-4c5d-bc02-08cc4743a983',
'domainID': '0ee30f68-c222-44c0-85e6-2ae246f4c1ec', 'volumeID':
'ebce7305-2586-4c5d-bc02-08cc4743a983', 'imageID':
'975f7398-866e-43f6-8579-1552be81519d'}]}
Thread-2985::DEBUG::2013-07-24
18:37:36,273::task::568::TaskManager.Task::(_updateState)
Task=`5011aeeb-7b97-474b-b559-d29f3f6e7a22`::moving from state preparing
-> state finished
Thread-2985::DEBUG::2013-07-24
18:37:36,273::resourceManager::830::ResourceManager.Owner::(releaseAll)
Owner.releaseAll requests {} resources
{'Storage.0ee30f68-c222-44c0-85e6-2ae246f4c1ec': < ResourceRef
'Storage.0ee30f68-c222-44c0-85e6-2ae246f4c1ec', isValid: 'True' obj:
'None'>}
Thread-2985::DEBUG::2013-07-24
18:37:36,274::resourceManager::864::ResourceManager.Owner::(cancelAll)
Owner.cancelAll requests {}
Thread-2985::DEBUG::2013-07-24
18:37:36,274::resourceManager::557::ResourceManager::(releaseResource)
Trying to release resource 'Storage.0ee30f68-c222-44c0-85e6-2ae246f4c1ec'
Thread-2985::DEBUG::2013-07-24
18:37:36,274::resourceManager::573::ResourceManager::(releaseResource)
Released resource 'Storage.0ee30f68-c222-44c0-85e6-2ae246f4c1ec' (0
active users)
Thread-2985::DEBUG::2013-07-24
18:37:36,274::resourceManager::578::ResourceManager::(releaseResource)
Resource 'Storage.0ee30f68-c222-44c0-85e6-2ae246f4c1ec' is free, finding
out if anyone is waiting for it.
Thread-2985::DEBUG::2013-07-24
18:37:36,274::resourceManager::585::ResourceManager::(releaseResource)
No one is waiting for resource
'Storage.0ee30f68-c222-44c0-85e6-2ae246f4c1ec', Clearing records.
Thread-2985::DEBUG::2013-07-24
18:37:36,275::task::957::TaskManager.Task::(_decref)
Task=`5011aeeb-7b97-474b-b559-d29f3f6e7a22`::ref 0 aborting False
Thread-2985::INFO::2013-07-24
18:37:36,275::clientIF::316::vds::(prepareVolumePath) prepared volume
path:
/rhev/data-center/5849b030-626e-47cb-ad90-3ce782d831b3/0ee30f68-c222-44c0-85e6-2ae246f4c1ec/images/975f7398-866e-43f6-8579-1552be81519d/ebce7305-2586-4c5d-bc02-08cc4743a983
Thread-2985::DEBUG::2013-07-24
18:37:36,282::vm::692::vm.Vm::(_startUnderlyingVm)
vmId=`03ac5be8-75fc-43df-9fb8-c8e8af30ae84`::_ongoingCreations released
Thread-2984::DEBUG::2013-07-24
18:37:36,282::API::502::vds::(migrationCreate) Destination VM creation
succeeded
Thread-2985::DEBUG::2013-07-24
18:37:36,284::libvirtvm::1901::vm.Vm::(_waitForIncomingMigrationFinish)
vmId=`03ac5be8-75fc-43df-9fb8-c8e8af30ae84`::Waiting 300 seconds for end
of migration
Thread-2984::DEBUG::2013-07-24
18:37:36,285::BindingXMLRPC::920::vds::(wrapper) return
vmMigrationCreate with {'status': {'message': 'Done', 'code': 0},
'migrationPort': 0, 'params': {'status': 'Migration Destination',
'acpiEnable': 'true', 'emulatedMachine': 'pc-0.14',
'afterMigrationStatus': 'Up', 'spiceSecureChannels':
'smain,sinputs,scursor,splayback,srecord,sdisplay,susbredir,ssmartcard',
'pid': '0', 'transparentHugePages': 'true', 'displaySecurePort': '-1',
'timeOffset': '-43200', 'cpuType': 'Opteron_G3', 'smp': '2',
'migrationDest': 'libvirt', 'custom':
{'device_d7b86bcd-13fe-4259-b7a1-0e6243bf8aca': 'VmDevice
{vmId=03ac5be8-75fc-43df-9fb8-c8e8af30ae84,
deviceId=d7b86bcd-13fe-4259-b7a1-0e6243bf8aca, device=ide,
type=controller, bootOrder=0, specParams={}, address={bus=0x00,
domain=0x0000, type=pci, slot=0x01, function=0x1}, managed=false,
plugged=true, readOnly=false, deviceAlias=ide0}',
'device_d7b86bcd-13fe-4259-b7a1-0e6243bf8acadevice_d15dafa0-a1ae-4637-bd9b-6708a31f7e41device_f68176fc-0731-4d98-9f35-b31140dcf568device_e9769f4d-b137-4560-8eed-230655922bfadevice_4b6fd867-b4f3-47f6-b6ef-1f3cd0ed9c34':
'VmDevice {vmId=03ac5be8-75fc-43df-9fb8-c8e8af30ae84,
deviceId=4b6fd867-b4f3-47f6-b6ef-1f3cd0ed9c34, device=spicevmc,
type=channel, bootOrder=0, specParams={}, address={port=3, bus=0,
controller=0, type=virtio-serial}, managed=false, plugged=true,
readOnly=false, deviceAlias=channel2}',
'device_d7b86bcd-13fe-4259-b7a1-0e6243bf8acadevice_d15dafa0-a1ae-4637-bd9b-6708a31f7e41device_f68176fc-0731-4d98-9f35-b31140dcf568device_e9769f4d-b137-4560-8eed-230655922bfa':
'VmDevice {vmId=03ac5be8-75fc-43df-9fb8-c8e8af30ae84,
deviceId=e9769f4d-b137-4560-8eed-230655922bfa, device=unix,
type=channel, bootOrder=0, specParams={}, address={port=2, bus=0,
controller=0, type=virtio-serial}, managed=false, plugged=true,
readOnly=false, deviceAlias=channel1}',
'device_d7b86bcd-13fe-4259-b7a1-0e6243bf8acadevice_d15dafa0-a1ae-4637-bd9b-6708a31f7e41device_f68176fc-0731-4d98-9f35-b31140dcf568':
'VmDevice {vmId=03ac5be8-75fc-43df-9fb8-c8e8af30ae84,
deviceId=f68176fc-0731-4d98-9f35-b31140dcf568, device=unix,
type=channel, bootOrder=0, specParams={}, address={port=1, bus=0,
controller=0, type=virtio-serial}, managed=false, plugged=true,
readOnly=false, deviceAlias=channel0}',
'device_d7b86bcd-13fe-4259-b7a1-0e6243bf8acadevice_d15dafa0-a1ae-4637-bd9b-6708a31f7e41':
'VmDevice {vmId=03ac5be8-75fc-43df-9fb8-c8e8af30ae84,
deviceId=d15dafa0-a1ae-4637-bd9b-6708a31f7e41, device=virtio-serial,
type=controller, bootOrder=0, specParams={}, address={bus=0x00,
domain=0x0000, type=pci, slot=0x04, function=0x0}, managed=false,
plugged=true, readOnly=false, deviceAlias=virtio-serial0}'}, 'vmType':
'kvm', 'spiceSslCipherSuite': 'DEFAULT', 'memSize': 2048, 'vmName':
'ipa2.nbu.cz', 'nice': '0', 'username': 'Unknown', 'vmId':
'03ac5be8-75fc-43df-9fb8-c8e8af30ae84', 'displayIp': '0',
'keyboardLayout': 'en-us', 'displayPort': '-1', 'smartcardEnable':
'false', 'guestIPs': '', 'nicModel': 'rtl8139,pv', 'smpCoresPerSocket':
'1', 'kvmEnable': 'true', 'pitReinjection': 'false', 'devices':
[{'device': 'unix', 'alias': 'channel0', 'type': 'channel', 'address':
{'bus': '0', 'controller': '0', 'type': 'virtio-serial', 'port': '1'}},
{'device': 'unix', 'alias': 'channel1', 'type': 'channel', 'address':
{'bus': '0', 'controller': '0', 'type': 'virtio-serial', 'port': '2'}},
{'device': 'spicevmc', 'alias': 'channel2', 'type': 'channel',
'address': {'bus': '0', 'controller': '0', 'type': 'virtio-serial',
'port': '3'}}, {'device': 'usb', 'alias': 'usb0', 'type': 'controller',
'address': {'slot': '0x01', 'bus': '0x00', 'domain': '0x0000', 'type':
'pci', 'function': '0x2'}}, {'device': 'ide', 'alias': 'ide0', 'type':
'controller', 'address': {'slot': '0x01', 'bus': '0x00', 'domain':
'0x0000', 'type': 'pci', 'function': '0x1'}}, {'device':
'virtio-serial', 'alias': 'virtio-serial0', 'type': 'controller',
'address': {'slot': '0x04', 'bus': '0x00', 'domain': '0x0000', 'type':
'pci', 'function': '0x0'}}, {'specParams': {'vram': '65536'}, 'alias':
'video0', 'deviceId': 'a74682a1-e49b-4c3c-a65a-679036bbd6f6', 'address':
{'slot': '0x02', 'bus': '0x00', 'domain': '0x0000', 'type': 'pci',
'function': '0x0'}, 'device': 'qxl', 'type': 'video'}, {'nicModel':
'pv', 'macAddr': '00:1a:4a:a8:03:9e', 'linkActive': True, 'network':
'ovirtmgmt', 'specParams': {}, 'filter': 'vdsm-no-mac-spoofing',
'alias': 'net0', 'deviceId': '405d2d7f-cbe4-4a8c-aeaa-b6d11c0739fd',
'address': {'slot': '0x03', 'bus': '0x00', 'domain': '0x0000', 'type':
'pci', 'function': '0x0'}, 'device': 'bridge', 'type': 'interface',
'name': 'vnet7'}, {'index': '2', 'iface': 'ide', 'name': 'hdc', 'alias':
'ide0-1-0', 'shared': 'false', 'specParams': {'path': ''}, 'readonly':
'True', 'deviceId': 'f3399625-142a-44ed-897d-01b2fad56a89', 'address':
{'bus': '1', 'controller': '0', 'type': 'drive', 'target': '0', 'unit':
'0'}, 'device': 'cdrom', 'path': '', 'type': 'disk'}, {'address':
{'slot': '0x05', 'bus': '0x00', 'domain': '0x0000', 'type': 'pci',
'function': '0x0'}, 'index': 0, 'iface': 'virtio', 'apparentsize':
'7516192768', 'alias': 'virtio-disk0', 'imageID':
'975f7398-866e-43f6-8579-1552be81519d', 'readonly': 'False', 'shared':
'false', 'truesize': '7516192768', 'type': 'disk', 'domainID':
'0ee30f68-c222-44c0-85e6-2ae246f4c1ec', 'reqsize': '0', 'format': 'cow',
'deviceId': '975f7398-866e-43f6-8579-1552be81519d', 'poolID':
'5849b030-626e-47cb-ad90-3ce782d831b3', 'device': 'disk', 'path':
'/rhev/data-center/5849b030-626e-47cb-ad90-3ce782d831b3/0ee30f68-c222-44c0-85e6-2ae246f4c1ec/images/975f7398-866e-43f6-8579-1552be81519d/ebce7305-2586-4c5d-bc02-08cc4743a983',
'propagateErrors': 'off', 'optional': 'false', 'name': 'vda',
'volumeID': 'ebce7305-2586-4c5d-bc02-08cc4743a983', 'specParams': {},
'volumeChain': [{'path':
'/rhev/data-center/5849b030-626e-47cb-ad90-3ce782d831b3/0ee30f68-c222-44c0-85e6-2ae246f4c1ec/images/975f7398-866e-43f6-8579-1552be81519d/ebce7305-2586-4c5d-bc02-08cc4743a983',
'domainID': '0ee30f68-c222-44c0-85e6-2ae246f4c1ec', 'volumeID':
'ebce7305-2586-4c5d-bc02-08cc4743a983', 'imageID':
'975f7398-866e-43f6-8579-1552be81519d'}]}, {'target': 2097152,
'specParams': {'model': 'virtio'}, 'alias': 'balloon0', 'deviceId':
'c25a298c-698f-41ff-8968-12e6ed9b14fe', 'address': {'slot': '0x06',
'bus': '0x00', 'domain': '0x0000', 'type': 'pci', 'function': '0x0'},
'device': 'memballoon', 'type': 'balloon'}], 'clientIp': '', 'display':
'qxl'}}
Thread-2987::DEBUG::2013-07-24
18:37:36,505::BindingXMLRPC::913::vds::(wrapper) client
[192.168.3.207]::call vmDestroy with
('03ac5be8-75fc-43df-9fb8-c8e8af30ae84',) {}
Thread-2987::INFO::2013-07-24 18:37:36,505::API::310::vds::(destroy)
vmContainerLock acquired by vm 03ac5be8-75fc-43df-9fb8-c8e8af30ae84
Thread-2987::DEBUG::2013-07-24
18:37:36,505::libvirtvm::2639::vm.Vm::(destroy)
vmId=`03ac5be8-75fc-43df-9fb8-c8e8af30ae84`::destroy Called
Thread-2987::INFO::2013-07-24
18:37:36,505::libvirtvm::2588::vm.Vm::(releaseVm)
vmId=`03ac5be8-75fc-43df-9fb8-c8e8af30ae84`::Release VM resources
Thread-2985::ERROR::2013-07-24
18:37:36,507::vm::716::vm.Vm::(_startUnderlyingVm)
vmId=`03ac5be8-75fc-43df-9fb8-c8e8af30ae84`::The vm start process failed
Traceback (most recent call last):
File "/usr/share/vdsm/vm.py", line 696, in _startUnderlyingVm
self._waitForIncomingMigrationFinish()
File "/usr/share/vdsm/libvirtvm.py", line 1907, in
_waitForIncomingMigrationFinish
self._connection.lookupByUUIDString(self.id),
File "/usr/lib64/python2.6/site-packages/vdsm/libvirtconnection.py",
line 111, in wrapper
ret = f(*args, **kwargs)
File "/usr/lib64/python2.6/site-packages/libvirt.py", line 2838, in
lookupByUUIDString
if ret is None:raise libvirtError('virDomainLookupByUUIDString()
failed', conn=self)
libvirtError: Domain not found: no domain with matching uuid
'03ac5be8-75fc-43df-9fb8-c8e8af30ae84'
Thread-2985::DEBUG::2013-07-24
18:37:36,508::vm::1065::vm.Vm::(setDownStatus)
vmId=`03ac5be8-75fc-43df-9fb8-c8e8af30ae84`::Changed state to Down:
Domain not found: no domain with matching uuid
'03ac5be8-75fc-43df-9fb8-c8e8af30ae84'
Thread-2987::DEBUG::2013-07-24
18:37:36,509::task::568::TaskManager.Task::(_updateState)
Task=`43d36e49-25bf-4280-a469-aa944d7f8939`::moving from state init ->
state preparing
Thread-2987::INFO::2013-07-24
18:37:36,510::logUtils::41::dispatcher::(wrapper) Run and protect:
inappropriateDevices(thiefId='03ac5be8-75fc-43df-9fb8-c8e8af30ae84')
Thread-2987::INFO::2013-07-24
18:37:36,511::logUtils::44::dispatcher::(wrapper) Run and protect:
inappropriateDevices, Return response: None
Thread-2987::DEBUG::2013-07-24
18:37:36,512::task::1151::TaskManager.Task::(prepare)
Task=`43d36e49-25bf-4280-a469-aa944d7f8939`::finished: None
Thread-2987::DEBUG::2013-07-24
18:37:36,512::task::568::TaskManager.Task::(_updateState)
Task=`43d36e49-25bf-4280-a469-aa944d7f8939`::moving from state preparing
-> state finished
Thread-2987::DEBUG::2013-07-24
18:37:36,512::resourceManager::830::ResourceManager.Owner::(releaseAll)
Owner.releaseAll requests {} resources {}
Thread-2987::DEBUG::2013-07-24
18:37:36,512::resourceManager::864::ResourceManager.Owner::(cancelAll)
Owner.cancelAll requests {}
Thread-2987::DEBUG::2013-07-24
18:37:36,512::task::957::TaskManager.Task::(_decref)
Task=`43d36e49-25bf-4280-a469-aa944d7f8939`::ref 0 aborting False
Thread-2987::DEBUG::2013-07-24
18:37:36,513::libvirtvm::2633::vm.Vm::(deleteVm)
vmId=`03ac5be8-75fc-43df-9fb8-c8e8af30ae84`::Total desktops after
destroy of 03ac5be8-75fc-43df-9fb8-c8e8af30ae84 is 0
Thread-2987::DEBUG::2013-07-24
18:37:36,513::BindingXMLRPC::920::vds::(wrapper) return vmDestroy with
{'status': {'message': 'Machine destroyed', 'code': 0}}
Thread-2990::DEBUG::2013-07-24
18:37:40,815::task::568::TaskManager.Task::(_updateState)
Task=`27887d84-ac21-4eda-b04d-d5e23ab9c183`::moving from state init ->
state preparing
Thread-2990::INFO::2013-07-24
18:37:40,815::logUtils::41::dispatcher::(wrapper) Run and protect:
repoStats(options=None)
Thread-2990::INFO::2013-07-24
18:37:40,815::logUtils::44::dispatcher::(wrapper) Run and protect:
repoStats, Return response: {u'0ee30f68-c222-44c0-85e6-2ae246f4c1ec':
{'delay': '0.0135831832886', 'lastCheck': '7.7', 'code': 0, 'valid':
True}, u'a8c13187-d9d1-46b8-abe3-c322970d9d4d': {'delay':
'0.00499796867371', 'lastCheck': '5.8', 'code': 0, 'valid': True}}
Thread-2990::DEBUG::2013-07-24
18:37:40,815::task::1151::TaskManager.Task::(prepare)
Task=`27887d84-ac21-4eda-b04d-d5e23ab9c183`::finished:
{u'0ee30f68-c222-44c0-85e6-2ae246f4c1ec': {'delay': '0.0135831832886',
'lastCheck': '7.7', 'code': 0, 'valid': True},
u'a8c13187-d9d1-46b8-abe3-c322970d9d4d': {'delay': '0.00499796867371',
'lastCheck': '5.8', 'code': 0, 'valid': True}}
Thread-2990::DEBUG::2013-07-24
18:37:40,815::task::568::TaskManager.Task::(_updateState)
Task=`27887d84-ac21-4eda-b04d-d5e23ab9c183`::moving from state preparing
-> state finished
Thread-2990::DEBUG::2013-07-24
18:37:40,816::resourceManager::830::ResourceManager.Owner::(releaseAll)
Owner.releaseAll requests {} resources {}
Thread-2990::DEBUG::2013-07-24
18:37:40,816::resourceManager::864::ResourceManager.Owner::(cancelAll)
Owner.cancelAll requests {}
Thread-2990::DEBUG::2013-07-24
18:37:40,816::task::957::TaskManager.Task::(_decref)
Task=`27887d84-ac21-4eda-b04d-d5e23ab9c183`::ref 0 aborting False
2
2
25 Jul '13
----- Original Message -----
> From: "Cheryn Tan" <cheryntan(a)redhat.com>
> To: "Eli Mesika" <emesika(a)redhat.com>
> Cc: "users(a)oVirt.org" <users(a)ovirt.org>, "rhev-devel" <rhev-devel(a)redhat.com>
> Sent: Thursday, July 25, 2013 3:48:16 AM
> Subject: Re: [rhev-devel] Deep Dive into Host Power Management presentation
>
> Hi Eli,
>
> Will a recording of your presentation be available to those of us in less
> convenient time zones? :)
Hi
Yes, the elluminate includes a recording option ...
>
> Thank you,
> Cheryn
>
> ----- Original Message -----
> > From: "Eli Mesika" <emesika(a)redhat.com>
> > To: "users(a)oVirt.org" <users(a)ovirt.org>, "rhev-devel"
> > <rhev-devel(a)redhat.com>
> > Sent: Thursday, 25 July, 2013 6:08:35 AM
> > Subject: [rhev-devel] Deep Dive into Host Power Management presentation
> >
> > Hi
> > I will give a presentation on Host Power Management - Deep Dive Deep Dive
> > into Host Power Management
> >
> > Details:
> >
> > When:
> > MON JUL 29 16:00 - 17:00 (IST)
> >
> > Elluminate session :
> > https://sas.elluminate.com/m.jnlp?sid=819&password=M.2B4BD5BEB64C670EE9D53F…
> >
> > Conf Code : 3375501449
> >
> > Global Numbers
> > Argentina 08004441016
> > Australia 1800337169
> > Austria 0800005898
> > Bahamas 18002054778
> > Bahrain 80004377
> > Barbados 18668556594
> > Belarus 882000110160
> > Belgium 080048325
> > Bolivia 800100768
> > Brazil 08008921002
> > Bulgaria 008001100236
> > Chile 800370228
> > Colombia 018005182186
> > Costa Rica 08000131048
> > Croatia (Hrvatska) 0800222320
> > Cyprus 80095297
> > Czech Republic 800701035
> > Denmark 80887114
> > Dominican Republic 18007519076
> > Ecuador 1800020545
> > Egypt, *SITF* 08000000188
> > El Salvador 8006699
> > Estonia 8000100232
> > Fiji 008002539
> > Finland 0800117116
> > France 0805632867
> > Germany 08006647541
> > Greece 00800127562
> > Guam 18773010136
> > Hong Kong 800930349
> > Hungary 0680014726
> > Iceland 8008967
> > India 180030104350
> > Indonesia, PT Telkom only 0078030179162
> > Indonesia, PT Indosat only 0018030179162
> > Ireland 1800932401
> > Israel 1809462557
> > Italy 800985897
> > Jamaica 18002050328
> > Japan 00531250120
> > Japan 0120934453
> > Kazakhstan 88003337376
> > Korea (South) 007986517393
> > Latvia 80003339
> > Lithuania 880031223
> > Luxembourg 80026595
> > Malaysia 1800814451
> > Malta 80062176
> > Mexico 018009269658
> > Monaco 80093642
> > Netherlands 08000222329
> > New Zealand 0800888167
> > Nicaragua 0018002202067
> > Norway 80013504
> > Panama 0018002043574
> > Peru 080052972
> > Philippines 180011100991
> > Poland 008001210187
> > Portugal 800814625
> > Romania 0800895537
> > Russian Federation 81080028341012
> > Saint Kitts and Nevis 18002059252
> > Saudi Arabia 8008445917
> > Singapore 8006162235
> > Slovak Republic 0800001441
> > Slovenia 080080471
> > South Africa 0800982957
> > Spain 800300524
> > Sweden 0200896860
> > Switzerland 0800650077
> > Taiwan 00801127141
> > Thailand 001800656966
> > Trinidad and Tobago 18002024615
> > Turkey 0080044632093
> > Turks and Caicos Islands 18772780472
> > Ukraine 0800500152
> > United Arab Emirates 8000440163
> > United Kingdom 08006948057
> > United States 8004518679
> > Uruguay 00040190315
> > Venezuela 8001627182
> > Vietnam 12011346
> > Virgin Islands (U.S.) 8773007428
> > Global Numbers
> > Australia, Adelaide 61870020130
> > Australia, Brisbane 61730870178
> > Australia, Melbourne 0382561740
> > Australia, Perth 61861884572
> > Australia, Sydney 0289852326
> > Austria, Vienna 012534978196
> > Belgium, Brussels 027920405
> > China, All Cities Domestic 4006205013
> > China, All Cities Domestic 8008190132
> > Czech Republic, Prague 239014984
> > Denmark, Copenhagen 32729215
> > Finland, Helsinki 0923194436
> > France, Paris 0170377140
> > Germany, Berlin 030300190579
> > Germany, Frankfurt 06922222594
> > Hong Kong, Hong Kong 85230730429
> > Hungary, Budapest 7789030
> > India, Bangalore 08039417180
> > India, Chennai 04430061276
> > India, Hyderabad 04030644055
> > India, Mumbai 02230985358
> > India, New Delhi 01139417180
> > Ireland, Dublin 014367793
> > Italy, Milan 0236269529
> > Japan, Tokyo 0345807897
> > Korea (South), Seoul 0234837408
> > Lithuania, Vilnius 52054226
> > Luxembourg, Luxembourg 24871157
> > Malaysia, Kuala Lumpur 0348190012
> > Netherlands, Amsterdam 0207975872
> > Norway, Oslo 21033188
> > Poland, Warsaw 222120148
> > Romania, Bucharest 0318103711
> > Russian Federation, Moscow 4999221989
> > Singapore, All Cities 64840858
> > Singapore, All Cities 64840858
> > Slovak Republic, Bratislava 0233456338
> > Slovenia, Ljubljana 016003991
> > Spain, Barcelona 935452328
> > Spain, Madrid 914146284
> > Sweden, Stockholm 0850513770
> > Switzerland, Geneva 0225927881
> > Switzerland, Zurich 0445803463
> > United Kingdom, All Cities 08445790678
> > United Kingdom, All Cities 02035746870
> > United States, All Cities 2127295016
> >
> >
>
> --
> Cheryn Tan, RHCSA, RHCVA
> Content Author
> Engineering Content Services
>
> Red Hat Asia Pacific
> Brisbane, Australia
> Phone: +61735148326
> Mobile: +61401562796
> cheryntan(a)redhat.com
>
1
0
Hi
I will give a presentation on Host Power Management - Deep Dive Deep Dive into Host Power Management
Details:
When:
MON JUL 29 16:00 - 17:00 (IST)
Elluminate session :
https://sas.elluminate.com/m.jnlp?sid=819&password=M.2B4BD5BEB64C670EE9D53F…
Conf Code : 3375501449
Global Numbers
Argentina 08004441016
Australia 1800337169
Austria 0800005898
Bahamas 18002054778
Bahrain 80004377
Barbados 18668556594
Belarus 882000110160
Belgium 080048325
Bolivia 800100768
Brazil 08008921002
Bulgaria 008001100236
Chile 800370228
Colombia 018005182186
Costa Rica 08000131048
Croatia (Hrvatska) 0800222320
Cyprus 80095297
Czech Republic 800701035
Denmark 80887114
Dominican Republic 18007519076
Ecuador 1800020545
Egypt, *SITF* 08000000188
El Salvador 8006699
Estonia 8000100232
Fiji 008002539
Finland 0800117116
France 0805632867
Germany 08006647541
Greece 00800127562
Guam 18773010136
Hong Kong 800930349
Hungary 0680014726
Iceland 8008967
India 180030104350
Indonesia, PT Telkom only 0078030179162
Indonesia, PT Indosat only 0018030179162
Ireland 1800932401
Israel 1809462557
Italy 800985897
Jamaica 18002050328
Japan 00531250120
Japan 0120934453
Kazakhstan 88003337376
Korea (South) 007986517393
Latvia 80003339
Lithuania 880031223
Luxembourg 80026595
Malaysia 1800814451
Malta 80062176
Mexico 018009269658
Monaco 80093642
Netherlands 08000222329
New Zealand 0800888167
Nicaragua 0018002202067
Norway 80013504
Panama 0018002043574
Peru 080052972
Philippines 180011100991
Poland 008001210187
Portugal 800814625
Romania 0800895537
Russian Federation 81080028341012
Saint Kitts and Nevis 18002059252
Saudi Arabia 8008445917
Singapore 8006162235
Slovak Republic 0800001441
Slovenia 080080471
South Africa 0800982957
Spain 800300524
Sweden 0200896860
Switzerland 0800650077
Taiwan 00801127141
Thailand 001800656966
Trinidad and Tobago 18002024615
Turkey 0080044632093
Turks and Caicos Islands 18772780472
Ukraine 0800500152
United Arab Emirates 8000440163
United Kingdom 08006948057
United States 8004518679
Uruguay 00040190315
Venezuela 8001627182
Vietnam 12011346
Virgin Islands (U.S.) 8773007428
Global Numbers
Australia, Adelaide 61870020130
Australia, Brisbane 61730870178
Australia, Melbourne 0382561740
Australia, Perth 61861884572
Australia, Sydney 0289852326
Austria, Vienna 012534978196
Belgium, Brussels 027920405
China, All Cities Domestic 4006205013
China, All Cities Domestic 8008190132
Czech Republic, Prague 239014984
Denmark, Copenhagen 32729215
Finland, Helsinki 0923194436
France, Paris 0170377140
Germany, Berlin 030300190579
Germany, Frankfurt 06922222594
Hong Kong, Hong Kong 85230730429
Hungary, Budapest 7789030
India, Bangalore 08039417180
India, Chennai 04430061276
India, Hyderabad 04030644055
India, Mumbai 02230985358
India, New Delhi 01139417180
Ireland, Dublin 014367793
Italy, Milan 0236269529
Japan, Tokyo 0345807897
Korea (South), Seoul 0234837408
Lithuania, Vilnius 52054226
Luxembourg, Luxembourg 24871157
Malaysia, Kuala Lumpur 0348190012
Netherlands, Amsterdam 0207975872
Norway, Oslo 21033188
Poland, Warsaw 222120148
Romania, Bucharest 0318103711
Russian Federation, Moscow 4999221989
Singapore, All Cities 64840858
Singapore, All Cities 64840858
Slovak Republic, Bratislava 0233456338
Slovenia, Ljubljana 016003991
Spain, Barcelona 935452328
Spain, Madrid 914146284
Sweden, Stockholm 0850513770
Switzerland, Geneva 0225927881
Switzerland, Zurich 0445803463
United Kingdom, All Cities 08445790678
United Kingdom, All Cities 02035746870
United States, All Cities 2127295016
1
0
i wanted to do something "easy", so i decided to test ovirt-live.
but there wasn't any for 3.3, so i'll just build it and test
this is where i got today:
1. finding: no ovirt-live in repos for test day
ok, i'll build it myself.
2. finding: the ovirt-live page[1] has no data on how to build it (link
to ovirt-live git repo, readme in git repo, etc.)
su -c "yum install git"[2]
git clone git://github.com/oVirt/ovirt-live
cd ovirt-live/centos
./build.sh
3. finding: build.sh is modifying a git managed file[3] - this shouldn't
happen.
4. finding: i didn't notice build.sh warned me that:
4.1 "You need to be root to perform this command."
since it continued downloading tinycore rather than exit with error
4.2 yum install -y livecd-tools
probably should be in readme to pre-install it, and ./build.sh should
check and exit with error, but not do the installation itself
5. yum install -y livecd-tools
finding: no livecd-toold in rhel - referencing an rpm or instructions to
build from source are needed
cd ~
git clone git://git.fedorahosted.org/livecd
cd livecd
su -c "make install"
cd ~/ovirt-live/centos
6. finding: re-running ./build.sh download tinycore and yad each time
instead of checking is-modified-since
7. finding: missing check on selinux is disabled (required by
livecd-creator according to [4])
(again looked in log, no info on screen):
Traceback (most recent call last):
File "/usr/bin/livecd-creator", line 26, in <module>
import selinux
ImportError: No module named selinux
disabled selinux.
su -c "setenforce 0"
(didn't resolve the selinux import error...)
8. finding: no validation libselinux-python is installed
su -c "yum install -y libselinux-python"
9. finding: missing pykickstart
from log: ImportError: No module named pykickstart.parser
su -c "yum localinstall -y
http://mirror.centos.org/centos/6/os/i386/Packages/pykickstart-1.74.12-1.el…"
(mirror link is from [6])
10. finding: missing rhpl
from log: ImportError: No module named rhpl.keyboard
su -c "yum localinstall -y
http://apt.sw.be/redhat/el6/en/x86_64/rpmforge/RPMS/rhpl-0.221-2.el6.rf.x86…"
(mirror link is from[5])
11. finding: (again) need to run as root
"You must run /usr/bin/livecd-creator as root"
12. finding: noise output when re-running ./build.sh
mkdir: cannot create directory `oVirtLiveFiles/rpms/': File exists
mkdir: cannot create directory `oVirtLiveFiles/iso/': File exists
mv: cannot stat `*.rpm': No such file or directory
mv: cannot stat `*.iso': No such file or directory
13. finding: awk warnings when running ./build.sh
awk: warning: escape sequence `\-' treated as plain `-'
awk: warning: escape sequence `\/' treated as plain `/'
14. finding: ovirt not spelled oVirt
Using title 'Ovirt Live' and product 'Ovirt Live'
15. finding: YumRepo Error: All mirror URLs are not using ftp, http[s]
or file.
Eg. 6Server is not a valid release or hasnt been released yet/
removing mirrorlist with no valid mirrors:
/var/tmp/imgcreate-W8OJmj/install_root/var/cache/yum/base/mirrorlist.txt
...
yum.Errors.RepoError: Cannot retrieve repository metadata (repomd.xml)
for repository: base. Please verify its path and try again
16. repo issues
(not all are bugs)
16.1. (not a bug) - I'm runnning on rhel, not centos, so base repo is
different for me than centos
16.2 (not a bug) - had to comment out yum-plugin-fastestmirror, not in
rhel (should be optional...)
16.3 finding: should be configurable - ovirt repo version - git is
static to 3.2, i needed beta. should be a parameter
16.4 finding: missing jboss-as package?
added to kickstart link to stable repo
16.5 finding: livecd-tools-13.4.4-2.el6.x86_64 requires /sbin/extlinux
doesn't help installing it, needs to be in the kickstart list of repos
i did some more fiddling with repos here, trying to run this on RHEL
rather than centos.
17. finding: Error creating Live CD : Unable to run
['/usr/bin/firewall-offline-cmd', '--enabled', '--service=mdns']!
hmmm - there is no firewall on .el6?
so far for today...
[1] http://www.ovirt.org/OVirt_Live
[2] i always start from the cleanest system when trying to test something.
[3] kickstart/ovirt-live-base.ks
[4]
https://fedoraproject.org/wiki/How_to_create_and_use_a_Live_CD?rd=How_to_cr…
"SELinux should be in permissive mode for livecd-creator to work. Run
the following as root user first before attempting to create a live CD
or DVD"
[5]
http://pkgs.org/centos-6-rhel-6/repoforge-x86_64/rhpl-0.221-2.el6.rf.x86_64…
[6]
http://pkgs.org/centos-6-rhel-6/centos-rhel-i386/pykickstart-1.74.12-1.el6.…
[7]
http://pkgs.org/centos-6-rhel-6/centos-rhel-x86_64/syslinux-extlinux-4.02-8…
1
0
Hello,
I am trying to setup fedora 19 server as ovirt node, but it still
failing with network. Here is the log:
2013-07-24 12:54:55 DEBUG otopi.plugins.otopi.dialog.machine
dialog.__logString:215 DIALOG:SEND 2013-07-24 12:54:55 DEBUG
otopi.plugins.ovirt_host_deploy.vdsm.bridge
bridge._rhel_getInterfaceConfigParameters:406 Readig interface
parameters of em1
2013-07-24 12:54:55 DEBUG otopi.plugins.otopi.dialog.machine
dialog.__logString:215 DIALOG:SEND 2013-07-24 12:54:55 DEBUG
otopi.plugins.ovirt_host_deploy.vdsm.bridge plugin.executeRaw:347
execute: ('/bin/nmcli', 'dev', 'list', 'iface', u'em1'),
executable='None', cwd='None', env=None
2013-07-24 12:54:55 DEBUG otopi.plugins.otopi.dialog.machine
dialog.__logString:215 DIALOG:SEND 2013-07-24 12:54:55 DEBUG
otopi.plugins.ovirt_host_deploy.vdsm.bridge plugin.executeRaw:364
execute-result: ('/bin/nmcli', 'dev', 'list', 'iface', u'em1'), rc=0
2013-07-24 12:54:55 DEBUG otopi.plugins.otopi.dialog.machine
dialog.__logString:215 DIALOG:SEND 2013-07-24 12:54:55 DEBUG
otopi.plugins.ovirt_host_deploy.vdsm.bridge plugin.execute:412
execute-output: ('/bin/nmcli', 'dev', 'list', 'iface', u'em1') stdout:
2013-07-24 12:54:55 DEBUG otopi.plugins.otopi.dialog.machine
dialog.__logString:215 DIALOG:SEND
GENERAL.DEVICE: em1
2013-07-24 12:54:55 DEBUG otopi.plugins.otopi.dialog.machine
dialog.__logString:215 DIALOG:SEND
GENERAL.TYPE: 802-3-ethernet
2013-07-24 12:54:55 DEBUG otopi.plugins.otopi.dialog.machine
dialog.__logString:215 DIALOG:SEND
GENERAL.VENDOR: Broadcom Corporation
2013-07-24 12:54:55 DEBUG otopi.plugins.otopi.dialog.machine
dialog.__logString:215 DIALOG:SEND
GENERAL.PRODUCT: --
2013-07-24 12:54:55 DEBUG otopi.plugins.otopi.dialog.machine
dialog.__logString:215 DIALOG:SEND
GENERAL.DRIVER: bnx2
2013-07-24 12:54:55 DEBUG otopi.plugins.otopi.dialog.machine
dialog.__logString:215 DIALOG:SEND
GENERAL.DRIVER-VERSION: 2.2.3
2013-07-24 12:54:55 DEBUG otopi.plugins.otopi.dialog.machine
dialog.__logString:215 DIALOG:SEND
GENERAL.FIRMWARE-VERSION: 6.2.12 bc 5.2.3 NCSI 2.0.11
2013-07-24 12:54:55 DEBUG otopi.plugins.otopi.dialog.machine
dialog.__logString:215 DIALOG:SEND
GENERAL.HWADDR: E8:9A:8F:13:AF:51
2013-07-24 12:54:55 DEBUG otopi.plugins.otopi.dialog.machine
dialog.__logString:215 DIALOG:SEND
GENERAL.STATE: 100 (connected)
2013-07-24 12:54:55 DEBUG otopi.plugins.otopi.dialog.machine
dialog.__logString:215 DIALOG:SEND
GENERAL.REASON: 0 (No reason given)
2013-07-24 12:54:55 DEBUG otopi.plugins.otopi.dialog.machine
dialog.__logString:215 DIALOG:SEND GENERAL.UDI:
/sys/devices/pci0000:00/0000:00:09.0/0000:07:00.0/net/em1
2013-07-24 12:54:55 DEBUG otopi.plugins.otopi.dialog.machine
dialog.__logString:215 DIALOG:SEND
GENERAL.IP-IFACE: em1
2013-07-24 12:54:55 DEBUG otopi.plugins.otopi.dialog.machine
dialog.__logString:215 DIALOG:SEND
GENERAL.NM-MANAGED: yes
2013-07-24 12:54:55 DEBUG otopi.plugins.otopi.dialog.machine
dialog.__logString:215 DIALOG:SEND
GENERAL.AUTOCONNECT: yes
2013-07-24 12:54:55 DEBUG otopi.plugins.otopi.dialog.machine
dialog.__logString:215 DIALOG:SEND
GENERAL.FIRMWARE-MISSING: no
2013-07-24 12:54:55 DEBUG otopi.plugins.otopi.dialog.machine
dialog.__logString:215 DIALOG:SEND GENERAL.CONNECTION:
/org/freedesktop/NetworkManager/ActiveConnection/0
2013-07-24 12:54:55 DEBUG otopi.plugins.otopi.dialog.machine
dialog.__logString:215 DIALOG:SEND
CAPABILITIES.CARRIER-DETECT: yes
2013-07-24 12:54:55 DEBUG otopi.plugins.otopi.dialog.machine
dialog.__logString:215 DIALOG:SEND
CAPABILITIES.SPEED: 1000 Mb/s
2013-07-24 12:54:55 DEBUG otopi.plugins.otopi.dialog.machine
dialog.__logString:215 DIALOG:SEND
CONNECTIONS.AVAILABLE-CONNECTION-PATHS:
/org/freedesktop/NetworkManager/Settings/{7}
2013-07-24 12:54:55 DEBUG otopi.plugins.otopi.dialog.machine
dialog.__logString:215 DIALOG:SEND CONNECTIONS.AVAILABLE-CONNECTIONS[1]:
4fa38552-a2e6-41f7-abf9-5257f5e80847 | eno1
2013-07-24 12:54:55 DEBUG otopi.plugins.otopi.dialog.machine
dialog.__logString:215 DIALOG:SEND
WIRED-PROPERTIES.CARRIER: on
2013-07-24 12:54:55 DEBUG otopi.plugins.otopi.dialog.machine
dialog.__logString:215 DIALOG:SEND
IP4.ADDRESS[1]: ip = 192.168.3.212/24, gw =
192.168.3.1
2013-07-24 12:54:55 DEBUG otopi.plugins.otopi.dialog.machine
dialog.__logString:215 DIALOG:SEND
IP4.DNS[1]: 192.168.3.214
2013-07-24 12:54:55 DEBUG otopi.plugins.otopi.dialog.machine
dialog.__logString:215 DIALOG:SEND
2013-07-24 12:54:55 DEBUG otopi.plugins.otopi.dialog.machine
dialog.__logString:215 DIALOG:SEND 2013-07-24 12:54:55 DEBUG
otopi.plugins.ovirt_host_deploy.vdsm.bridge plugin.execute:417
execute-output: ('/bin/nmcli', 'dev', 'list', 'iface', u'em1') stderr:
2013-07-24 12:54:55 DEBUG otopi.plugins.otopi.dialog.machine
dialog.__logString:215 DIALOG:SEND
2013-07-24 12:54:55 DEBUG otopi.plugins.otopi.dialog.machine
dialog.__logString:215 DIALOG:SEND
2013-07-24 12:54:55 DEBUG otopi.plugins.otopi.dialog.machine
dialog.__logString:215 DIALOG:SEND 2013-07-24 12:54:55 DEBUG
otopi.context context._executeMethod:130 method exception
2013-07-24 12:54:55 DEBUG otopi.plugins.otopi.dialog.machine
dialog.__logString:215 DIALOG:SEND Traceback (most recent call last):
2013-07-24 12:54:55 DEBUG otopi.plugins.otopi.dialog.machine
dialog.__logString:215 DIALOG:SEND File
"/tmp/ovirt-lKUu6m5wHT/pythonlib/otopi/context.py", line 120, in
_executeMethod
2013-07-24 12:54:55 DEBUG otopi.plugins.otopi.dialog.machine
dialog.__logString:215 DIALOG:SEND method['method']()
2013-07-24 12:54:55 DEBUG otopi.plugins.otopi.dialog.machine
dialog.__logString:215 DIALOG:SEND File
"/tmp/ovirt-lKUu6m5wHT/otopi-plugins/ovirt-host-deploy/vdsm/bridge.py",
line 751, in _misc
2013-07-24 12:54:55 DEBUG otopi.plugins.otopi.dialog.machine
dialog.__logString:215 DIALOG:SEND parameters =
self._rhel_getInterfaceConfigParameters(name=interface)
2013-07-24 12:54:55 DEBUG otopi.plugins.otopi.dialog.machine
dialog.__logString:215 DIALOG:SEND File
"/tmp/ovirt-lKUu6m5wHT/otopi-plugins/ovirt-host-deploy/vdsm/bridge.py",
line 465, in _rhel_getInterfaceConfigParameters
2013-07-24 12:54:55 DEBUG otopi.plugins.otopi.dialog.machine
dialog.__logString:215 DIALOG:SEND ''.ljust(prefix, '1') +
2013-07-24 12:54:55 DEBUG otopi.plugins.otopi.dialog.machine
dialog.__logString:215 DIALOG:SEND TypeError: an integer is required
2013-07-24 12:54:55 DEBUG otopi.plugins.otopi.dialog.machine
dialog.__logString:215 DIALOG:SEND 2013-07-24 12:54:55 ERROR
otopi.context context._executeMethod:139 Failed to execute stage 'Misc
configuration': an integer is required
2013-07-24 12:54:55 DEBUG otopi.plugins.otopi.dialog.machine
dialog.__logString:215 DIALOG:SEND 2013-07-24 12:54:55 DEBUG
otopi.transaction transaction.abort:131 aborting 'Yum Transaction'
3
6
This is a multi-part message in MIME format.
--------------060503070604080704040108
Content-Type: text/plain; charset=ISO-8859-1; format=flowed
Content-Transfer-Encoding: 7bit
oVirt 3.3 release is just around the corner, and we would like to invite
you all to take a taste of the new release [1].
http://www.ovirt.org/OVirt_3.3_TestDay
Tomorrow morning (when all repos are packages are ready - finger
crossed) you are invited to:
-checkout Test Day page [2]
-choose a distro (el6/fc19)
-install oVirt [3]
-check out the version
-report what you find
Developers and Users will be available in the channel/ on list to
support in case there is trouble
Thanks
[1] http://www.ovirt.org/OVirt_3.3_release-management
[2] http://www.ovirt.org/OVirt_3.3_TestDay
[3]http://www.ovirt.org/Download
--------------060503070604080704040108
Content-Type: text/html; charset=ISO-8859-1
Content-Transfer-Encoding: 7bit
<html>
<head>
<meta http-equiv="content-type" content="text/html; charset=ISO-8859-1">
</head>
<body bgcolor="#FFFFFF" text="#000000">
oVirt 3.3 release is just around the corner, and we would like to
invite you all to take a taste of the new release [1].<br>
<br>
<a href="http://www.ovirt.org/OVirt_3.3_TestDay">http://www.ovirt.org/OVirt_3.3_TestDay</a><br>
<br>
Tomorrow morning (when all repos are packages are ready - finger
crossed) you are invited to:<br>
<br>
-checkout Test Day page [2]<br>
-choose a distro (el6/fc19)<br>
-install oVirt [3]<br>
-check out the version<br>
-report what you find<br>
<br>
Developers and Users will be available in the channel/ on list to
support in case there is trouble <br>
<br>
Thanks<br>
<br>
<br>
[1]
<meta http-equiv="content-type" content="text/html;
charset=ISO-8859-1">
<a href="http://www.ovirt.org/OVirt_3.3_release-management">http://www.ovirt.org/OVirt_3.3_release-management</a><br>
[2]
<meta http-equiv="content-type" content="text/html;
charset=ISO-8859-1">
<a href="http://www.ovirt.org/OVirt_3.3_TestDay">http://www.ovirt.org/OVirt_3.3_TestDay</a><br>
[3]<a class="moz-txt-link-freetext" href="http://www.ovirt.org/Download">http://www.ovirt.org/Download</a> <br>
<br>
</body>
</html>
--------------060503070604080704040108--
1
1