----- Original Message -----
> From: "Francesco Romani" <fromani(a)redhat.com>
> To: devel(a)ovirt.org
> Cc: "users" <users(a)ovirt.org>
> Sent: Wednesday, September 17, 2014 5:33:01 PM
> Subject: [ovirt-users] OVIRT-3.5-TEST-DAY-3: replace XML-rpc with JSON-rpc
> Everything I tried went OK, and logs look good to me.
>
> I run in a few hiccups, which I mention for the sake of completeness:
> - VDSM refused to start or run VMs initially: libvirt config included relics
> from past
> environment on the same box, not JSON-rpc fault. Fixed with new config and
> (later) a reboot.
> - Trying recovery, Engine took longer than expected to sync up with VDSM.
> I have not hard data and feeling is not enough to file a BZ, so I didn't.
> - Still trying recovery, one and just one time Engine had stale data from
> VDSM (reported two
> VMs as present which actually aren't). Not sure it was related to JSON-rpc,
> can't reproduce,
> so not filed a BZ.
I need to partially amend this statement as, running more benchmarks/profiling,
I got this twice in a row
INFO:root:starting 100 vms
INFO:root:start: serial execution
INFO:root:Starting VM: XS_C000
INFO:root:Starting VM: XS_C001
INFO:root:Starting VM: XS_C002
Traceback (most recent call last):
File "./observe.py", line 154, in <module>
data = bench(host, 'XS_C%03i', first, last, api, outfile, mins * 60.)
File "./observe.py", line 122, in bench
start(vms)
File "./observe.py", line 66, in start
vm.start()
File "./observe.py", line 54, in start
self._handle.start()
File "/usr/lib/python2.7/site-packages/ovirtsdk/infrastructure/brokers.py",
line 16507, in start
headers={"Correlation-Id":correlation_id}
File "/usr/lib/python2.7/site-packages/ovirtsdk/infrastructure/proxy.py", line
118, in request
persistent_auth=self._persistent_auth)
File "/usr/lib/python2.7/site-packages/ovirtsdk/infrastructure/proxy.py", line
140, in __doRequest
persistent_auth=persistent_auth
File "/usr/lib/python2.7/site-packages/ovirtsdk/web/connection.py", line 134,
in doRequest
raise RequestError, response
ovirtsdk.infrastructure.errors.RequestError:
status: 400
reason: Bad Request
detail: Network error during communication with the Host.
(this is a runner script using ovirt sdk for python, source is available on demand and
will be
published anyway soon[ish])
On engine logs I see something alike this:
http://fpaste.org/134263/
Since the above is way too vague to file a meaningful BZ, I'm now continuing the
investigation
to see if there is a bug somewhere or if it's a hiccup of my local environment.
I just want to note that I have been experiencing vague, intermittent
jsonRPC issues with my environment also. I have filed 1143042 which I
believe to be a symptom of unreliable communication. It seems to me
that we have a definite problem to work out.
--
Adam Litke