Barak, I was getting these 400s and 409s sporadically all last week while
iterating on my docker stuff. I thought maybe it was my messing with the
http_proxy stuff or doing docker rms. Is it possible I'm breaking things?
I'm still working on it. Been working in straight for a while now:
On Tue, Apr 3, 2018 at 7:51 AM, Barak Korren <bkorren(a)redhat.com> wrote:
On 3 April 2018 at 14:07, Barak Korren <bkorren(a)redhat.com>
wrote:
> Test failed: [ 006_migrations.prepare_migration_attachments_ipv6 ]
>
> Link to suspected patches:
>
> (Patch seems unrelated - do we have sporadic communication issues
> arising in PST?)
>
https://gerrit.ovirt.org/c/89737/1 - vdsm - automation: check-patch:
> attempt to install vdsm-gluster
>
> Link to Job:
>
http://jenkins.ovirt.org/job/ovirt-4.2_change-queue-tester/1521/
>
> Link to all logs:
>
http://jenkins.ovirt.org/job/ovirt-4.2_change-queue-tester/
1521/artifact/exported-artifacts/basic-suit-4.2-el7/
test_logs/basic-suite-4.2/post-006_migrations.py/
>
> Error snippet from log:
>
> <error>
>
> Traceback (most recent call last):
> File "/usr/lib64/python2.7/unittest/case.py", line 369, in run
> testMethod()
> File "/usr/lib/python2.7/site-packages/nose/case.py", line 197, in
runTest
> self.test(*self.arg)
> File "/usr/lib/python2.7/site-packages/ovirtlago/testlib.py", line
> 129, in wrapped_test
> test()
> File "/usr/lib/python2.7/site-packages/ovirtlago/testlib.py", line
> 59, in wrapper
> return func(get_test_prefix(), *args, **kwargs)
> File "/usr/lib/python2.7/site-packages/ovirtlago/testlib.py", line
> 78, in wrapper
> prefix.virt_env.engine_vm().get_api(api_ver=4), *args, **kwargs
> File "/home/jenkins/workspace/ovirt-4.2_change-queue-tester/
ovirt-system-tests/basic-suite-4.2/test-scenarios/006_migrations.py",
> line 139, in prepare_migration_attachments_ipv6
> engine, host_service, MIGRATION_NETWORK, ip_configuration)
> File "/home/jenkins/workspace/ovirt-4.2_change-queue-tester/
ovirt-system-tests/basic-suite-4.2/test_utils/network_utils_v4.py",
> line 71, in modify_ip_config
> check_connectivity=True)
> File "/usr/lib64/python2.7/site-packages/ovirtsdk4/services.py",
> line 36729, in setup_networks
> return self._internal_action(action, 'setupnetworks', None,
> headers, query, wait)
> File "/usr/lib64/python2.7/site-packages/ovirtsdk4/service.py", line
> 299, in _internal_action
> return future.wait() if wait else future
> File "/usr/lib64/python2.7/site-packages/ovirtsdk4/service.py", line
> 55, in wait
> return self._code(response)
> File "/usr/lib64/python2.7/site-packages/ovirtsdk4/service.py", line
> 296, in callback
> self._check_fault(response)
> File "/usr/lib64/python2.7/site-packages/ovirtsdk4/service.py", line
> 132, in _check_fault
> self._raise_error(response, body)
> File "/usr/lib64/python2.7/site-packages/ovirtsdk4/service.py", line
> 118, in _raise_error
> raise error
> Error: Fault reason is "Operation Failed". Fault detail is "[Network
> error during communication with the Host.]". HTTP response code is
> 400.
>
>
>
> </error>
>
Same failure seems to have happened again - on a different patch -
this time foe ovirt-engine:
https://gerrit.ovirt.org/#/c/89748/1
Failed test run:
http://jenkins.ovirt.org/job/ovirt-4.2_change-queue-tester/1523/
--
Barak Korren
RHV DevOps team , RHCE, RHCi
Red Hat EMEA
redhat.com | TRIED. TESTED. TRUSTED. |
redhat.com/trusted
_______________________________________________
Devel mailing list
Devel(a)ovirt.org
http://lists.ovirt.org/mailman/listinfo/devel
--
GREG SHEREMETA
SENIOR SOFTWARE ENGINEER - TEAM LEAD - RHV UX
Red Hat NA
<