<div dir="ltr"><div><div><div><div>Dan,<br><br></div>It looks like it was one of the calls triggered when vdsm was down:<br><br><pre>2018-04-03 05:30:16,065-0400 INFO (mailbox-hsm) [storage.MailBox.HsmMailMonitor] HSM_MailMonitor sending mail to SPM - ['/usr/bin/dd', 'of=/rhev/data-center/ddb765d2-2137-437d-95f8-c46dbdbc7711/mastersd/dom_md/inbox', 'iflag=fullblock', 'oflag=direct', 'conv=notrunc', 'bs=4096', 'count=1', 'seek=1'] (mailbox:387)
2018-04-03 05:31:22,441-0400 INFO (MainThread) [vds] (PID: 20548) I am the actual vdsm 4.20.23-28.gitd11ed44.el7.centos lago-basic-suite-4-2-host-0 (3.10.0-693.21.1.el7.x86_64) (vdsmd:149)</pre><br></div>which failed and caused timeout.<br><br></div>Thanks,<br></div>Piotr<br></div><div class="gmail_extra"><br><div class="gmail_quote">On Tue, Apr 3, 2018 at 1:57 PM, Dan Kenigsberg <span dir="ltr"><<a href="mailto:danken@redhat.com" target="_blank">danken@redhat.com</a>></span> wrote:<br><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex">On Tue, Apr 3, 2018 at 2:07 PM, Barak Korren <<a href="mailto:bkorren@redhat.com">bkorren@redhat.com</a>> wrote:<br>
> Test failed: [ 006_migrations.prepare_<wbr>migration_attachments_ipv6 ]<br>
><br>
> Link to suspected patches:<br>
><br>
> (Patch seems unrelated - do we have sporadic communication issues<br>
> arising in PST?)<br>
> <a href="https://gerrit.ovirt.org/c/89737/1" rel="noreferrer" target="_blank">https://gerrit.ovirt.org/c/<wbr>89737/1</a> - vdsm - automation: check-patch:<br>
> attempt to install vdsm-gluster<br>
><br>
> Link to Job:<br>
> <a href="http://jenkins.ovirt.org/job/ovirt-4.2_change-queue-tester/1521/" rel="noreferrer" target="_blank">http://jenkins.ovirt.org/job/<wbr>ovirt-4.2_change-queue-tester/<wbr>1521/</a><br>
><br>
> Link to all logs:<br>
> <a href="http://jenkins.ovirt.org/job/ovirt-4.2_change-queue-tester/1521/artifact/exported-artifacts/basic-suit-4.2-el7/test_logs/basic-suite-4.2/post-006_migrations.py/" rel="noreferrer" target="_blank">http://jenkins.ovirt.org/job/<wbr>ovirt-4.2_change-queue-tester/<wbr>1521/artifact/exported-<wbr>artifacts/basic-suit-4.2-el7/<wbr>test_logs/basic-suite-4.2/<wbr>post-006_migrations.py/</a><br>
><br>
> Error snippet from log:<br>
><br>
> <error><br>
><br>
> Traceback (most recent call last):<br>
> File "/usr/lib64/python2.7/<wbr>unittest/case.py", line 369, in run<br>
> testMethod()<br>
> File "/usr/lib/python2.7/site-<wbr>packages/nose/case.py", line 197, in runTest<br>
> self.test(*self.arg)<br>
> File "/usr/lib/python2.7/site-<wbr>packages/ovirtlago/testlib.py"<wbr>, line<br>
> 129, in wrapped_test<br>
> test()<br>
> File "/usr/lib/python2.7/site-<wbr>packages/ovirtlago/testlib.py"<wbr>, line<br>
> 59, in wrapper<br>
> return func(get_test_prefix(), *args, **kwargs)<br>
> File "/usr/lib/python2.7/site-<wbr>packages/ovirtlago/testlib.py"<wbr>, line<br>
> 78, in wrapper<br>
> prefix.virt_env.engine_vm().<wbr>get_api(api_ver=4), *args, **kwargs<br>
> File "/home/jenkins/workspace/<wbr>ovirt-4.2_change-queue-tester/<wbr>ovirt-system-tests/basic-<wbr>suite-4.2/test-scenarios/006_<wbr>migrations.py",<br>
> line 139, in prepare_migration_attachments_<wbr>ipv6<br>
> engine, host_service, MIGRATION_NETWORK, ip_configuration)<br>
> File "/home/jenkins/workspace/<wbr>ovirt-4.2_change-queue-tester/<wbr>ovirt-system-tests/basic-<wbr>suite-4.2/test_utils/network_<wbr>utils_v4.py",<br>
> line 71, in modify_ip_config<br>
> check_connectivity=True)<br>
> File "/usr/lib64/python2.7/site-<wbr>packages/ovirtsdk4/services.<wbr>py",<br>
> line 36729, in setup_networks<br>
> return self._internal_action(action, 'setupnetworks', None,<br>
> headers, query, wait)<br>
> File "/usr/lib64/python2.7/site-<wbr>packages/ovirtsdk4/service.py"<wbr>, line<br>
> 299, in _internal_action<br>
> return future.wait() if wait else future<br>
> File "/usr/lib64/python2.7/site-<wbr>packages/ovirtsdk4/service.py"<wbr>, line<br>
> 55, in wait<br>
> return self._code(response)<br>
> File "/usr/lib64/python2.7/site-<wbr>packages/ovirtsdk4/service.py"<wbr>, line<br>
> 296, in callback<br>
> self._check_fault(response)<br>
> File "/usr/lib64/python2.7/site-<wbr>packages/ovirtsdk4/service.py"<wbr>, line<br>
> 132, in _check_fault<br>
> self._raise_error(response, body)<br>
> File "/usr/lib64/python2.7/site-<wbr>packages/ovirtsdk4/service.py"<wbr>, line<br>
> 118, in _raise_error<br>
> raise error<br>
> Error: Fault reason is "Operation Failed". Fault detail is "[Network<br>
> error during communication with the Host.]". HTTP response code is<br>
> 400.<br>
<br>
The error occurred sometime in the interval<br>
<br>
09:32:58 [basic-suit] @ Run test: 006_migrations.py:<br>
09:33:55 [basic-suit] Error occured, aborting<br>
<br>
and indeed<br>
<br>
<a href="http://jenkins.ovirt.org/job/ovirt-4.2_change-queue-tester/1521/artifact/exported-artifacts/basic-suit-4.2-el7/test_logs/basic-suite-4.2/post-006_migrations.py/lago-basic-suite-4-2-engine/_var_log/ovirt-engine/engine.log/*view*/" rel="noreferrer" target="_blank">http://jenkins.ovirt.org/job/<wbr>ovirt-4.2_change-queue-tester/<wbr>1521/artifact/exported-<wbr>artifacts/basic-suit-4.2-el7/<wbr>test_logs/basic-suite-4.2/<wbr>post-006_migrations.py/lago-<wbr>basic-suite-4-2-engine/_var_<wbr>log/ovirt-engine/engine.log/*<wbr>view*/</a><br>
<br>
has Engine disconnected from the host at<br>
<br>
2018-04-03 05:33:32,307-04 ERROR<br>
[org.ovirt.engine.core.<wbr>vdsbroker.monitoring.<wbr>HostMonitoring]<br>
(EE-ManagedThreadFactory-<wbr>engineScheduled-Thread-39) [] Unable to<br>
RefreshCapabilities: VDSNetworkException: VDSGenericException:<br>
VDSNetworkException: Vds timeout occured<br>
<br>
Maybe Piotr can read more into it.<br>
</blockquote></div><br></div>