<div dir="ltr"><div>Hi,</div><div><br></div><div>thanks for the advice. The upgrade is already scheduled, but I would like to fix this issue before proceeding with a big upgrade (unless an upgrade will fixes the problem).</div><div><br></div><div>The problem is on all hypervisors.</div><div><br></div><div>We have 2 cluster (both connected to the same storage system):</div><div> - the old one with FC</div><div> - the new one with FCoE</div><div><br></div><div><br></div><div>With dmesg -T and looking at /var/log/messages we found several problems like these:</div><div><br></div><div>1) </div><div>[Wed May 3 10:40:11 2017] sd 12:0:0:3: Parameters changed</div><div>[Wed May 3 10:40:11 2017] sd 12:0:1:3: Parameters changed</div><div>[Wed May 3 10:40:11 2017] sd 12:0:1:1: Parameters changed</div><div>[Wed May 3 10:40:12 2017] sd 13:0:0:1: Parameters changed</div><div>[Wed May 3 10:40:12 2017] sd 13:0:0:3: Parameters changed</div><div>[Wed May 3 10:40:12 2017] sd 13:0:1:3: Parameters changed</div><div>[Wed May 3 12:39:32 2017] device-mapper: multipath: Failing path 65:144.</div><div>[Wed May 3 12:39:37 2017] sd 13:0:1:2: alua: port group 01 state A preferred supports tolUsNA</div><div><br></div><div>2)</div><div>[Wed May 3 17:08:17 2017] perf interrupt took too long (2590 > 2500), lowering kernel.perf_event_max_sample_rate to 50000</div><div><br></div><div>3)</div><div>[Wed May 3 19:16:21 2017] bnx2fc: els 0x5: tgt not ready</div><div>[Wed May 3 19:16:21 2017] bnx2fc: Relogin to the tgt</div><div><br></div><div>4)</div><div>sd 13:0:1:0: [sdx] FAILED Result: hostbyte=DID_ERROR driverbyte=DRIVER_OK</div><div>sd 13:0:1:0: [sdx] CDB: Read(16) 88 00 00 00 00 00 00 58 08 00 00 00 04 00 00 00</div><div>blk_update_request: I/O error, dev sdx, sector 5769216</div><div>device-mapper: multipath: Failing path 65:112.</div><div>sd 13:0:1:0: alua: port group 01 state A preferred supports tolUsNA</div><div><br></div><div>5)</div><div>multipathd: 360060160a6213400cce46e40949de411: sdaa - emc_clariion_checker: Read error for WWN 60060160a6213400cce46e40949de411. Sense data are 0x0/0x0/0x0.</div><div>multipathd: checker failed path 65:160 in map 360060160a6213400cce46e40949de411</div><div>multipathd: 360060160a6213400cce46e40949de411: remaining active paths: 3</div><div>kernel: device-mapper: multipath: Failing path 65:160.</div><div>multipathd: 360060160a6213400cce46e40949de411: sdaa - emc_clariion_checker: Active path is healthy.</div><div>multipathd: 65:160: reinstated</div><div>multipathd: 360060160a6213400cce46e40949de411: remaining active paths: 4</div><div><br></div><div>6)</div><div>[Sat May 6 11:37:07 2017] megaraid_sas 0000:02:00.0: Firmware crash dump is not available</div><div><br></div><div><br></div><div>Multipath configuration is the following (recommended by EMC):</div><div><br></div><div># RHEV REVISION 1.1</div><div># RHEV PRIVATE</div><div><br></div><div>devices {</div><div>device {</div><div> vendor "DGC"</div><div> product ".*"</div><div> product_blacklist "LUNZ"</div><div> path_grouping_policy group_by_prio</div><div> path_selector "round-robin 0"</div><div> path_checker emc_clariion</div><div> features "1 queue_if_no_path"</div><div> hardware_handler "1 alua"</div><div> prio alua</div><div> failback immediate</div><div> rr_weight uniform</div><div> no_path_retry 60</div><div> rr_min_io 1</div><div>}</div><div>}</div><div><br></div><div>Regards,</div><div><br></div><div>Stefano</div><div><br></div></div><div class="gmail_extra"><br><div class="gmail_quote">2017-05-07 8:36 GMT+02:00 Yaniv Kaul <span dir="ltr"><<a href="mailto:ykaul@redhat.com" target="_blank">ykaul@redhat.com</a>></span>:<br><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex"><div dir="ltr"><br><div class="gmail_extra"><br><div class="gmail_quote"><span class="">On Tue, May 2, 2017 at 11:09 PM, Stefano Bovina <span dir="ltr"><<a href="mailto:bovy89@gmail.com" target="_blank">bovy89@gmail.com</a>></span> wrote:<br><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex"><div dir="ltr"><div style="font-size:12.8px">Hi, the engine logs show high latency on storage domains: "Storage domain <xxxx> experienced a high latency of 19.2814 seconds from ...... This may cause performance and functional issues."</div><div style="font-size:12.8px"><br></div><div style="font-size:12.8px">Looking at host logs, I found also these locking errors:</div><div style="font-size:12.8px"><br></div><div style="font-size:12.8px">2017-05-02 20:52:13+0200 33883 [10098]: s1 renewal error -202 delta_length 10 last_success 33853</div><div style="font-size:12.8px">2017-05-02 20:52:19+0200 33889 [10098]: 6a386652 aio collect 0 0x7f1fb80008c0:0x7f1fb80008d0:<wbr>0x7f1fbe9fb000 result 1048576:0 other free</div><div style="font-size:12.8px">2017-05-02 21:08:51+0200 34880 [10098]: 6a386652 aio timeout 0 0x7f1fb80008c0:0x7f1fb80008d0:<wbr>0x7f1fbe4f2000 ioto 10 to_count 24</div><div style="font-size:12.8px">2017-05-02 21:08:51+0200 34880 [10098]: s1 delta_renew read rv -202 offset 0 /dev/6a386652-629d-4045-835b-2<wbr>1d2f5c104aa/ids</div><div style="font-size:12.8px">2017-05-02 21:08:51+0200 34880 [10098]: s1 renewal error -202 delta_length 10 last_success 34850</div><div style="font-size:12.8px">2017-05-02 21:08:53+0200 34883 [10098]: 6a386652 aio collect 0 0x7f1fb80008c0:0x7f1fb80008d0:<wbr>0x7f1fbe4f2000 result 1048576:0 other free</div><div style="font-size:12.8px">2017-05-02 21:30:40+0200 36189 [10098]: 6a386652 aio timeout 0 0x7f1fb80008c0:0x7f1fb80008d0:<wbr>0x7f1fbe9fb000 ioto 10 to_count 25</div><div style="font-size:12.8px">2017-05-02 21:30:40+0200 36189 [10098]: s1 delta_renew read rv -202 offset 0 /dev/6a386652-629d-4045-835b-2<wbr>1d2f5c104aa/ids</div><div style="font-size:12.8px">2017-05-02 21:30:40+0200 36189 [10098]: s1 renewal error -202 delta_length 10 last_success 36159</div><div style="font-size:12.8px">2017-05-02 21:30:45+0200 36195 [10098]: 6a386652 aio collect 0 0x7f1fb80008c0:0x7f1fb80008d0:<wbr>0x7f1fbe9fb000 result 1048576:0 other free</div><div style="font-size:12.8px"><br></div><div style="font-size:12.8px">and this vdsm errors too:</div><div style="font-size:12.8px"><br></div><div style="font-size:12.8px">Thread-22::ERROR::2017-05-02 21:53:48,147::sdc::137::Storag<wbr>e.StorageDomainCache::(_findDo<wbr>main) looking for unfetched domain f8f21d6c-2425-45c4-aded-4cb9b5<wbr>3ebd96</div><div style="font-size:12.8px">Thread-22::ERROR::2017-05-02 21:53:48,148::sdc::154::Storag<wbr>e.StorageDomainCache::(_findUn<wbr>fetchedDomain) looking for domain f8f21d6c-2425-45c4-aded-4cb9b5<wbr>3ebd96</div><div style="font-size:12.8px"><br></div><div style="font-size:12.8px">Engine instead is showing this errors:</div><div style="font-size:12.8px"><br></div><div style="font-size:12.8px">2017-05-02 21:40:38,089 ERROR [org.ovirt.engine.core.vdsbrok<wbr>er.vdsbroker.SpmStatusVDSComma<wbr>nd] (DefaultQuartzScheduler_Worker<wbr>-96) Command SpmStatusVDSCommand(HostName = <<a href="http://myhost.example.com/" target="_blank">myhost.example.com</a>>, HostId = dcc0275a-b011-4e33-bb95-366ffb<wbr>0697b3, storagePoolId = 715d1ba2-eabe-48db-9aea-c28c30<wbr>359808) execution failed. Exception: VDSErrorException: VDSGenericException: VDSErrorException: Failed to SpmStatusVDS, error = (-202, 'Sanlock resource read failure', 'Sanlock exception'), code = 100</div><div style="font-size:12.8px">2017-05-02 21:41:08,431 ERROR [org.ovirt.engine.core.vdsbrok<wbr>er.vdsbroker.SpmStatusVDSComma<wbr>nd] (DefaultQuartzScheduler_Worker<wbr>-53) [6e0d5ebf] Failed in SpmStatusVDS method</div><div style="font-size:12.8px">2017-05-02 21:41:08,443 ERROR [org.ovirt.engine.core.vdsbrok<wbr>er.vdsbroker.SpmStatusVDSComma<wbr>nd] (DefaultQuartzScheduler_Worker<wbr>-53) [6e0d5ebf] Command SpmStatusVDSCommand(HostName = <<a href="http://myhost.example.com/" target="_blank">myhost.example.com</a>>, HostId = 7991933e-5f30-48cd-88bf-b0b525<wbr>613384, storagePoolId = 4bd73239-22d0-4c44-ab8c-17adcd<wbr>580309) execution failed. Exception: VDSErrorException: VDSGenericException: VDSErrorException: Failed to SpmStatusVDS, error = (-202, 'Sanlock resource read failure', 'Sanlock exception'), code = 100</div><div style="font-size:12.8px">2017-05-02 21:41:31,975 ERROR [org.ovirt.engine.core.vdsbrok<wbr>er.vdsbroker.SpmStatusVDSComma<wbr>nd] (DefaultQuartzScheduler_Worker<wbr>-61) [2a54a1b2] Failed in SpmStatusVDS method</div><div style="font-size:12.8px">2017-05-02 21:41:31,987 ERROR [org.ovirt.engine.core.vdsbrok<wbr>er.vdsbroker.SpmStatusVDSComma<wbr>nd] (DefaultQuartzScheduler_Worker<wbr>-61) [2a54a1b2] Command SpmStatusVDSCommand(HostName = <<a href="http://myhost.example.com/" target="_blank">myhost.example.com</a>>, HostId = dcc0275a-b011-4e33-bb95-366ffb<wbr>0697b3, storagePoolId = 715d1ba2-eabe-48db-9aea-c28c30<wbr>359808) execution failed. Exception: VDSErrorException: VDSGenericException: VDSErrorException: Failed to SpmStatusVDS, error = (-202, 'Sanlock resource read failure', 'Sanlock exception'), code = 100</div><div style="font-size:12.8px"><br></div><div style="font-size:12.8px"><br></div><div style="font-size:12.8px">I'm using Fibre Channel or FCoE connectivity; storage array technical support has analyzed it (also switch and OS configurations), but nothing has been found.</div></div></blockquote><div><br></div></span><div>Is this on a specific hosts, or multiple hosts?</div><div>Is that FC or FCoE? Anything on the host's /var/log/messages?</div><span class=""><div> </div><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex"><div dir="ltr"><div style="font-size:12.8px"><br></div><div style="font-size:12.8px">Any advice?</div><div style="font-size:12.8px"><br></div><div style="font-size:12.8px">Thanks</div><div style="font-size:12.8px"><br></div><div style="font-size:12.8px"><br></div><div style="font-size:12.8px">Installation info:</div><div style="font-size:12.8px"><br></div><div style="font-size:12.8px">ovirt-release35-006-1.noarch</div></div></blockquote><div><br></div></span><div>This is a very old release, I suggest, regardless of this issue, to upgrade.</div><div>Y.</div><div> </div><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex"><div><div class="h5"><div dir="ltr"><div style="font-size:12.8px">libgovirt-0.3.3-1.el7_2.1.x86_<wbr>64</div><div style="font-size:12.8px">vdsm-4.16.30-0.el7.centos.x86_<wbr>64</div><div style="font-size:12.8px">vdsm-xmlrpc-4.16.30-0.el7.cent<wbr>os.noarch</div><div style="font-size:12.8px">vdsm-yajsonrpc-4.16.30-0.el7.c<wbr>entos.noarch</div><div style="font-size:12.8px">vdsm-jsonrpc-4.16.30-0.el7.cen<wbr>tos.noarch</div><div style="font-size:12.8px">vdsm-python-zombiereaper-4.16.<wbr>30-0.el7.centos.noarch</div><div style="font-size:12.8px">vdsm-python-4.16.30-0.el7.cent<wbr>os.noarch</div><div style="font-size:12.8px">vdsm-cli-4.16.30-0.el7.centos.<wbr>noarch</div><div style="font-size:12.8px">qemu-kvm-ev-2.3.0-29.1.el7.x86<wbr>_64</div><div style="font-size:12.8px">qemu-kvm-common-ev-2.3.0-29.1.<wbr>el7.x86_64</div><div style="font-size:12.8px">qemu-kvm-tools-ev-2.3.0-29.1.e<wbr>l7.x86_64</div><div style="font-size:12.8px">libvirt-client-1.2.17-13.el7_2<wbr>.3.x86_64</div><div style="font-size:12.8px">libvirt-daemon-driver-storage-<wbr>1.2.17-13.el7_2.3.x86_64</div><div style="font-size:12.8px">libvirt-python-1.2.17-2.el7.x8<wbr>6_64</div><div style="font-size:12.8px">libvirt-daemon-driver-nwfilter<wbr>-1.2.17-13.el7_2.3.x86_64</div><div style="font-size:12.8px">libvirt-daemon-driver-nodedev-<wbr>1.2.17-13.el7_2.3.x86_64</div><div style="font-size:12.8px">libvirt-lock-sanlock-1.2.17-13<wbr>.el7_2.3.x86_64</div><div style="font-size:12.8px">libvirt-glib-0.1.9-1.el7.x86_6<wbr>4</div><div style="font-size:12.8px">libvirt-daemon-driver-network-<wbr>1.2.17-13.el7_2.3.x86_64</div><div style="font-size:12.8px">libvirt-daemon-driver-lxc-1.2.<wbr>17-13.el7_2.3.x86_64</div><div style="font-size:12.8px">libvirt-daemon-driver-interfac<wbr>e-1.2.17-13.el7_2.3.x86_64</div><div style="font-size:12.8px">libvirt-1.2.17-13.el7_2.3.x86_<wbr>64</div><div style="font-size:12.8px">libvirt-daemon-1.2.17-13.el7_2<wbr>.3.x86_64</div><div style="font-size:12.8px">libvirt-daemon-config-network-<wbr>1.2.17-13.el7_2.3.x86_64</div><div style="font-size:12.8px">libvirt-daemon-driver-secret-1<wbr>.2.17-13.el7_2.3.x86_64</div><div style="font-size:12.8px">libvirt-daemon-config-nwfilter<wbr>-1.2.17-13.el7_2.3.x86_64</div><div style="font-size:12.8px">libvirt-daemon-kvm-1.2.17-13.e<wbr>l7_2.3.x86_64</div><div style="font-size:12.8px">libvirt-daemon-driver-qemu-1.2<wbr>.17-13.el7_2.3.x86_64</div><div style="font-size:12.8px"><br></div><div style="font-size:12.8px"><br></div><div style="font-size:12.8px"><br></div><div style="font-size:12.8px">------- vdsm.log (high latency)</div><div style="font-size:12.8px"><br></div><div style="font-size:12.8px">Thread-21::DEBUG::2017-05-02 18:02:23,646::fileSD::261::Sto<wbr>rage.Misc.excCmd::(getReadDela<wbr>y) SUCCESS: <err> = '0+1 records in\n0+1 records out\n331 bytes (331 B) copied, 0.000285529 s, 1.2 MB/s\n'; <rc> = 0</div><div style="font-size:12.8px">Thread-18::DEBUG::2017-05-02 18:02:24,335::blockSD::596::St<wbr>orage.Misc.excCmd::(getReadDel<wbr>ay) /bin/dd if=/dev/2c501858-bf8d-49a5-a42<wbr>b-bca341b47827/metadata iflag=direct of=/dev/null bs=4096 count=1 (cwd None)</div><div style="font-size:12.8px">Thread-18::DEBUG::2017-05-02 18:02:24,343::blockSD::596::St<wbr>orage.Misc.excCmd::(getReadDel<wbr>ay) SUCCESS: <err> = '1+0 records in\n1+0 records out\n4096 bytes (4.1 kB) copied, 0.000489782 s, 8.4 MB/s\n'; <rc> = 0</div><div style="font-size:12.8px">Thread-19::DEBUG::2017-05-02 18:02:24,635::blockSD::596::St<wbr>orage.Misc.excCmd::(getReadDel<wbr>ay) /bin/dd if=/dev/c5dc09c4-cd79-42d2-a5a<wbr>f-89953076f79e/metadata iflag=direct of=/dev/null bs=4096 count=1 (cwd None)</div><div style="font-size:12.8px">Thread-19::DEBUG::2017-05-02 18:02:24,643::blockSD::596::St<wbr>orage.Misc.excCmd::(getReadDel<wbr>ay) SUCCESS: <err> = '1+0 records in\n1+0 records out\n4096 bytes (4.1 kB) copied, 0.000558622 s, 7.3 MB/s\n'; <rc> = 0</div><div style="font-size:12.8px">Thread-22::DEBUG::2017-05-02 18:02:24,722::blockSD::596::St<wbr>orage.Misc.excCmd::(getReadDel<wbr>ay) /bin/dd if=/dev/f8f21d6c-2425-45c4-ade<wbr>d-4cb9b53ebd96/metadata iflag=direct of=/dev/null bs=4096 count=1 (cwd None)</div><div style="font-size:12.8px">Thread-22::DEBUG::2017-05-02 18:02:24,729::blockSD::596::St<wbr>orage.Misc.excCmd::(getReadDel<wbr>ay) SUCCESS: <err> = '1+0 records in\n1+0 records out\n4096 bytes (4.1 kB) copied, 0.000632576 s, 6.5 MB/s\n'; <rc> = 0</div><div style="font-size:12.8px">JsonRpc (StompReactor)::DEBUG::2017-05<wbr>-02 18:02:25,056::stompReactor::98<wbr>::Broker.StompAdapter::(handle<wbr>_frame) Handling message <StompFrame command='SEND'></div><div style="font-size:12.8px">JsonRpcServer::DEBUG::2017-05-<wbr>02 18:02:25,057::__init__::506::j<wbr>sonrpc.JsonRpcServer::(serve_r<wbr>equests) Waiting for request</div><div style="font-size:12.8px">Thread-9372::DEBUG::2017-05-02 18:02:25,060::stompReactor::16<wbr>3::yajsonrpc.StompServer::(sen<wbr>d) Sending response</div><div style="font-size:12.8px">Thread-20::DEBUG::2017-05-02 18:02:25,895::blockSD::596::St<wbr>orage.Misc.excCmd::(getReadDel<wbr>ay) SUCCESS: <err> = '1+0 records in\n1+0 records out\n4096 bytes (4.1 kB) copied, 19.2814 s, 0.2 kB/s\n'; <rc> = 0</div><div style="font-size:12.8px">JsonRpc (StompReactor)::DEBUG::2017-05<wbr>-02 18:02:28,075::stompReactor::98<wbr>::Broker.StompAdapter::(handle<wbr>_frame) Handling message <StompFrame command='SEND'></div><div style="font-size:12.8px">JsonRpcServer::DEBUG::2017-05-<wbr>02 18:02:28,076::__init__::506::j<wbr>sonrpc.JsonRpcServer::(serve_r<wbr>equests) Waiting for request</div><div style="font-size:12.8px">Thread-9373::DEBUG::2017-05-02 18:02:28,078::stompReactor::16<wbr>3::yajsonrpc.StompServer::(sen<wbr>d) Sending response</div><div style="font-size:12.8px">JsonRpc (StompReactor)::DEBUG::2017-05<wbr>-02 18:02:31,094::stompReactor::98<wbr>::Broker.StompAdapter::(handle<wbr>_frame) Handling message <StompFrame command='SEND'></div><div style="font-size:12.8px">JsonRpcServer::DEBUG::2017-05-<wbr>02 18:02:31,095::__init__::506::j<wbr>sonrpc.JsonRpcServer::(serve_r<wbr>equests) Waiting for request</div><div style="font-size:12.8px">Thread-9374::DEBUG::2017-05-02 18:02:31,097::stompReactor::16<wbr>3::yajsonrpc.StompServer::(sen<wbr>d) Sending response</div><div style="font-size:12.8px">Thread-21::DEBUG::2017-05-02 18:02:33,652::fileSD::261::Sto<wbr>rage.Misc.excCmd::(getReadDela<wbr>y) /bin/dd if=/rhev/data-center/mnt/<<a href="http://myhost.example.com/" target="_blank">myho<wbr>st.example.com</a>>:_ISO/f105bdc6-<wbr>efdc-445c-b49e-aa38c91c2569/do<wbr>m_md/metadata iflag=direct of=/dev/null bs=4096 count=1 (cwd None)</div><div style="font-size:12.8px">Thread-21::DEBUG::2017-05-02 18:02:33,660::fileSD::261::Sto<wbr>rage.Misc.excCmd::(getReadDela<wbr>y) SUCCESS: <err> = '0+1 records in\n0+1 records out\n331 bytes (331 B) copied, 0.000246594 s, 1.3 MB/s\n'; <rc> = 0</div><div style="font-size:12.8px">JsonRpc (StompReactor)::DEBUG::2017-05<wbr>-02 18:02:34,112::stompReactor::98<wbr>::Broker.StompAdapter::(handle<wbr>_frame) Handling message <StompFrame command='SEND'></div><div style="font-size:12.8px">JsonRpcServer::DEBUG::2017-05-<wbr>02 18:02:34,114::__init__::506::j<wbr>sonrpc.JsonRpcServer::(serve_r<wbr>equests) Waiting for request</div><div style="font-size:12.8px">Thread-9375::DEBUG::2017-05-02 18:02:34,117::task::595::Stora<wbr>ge.TaskManager.Task::(_updateS<wbr>tate) Task=`ea63814e-d25d-4436-8bb5-<wbr>cfc67af6498d`::moving from state init -> state preparing</div><div style="font-size:12.8px">Thread-9375::INFO::2017-05-02 18:02:34,117::logUtils::44::di<wbr>spatcher::(wrapper) Run and protect: repoStats(options=None)</div><div style="font-size:12.8px">Thread-9375::INFO::2017-05-02 18:02:34,118::logUtils::47::di<wbr>spatcher::(wrapper) Run and protect: repoStats, Return response: {'2c501858-bf8d-49a5-a42b-bca3<wbr>41b47827': {'code': 0, 'version': 3, 'acquired': True, 'delay': '0.000489782', 'lastCheck': '9.8', 'valid': True}, 'f8f21d6c-2425-45c4-aded-4cb9b<wbr>53ebd96': {'code': 0, 'version': 3, 'acquired': True, 'delay': '0.000632576', 'lastCheck': '9.4', 'valid': True}, '6a386652-629d-4045-835b-21d2f<wbr>5c104aa': {'code': 0, 'version': 3, 'acquired': True, 'delay': '19.2814', 'lastCheck': '8.2', 'valid': True}, 'f105bdc6-efdc-445c-b49e-aa38c<wbr>91c2569': {'code': 0, 'version': 0, 'acquired': True, 'delay': '0.000246594', 'lastCheck': '0.5', 'valid': True}, 'c5dc09c4-cd79-42d2-a5af-89953<wbr>076f79e': {'code': 0, 'version': 3, 'acquired': True, 'delay': '0.000558622', 'lastCheck': '9.5', 'valid': True}}</div><div style="font-size:12.8px">Thread-9375::DEBUG::2017-05-02 18:02:34,118::task::1191::Stor<wbr>age.TaskManager.Task::(prepare<wbr>) Task=`ea63814e-d25d-4436-8bb5-<wbr>cfc67af6498d`::finished: {'2c501858-bf8d-49a5-a42b-bca3<wbr>41b47827': {'code': 0, 'version': 3, 'acquired': True, 'delay': '0.000489782', 'lastCheck': '9.8', 'valid': True}, 'f8f21d6c-2425-45c4-aded-4cb9b<wbr>53ebd96': {'code': 0, 'version': 3, 'acquired': True, 'delay': '0.000632576', 'lastCheck': '9.4', 'valid': True}, '6a386652-629d-4045-835b-21d2f<wbr>5c104aa': {'code': 0, 'version': 3, 'acquired': True, 'delay': '19.2814', 'lastCheck': '8.2', 'valid': True}, 'f105bdc6-efdc-445c-b49e-aa38c<wbr>91c2569': {'code': 0, 'version': 0, 'acquired': True, 'delay': '0.000246594', 'lastCheck': '0.5', 'valid': True}, 'c5dc09c4-cd79-42d2-a5af-89953<wbr>076f79e': {'code': 0, 'version': 3, 'acquired': True, 'delay': '0.000558622', 'lastCheck': '9.5', 'valid': True}}</div><div style="font-size:12.8px">Thread-9375::DEBUG::2017-05-02 18:02:34,119::task::595::Stora<wbr>ge.TaskManager.Task::(_updateS<wbr>tate) Task=`ea63814e-d25d-4436-8bb5-<wbr>cfc67af6498d`::moving from state preparing -> state finished</div><div style="font-size:12.8px">Thread-9375::DEBUG::2017-05-02 18:02:34,119::resourceManager:<wbr>:940::Storage.ResourceManager.<wbr>Owner::(releaseAll) Owner.releaseAll requests {} resources {}</div><div style="font-size:12.8px">Thread-9375::DEBUG::2017-05-02 18:02:34,119::resourceManager:<wbr>:977::Storage.ResourceManager.<wbr>Owner::(cancelAll) Owner.cancelAll requests {}</div><div style="font-size:12.8px">Thread-9375::DEBUG::2017-05-02 18:02:34,119::task::993::Stora<wbr>ge.TaskManager.Task::(_decref) Task=`ea63814e-d25d-4436-8bb5-<wbr>cfc67af6498d`::ref 0 aborting False</div><div style="font-size:12.8px">Thread-9375::DEBUG::2017-05-02 18:02:34,122::stompReactor::16<wbr>3::yajsonrpc.StompServer::(sen<wbr>d) Sending response</div><div style="font-size:12.8px">JsonRpc (StompReactor)::DEBUG::2017-05<wbr>-02 18:02:34,128::stompReactor::98<wbr>::Broker.StompAdapter::(handle<wbr>_frame) Handling message <StompFrame command='SEND'></div><div style="font-size:12.8px">JsonRpcServer::DEBUG::2017-05-<wbr>02 18:02:34,129::__init__::506::j<wbr>sonrpc.JsonRpcServer::(serve_r<wbr>equests) Waiting for request</div><div style="font-size:12.8px">Thread-9376::DEBUG::2017-05-02 18:02:34,132::stompReactor::16<wbr>3::yajsonrpc.StompServer::(sen<wbr>d) Sending response</div></div>
<br></div></div>______________________________<wbr>_________________<br>
Users mailing list<br>
<a href="mailto:Users@ovirt.org" target="_blank">Users@ovirt.org</a><br>
<a href="http://lists.ovirt.org/mailman/listinfo/users" rel="noreferrer" target="_blank">http://lists.ovirt.org/mailman<wbr>/listinfo/users</a><br>
<br></blockquote></div><br></div></div>
</blockquote></div><br></div>