On Tue, May 2, 2017 at 11:09 PM, Stefano Bovina <bovy89@gmail.com> wrote:Hi, the engine logs show high latency on storage domains: "Storage domain <xxxx> experienced a high latency of 19.2814 seconds from ...... This may cause performance and functional issues."Looking at host logs, I found also these locking errors:2017-05-02 20:52:13+0200 33883 [10098]: s1 renewal error -202 delta_length 10 last_success 338532017-05-02 20:52:19+0200 33889 [10098]: 6a386652 aio collect 0 0x7f1fb80008c0:0x7f1fb80008d0:0x7f1fbe9fb000 result 1048576:0 other free 2017-05-02 21:08:51+0200 34880 [10098]: 6a386652 aio timeout 0 0x7f1fb80008c0:0x7f1fb80008d0:0x7f1fbe4f2000 ioto 10 to_count 24 2017-05-02 21:08:51+0200 34880 [10098]: s1 delta_renew read rv -202 offset 0 /dev/6a386652-629d-4045-835b-21d2f5c104aa/ids 2017-05-02 21:08:51+0200 34880 [10098]: s1 renewal error -202 delta_length 10 last_success 348502017-05-02 21:08:53+0200 34883 [10098]: 6a386652 aio collect 0 0x7f1fb80008c0:0x7f1fb80008d0:0x7f1fbe4f2000 result 1048576:0 other free 2017-05-02 21:30:40+0200 36189 [10098]: 6a386652 aio timeout 0 0x7f1fb80008c0:0x7f1fb80008d0:0x7f1fbe9fb000 ioto 10 to_count 25 2017-05-02 21:30:40+0200 36189 [10098]: s1 delta_renew read rv -202 offset 0 /dev/6a386652-629d-4045-835b-21d2f5c104aa/ids 2017-05-02 21:30:40+0200 36189 [10098]: s1 renewal error -202 delta_length 10 last_success 361592017-05-02 21:30:45+0200 36195 [10098]: 6a386652 aio collect 0 0x7f1fb80008c0:0x7f1fb80008d0:0x7f1fbe9fb000 result 1048576:0 other free and this vdsm errors too:Thread-22::ERROR::2017-05-02 21:53:48,147::sdc::137::Storage.StorageDomainCache::(_findDo main) looking for unfetched domain f8f21d6c-2425-45c4-aded-4cb9b5 3ebd96 Thread-22::ERROR::2017-05-02 21:53:48,148::sdc::154::Storage.StorageDomainCache::(_findUn fetchedDomain) looking for domain f8f21d6c-2425-45c4-aded-4cb9b5 3ebd96 Engine instead is showing this errors:2017-05-02 21:40:38,089 ERROR [org.ovirt.engine.core.vdsbroker.vdsbroker.SpmStatusVDSComma nd] (DefaultQuartzScheduler_Worker -96) Command SpmStatusVDSCommand(HostName = <myhost.example.com>, HostId = dcc0275a-b011-4e33-bb95-366ffb 0697b3, storagePoolId = 715d1ba2-eabe-48db-9aea-c28c30 359808) execution failed. Exception: VDSErrorException: VDSGenericException: VDSErrorException: Failed to SpmStatusVDS, error = (-202, 'Sanlock resource read failure', 'Sanlock exception'), code = 100 2017-05-02 21:41:08,431 ERROR [org.ovirt.engine.core.vdsbroker.vdsbroker.SpmStatusVDSComma nd] (DefaultQuartzScheduler_Worker -53) [6e0d5ebf] Failed in SpmStatusVDS method 2017-05-02 21:41:08,443 ERROR [org.ovirt.engine.core.vdsbroker.vdsbroker.SpmStatusVDSComma nd] (DefaultQuartzScheduler_Worker -53) [6e0d5ebf] Command SpmStatusVDSCommand(HostName = <myhost.example.com>, HostId = 7991933e-5f30-48cd-88bf-b0b525 613384, storagePoolId = 4bd73239-22d0-4c44-ab8c-17adcd 580309) execution failed. Exception: VDSErrorException: VDSGenericException: VDSErrorException: Failed to SpmStatusVDS, error = (-202, 'Sanlock resource read failure', 'Sanlock exception'), code = 100 2017-05-02 21:41:31,975 ERROR [org.ovirt.engine.core.vdsbroker.vdsbroker.SpmStatusVDSComma nd] (DefaultQuartzScheduler_Worker -61) [2a54a1b2] Failed in SpmStatusVDS method 2017-05-02 21:41:31,987 ERROR [org.ovirt.engine.core.vdsbroker.vdsbroker.SpmStatusVDSComma nd] (DefaultQuartzScheduler_Worker -61) [2a54a1b2] Command SpmStatusVDSCommand(HostName = <myhost.example.com>, HostId = dcc0275a-b011-4e33-bb95-366ffb 0697b3, storagePoolId = 715d1ba2-eabe-48db-9aea-c28c30 359808) execution failed. Exception: VDSErrorException: VDSGenericException: VDSErrorException: Failed to SpmStatusVDS, error = (-202, 'Sanlock resource read failure', 'Sanlock exception'), code = 100 I'm using Fibre Channel or FCoE connectivity; storage array technical support has analyzed it (also switch and OS configurations), but nothing has been found.Is this on a specific hosts, or multiple hosts?Is that FC or FCoE? Anything on the host's /var/log/messages?Any advice?ThanksInstallation info:ovirt-release35-006-1.noarchThis is a very old release, I suggest, regardless of this issue, to upgrade.Y.______________________________libgovirt-0.3.3-1.el7_2.1.x86_64 vdsm-4.16.30-0.el7.centos.x86_64 vdsm-xmlrpc-4.16.30-0.el7.centos.noarch vdsm-yajsonrpc-4.16.30-0.el7.centos.noarch vdsm-jsonrpc-4.16.30-0.el7.centos.noarch vdsm-python-zombiereaper-4.16.30-0.el7.centos.noarch vdsm-python-4.16.30-0.el7.centos.noarch vdsm-cli-4.16.30-0.el7.centos.noarch qemu-kvm-ev-2.3.0-29.1.el7.x86_64 qemu-kvm-common-ev-2.3.0-29.1.el7.x86_64 qemu-kvm-tools-ev-2.3.0-29.1.el7.x86_64 libvirt-client-1.2.17-13.el7_2.3.x86_64 libvirt-daemon-driver-storage-1.2.17-13.el7_2.3.x86_64 libvirt-python-1.2.17-2.el7.x86_64 libvirt-daemon-driver-nwfilter-1.2.17-13.el7_2.3.x86_64 libvirt-daemon-driver-nodedev-1.2.17-13.el7_2.3.x86_64 libvirt-lock-sanlock-1.2.17-13.el7_2.3.x86_64 libvirt-glib-0.1.9-1.el7.x86_64 libvirt-daemon-driver-network-1.2.17-13.el7_2.3.x86_64 libvirt-daemon-driver-lxc-1.2.17-13.el7_2.3.x86_64 libvirt-daemon-driver-interface-1.2.17-13.el7_2.3.x86_64 libvirt-1.2.17-13.el7_2.3.x86_64 libvirt-daemon-1.2.17-13.el7_2.3.x86_64 libvirt-daemon-config-network-1.2.17-13.el7_2.3.x86_64 libvirt-daemon-driver-secret-1.2.17-13.el7_2.3.x86_64 libvirt-daemon-config-nwfilter-1.2.17-13.el7_2.3.x86_64 libvirt-daemon-kvm-1.2.17-13.el7_2.3.x86_64 libvirt-daemon-driver-qemu-1.2.17-13.el7_2.3.x86_64 ------- vdsm.log (high latency)Thread-21::DEBUG::2017-05-02 18:02:23,646::fileSD::261::Storage.Misc.excCmd::(getReadDela y) SUCCESS: <err> = '0+1 records in\n0+1 records out\n331 bytes (331 B) copied, 0.000285529 s, 1.2 MB/s\n'; <rc> = 0 Thread-18::DEBUG::2017-05-02 18:02:24,335::blockSD::596::Storage.Misc.excCmd::(getReadDel ay) /bin/dd if=/dev/2c501858-bf8d-49a5-a42 b-bca341b47827/metadata iflag=direct of=/dev/null bs=4096 count=1 (cwd None) Thread-18::DEBUG::2017-05-02 18:02:24,343::blockSD::596::Storage.Misc.excCmd::(getReadDel ay) SUCCESS: <err> = '1+0 records in\n1+0 records out\n4096 bytes (4.1 kB) copied, 0.000489782 s, 8.4 MB/s\n'; <rc> = 0 Thread-19::DEBUG::2017-05-02 18:02:24,635::blockSD::596::Storage.Misc.excCmd::(getReadDel ay) /bin/dd if=/dev/c5dc09c4-cd79-42d2-a5a f-89953076f79e/metadata iflag=direct of=/dev/null bs=4096 count=1 (cwd None) Thread-19::DEBUG::2017-05-02 18:02:24,643::blockSD::596::Storage.Misc.excCmd::(getReadDel ay) SUCCESS: <err> = '1+0 records in\n1+0 records out\n4096 bytes (4.1 kB) copied, 0.000558622 s, 7.3 MB/s\n'; <rc> = 0 Thread-22::DEBUG::2017-05-02 18:02:24,722::blockSD::596::Storage.Misc.excCmd::(getReadDel ay) /bin/dd if=/dev/f8f21d6c-2425-45c4-ade d-4cb9b53ebd96/metadata iflag=direct of=/dev/null bs=4096 count=1 (cwd None) Thread-22::DEBUG::2017-05-02 18:02:24,729::blockSD::596::Storage.Misc.excCmd::(getReadDel ay) SUCCESS: <err> = '1+0 records in\n1+0 records out\n4096 bytes (4.1 kB) copied, 0.000632576 s, 6.5 MB/s\n'; <rc> = 0 JsonRpc (StompReactor)::DEBUG::2017-05-02 18:02:25,056::stompReactor::98 ::Broker.StompAdapter::(handle _frame) Handling message <StompFrame command='SEND'> JsonRpcServer::DEBUG::2017-05-02 18:02:25,057::__init__::506::j sonrpc.JsonRpcServer::(serve_r equests) Waiting for request Thread-9372::DEBUG::2017-05-02 18:02:25,060::stompReactor::163::yajsonrpc.StompServer::(sen d) Sending response Thread-20::DEBUG::2017-05-02 18:02:25,895::blockSD::596::Storage.Misc.excCmd::(getReadDel ay) SUCCESS: <err> = '1+0 records in\n1+0 records out\n4096 bytes (4.1 kB) copied, 19.2814 s, 0.2 kB/s\n'; <rc> = 0 JsonRpc (StompReactor)::DEBUG::2017-05-02 18:02:28,075::stompReactor::98 ::Broker.StompAdapter::(handle _frame) Handling message <StompFrame command='SEND'> JsonRpcServer::DEBUG::2017-05-02 18:02:28,076::__init__::506::j sonrpc.JsonRpcServer::(serve_r equests) Waiting for request Thread-9373::DEBUG::2017-05-02 18:02:28,078::stompReactor::163::yajsonrpc.StompServer::(sen d) Sending response JsonRpc (StompReactor)::DEBUG::2017-05-02 18:02:31,094::stompReactor::98 ::Broker.StompAdapter::(handle _frame) Handling message <StompFrame command='SEND'> JsonRpcServer::DEBUG::2017-05-02 18:02:31,095::__init__::506::j sonrpc.JsonRpcServer::(serve_r equests) Waiting for request Thread-9374::DEBUG::2017-05-02 18:02:31,097::stompReactor::163::yajsonrpc.StompServer::(sen d) Sending response Thread-21::DEBUG::2017-05-02 18:02:33,652::fileSD::261::Storage.Misc.excCmd::(getReadDela y) /bin/dd if=/rhev/data-center/mnt/<myho st.example.com >:_ISO/f105bdc6-efdc-445c-b49e-aa38c91c2569/do m_md/metadata iflag=direct of=/dev/null bs=4096 count=1 (cwd None) Thread-21::DEBUG::2017-05-02 18:02:33,660::fileSD::261::Storage.Misc.excCmd::(getReadDela y) SUCCESS: <err> = '0+1 records in\n0+1 records out\n331 bytes (331 B) copied, 0.000246594 s, 1.3 MB/s\n'; <rc> = 0 JsonRpc (StompReactor)::DEBUG::2017-05-02 18:02:34,112::stompReactor::98 ::Broker.StompAdapter::(handle _frame) Handling message <StompFrame command='SEND'> JsonRpcServer::DEBUG::2017-05-02 18:02:34,114::__init__::506::j sonrpc.JsonRpcServer::(serve_r equests) Waiting for request Thread-9375::DEBUG::2017-05-02 18:02:34,117::task::595::Storage.TaskManager.Task::(_updateS tate) Task=`ea63814e-d25d-4436-8bb5- cfc67af6498d`::moving from state init -> state preparing Thread-9375::INFO::2017-05-02 18:02:34,117::logUtils::44::dispatcher::(wrapper) Run and protect: repoStats(options=None) Thread-9375::INFO::2017-05-02 18:02:34,118::logUtils::47::dispatcher::(wrapper) Run and protect: repoStats, Return response: {'2c501858-bf8d-49a5-a42b-bca3 41b47827': {'code': 0, 'version': 3, 'acquired': True, 'delay': '0.000489782', 'lastCheck': '9.8', 'valid': True}, 'f8f21d6c-2425-45c4-aded-4cb9b 53ebd96': {'code': 0, 'version': 3, 'acquired': True, 'delay': '0.000632576', 'lastCheck': '9.4', 'valid': True}, '6a386652-629d-4045-835b-21d2f 5c104aa': {'code': 0, 'version': 3, 'acquired': True, 'delay': '19.2814', 'lastCheck': '8.2', 'valid': True}, 'f105bdc6-efdc-445c-b49e-aa38c 91c2569': {'code': 0, 'version': 0, 'acquired': True, 'delay': '0.000246594', 'lastCheck': '0.5', 'valid': True}, 'c5dc09c4-cd79-42d2-a5af-89953 076f79e': {'code': 0, 'version': 3, 'acquired': True, 'delay': '0.000558622', 'lastCheck': '9.5', 'valid': True}} Thread-9375::DEBUG::2017-05-02 18:02:34,118::task::1191::Storage.TaskManager.Task::(prepare ) Task=`ea63814e-d25d-4436-8bb5- cfc67af6498d`::finished: {'2c501858-bf8d-49a5-a42b-bca3 41b47827': {'code': 0, 'version': 3, 'acquired': True, 'delay': '0.000489782', 'lastCheck': '9.8', 'valid': True}, 'f8f21d6c-2425-45c4-aded-4cb9b 53ebd96': {'code': 0, 'version': 3, 'acquired': True, 'delay': '0.000632576', 'lastCheck': '9.4', 'valid': True}, '6a386652-629d-4045-835b-21d2f 5c104aa': {'code': 0, 'version': 3, 'acquired': True, 'delay': '19.2814', 'lastCheck': '8.2', 'valid': True}, 'f105bdc6-efdc-445c-b49e-aa38c 91c2569': {'code': 0, 'version': 0, 'acquired': True, 'delay': '0.000246594', 'lastCheck': '0.5', 'valid': True}, 'c5dc09c4-cd79-42d2-a5af-89953 076f79e': {'code': 0, 'version': 3, 'acquired': True, 'delay': '0.000558622', 'lastCheck': '9.5', 'valid': True}} Thread-9375::DEBUG::2017-05-02 18:02:34,119::task::595::Storage.TaskManager.Task::(_updateS tate) Task=`ea63814e-d25d-4436-8bb5- cfc67af6498d`::moving from state preparing -> state finished Thread-9375::DEBUG::2017-05-02 18:02:34,119::resourceManager::940::Storage.ResourceManager. Owner::(releaseAll) Owner.releaseAll requests {} resources {} Thread-9375::DEBUG::2017-05-02 18:02:34,119::resourceManager::977::Storage.ResourceManager. Owner::(cancelAll) Owner.cancelAll requests {} Thread-9375::DEBUG::2017-05-02 18:02:34,119::task::993::Storage.TaskManager.Task::(_decref) Task=`ea63814e-d25d-4436-8bb5- cfc67af6498d`::ref 0 aborting False Thread-9375::DEBUG::2017-05-02 18:02:34,122::stompReactor::163::yajsonrpc.StompServer::(sen d) Sending response JsonRpc (StompReactor)::DEBUG::2017-05-02 18:02:34,128::stompReactor::98 ::Broker.StompAdapter::(handle _frame) Handling message <StompFrame command='SEND'> JsonRpcServer::DEBUG::2017-05-02 18:02:34,129::__init__::506::j sonrpc.JsonRpcServer::(serve_r equests) Waiting for request Thread-9376::DEBUG::2017-05-02 18:02:34,132::stompReactor::163::yajsonrpc.StompServer::(sen d) Sending response _________________
Users mailing list
Users@ovirt.org
http://lists.ovirt.org/mailman/listinfo/users