
On 23 Jun 2016, at 18:36, Stefano Danzi <s.danzi@hawai.it> wrote:
Hi! I've just upgrade oVirt from 3.6 to 4.0 and I'm not able to start the self hosted engine.
first thing is that the host network lose the degaut gateway configuration. But this is not the problem.
Logs:
==> /var/log/ovirt-hosted-engine-ha/agent.log <== MainThread::INFO::2016-06-23 18:28:40,833::hosted_engine::669::ovirt_hosted_engine_ha.agent.hosted_engine.HostedEngine::(_initialize_storage_images) Reloading vm.conf from the shared storage domain MainThread::INFO::2016-06-23 18:28:40,833::config::206::ovirt_hosted_engine_ha.agent.hosted_engine.HostedEngine.config::(refresh_local_conf_file) Trying to get a fresher copy of vm configuration from the OVF_STORE MainThread::INFO::2016-06-23 18:28:44,535::ovf_store::102::ovirt_hosted_engine_ha.lib.ovf.ovf_store.OVFStore::(scan) Found OVF_STORE: imgUUID:8d07965c-a5c4-4057-912d-901f80cf246c, volUUID:ce3aa63e-e1c4-498e-bdca-9d2e9f47f0f9 MainThread::INFO::2016-06-23 18:28:44,582::ovf_store::102::ovirt_hosted_engine_ha.lib.ovf.ovf_store.OVFStore::(scan) Found OVF_STORE: imgUUID:bd9aaf0b-8435-4d78-9871-8c7a7f7fa02a, volUUID:3c477b06-063e-4f01-bd05-84c7d467742b MainThread::INFO::2016-06-23 18:28:44,674::ovf_store::111::ovirt_hosted_engine_ha.lib.ovf.ovf_store.OVFStore::(getEngineVMOVF) Extracting Engine VM OVF from the OVF_STORE MainThread::INFO::2016-06-23 18:28:44,675::ovf_store::118::ovirt_hosted_engine_ha.lib.ovf.ovf_store.OVFStore::(getEngineVMOVF) OVF_STORE volume path: /rhev/data-center/mnt/ovirt01.hawai.lan:_engine/46f55a31-f35f-465c-b3e2-df45c05e06a7/images/bd9aaf0b-8435-4d78-9871-8c7a7f7fa02a/3c477b06-063e-4f01-bd05-84c7d467742b MainThread::INFO::2016-06-23 18:28:44,682::config::226::ovirt_hosted_engine_ha.agent.hosted_engine.HostedEngine.config::(refresh_local_conf_file) Found an OVF for HE VM, trying to convert MainThread::INFO::2016-06-23 18:28:44,684::config::231::ovirt_hosted_engine_ha.agent.hosted_engine.HostedEngine.config::(refresh_local_conf_file) Got vm.conf from OVF_STORE MainThread::INFO::2016-06-23 18:28:44,684::hosted_engine::517::ovirt_hosted_engine_ha.agent.hosted_engine.HostedEngine::(_initialize_broker) Initializing ha-broker connection MainThread::INFO::2016-06-23 18:28:44,685::brokerlink::129::ovirt_hosted_engine_ha.lib.brokerlink.BrokerLink::(start_monitor) Starting monitor ping, options {'addr': '192.168.1.254'}
==> /var/log/ovirt-hosted-engine-ha/broker.log <== Thread-25::ERROR::2016-06-23 18:28:44,697::listener::182::ovirt_hosted_engine_ha.broker.listener.ConnectionHandler::(handle) Error while serving connection Traceback (most recent call last): File "/usr/lib/python2.7/site-packages/ovirt_hosted_engine_ha/broker/listener.py", line 166, in handle data) File "/usr/lib/python2.7/site-packages/ovirt_hosted_engine_ha/broker/listener.py", line 299, in _dispatch .set_storage_domain(client, sd_type, **options) File "/usr/lib/python2.7/site-packages/ovirt_hosted_engine_ha/broker/storage_broker.py", line 66, in set_storage_domain self._backends[client].connect() File "/usr/lib/python2.7/site-packages/ovirt_hosted_engine_ha/lib/storage_backends.py", line 400, in connect volUUID=volume.volume_uuid File "/usr/lib/python2.7/site-packages/ovirt_hosted_engine_ha/lib/storage_backends.py", line 245, in _get_volume_path volUUID File "/usr/lib64/python2.7/xmlrpclib.py", line 1233, in __call__ return self.__send(self.__name, args) File "/usr/lib64/python2.7/xmlrpclib.py", line 1587, in __request verbose=self.__verbose File "/usr/lib64/python2.7/xmlrpclib.py", line 1273, in request return self.single_request(host, handler, request_body, verbose) File "/usr/lib64/python2.7/xmlrpclib.py", line 1301, in single_request self.send_content(h, request_body) File "/usr/lib64/python2.7/xmlrpclib.py", line 1448, in send_content connection.endheaders(request_body) File "/usr/lib64/python2.7/httplib.py", line 975, in endheaders self._send_output(message_body) File "/usr/lib64/python2.7/httplib.py", line 835, in _send_output self.send(msg) File "/usr/lib64/python2.7/httplib.py", line 797, in send self.connect() File "/usr/lib/python2.7/site-packages/vdsm/m2cutils.py", line 203, in connect sock = socket.create_connection((self.host, self.port), self.timeout) File "/usr/lib64/python2.7/socket.py", line 571, in create_connection raise err error: [Errno 101] Network is unreachable
==> /var/log/ovirt-hosted-engine-ha/agent.log <== MainThread::INFO::2016-06-23 18:28:44,697::hosted_engine::602::ovirt_hosted_engine_ha.agent.hosted_engine.HostedEngine::(_initialize_broker) Failed set the storage domain: 'Failed to set storage domain VdsmBackend, options {'hosted-engine.lockspace': '7B22696D6167655F75756964223A202265663131373139322D623564662D346534362D383939622D626666333362663862663135222C202270617468223A206E756C6C2C2022766F6C756D655F75756964223A202230613363393433652D633032392D343134372D623364342D396366353364663161356262227D', 'sp_uuid': '00000000-0000-0000-0000-000000000000', 'dom_type': 'nfs3', 'hosted-engine.metadata': '7B22696D6167655F75756964223A202233353066396361302D363933302D346336342D626438632D613262656631656336383934222C202270617468223A206E756C6C2C2022766F6C756D655F75756964223A202235393335373437392D656664612D343563312D386363392D656131653739633231646561227D', 'sd_uuid': '46f55a31-f35f-465c-b3e2-df45c05e06a7'}: Request failed: <class 'socket.error'>'. Waiting '5's before the next attempt
==> /var/log/vdsm/vdsm.log <== periodic/3::WARNING::2016-06-23 18:34:41,731::periodic::269::virt.periodic.VmDispatcher::(__call__) could not run <class 'vdsm.virt.periodic.DriveWatermarkMonitor'> on ['b66ae2c5-de0f-4361-953b-f10226da7eb8'] periodic/2::WARNING::2016-06-23 18:34:43,657::periodic::269::virt.periodic.VmDispatcher::(__call__) could not run <class 'vdsm.virt.periodic.BlockjobMonitor'> on ['b66ae2c5-de0f-4361-953b-f10226da7eb8'] periodic/0::WARNING::2016-06-23 18:34:43,732::periodic::269::virt.periodic.VmDispatcher::(__call__) could not run <class 'vdsm.virt.periodic.DriveWatermarkMonitor'> on ['b66ae2c5-de0f-4361-953b-f10226da7eb8'] periodic/3::WARNING::2016-06-23 18:34:45,732::periodic::269::virt.periodic.VmDispatcher::(__call__) could not run <class 'vdsm.virt.periodic.DriveWatermarkMonitor'> on ['b66ae2c5-de0f-4361-953b-f10226da7eb8'] periodic/2::WARNING::2016-06-23 18:34:47,723::periodic::269::virt.periodic.VmDispatcher::(__call__) could not run <class 'vdsm.virt.periodic.DriveWatermarkMonitor'> on ['b66ae2c5-de0f-4361-953b-f10226da7eb8'] periodic/1::WARNING::2016-06-23 18:34:49,724::periodic::269::virt.periodic.VmDispatcher::(__call__) could not run <class 'vdsm.virt.periodic.DriveWatermarkMonitor'> on ['b66ae2c5-de0f-4361-953b-f10226da7eb8'] periodic/0::WARNING::2016-06-23 18:34:51,725::periodic::269::virt.periodic.VmDispatcher::(__call__) could not run <class 'vdsm.virt.periodic.DriveWatermarkMonitor'> on ['b66ae2c5-de0f-4361-953b-f10226da7eb8'] periodic/3::WARNING::2016-06-23 18:34:53,726::periodic::269::virt.periodic.VmDispatcher::(__call__) could not run <class 'vdsm.virt.periodic.DriveWatermarkMonitor'> on ['b66ae2c5-de0f-4361-953b-f10226da7eb8'] periodic/2::WARNING::2016-06-23 18:34:55,727::periodic::269::virt.periodic.VmDispatcher::(__call__) could not run <class 'vdsm.virt.periodic.DriveWatermarkMonitor'> on ['b66ae2c5-de0f-4361-953b-f10226da7eb8']
==> /var/log/vdsm/mom.log <== 2016-06-23 18:34:56,376 - mom.vdsmInterface - ERROR - Cannot connect to VDSM! [Errno 101] Network is unreachable
==> /var/log/vdsm/vdsm.log <== periodic/1::WARNING::2016-06-23 18:34:57,727::periodic::269::virt.periodic.VmDispatcher::(__call__) could not run <class 'vdsm.virt.periodic.DriveWatermarkMonitor'> on ['b66ae2c5-de0f-4361-953b-f10226da7eb8'] periodic/2::WARNING::2016-06-23 18:34:58,652::periodic::269::virt.periodic.VmDispatcher::(__call__) could not run <class 'vdsm.virt.periodic.BlockjobMonitor'> on ['b66ae2c5-de0f-4361-953b-f10226da7eb8'] periodic/2::WARNING::2016-06-23 18:34:59,728::periodic::269::virt.periodic.VmDispatcher::(__call__) could not run <class 'vdsm.virt.periodic.DriveWatermarkMonitor'> on ['b66ae2c5-de0f-4361-953b-f10226da7eb8'] Reactor thread::INFO::2016-06-23 18:35:00,048::protocoldetector::72::ProtocolDetector.AcceptorImpl::(handle_accept) Accepting connection from ::ffff:192.168.1.50:60604 Reactor thread::DEBUG::2016-06-23 18:35:00,057::protocoldetector::85::ProtocolDetector.Detector::(__init__) Using required_size=11 Reactor thread::INFO::2016-06-23 18:35:00,058::protocoldetector::121::ProtocolDetector.Detector::(handle_read) Detected protocol stomp from ::ffff:192.168.1.50:60604 Reactor thread::INFO::2016-06-23 18:35:00,058::stompreactor::101::Broker.StompAdapter::(_cmd_connect) Processing CONNECT request Reactor thread::DEBUG::2016-06-23 18:35:00,059::stompreactor::483::protocoldetector.StompDetector::(handle_socket) Stomp detected from ('::ffff:192.168.1.50', 60604) JsonRpc (StompReactor)::INFO::2016-06-23 18:35:00,059::stompreactor::128::Broker.StompAdapter::(_cmd_subscribe) Subscribe command received jsonrpc.Executor/2::DEBUG::2016-06-23 18:35:01,688::__init__::522::jsonrpc.JsonRpcServer::(_serveRequest) Calling 'Host.getHardwareInfo' in bridge with {}
can you please post a bit more of vdsm.log, before and after, inclding also time around 18:28 as per the error above in hosted-engine-ha?
[root@ovirt01 ~]# netstat -lnp Active Internet connections (only servers) Proto Recv-Q Send-Q Local Address Foreign Address State PID/Program name tcp 0 0 0.0.0.0:111 0.0.0.0:* LISTEN 985/rpcbind tcp 0 0 0.0.0.0:882 0.0.0.0:* LISTEN 1970/glusterfs tcp 0 0 0.0.0.0:54322 0.0.0.0:* LISTEN 925/python tcp 0 0 0.0.0.0:22 0.0.0.0:* LISTEN 1600/sshd tcp 0 0 0.0.0.0:49152 0.0.0.0:* LISTEN 1978/glusterfsd tcp 0 0 0.0.0.0:2049 0.0.0.0:* LISTEN 1970/glusterfs tcp 0 0 0.0.0.0:38465 0.0.0.0:* LISTEN 1970/glusterfs tcp 0 0 0.0.0.0:49153 0.0.0.0:* LISTEN 1963/glusterfsd tcp 0 0 0.0.0.0:32802 0.0.0.0:* LISTEN 2193/rpc.statd tcp 0 0 0.0.0.0:38466 0.0.0.0:* LISTEN 1970/glusterfs tcp 0 0 0.0.0.0:16514 0.0.0.0:* LISTEN 1633/libvirtd tcp 0 0 0.0.0.0:38468 0.0.0.0:* LISTEN 1970/glusterfs tcp 0 0 0.0.0.0:38469 0.0.0.0:* LISTEN 1970/glusterfs tcp 0 0 0.0.0.0:24007 0.0.0.0:* LISTEN 1652/glusterd tcp6 0 0 :::54321 :::* LISTEN 1921/python tcp6 0 0 :::22 :::* LISTEN 1600/sshd tcp6 0 0 :::16514 :::* LISTEN 1633/libvirtd udp 0 0 0.0.0.0:40503 0.0.0.0:* 937/avahi-daemon: r udp 0 0 0.0.0.0:40873 0.0.0.0:* 2193/rpc.statd udp 0 0 0.0.0.0:111 0.0.0.0:* 985/rpcbind udp 0 0 127.0.0.1:323 0.0.0.0:* 952/chronyd udp 0 0 0.0.0.0:694 0.0.0.0:* 985/rpcbind udp 0 0 127.0.0.1:703 0.0.0.0:* 2193/rpc.statd udp 0 0 0.0.0.0:879 0.0.0.0:* 1970/glusterfs udp 0 0 0.0.0.0:5353 0.0.0.0:* 937/avahi-daemon: r udp6 0 0 ::1:323 :::* 952/chronyd Active UNIX domain sockets (only servers) Proto RefCnt Flags Type State I-Node PID/Program name Path unix 2 [ ACC ] STREAM LISTENING 19455 1633/libvirtd /var/run/libvirt/libvirt-sock unix 2 [ ACC ] STREAM LISTENING 22529 1633/libvirtd /var/run/libvirt/libvirt-sock-ro unix 2 [ ACC ] STREAM LISTENING 16902 1/systemd /var/run/rpcbind.sock unix 2 [ ACC ] STREAM LISTENING 24077 1978/glusterfsd /var/run/gluster/changelog-36d1d6bf77f0ad218a0db9ada4237737.sock unix 2 [ ACC ] STREAM LISTENING 20494 992/gssproxy /run/gssproxy.sock unix 2 [ ACC ] STREAM LISTENING 14641 682/multipathd @/org/kernel/linux/storage/multipathd unix 2 [ ACC ] STREAM LISTENING 16914 1/systemd /var/run/avahi-daemon/socket unix 2 [ ACC ] STREAM LISTENING 19484 993/sanlock /var/run/sanlock/sanlock.sock unix 2 [ ACC ] STREAM LISTENING 24095 1963/glusterfsd /var/run/gluster/changelog-c54f38f29c5c1e84d2766564ea967b8d.sock unix 2 [ ACC ] STREAM LISTENING 16929 1/systemd /var/run/dbus/system_bus_socket unix 2 [ ACC ] STREAM LISTENING 12093 1/systemd /run/systemd/private unix 2 [ ACC ] STREAM LISTENING 16924 1/systemd @ISCSID_UIP_ABSTRACT_NAMESPACE unix 2 [ ACC ] STREAM LISTENING 18255 925/python /var/run/vdsm/ovirt-imageio-daemon.sock unix 2 [ ACC ] STREAM LISTENING 23905 1963/glusterfsd /var/run/gluster/b3162364606c81381a4c768484c6f091.socket unix 2 [ ACC ] SEQPACKET LISTENING 13409 1/systemd /run/udev/control unix 2 [ ACC ] STREAM LISTENING 90477 1970/glusterfs /var/run/gluster/b6606e2cc1634ac3c57dae961f6a6cc3.socket unix 2 [ ACC ] STREAM LISTENING 13428 1/systemd /run/lvm/lvmetad.socket unix 2 [ ACC ] STREAM LISTENING 18302 920/python /var/run/ovirt-hosted-engine-ha/broker.socket unix 2 [ ACC ] STREAM LISTENING 23947 1978/glusterfsd /var/run/gluster/65f923a8c0294836aec3ffce760511cb.socket unix 2 [ ACC ] STREAM LISTENING 20112 1652/glusterd /var/run/glusterd.socket unix 2 [ ACC ] STREAM LISTENING 18321 939/python /var/run/vdsm/svdsm.sock unix 2 [ ACC ] STREAM LISTENING 20493 992/gssproxy /var/lib/gssproxy/default.sock unix 2 [ ACC ] STREAM LISTENING 13470 1/systemd /run/lvm/lvmpolld.socket unix 2 [ ACC ] STREAM LISTENING 16909 1/systemd @ISCSIADM_ABSTRACT_NAMESPACE unix 2 [ ACC ] STREAM LISTENING 18160 990/wdmd /var/run/wdmd/wdmd.sock unix 2 [ ACC ] STREAM LISTENING 2291 1/systemd /run/systemd/journal/stdout unix 2 [ ACC ] STREAM LISTENING 90622 1923/python /var/run/vdsm/mom-vdsm.sock
_______________________________________________ Users mailing list Users@ovirt.org http://lists.ovirt.org/mailman/listinfo/users