I ran a service restart.
The vdsmd and supervdsmd service cannot run.
[root@node0 vdsm]# service supervdsmd status
Redirecting to /bin/systemctl status supervdsmd.service
supervdsmd.service - "Auxiliary vdsm service for running helper functions as
root"
Loaded: loaded (/usr/lib/systemd/system/supervdsmd.service; static)
Active: failed (Result: exit-code) since p 2014-10-03 16:44:25 CEST; 790ms ago
Process: 28162 ExecStart=/usr/share/vdsm/daemonAdapter /usr/share/vdsm/supervdsmServer
--sockfile /var/run/vdsm/svdsm.sock (code=exited, status=1/FAILURE)
Main PID: 28162 (code=exited, status=1/FAILURE)
okt 03 16:44:25 node0.itsmart.cloud daemonAdapter[28162]: from parted_utils import
getDevicePartedInfo as _getDevicePartedInfo
okt 03 16:44:25 node0.itsmart.cloud daemonAdapter[28162]: File
"/usr/share/vdsm/parted_utils.py", line 21, in <module>
okt 03 16:44:25 node0.itsmart.cloud daemonAdapter[28162]: import parted
okt 03 16:44:25 node0.itsmart.cloud daemonAdapter[28162]: File
"/usr/lib64/python2.7/site-packages/parted/__init__.py", line 60, in
<module>
okt 03 16:44:25 node0.itsmart.cloud daemonAdapter[28162]: from partition import Partition
okt 03 16:44:25 node0.itsmart.cloud daemonAdapter[28162]: File
"/usr/lib64/python2.7/site-packages/parted/partition.py", line 260, in
<module>
okt 03 16:44:25 node0.itsmart.cloud daemonAdapter[28162]: partitionFlag[__flag] =
_ped.partition_flag_get_name(__flag)
okt 03 16:44:25 node0.itsmart.cloud daemonAdapter[28162]: ValueError: Invalid flag
provided.
okt 03 16:44:25 node0.itsmart.cloud systemd[1]: supervdsmd.service: main process exited,
code=exited, status=1/FAILURE
okt 03 16:44:25 node0.itsmart.cloud systemd[1]: Unit supervdsmd.service entered failed
state.
What the problem with my partiton?
Thanks
Demeter Tibor
----- Eredeti üzenet -----
Hi,
I did a yum update but same.
I don't know why, but I have a supervdsm.log but since the update it did not
changed.
Also I have 3 vlan and 3 bonded interface, is it cause same problems?
Thanks
Tibor
----- Eredeti üzenet -----
> Il 02/10/2014 21:40, Demeter Tibor ha scritto:
> > Hi,
> >
> > I've trying to install a hosted engine on centos7 with 3.5 rc3, but it
> > was
> > failed.
> >
> > [root@node0 network-scripts]# hosted-engine --deploy
> > [ INFO ] Stage: Initializing
> > Continuing will configure this host for serving as hypervisor and create
> > a
> > VM where you have to install oVirt Engine afterwards.
> > Are you sure you want to continue? (Yes, No)[Yes]:
> > [ INFO ] Generating a temporary VNC password.
> > [ INFO ] Stage: Environment setup
> > Configuration files: []
> > Log file:
> >
/var/log/ovirt-hosted-engine-setup/ovirt-hosted-engine-setup-20141002213312-ebysmz.log
> > Version: otopi-1.3.0_master
> > (otopi-1.3.0-0.0.master.20140911.git7c7d631.el7.centos)
> > [ INFO ] Hardware supports virtualization
> > [ INFO ] Bridge ovirtmgmt already created
> > [ INFO ] Stage: Environment packages setup
> > [ INFO ] Stage: Programs detection
> > [ INFO ] Stage: Environment setup
> > [ INFO ] Waiting for VDSM hardware info
> > [ INFO ] Waiting for VDSM hardware info
> > [ INFO ] Waiting for VDSM hardware info
> > [ INFO ] Waiting for VDSM hardware info
> > [ INFO ] Waiting for VDSM hardware info
> > [ INFO ] Waiting for VDSM hardware info
> > [ INFO ] Waiting for VDSM hardware info
> > [ INFO ] Waiting for VDSM hardware info
> > [ INFO ] Waiting for VDSM hardware info
> > [ INFO ] Waiting for VDSM hardware info
> > [ ERROR ] Failed to execute stage 'Environment setup': [Errno 111]
> > Connection refused
> > [ INFO ] Stage: Clean up
> > [ INFO ] Generating answer file
'/etc/ovirt-hosted-engine/answers.conf'
> > [ INFO ] Answer file '/etc/ovirt-hosted-engine/answers.conf' has been
> > updated
> > [ INFO ] Stage: Pre-termination
> > [ INFO ] Stage: Termination
> >
> >
> > The vdsm daemon log:
> >
> > MainThread::INFO::2014-10-02 21:02:04,001::vdsm::131::vds::(run) (PID:
> > 4376) I am the actual vdsm 4.16.5-0.el7 node0.itsmart.cloud
> > (3.10.0-123.6.3.el7.x86_64)
> > MainThread::DEBUG::2014-10-02
> >
21:02:04,002::resourceManager::421::Storage.ResourceManager::(registerNamespace)
> > Registering namespace 'Storage'
> > MainThread::DEBUG::2014-10-02
> > 21:02:04,002::threadPool::35::Storage.ThreadPool::(__init__) Enter -
> > numThreads: 10, waitTimeout: 3, maxTasks: 500
> > MainThread::DEBUG::2014-10-02
> > 21:02:04,005::fileUtils::142::Storage.fileUtils::(createdir) Creating
> > directory: /rhev/data-center/mnt
> > MainThread::DEBUG::2014-10-02
> > 21:02:04,036::supervdsm::77::SuperVdsmProxy::(_connect) Trying to connect
> > to Super Vdsm
> > MainThread::ERROR::2014-10-02 21:02:06,039::utils::1158::root::(panic)
> > Panic: Connect to supervdsm service failed: [Errno 2] No such file or
> > directory
> > Traceback (most recent call last):
> > File "/usr/share/vdsm/supervdsm.py", line 79, in _connect
> > utils.retry(self._manager.connect, Exception, timeout=60, tries=3)
> > File "/usr/lib64/python2.7/site-packages/vdsm/utils.py", line 1086,
in
> > retry
> > return func()
> > File "/usr/lib64/python2.7/multiprocessing/managers.py", line 500,
in
> > connect
> > conn = Client(self._address, authkey=self._authkey)
> > File "/usr/lib64/python2.7/multiprocessing/connection.py", line 173,
in
> > Client
> > c = SocketClient(address)
> > File "/usr/lib64/python2.7/multiprocessing/connection.py", line 301,
in
> > SocketClient
> > s.connect(address)
> > File "/usr/lib64/python2.7/socket.py", line 224, in meth
> > return getattr(self._sock,name)(*args)
> > error: [Errno 2] No such file or directory
> >
> >
> >
> >
> > What can I do ?
>
> Please try again using RC4, released yesterday.
> Let us know if you stil lhave this issue.
>
>
> >
> >
> > Thanks
> >
> > Tibor
> >
> >
> >
> > _______________________________________________
> > Users mailing list
> > Users(a)ovirt.org
> >
http://lists.ovirt.org/mailman/listinfo/users
> >
>
>
> --
> Sandro Bonazzola
> Better technology. Faster innovation. Powered by community collaboration.
> See how it works at
redhat.com
>
_______________________________________________
Users mailing list
Users(a)ovirt.org
http://lists.ovirt.org/mailman/listinfo/users