[Users] gluster storage

зоррыч zorro at megatrone.ru
Wed Jul 18 14:28:49 UTC 2012


I'm trying to mount manually created gluster storage.

But getting the error:

 

Vdsm.log:

 

 

Thread-1530::DEBUG::2012-07-18
09:59:41,604::BindingXMLRPC::160::vds::(wrapper) [10.1.20.2]

Thread-1530::DEBUG::2012-07-18
09:59:41,605::task::568::TaskManager.Task::(_updateState)
Task=`90d34839-2ea2-4be5-b9b5-b3f4c330dc16`::moving from state init -> state
preparing

Thread-1530::INFO::2012-07-18
09:59:41,605::logUtils::37::dispatcher::(wrapper) Run and protect:
validateStorageServerConnection(domType=6,
spUUID='00000000-0000-0000-0000-000000000000', conList=[{'port': '',
'connection': '127.0.0.1:/sd3', 'iqn': '', 'portal': '', 'user': '',
'vfs_type': 'glusterfs', 'password': '******', 'id':
'00000000-0000-0000-0000-000000000000'}], options=None)

Thread-1530::INFO::2012-07-18
09:59:41,605::logUtils::39::dispatcher::(wrapper) Run and protect:
validateStorageServerConnection, Return response: {'statuslist': [{'status':
0, 'id': '00000000-0000-0000-0000-000000000000'}]}

Thread-1530::DEBUG::2012-07-18
09:59:41,605::task::1151::TaskManager.Task::(prepare)
Task=`90d34839-2ea2-4be5-b9b5-b3f4c330dc16`::finished: {'statuslist':
[{'status': 0, 'id': '00000000-0000-0000-0000-000000000000'}]}

Thread-1530::DEBUG::2012-07-18
09:59:41,605::task::568::TaskManager.Task::(_updateState)
Task=`90d34839-2ea2-4be5-b9b5-b3f4c330dc16`::moving from state preparing ->
state finished

Thread-1530::DEBUG::2012-07-18
09:59:41,606::resourceManager::809::ResourceManager.Owner::(releaseAll)
Owner.releaseAll requests {} resources {}

Thread-1530::DEBUG::2012-07-18
09:59:41,606::resourceManager::844::ResourceManager.Owner::(cancelAll)
Owner.cancelAll requests {}

Thread-1530::DEBUG::2012-07-18
09:59:41,606::task::957::TaskManager.Task::(_decref)
Task=`90d34839-2ea2-4be5-b9b5-b3f4c330dc16`::ref 0 aborting False

Thread-1531::DEBUG::2012-07-18
09:59:41,634::BindingXMLRPC::160::vds::(wrapper) [10.1.20.2]

Thread-1531::DEBUG::2012-07-18
09:59:41,634::task::568::TaskManager.Task::(_updateState)
Task=`19e8db57-2942-495a-a58c-398b79646a65`::moving from state init -> state
preparing

Thread-1531::INFO::2012-07-18
09:59:41,635::logUtils::37::dispatcher::(wrapper) Run and protect:
connectStorageServer(domType=6,
spUUID='00000000-0000-0000-0000-000000000000', conList=[{'port': '',
'connection': '127.0.0.1:/sd3', 'iqn': '', 'portal': '', 'user': '',
'vfs_type': 'glusterfs', 'password': '******', 'id':
'dd989346-1bd1-4772-8cc3-a8913680af01'}], options=None)

Thread-1531::DEBUG::2012-07-18
09:59:41,637::__init__::1164::Storage.Misc.excCmd::(_log) '/usr/bin/sudo -n
/bin/mount -t glusterfs 127.0.0.1:/sd3 /rhev/data-center/mnt/127.0.0.1:_sd3'
(cwd None)

Thread-1533::DEBUG::2012-07-18
09:59:43,910::task::568::TaskManager.Task::(_updateState)
Task=`d350c85f-51f9-4ee2-be80-1dfbc072e789`::moving from state init -> state
preparing

Thread-1533::INFO::2012-07-18
09:59:43,911::logUtils::37::dispatcher::(wrapper) Run and protect:
repoStats(options=None)

Thread-1533::INFO::2012-07-18
09:59:43,911::logUtils::39::dispatcher::(wrapper) Run and protect:
repoStats, Return response: {}

Thread-1533::DEBUG::2012-07-18
09:59:43,911::task::1151::TaskManager.Task::(prepare)
Task=`d350c85f-51f9-4ee2-be80-1dfbc072e789`::finished: {}

Thread-1533::DEBUG::2012-07-18
09:59:43,911::task::568::TaskManager.Task::(_updateState)
Task=`d350c85f-51f9-4ee2-be80-1dfbc072e789`::moving from state preparing ->
state finished

Thread-1533::DEBUG::2012-07-18
09:59:43,911::resourceManager::809::ResourceManager.Owner::(releaseAll)
Owner.releaseAll requests {} resources {}

Thread-1533::DEBUG::2012-07-18
09:59:43,911::resourceManager::844::ResourceManager.Owner::(cancelAll)
Owner.cancelAll requests {}

Thread-1533::DEBUG::2012-07-18
09:59:43,912::task::957::TaskManager.Task::(_decref)
Task=`d350c85f-51f9-4ee2-be80-1dfbc072e789`::ref 0 aborting False

Thread-1531::DEBUG::2012-07-18
09:59:45,719::lvm::477::OperationMutex::(_invalidateAllPvs) Operation 'lvm
invalidate operation' got the operation mutex

Thread-1531::DEBUG::2012-07-18
09:59:45,720::lvm::479::OperationMutex::(_invalidateAllPvs) Operation 'lvm
invalidate operation' released the operation mutex

Thread-1531::DEBUG::2012-07-18
09:59:45,720::lvm::488::OperationMutex::(_invalidateAllVgs) Operation 'lvm
invalidate operation' got the operation mutex

Thread-1531::DEBUG::2012-07-18
09:59:45,720::lvm::490::OperationMutex::(_invalidateAllVgs) Operation 'lvm
invalidate operation' released the operation mutex

Thread-1531::DEBUG::2012-07-18
09:59:45,721::lvm::508::OperationMutex::(_invalidateAllLvs) Operation 'lvm
invalidate operation' got the operation mutex

Thread-1531::DEBUG::2012-07-18
09:59:45,721::lvm::510::OperationMutex::(_invalidateAllLvs) Operation 'lvm
invalidate operation' released the operation mutex

Thread-1531::INFO::2012-07-18
09:59:45,721::logUtils::39::dispatcher::(wrapper) Run and protect:
connectStorageServer, Return response: {'statuslist': [{'status': 0, 'id':
'dd989346-1bd1-4772-8cc3-a8913680af01'}]}

Thread-1531::DEBUG::2012-07-18
09:59:45,721::task::1151::TaskManager.Task::(prepare)
Task=`19e8db57-2942-495a-a58c-398b79646a65`::finished: {'statuslist':
[{'status': 0, 'id': 'dd989346-1bd1-4772-8cc3-a8913680af01'}]}

Thread-1531::DEBUG::2012-07-18
09:59:45,722::task::568::TaskManager.Task::(_updateState)
Task=`19e8db57-2942-495a-a58c-398b79646a65`::moving from state preparing ->
state finished

Thread-1531::DEBUG::2012-07-18
09:59:45,722::resourceManager::809::ResourceManager.Owner::(releaseAll)
Owner.releaseAll requests {} resources {}

Thread-1531::DEBUG::2012-07-18
09:59:45,722::resourceManager::844::ResourceManager.Owner::(cancelAll)
Owner.cancelAll requests {}

Thread-1531::DEBUG::2012-07-18
09:59:45,722::task::957::TaskManager.Task::(_decref)
Task=`19e8db57-2942-495a-a58c-398b79646a65`::ref 0 aborting False

Thread-1535::DEBUG::2012-07-18
09:59:45,802::BindingXMLRPC::160::vds::(wrapper) [10.1.20.2]

Thread-1535::DEBUG::2012-07-18
09:59:45,803::task::568::TaskManager.Task::(_updateState)
Task=`6d187cad-6851-48c0-9fc8-f19698b385ca`::moving from state init -> state
preparing

Thread-1535::INFO::2012-07-18
09:59:45,803::logUtils::37::dispatcher::(wrapper) Run and protect:
createStorageDomain(storageType=6,
sdUUID='2118b60e-d905-49c8-8379-3de2ef7e9711', domainName='xcv',
typeSpecificArg='127.0.0.1:/sd3', domClass=1, domVersion='0', options=None)

Thread-1535::DEBUG::2012-07-18
09:59:45,803::misc::1054::SamplingMethod::(__call__) Trying to enter
sampling method (storage.sdc.refreshStorage)

Thread-1535::DEBUG::2012-07-18
09:59:45,803::misc::1056::SamplingMethod::(__call__) Got in to sampling
method

Thread-1535::DEBUG::2012-07-18
09:59:45,804::misc::1054::SamplingMethod::(__call__) Trying to enter
sampling method (storage.iscsi.rescan)

Thread-1535::DEBUG::2012-07-18
09:59:45,804::misc::1056::SamplingMethod::(__call__) Got in to sampling
method

Thread-1535::DEBUG::2012-07-18
09:59:45,804::__init__::1164::Storage.Misc.excCmd::(_log) '/usr/bin/sudo -n
/sbin/iscsiadm -m session -R' (cwd None)

Thread-1535::DEBUG::2012-07-18
09:59:45,819::__init__::1164::Storage.Misc.excCmd::(_log) FAILED: <err> =
'iscsiadm: No session found.\n'; <rc> = 21

Thread-1535::DEBUG::2012-07-18
09:59:45,819::misc::1064::SamplingMethod::(__call__) Returning last result

Thread-1535::DEBUG::2012-07-18
09:59:45,984::__init__::1164::Storage.Misc.excCmd::(_log) '/usr/bin/sudo -n
/sbin/multipath' (cwd None)

Thread-1535::DEBUG::2012-07-18
09:59:46,021::__init__::1164::Storage.Misc.excCmd::(_log) SUCCESS: <err> =
''; <rc> = 0

Thread-1535::DEBUG::2012-07-18
09:59:46,021::lvm::477::OperationMutex::(_invalidateAllPvs) Operation 'lvm
invalidate operation' got the operation mutex

Thread-1535::DEBUG::2012-07-18
09:59:46,021::lvm::479::OperationMutex::(_invalidateAllPvs) Operation 'lvm
invalidate operation' released the operation mutex

Thread-1535::DEBUG::2012-07-18
09:59:46,022::lvm::488::OperationMutex::(_invalidateAllVgs) Operation 'lvm
invalidate operation' got the operation mutex

Thread-1535::DEBUG::2012-07-18
09:59:46,022::lvm::490::OperationMutex::(_invalidateAllVgs) Operation 'lvm
invalidate operation' released the operation mutex

Thread-1535::DEBUG::2012-07-18
09:59:46,022::lvm::508::OperationMutex::(_invalidateAllLvs) Operation 'lvm
invalidate operation' got the operation mutex

Thread-1535::DEBUG::2012-07-18
09:59:46,022::lvm::510::OperationMutex::(_invalidateAllLvs) Operation 'lvm
invalidate operation' released the operation mutex

Thread-1535::DEBUG::2012-07-18
09:59:46,023::misc::1064::SamplingMethod::(__call__) Returning last result

Thread-1535::DEBUG::2012-07-18
09:59:46,023::lvm::368::OperationMutex::(_reloadvgs) Operation 'lvm reload
operation' got the operation mutex

Thread-1535::DEBUG::2012-07-18
09:59:46,024::__init__::1164::Storage.Misc.excCmd::(_log) '/usr/bin/sudo -n
/sbin/lvm vgs --config " devices { preferred_names = [\\"^/dev/mapper/\\"]
ignore_suspended_devices=1 write_cache_state=0 disable_after_error_count=3
filter = [ \\"a%35000c500048c2263%\\", \\"r%.*%\\" ] }  global {
locking_type=1  prioritise_write_locks=1  wait_for_locks=1 }  backup {
retain_min = 50  retain_days = 0 } " --noheadings --units b --nosuffix
--separator | -o
uuid,name,attr,size,free,extent_size,extent_count,free_count,tags,vg_mda_siz
e,vg_mda_free 2118b60e-d905-49c8-8379-3de2ef7e9711' (cwd None)

Thread-1535::DEBUG::2012-07-18
09:59:46,171::__init__::1164::Storage.Misc.excCmd::(_log) FAILED: <err> = '
Volume group "2118b60e-d905-49c8-8379-3de2ef7e9711" not found\n'; <rc> = 5

Thread-1535::WARNING::2012-07-18
09:59:46,172::lvm::373::Storage.LVM::(_reloadvgs) lvm vgs failed: 5 [] ['
Volume group "2118b60e-d905-49c8-8379-3de2ef7e9711" not found']

Thread-1535::DEBUG::2012-07-18
09:59:46,172::lvm::397::OperationMutex::(_reloadvgs) Operation 'lvm reload
operation' released the operation mutex

Thread-1535::INFO::2012-07-18
09:59:46,177::nfsSD::64::Storage.StorageDomain::(create)
sdUUID=2118b60e-d905-49c8-8379-3de2ef7e9711 domainName=xcv
remotePath=127.0.0.1:/sd3 domClass=1

Thread-1535::ERROR::2012-07-18
09:59:46,180::task::833::TaskManager.Task::(_setError)
Task=`6d187cad-6851-48c0-9fc8-f19698b385ca`::Unexpected error

Traceback (most recent call last):

  File "/usr/share/vdsm/storage/task.py", line 840, in _run

    return fn(*args, **kargs)

  File "/usr/share/vdsm/logUtils.py", line 38, in wrapper

    res = f(*args, **kwargs)

  File "/usr/share/vdsm/storage/hsm.py", line 2143, in createStorageDomain

    typeSpecificArg, storageType, domVersion)

  File "/usr/share/vdsm/storage/nfsSD.py", line 75, in create

    cls._preCreateValidation(sdUUID, mntPoint, remotePath, version)

  File "/usr/share/vdsm/storage/nfsSD.py", line 46, in _preCreateValidation

    fileSD.validateDirAccess(domPath)

  File "/usr/share/vdsm/storage/fileSD.py", line 51, in validateDirAccess

    getProcPool().fileUtils.validateAccess(dirPath)

  File "/usr/share/vdsm/storage/remoteFileHandler.py", line 270, in
callCrabRPCFunction

    *args, **kwargs)

  File "/usr/share/vdsm/storage/remoteFileHandler.py", line 186, in
callCrabRPCFunction

    res, err = pickle.loads(rawResponse)

  File "/usr/lib64/python2.6/pickle.py", line 1374, in loads

    return Unpickler(file).load()

  File "/usr/lib64/python2.6/pickle.py", line 858, in load

    dispatch[key](self)

  File "/usr/lib64/python2.6/pickle.py", line 1090, in load_global

    klass = self.find_class(module, name)

  File "/usr/lib64/python2.6/pickle.py", line 1124, in find_class

    __import__(module)

ImportError: No module named storage_exception

Thread-1535::DEBUG::2012-07-18
09:59:46,181::task::852::TaskManager.Task::(_run)
Task=`6d187cad-6851-48c0-9fc8-f19698b385ca`::Task._run:
6d187cad-6851-48c0-9fc8-f19698b385ca (6,
'2118b60e-d905-49c8-8379-3de2ef7e9711', 'xcv', '127.0.0.1:/sd3', 1, '0') {}
failed - stopping task

Thread-1535::DEBUG::2012-07-18
09:59:46,181::task::1177::TaskManager.Task::(stop)
Task=`6d187cad-6851-48c0-9fc8-f19698b385ca`::stopping in state preparing
(force False)

Thread-1535::DEBUG::2012-07-18
09:59:46,181::task::957::TaskManager.Task::(_decref)
Task=`6d187cad-6851-48c0-9fc8-f19698b385ca`::ref 1 aborting True

Thread-1535::INFO::2012-07-18
09:59:46,181::task::1134::TaskManager.Task::(prepare)
Task=`6d187cad-6851-48c0-9fc8-f19698b385ca`::aborting: Task is aborted: u'No
module named storage_exception' - code 100

Thread-1535::DEBUG::2012-07-18
09:59:46,182::task::1139::TaskManager.Task::(prepare)
Task=`6d187cad-6851-48c0-9fc8-f19698b385ca`::Prepare: aborted: No module
named storage_exception

Thread-1535::DEBUG::2012-07-18
09:59:46,182::task::957::TaskManager.Task::(_decref)
Task=`6d187cad-6851-48c0-9fc8-f19698b385ca`::ref 0 aborting True

Thread-1535::DEBUG::2012-07-18
09:59:46,182::task::892::TaskManager.Task::(_doAbort)
Task=`6d187cad-6851-48c0-9fc8-f19698b385ca`::Task._doAbort: force False

Thread-1535::DEBUG::2012-07-18
09:59:46,182::resourceManager::844::ResourceManager.Owner::(cancelAll)
Owner.cancelAll requests {}

Thread-1535::DEBUG::2012-07-18
09:59:46,183::task::568::TaskManager.Task::(_updateState)
Task=`6d187cad-6851-48c0-9fc8-f19698b385ca`::moving from state preparing ->
state aborting

Thread-1535::DEBUG::2012-07-18
09:59:46,183::task::523::TaskManager.Task::(__state_aborting)
Task=`6d187cad-6851-48c0-9fc8-f19698b385ca`::_aborting: recover policy none

Thread-1535::DEBUG::2012-07-18
09:59:46,183::task::568::TaskManager.Task::(_updateState)
Task=`6d187cad-6851-48c0-9fc8-f19698b385ca`::moving from state aborting ->
state failed

Thread-1535::DEBUG::2012-07-18
09:59:46,183::resourceManager::809::ResourceManager.Owner::(releaseAll)
Owner.releaseAll requests {} resources {}

Thread-1535::DEBUG::2012-07-18
09:59:46,183::resourceManager::844::ResourceManager.Owner::(cancelAll)
Owner.cancelAll requests {}

Thread-1535::ERROR::2012-07-18
09:59:46,184::dispatcher::69::Storage.Dispatcher.Protect::(run) No module
named storage_exception

Traceback (most recent call last):

  File "/usr/share/vdsm/storage/dispatcher.py", line 61, in run

    result = ctask.prepare(self.func, *args, **kwargs)

  File "/usr/share/vdsm/storage/task.py", line 1142, in prepare

    raise self.error

ImportError: No module named storage_exception

Thread-1539::DEBUG::2012-07-18
09:59:46,376::BindingXMLRPC::160::vds::(wrapper) [10.1.20.2]

Thread-1539::DEBUG::2012-07-18
09:59:46,377::task::568::TaskManager.Task::(_updateState)
Task=`37c8be8a-d5a9-4fca-b502-754795857aea`::moving from state init -> state
preparing

Thread-1539::INFO::2012-07-18
09:59:46,377::logUtils::37::dispatcher::(wrapper) Run and protect:
disconnectStorageServer(domType=6,
spUUID='00000000-0000-0000-0000-000000000000', conList=[{'port': '',
'connection': '127.0.0.1:/sd3', 'iqn': '', 'portal': '', 'user': '',
'vfs_type': 'glusterfs', 'password': '******', 'id':
'00000000-0000-0000-0000-000000000000'}], options=None)

Thread-1539::DEBUG::2012-07-18
09:59:46,377::__init__::1164::Storage.Misc.excCmd::(_log) '/usr/bin/sudo -n
/bin/umount -f -l /rhev/data-center/mnt/127.0.0.1:_sd3' (cwd None)

Thread-1539::DEBUG::2012-07-18
09:59:46,398::misc::1054::SamplingMethod::(__call__) Trying to enter
sampling method (storage.sdc.refreshStorage)

Thread-1539::DEBUG::2012-07-18
09:59:46,399::misc::1056::SamplingMethod::(__call__) Got in to sampling
method

Thread-1539::DEBUG::2012-07-18
09:59:46,399::misc::1054::SamplingMethod::(__call__) Trying to enter
sampling method (storage.iscsi.rescan)

Thread-1539::DEBUG::2012-07-18
09:59:46,399::misc::1056::SamplingMethod::(__call__) Got in to sampling
method

Thread-1539::DEBUG::2012-07-18
09:59:46,399::__init__::1164::Storage.Misc.excCmd::(_log) '/usr/bin/sudo -n
/sbin/iscsiadm -m session -R' (cwd None)

Thread-1539::DEBUG::2012-07-18
09:59:46,414::__init__::1164::Storage.Misc.excCmd::(_log) FAILED: <err> =
'iscsiadm: No session found.\n'; <rc> = 21

Thread-1539::DEBUG::2012-07-18
09:59:46,415::misc::1064::SamplingMethod::(__call__) Returning last result

Thread-1539::DEBUG::2012-07-18
09:59:46,579::__init__::1164::Storage.Misc.excCmd::(_log) '/usr/bin/sudo -n
/sbin/multipath' (cwd None)

Thread-1539::DEBUG::2012-07-18
09:59:46,616::__init__::1164::Storage.Misc.excCmd::(_log) SUCCESS: <err> =
''; <rc> = 0

Thread-1539::DEBUG::2012-07-18
09:59:46,616::lvm::477::OperationMutex::(_invalidateAllPvs) Operation 'lvm
invalidate operation' got the operation mutex

Thread-1539::DEBUG::2012-07-18
09:59:46,616::lvm::479::OperationMutex::(_invalidateAllPvs) Operation 'lvm
invalidate operation' released the operation mutex

Thread-1539::DEBUG::2012-07-18
09:59:46,617::lvm::488::OperationMutex::(_invalidateAllVgs) Operation 'lvm
invalidate operation' got the operation mutex

Thread-1539::DEBUG::2012-07-18
09:59:46,617::lvm::490::OperationMutex::(_invalidateAllVgs) Operation 'lvm
invalidate operation' released the operation mutex

Thread-1539::DEBUG::2012-07-18
09:59:46,617::lvm::508::OperationMutex::(_invalidateAllLvs) Operation 'lvm
invalidate operation' got the operation mutex

Thread-1539::DEBUG::2012-07-18
09:59:46,617::lvm::510::OperationMutex::(_invalidateAllLvs) Operation 'lvm
invalidate operation' released the operation mutex

Thread-1539::DEBUG::2012-07-18
09:59:46,618::misc::1064::SamplingMethod::(__call__) Returning last result

Thread-1539::INFO::2012-07-18
09:59:46,618::logUtils::39::dispatcher::(wrapper) Run and protect:
disconnectStorageServer, Return response: {'statuslist': [{'status': 0,
'id': '00000000-0000-0000-0000-000000000000'}]}

Thread-1539::DEBUG::2012-07-18
09:59:46,618::task::1151::TaskManager.Task::(prepare)
Task=`37c8be8a-d5a9-4fca-b502-754795857aea`::finished: {'statuslist':
[{'status': 0, 'id': '00000000-0000-0000-0000-000000000000'}]}

Thread-1539::DEBUG::2012-07-18
09:59:46,618::task::568::TaskManager.Task::(_updateState)
Task=`37c8be8a-d5a9-4fca-b502-754795857aea`::moving from state preparing ->
state finished

Thread-1539::DEBUG::2012-07-18
09:59:46,618::resourceManager::809::ResourceManager.Owner::(releaseAll)
Owner.releaseAll requests {} resources {}

Thread-1539::DEBUG::2012-07-18
09:59:46,619::resourceManager::844::ResourceManager.Owner::(cancelAll)
Owner.cancelAll requests {}

Thread-1539::DEBUG::2012-07-18
09:59:46,619::task::957::TaskManager.Task::(_decref)
Task=`37c8be8a-d5a9-4fca-b502-754795857aea`::ref 0 aborting False

Thread-1543::DEBUG::2012-07-18
09:59:54,010::task::568::TaskManager.Task::(_updateState)
Task=`a1230d49-dae8-4a2d-ae39-543aa1e530bb`::moving from state init -> state
preparing

Thread-1543::INFO::2012-07-18
09:59:54,011::logUtils::37::dispatcher::(wrapper) Run and protect:
repoStats(options=None)

Thread-1543::INFO::2012-07-18
09:59:54,011::logUtils::39::dispatcher::(wrapper) Run and protect:
repoStats, Return response: {}

Thread-1543::DEBUG::2012-07-18
09:59:54,011::task::1151::TaskManager.Task::(prepare)
Task=`a1230d49-dae8-4a2d-ae39-543aa1e530bb`::finished: {}

Thread-1543::DEBUG::2012-07-18
09:59:54,011::task::568::TaskManager.Task::(_updateState)
Task=`a1230d49-dae8-4a2d-ae39-543aa1e530bb`::moving from state preparing ->
state finished

Thread-1543::DEBUG::2012-07-18
09:59:54,012::resourceManager::809::ResourceManager.Owner::(releaseAll)
Owner.releaseAll requests {} resources {}

Thread-1543::DEBUG::2012-07-18
09:59:54,012::resourceManager::844::ResourceManager.Owner::(cancelAll)
Owner.cancelAll requests {}

Thread-1543::DEBUG::2012-07-18
09:59:54,012::task::957::TaskManager.Task::(_decref)
Task=`a1230d49-dae8-4a2d-ae39-543aa1e530bb`::ref 0 aborting False

^C

 

Command:

[root at noc-2-m77 tmp]# rpm -qa | grep vdsm

vdsm-xmlrpc-4.10.0-0.184.7.el6.noarch

vdsm-python-4.10.0-0.184.7.el6.x86_64

vdsm-4.10.0-0.184.7.el6.x86_64

vdsm-cli-4.10.0-0.184.7.el6.noarch

vdsm-gluster-4.10.0-0.184.7.el6.noarch

[root at noc-2-m77 tmp]# rpm -qa | grep lvm

lvm2-2.02.95-10.el6.x86_64

lvm2-libs-2.02.95-10.el6.x86_64

[root at noc-2-m77 tmp]# rpm -qa | grep gluster

glusterfs-fuse-3.3.0-1.el6.x86_64

glusterfs-3.3.0-1.el6.x86_64

glusterfs-server-3.3.0-1.el6.x86_64

glusterfs-rdma-3.3.0-1.el6.x86_64

vdsm-gluster-4.10.0-0.184.7.el6.noarch

glusterfs-geo-replication-3.3.0-1.el6.x86_64

[root at noc-2-m77 tmp]#

 

[root at noc-2-m77 tmp]# gluster volume info all

 

Volume Name: sd3

Type: Stripe

Volume ID: bb966ba1-9b35-436a-bbc7-0503be5a1114

Status: Started

Number of Bricks: 1 x 2 = 2

Transport-type: tcp

Bricks:

Brick1: 10.1.20.10:/mht

Brick2: 10.2.20.2:/mht

[root at noc-2-m77 tmp]#

 

 

 

[root at noc-2-m77 tmp]# /usr/bin/sudo -n /bin/mount -t glusterfs
127.0.0.1:/sd3 /tmp/test2

 

[root at noc-2-m77 tmp]# cd /tmp/test2

[root at noc-2-m77 test2]# ls

[root at noc-2-m77 test2]# mkdir 12

[root at noc-2-m77 test2]# ls

12

[root at noc-2-m77 test2]# rm 21

rm: cannot remove `21': No such file or directory

[root at noc-2-m77 test2]# rm 12

rm: cannot remove `12': Is a directory

[root at noc-2-m77 test2]# ls

12

[root at noc-2-m77 test2]# rm -r 12

rm: remove directory `12'? y

[root at noc-2-m77 test2]# ls

[root at noc-2-m77 test2]#

 

[root at noc-2-m77 test2]# df -h

Filesystem            Size  Used Avail Use% Mounted on

/dev/mapper/vg_noc2m77-lv_root

                       50G  3.9G   44G   9% /

tmpfs                 7.9G     0  7.9G   0% /dev/shm

/dev/sda1             497M  243M  230M  52% /boot

/dev/mapper/vg_noc2m77-lv_home

                      491G  7.4G  459G   2% /mht

127.0.0.1:/sd3        979G   15G  915G   2% /tmp/test2

[root at noc-2-m77 test2]#

 

 

 

 

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.ovirt.org/pipermail/users/attachments/20120718/33e42df6/attachment-0001.html>


More information about the Users mailing list