[Users] allinone setup can't add storage

Hi all, Can someone help me with this error: AIO: Adding Local Datacenter and cluster... [ ERROR ] Error: could not create ovirtsdk API object trace from the log file 2012-11-07 13:34:44::DEBUG::all_in_one_100::220::root:: Initiating the API object 2012-11-07 13:34:44::ERROR::all_in_one_100::231::root:: Traceback (most recent call last): File "/usr/share/ovirt-engine/scripts/plugins/all_in_one_100.py", line 228, in initAPI ca_file=basedefs.FILE_CA_CRT_SRC, TypeError: __init__() got an unexpected keyword argument 'ca_file' 2012-11-07 13:34:44::DEBUG::setup_sequences::62::root:: Traceback (most recent call last): File "/usr/share/ovirt-engine/scripts/setup_sequences.py", line 60, in run function() File "/usr/share/ovirt-engine/scripts/plugins/all_in_one_100.py", line 232, in initAPI raise Exception(ERROR_CREATE_API_OBJECT) Exception: Error: could not create ovirtsdk API object Versions installed: ovirt-engine-3.1.0-3.20121106.git6891171.fc17.noarch ovirt-engine-backend-3.1.0-3.20121106.git6891171.fc17.noarch ovirt-engine-cli-3.1.0.6-1.fc17.noarch ovirt-engine-config-3.1.0-3.20121106.git6891171.fc17.noarch ovirt-engine-dbscripts-3.1.0-3.20121106.git6891171.fc17.noarch ovirt-engine-genericapi-3.1.0-3.20121106.git6891171.fc17.noarch ovirt-engine-notification-service-3.1.0-3.20121106.git6891171.fc17.noarch ovirt-engine-restapi-3.1.0-3.20121106.git6891171.fc17.noarch ovirt-engine-sdk-3.1.0.4-1.fc17.noarch ovirt-engine-setup-3.1.0-3.20121106.git6891171.fc17.noarch ovirt-engine-setup-plugin-allinone-3.1.0-3.20121106.git6891171.fc17.noarch ovirt-engine-tools-common-3.1.0-3.20121106.git6891171.fc17.noarch ovirt-engine-userportal-3.1.0-3.20121106.git6891171.fc17.noarch ovirt-engine-webadmin-portal-3.1.0-3.20121106.git6891171.fc17.noarch

you are using ovirt-sdk from stable repo and engine from nightly repo, those doesn't work together. On 11/07/2012 01:40 PM, Cristian Falcas wrote:
Hi all,
Can someone help me with this error:
AIO: Adding Local Datacenter and cluster... [ ERROR ] Error: could not create ovirtsdk API object
trace from the log file
2012-11-07 13:34:44::DEBUG::all_in_one_100::220::root:: Initiating the API object 2012-11-07 13:34:44::ERROR::all_in_one_100::231::root:: Traceback (most recent call last): File "/usr/share/ovirt-engine/scripts/plugins/all_in_one_100.py", line 228, in initAPI ca_file=basedefs.FILE_CA_CRT_SRC, TypeError: __init__() got an unexpected keyword argument 'ca_file'
2012-11-07 13:34:44::DEBUG::setup_sequences::62::root:: Traceback (most recent call last): File "/usr/share/ovirt-engine/scripts/setup_sequences.py", line 60, in run function() File "/usr/share/ovirt-engine/scripts/plugins/all_in_one_100.py", line 232, in initAPI raise Exception(ERROR_CREATE_API_OBJECT) Exception: Error: could not create ovirtsdk API object
Versions installed:
ovirt-engine-3.1.0-3.20121106.git6891171.fc17.noarch ovirt-engine-backend-3.1.0-3.20121106.git6891171.fc17.noarch ovirt-engine-cli-3.1.0.6-1.fc17.noarch ovirt-engine-config-3.1.0-3.20121106.git6891171.fc17.noarch ovirt-engine-dbscripts-3.1.0-3.20121106.git6891171.fc17.noarch ovirt-engine-genericapi-3.1.0-3.20121106.git6891171.fc17.noarch ovirt-engine-notification-service-3.1.0-3.20121106.git6891171.fc17.noarch ovirt-engine-restapi-3.1.0-3.20121106.git6891171.fc17.noarch ovirt-engine-sdk-3.1.0.4-1.fc17.noarch ovirt-engine-setup-3.1.0-3.20121106.git6891171.fc17.noarch ovirt-engine-setup-plugin-allinone-3.1.0-3.20121106.git6891171.fc17.noarch ovirt-engine-tools-common-3.1.0-3.20121106.git6891171.fc17.noarch ovirt-engine-userportal-3.1.0-3.20121106.git6891171.fc17.noarch ovirt-engine-webadmin-portal-3.1.0-3.20121106.git6891171.fc17.noarch
_______________________________________________ Users mailing list Users@ovirt.org http://lists.ovirt.org/mailman/listinfo/users

Bummer. And the sdk from nightly is not working because it's too new, from what I sow on the mailing list? I have errors even when I'm trying to add the storage manually, from the engine.Should I give up on nightly and use the beta ones, or there is an other solution for this? On Wed, Nov 7, 2012 at 3:47 PM, Moran Goldboim <mgoldboi@redhat.com> wrote:
you are using ovirt-sdk from stable repo and engine from nightly repo, those doesn't work together.
On 11/07/2012 01:40 PM, Cristian Falcas wrote:
Hi all,
Can someone help me with this error:
AIO: Adding Local Datacenter and cluster... [ ERROR ] Error: could not create ovirtsdk API object
trace from the log file
2012-11-07 13:34:44::DEBUG::all_in_one_**100::220::root:: Initiating the API object 2012-11-07 13:34:44::ERROR::all_in_one_**100::231::root:: Traceback (most recent call last): File "/usr/share/ovirt-engine/**scripts/plugins/all_in_one_**100.py", line 228, in initAPI ca_file=basedefs.FILE_CA_CRT_**SRC, TypeError: __init__() got an unexpected keyword argument 'ca_file'
2012-11-07 13:34:44::DEBUG::setup_**sequences::62::root:: Traceback (most recent call last): File "/usr/share/ovirt-engine/**scripts/setup_sequences.py", line 60, in run function() File "/usr/share/ovirt-engine/**scripts/plugins/all_in_one_**100.py", line 232, in initAPI raise Exception(ERROR_CREATE_API_**OBJECT) Exception: Error: could not create ovirtsdk API object
Versions installed:
ovirt-engine-3.1.0-3.20121106.**git6891171.fc17.noarch ovirt-engine-backend-3.1.0-3.**20121106.git6891171.fc17.**noarch ovirt-engine-cli-3.1.0.6-1.**fc17.noarch ovirt-engine-config-3.1.0-3.**20121106.git6891171.fc17.**noarch ovirt-engine-dbscripts-3.1.0-**3.20121106.git6891171.fc17.**noarch ovirt-engine-genericapi-3.1.0-**3.20121106.git6891171.fc17.**noarch ovirt-engine-notification-**service-3.1.0-3.20121106.** git6891171.fc17.noarch ovirt-engine-restapi-3.1.0-3.**20121106.git6891171.fc17.**noarch ovirt-engine-sdk-3.1.0.4-1.**fc17.noarch ovirt-engine-setup-3.1.0-3.**20121106.git6891171.fc17.**noarch ovirt-engine-setup-plugin-**allinone-3.1.0-3.20121106.** git6891171.fc17.noarch ovirt-engine-tools-common-3.1.**0-3.20121106.git6891171.fc17.**noarch ovirt-engine-userportal-3.1.0-**3.20121106.git6891171.fc17.**noarch ovirt-engine-webadmin-portal-**3.1.0-3.20121106.git6891171.**fc17.noarch
______________________________**_________________ Users mailing list Users@ovirt.org http://lists.ovirt.org/**mailman/listinfo/users<http://lists.ovirt.org/mailman/listinfo/users>

On 11/07/2012 02:59 PM, Cristian Falcas wrote:
Bummer.
And the sdk from nightly is not working because it's too new, from what I sow on the mailing list?
I have errors even when I'm trying to add the storage manually, from the engine.Should I give up on nightly and use the beta ones, or there is an other solution for this?
shouldn't fail from admin. which vdsm version, logs?
On Wed, Nov 7, 2012 at 3:47 PM, Moran Goldboim <mgoldboi@redhat.com <mailto:mgoldboi@redhat.com>> wrote:
you are using ovirt-sdk from stable repo and engine from nightly repo, those doesn't work together.
On 11/07/2012 01:40 PM, Cristian Falcas wrote:
Hi all,
Can someone help me with this error:
AIO: Adding Local Datacenter and cluster... [ ERROR ] Error: could not create ovirtsdk API object
trace from the log file
2012-11-07 13:34:44::DEBUG::all_in_one___100::220::root:: Initiating the API object 2012-11-07 13:34:44::ERROR::all_in_one___100::231::root:: Traceback (most recent call last): File "/usr/share/ovirt-engine/__scripts/plugins/all_in_one___100.py", line 228, in initAPI ca_file=basedefs.FILE_CA_CRT___SRC, TypeError: __init__() got an unexpected keyword argument 'ca_file'
2012-11-07 13:34:44::DEBUG::setup___sequences::62::root:: Traceback (most recent call last): File "/usr/share/ovirt-engine/__scripts/setup_sequences.py", line 60, in run function() File "/usr/share/ovirt-engine/__scripts/plugins/all_in_one___100.py", line 232, in initAPI raise Exception(ERROR_CREATE_API___OBJECT) Exception: Error: could not create ovirtsdk API object
Versions installed:
ovirt-engine-3.1.0-3.20121106.__git6891171.fc17.noarch ovirt-engine-backend-3.1.0-3.__20121106.git6891171.fc17.__noarch ovirt-engine-cli-3.1.0.6-1.__fc17.noarch ovirt-engine-config-3.1.0-3.__20121106.git6891171.fc17.__noarch ovirt-engine-dbscripts-3.1.0-__3.20121106.git6891171.fc17.__noarch ovirt-engine-genericapi-3.1.0-__3.20121106.git6891171.fc17.__noarch ovirt-engine-notification-__service-3.1.0-3.20121106.__git6891171.fc17.noarch ovirt-engine-restapi-3.1.0-3.__20121106.git6891171.fc17.__noarch ovirt-engine-sdk-3.1.0.4-1.__fc17.noarch ovirt-engine-setup-3.1.0-3.__20121106.git6891171.fc17.__noarch ovirt-engine-setup-plugin-__allinone-3.1.0-3.20121106.__git6891171.fc17.noarch ovirt-engine-tools-common-3.1.__0-3.20121106.git6891171.fc17.__noarch ovirt-engine-userportal-3.1.0-__3.20121106.git6891171.fc17.__noarch ovirt-engine-webadmin-portal-__3.1.0-3.20121106.git6891171.__fc17.noarch
_________________________________________________ Users mailing list Users@ovirt.org <mailto:Users@ovirt.org> http://lists.ovirt.org/__mailman/listinfo/users <http://lists.ovirt.org/mailman/listinfo/users>
_______________________________________________ Users mailing list Users@ovirt.org http://lists.ovirt.org/mailman/listinfo/users

This is what I got: Error in GUI: Cannot add Storage. Internal error, Storage Connection doesn't exist. Permissions: ls -la /media/ceva2/Ovirt/Storage/ total 8 drwxrwxr-x. 2 vdsm kvm 4096 Nov 6 09:04 . drwxr-xr-x. 5 vdsm kvm 4096 Nov 2 10:55 .. df | grep /media/ceva2 /dev/mapper/1ATA_WDC_WD2500BB-00GUA0_WD-WCAL73625324p2 144237688 124962468 11948376 92% /media/ceva2 This is with the latest nightly on ovirt and vdsm. This is the error from vdsm: Thread-366::DEBUG::2012-11-08 08:31:41,244::BindingXMLRPC::161::vds::(wrapper) [79.112.94.67] Thread-366::DEBUG::2012-11-08 08:31:41,244::task::568::TaskManager.Task::(_updateState) Task=`2b324a0a-96ef-4ecd-8ad3-b8588ed93fd5`::moving from state init -> state preparing Thread-366::INFO::2012-11-08 08:31:41,245::logUtils::37::dispatcher::(wrapper) Run and protect: validateStorageServerConnection(domType=4, spUUID='00000000-0000-0000-0000-000000000000', conList=[{'connection': '/media/ceva2/Ovirt/Storage', 'iqn': '', 'portal': '', 'user': '', 'password': '******', 'id': '00000000-0000-0000-0000-000000000000', 'port': ''}], options=None) Thread-366::INFO::2012-11-08 08:31:41,245::logUtils::39::dispatcher::(wrapper) Run and protect: validateStorageServerConnection, Return response: {'statuslist': [{'status': 0, 'id': '00000000-0000-0000-0000-000000000000'}]} Thread-366::DEBUG::2012-11-08 08:31:41,245::task::1151::TaskManager.Task::(prepare) Task=`2b324a0a-96ef-4ecd-8ad3-b8588ed93fd5`::finished: {'statuslist': [{'status': 0, 'id': '00000000-0000-0000-0000-000000000000'}]} Thread-366::DEBUG::2012-11-08 08:31:41,245::task::568::TaskManager.Task::(_updateState) Task=`2b324a0a-96ef-4ecd-8ad3-b8588ed93fd5`::moving from state preparing -> state finished Thread-366::DEBUG::2012-11-08 08:31:41,245::resourceManager::809::ResourceManager.Owner::(releaseAll) Owner.releaseAll requests {} resources {} Thread-366::DEBUG::2012-11-08 08:31:41,245::resourceManager::844::ResourceManager.Owner::(cancelAll) Owner.cancelAll requests {} Thread-366::DEBUG::2012-11-08 08:31:41,246::task::957::TaskManager.Task::(_decref) Task=`2b324a0a-96ef-4ecd-8ad3-b8588ed93fd5`::ref 0 aborting False Thread-367::DEBUG::2012-11-08 08:31:41,304::BindingXMLRPC::161::vds::(wrapper) [79.112.94.67] Thread-367::DEBUG::2012-11-08 08:31:41,305::task::568::TaskManager.Task::(_updateState) Task=`2844a5a4-7148-4d44-858c-fc75abab1a5f`::moving from state init -> state preparing Thread-367::INFO::2012-11-08 08:31:41,305::logUtils::37::dispatcher::(wrapper) Run and protect: connectStorageServer(domType=4, spUUID='00000000-0000-0000-0000-000000000000', conList=[{'connection': '/media/ceva2/Ovirt/Storage', 'iqn': '', 'portal': '', 'user': '', 'password': '******', 'id': '00000000-0000-0000-0000-000000000000', 'port': ''}], options=None) Thread-367::ERROR::2012-11-08 08:31:41,430::hsm::2057::Storage.HSM::(connectStorageServer) Could not connect to storageServer Traceback (most recent call last): File "/usr/share/vdsm/storage/hsm.py", line 2054, in connectStorageServer conObj.connect() File "/usr/share/vdsm/storage/storageServer.py", line 462, in connect if not self.checkTarget(): File "/usr/share/vdsm/storage/storageServer.py", line 449, in checkTarget fileSD.validateDirAccess(self._path)) File "/usr/share/vdsm/storage/fileSD.py", line 51, in validateDirAccess getProcPool().fileUtils.validateAccess(dirPath) File "/usr/share/vdsm/storage/remoteFileHandler.py", line 277, in callCrabRPCFunction *args, **kwargs) File "/usr/share/vdsm/storage/remoteFileHandler.py", line 180, in callCrabRPCFunction rawLength = self._recvAll(LENGTH_STRUCT_LENGTH, timeout) File "/usr/share/vdsm/storage/remoteFileHandler.py", line 149, in _recvAll timeLeft): File "/usr/lib64/python2.7/contextlib.py", line 84, in helper return GeneratorContextManager(func(*args, **kwds)) File "/usr/share/vdsm/storage/remoteFileHandler.py", line 136, in _poll raise Timeout() Timeout Thread-367::INFO::2012-11-08 08:31:41,432::logUtils::39::dispatcher::(wrapper) Run and protect: connectStorageServer, Return response: {'statuslist': [{'status': 100, 'id': '00000000-0000-0000-0000-000000000000'}]} Thread-367::DEBUG::2012-11-08 08:31:41,433::task::1151::TaskManager.Task::(prepare) Task=`2844a5a4-7148-4d44-858c-fc75abab1a5f`::finished: {'statuslist': [{'status': 100, 'id': '00000000-0000-0000-0000-000000000000'}]} Thread-367::DEBUG::2012-11-08 08:31:41,433::task::568::TaskManager.Task::(_updateState) Task=`2844a5a4-7148-4d44-858c-fc75abab1a5f`::moving from state preparing -> state finished Thread-367::DEBUG::2012-11-08 08:31:41,434::resourceManager::809::ResourceManager.Owner::(releaseAll) Owner.releaseAll requests {} resources {} Thread-367::DEBUG::2012-11-08 08:31:41,434::resourceManager::844::ResourceManager.Owner::(cancelAll) Owner.cancelAll requests {} Thread-367::DEBUG::2012-11-08 08:31:41,435::task::957::TaskManager.Task::(_decref) Task=`2844a5a4-7148-4d44-858c-fc75abab1a5f`::ref 0 aborting False Thread-370::DEBUG::2012-11-08 08:31:41,784::BindingXMLRPC::161::vds::(wrapper) [79.112.94.67] Thread-370::DEBUG::2012-11-08 08:31:41,784::task::568::TaskManager.Task::(_updateState) Task=`aa8526b6-0772-4909-bad3-b2cd7ad589cc`::moving from state init -> state preparing Thread-370::INFO::2012-11-08 08:31:41,784::logUtils::37::dispatcher::(wrapper) Run and protect: disconnectStorageServer(domType=4, spUUID='00000000-0000-0000-0000-000000000000', conList=[{'connection': '/media/ceva2/Ovirt/Storage', 'iqn': '', 'portal': '', 'user': '', 'password': '******', 'id': '00000000-0000-0000-0000-000000000000', 'port': ''}], options=None) Thread-370::DEBUG::2012-11-08 08:31:41,785::misc::1026::SamplingMethod::(__call__) Trying to enter sampling method (storage.sdc.refreshStorage) Thread-370::DEBUG::2012-11-08 08:31:41,785::misc::1028::SamplingMethod::(__call__) Got in to sampling method Thread-370::DEBUG::2012-11-08 08:31:41,785::misc::1026::SamplingMethod::(__call__) Trying to enter sampling method (storage.iscsi.rescan) Thread-370::DEBUG::2012-11-08 08:31:41,785::misc::1028::SamplingMethod::(__call__) Got in to sampling method Thread-370::DEBUG::2012-11-08 08:31:41,785::misc::84::Storage.Misc.excCmd::(<lambda>) '/bin/sudo -n /sbin/iscsiadm -m session -R' (cwd None) Thread-370::DEBUG::2012-11-08 08:31:41,810::misc::84::Storage.Misc.excCmd::(<lambda>) FAILED: <err> = 'iscsiadm: No session found.\n'; <rc> = 21 Thread-370::DEBUG::2012-11-08 08:31:41,811::misc::1036::SamplingMethod::(__call__) Returning last result MainProcess|Thread-370::DEBUG::2012-11-08 08:31:41,814::misc::84::Storage.Misc.excCmd::(<lambda>) '/bin/dd of=/sys/class/scsi_host/host0/scan' (cwd None) MainProcess|Thread-370::DEBUG::2012-11-08 08:31:41,817::misc::84::Storage.Misc.excCmd::(<lambda>) '/bin/dd of=/sys/class/scsi_host/host1/scan' (cwd None) MainProcess|Thread-370::DEBUG::2012-11-08 08:31:41,820::misc::84::Storage.Misc.excCmd::(<lambda>) '/bin/dd of=/sys/class/scsi_host/host2/scan' (cwd None) MainProcess|Thread-370::DEBUG::2012-11-08 08:31:41,822::misc::84::Storage.Misc.excCmd::(<lambda>) '/bin/dd of=/sys/class/scsi_host/host3/scan' (cwd None) MainProcess|Thread-370::DEBUG::2012-11-08 08:31:42,827::misc::84::Storage.Misc.excCmd::(<lambda>) '/bin/dd of=/sys/class/scsi_host/host4/scan' (cwd None) MainProcess|Thread-370::DEBUG::2012-11-08 08:31:42,835::misc::84::Storage.Misc.excCmd::(<lambda>) '/bin/dd of=/sys/class/scsi_host/host5/scan' (cwd None) MainProcess|Thread-370::DEBUG::2012-11-08 08:31:42,842::iscsi::388::Storage.ISCSI::(forceIScsiScan) Performing SCSI scan, this will take up to 30 seconds Thread-370::DEBUG::2012-11-08 08:31:44,846::misc::84::Storage.Misc.excCmd::(<lambda>) '/bin/sudo -n /sbin/multipath' (cwd None) Thread-370::DEBUG::2012-11-08 08:31:44,902::misc::84::Storage.Misc.excCmd::(<lambda>) SUCCESS: <err> = ''; <rc> = 0 Thread-370::DEBUG::2012-11-08 08:31:44,903::lvm::477::OperationMutex::(_invalidateAllPvs) Operation 'lvm invalidate operation' got the operation mutex Thread-370::DEBUG::2012-11-08 08:31:44,904::lvm::479::OperationMutex::(_invalidateAllPvs) Operation 'lvm invalidate operation' released the operation mutex Thread-370::DEBUG::2012-11-08 08:31:44,905::lvm::488::OperationMutex::(_invalidateAllVgs) Operation 'lvm invalidate operation' got the operation mutex Thread-370::DEBUG::2012-11-08 08:31:44,905::lvm::490::OperationMutex::(_invalidateAllVgs) Operation 'lvm invalidate operation' released the operation mutex Thread-370::DEBUG::2012-11-08 08:31:44,906::lvm::508::OperationMutex::(_invalidateAllLvs) Operation 'lvm invalidate operation' got the operation mutex Thread-370::DEBUG::2012-11-08 08:31:44,906::lvm::510::OperationMutex::(_invalidateAllLvs) Operation 'lvm invalidate operation' released the operation mutex Thread-370::DEBUG::2012-11-08 08:31:44,907::misc::1036::SamplingMethod::(__call__) Returning last result Thread-370::INFO::2012-11-08 08:31:44,907::logUtils::39::dispatcher::(wrapper) Run and protect: disconnectStorageServer, Return response: {'statuslist': [{'status': 0, 'id': '00000000-0000-0000-0000-000000000000'}]} Thread-370::DEBUG::2012-11-08 08:31:44,908::task::1151::TaskManager.Task::(prepare) Task=`aa8526b6-0772-4909-bad3-b2cd7ad589cc`::finished: {'statuslist': [{'status': 0, 'id': '00000000-0000-0000-0000-000000000000'}]} Thread-370::DEBUG::2012-11-08 08:31:44,908::task::568::TaskManager.Task::(_updateState) Task=`aa8526b6-0772-4909-bad3-b2cd7ad589cc`::moving from state preparing -> state finished Thread-370::DEBUG::2012-11-08 08:31:44,909::resourceManager::809::ResourceManager.Owner::(releaseAll) Owner.releaseAll requests {} resources {} Thread-370::DEBUG::2012-11-08 08:31:44,909::resourceManager::844::ResourceManager.Owner::(cancelAll) Owner.cancelAll requests {} Thread-370::DEBUG::2012-11-08 08:31:44,910::task::957::TaskManager.Task::(_decref) Task=`aa8526b6-0772-4909-bad3-b2cd7ad589cc`::ref 0 aborting False On Wed, Nov 7, 2012 at 10:00 PM, Itamar Heim <iheim@redhat.com> wrote:
On 11/07/2012 02:59 PM, Cristian Falcas wrote:
Bummer.
And the sdk from nightly is not working because it's too new, from what I sow on the mailing list?
I have errors even when I'm trying to add the storage manually, from the engine.Should I give up on nightly and use the beta ones, or there is an other solution for this?
shouldn't fail from admin. which vdsm version, logs?
On Wed, Nov 7, 2012 at 3:47 PM, Moran Goldboim <mgoldboi@redhat.com <mailto:mgoldboi@redhat.com>> wrote:
you are using ovirt-sdk from stable repo and engine from nightly repo, those doesn't work together.
On 11/07/2012 01:40 PM, Cristian Falcas wrote:
Hi all,
Can someone help me with this error:
AIO: Adding Local Datacenter and cluster... [ ERROR ] Error: could not create ovirtsdk API object
trace from the log file
2012-11-07 13:34:44::DEBUG::all_in_one___**100::220::root:: Initiating the API object 2012-11-07 13:34:44::ERROR::all_in_one___**100::231::root::
Traceback (most recent call last): File "/usr/share/ovirt-engine/__**scripts/plugins/all_in_one___** 100.py", line 228, in initAPI ca_file=basedefs.FILE_CA_CRT__**_SRC,
TypeError: __init__() got an unexpected keyword argument 'ca_file'
2012-11-07 13:34:44::DEBUG::setup___**sequences::62::root::
Traceback (most recent call last): File "/usr/share/ovirt-engine/__**scripts/setup_sequences.py",
line 60, in run function() File "/usr/share/ovirt-engine/__**scripts/plugins/all_in_one___** 100.py", line 232, in initAPI raise Exception(ERROR_CREATE_API___**OBJECT)
Exception: Error: could not create ovirtsdk API object
Versions installed:
ovirt-engine-3.1.0-3.20121106.**__git6891171.fc17.noarch ovirt-engine-backend-3.1.0-3._**_20121106.git6891171.fc17.__** noarch ovirt-engine-cli-3.1.0.6-1.__**fc17.noarch ovirt-engine-config-3.1.0-3.__**20121106.git6891171.fc17.__** noarch ovirt-engine-dbscripts-3.1.0-_**_3.20121106.git6891171.fc17.__** noarch ovirt-engine-genericapi-3.1.0-**__3.20121106.git6891171.fc17._** _noarch ovirt-engine-notification-__**service-3.1.0-3.20121106.__** git6891171.fc17.noarch ovirt-engine-restapi-3.1.0-3._**_20121106.git6891171.fc17.__** noarch ovirt-engine-sdk-3.1.0.4-1.__**fc17.noarch ovirt-engine-setup-3.1.0-3.__**20121106.git6891171.fc17.__** noarch ovirt-engine-setup-plugin-__**allinone-3.1.0-3.20121106.__** git6891171.fc17.noarch ovirt-engine-tools-common-3.1.**__0-3.20121106.git6891171.** fc17.__noarch ovirt-engine-userportal-3.1.0-**__3.20121106.git6891171.fc17._** _noarch ovirt-engine-webadmin-portal-_**_3.1.0-3.20121106.git6891171._** _fc17.noarch
______________________________**___________________ Users mailing list Users@ovirt.org <mailto:Users@ovirt.org> http://lists.ovirt.org/__**mailman/listinfo/users<http://lists.ovirt.org/__mailman/listinfo/users> <http://lists.ovirt.org/**mailman/listinfo/users<http://lists.ovirt.org/mailman/listinfo/users>
______________________________**_________________ Users mailing list Users@ovirt.org http://lists.ovirt.org/**mailman/listinfo/users<http://lists.ovirt.org/mailman/listinfo/users>

----- Original Message -----
From: "Moran Goldboim" <mgoldboi@redhat.com> To: "Cristian Falcas" <cristi.falcas@gmail.com> Cc: users@ovirt.org Sent: Wednesday, November 7, 2012 8:47:08 AM Subject: Re: [Users] allinone setup can't add storage
you are using ovirt-sdk from stable repo and engine from nightly repo, those doesn't work together.
Funnily enough, you will find it is actually in Fedora's stable updates repository and you can tell from the dist tag that is where this user (and many others) picked it up from. Refer to: https://bugzilla.redhat.com/show_bug.cgi?id=851893 https://admin.fedoraproject.org/updates/FEDORA-2012-14464/ovirt-engine-sdk-3... This has been causing a variety of issues for users since it was pushed in early October. Here are some others from my logs: Oct 16 06:13:04 <gestahlt> AIO: Error: could not create ovirtsdk API object Oct 23 19:58:03 <Rudde> okey here is my Error "Exception: Error: could not create ovirtsdk API object Oct 27 08:13:32 <yaro014> after yum update i'm getting error "ovirt Error: could not create ovirtsdk API object" while engine-setup, could anyone advice ? cannot find anything in google It breaks installation of oVirt 3.1 AIO on F17 and also I believe some of the tools for "normal" installs. In future it would be better if we did not push incompatible versions to stable Fedora releases, it directly convenes Fedora's updates policy. Thanks, Steve
On 11/07/2012 01:40 PM, Cristian Falcas wrote:
Hi all,
Can someone help me with this error:
AIO: Adding Local Datacenter and cluster... [ ERROR ] Error: could not create ovirtsdk API object
trace from the log file
2012-11-07 13:34:44::DEBUG::all_in_one_100::220::root:: Initiating the API object 2012-11-07 13:34:44::ERROR::all_in_one_100::231::root:: Traceback (most recent call last): File "/usr/share/ovirt-engine/scripts/plugins/all_in_one_100.py", line 228, in initAPI ca_file=basedefs.FILE_CA_CRT_SRC, TypeError: __init__() got an unexpected keyword argument 'ca_file'
2012-11-07 13:34:44::DEBUG::setup_sequences::62::root:: Traceback (most recent call last): File "/usr/share/ovirt-engine/scripts/setup_sequences.py", line 60, in run function() File "/usr/share/ovirt-engine/scripts/plugins/all_in_one_100.py", line 232, in initAPI raise Exception(ERROR_CREATE_API_OBJECT) Exception: Error: could not create ovirtsdk API object
Versions installed:
ovirt-engine-3.1.0-3.20121106.git6891171.fc17.noarch ovirt-engine-backend-3.1.0-3.20121106.git6891171.fc17.noarch ovirt-engine-cli-3.1.0.6-1.fc17.noarch ovirt-engine-config-3.1.0-3.20121106.git6891171.fc17.noarch ovirt-engine-dbscripts-3.1.0-3.20121106.git6891171.fc17.noarch ovirt-engine-genericapi-3.1.0-3.20121106.git6891171.fc17.noarch ovirt-engine-notification-service-3.1.0-3.20121106.git6891171.fc17.noarch ovirt-engine-restapi-3.1.0-3.20121106.git6891171.fc17.noarch ovirt-engine-sdk-3.1.0.4-1.fc17.noarch ovirt-engine-setup-3.1.0-3.20121106.git6891171.fc17.noarch ovirt-engine-setup-plugin-allinone-3.1.0-3.20121106.git6891171.fc17.noarch ovirt-engine-tools-common-3.1.0-3.20121106.git6891171.fc17.noarch ovirt-engine-userportal-3.1.0-3.20121106.git6891171.fc17.noarch ovirt-engine-webadmin-portal-3.1.0-3.20121106.git6891171.fc17.noarch
_______________________________________________ Users mailing list Users@ovirt.org http://lists.ovirt.org/mailman/listinfo/users
_______________________________________________ Users mailing list Users@ovirt.org http://lists.ovirt.org/mailman/listinfo/users

----- Original Message -----
From: "Steve Gordon" <sgordon@redhat.com> To: "Moran Goldboim" <mgoldboi@redhat.com> Cc: users@ovirt.org, "Cristian Falcas" <cristi.falcas@gmail.com> Sent: Friday, November 9, 2012 11:11:23 AM Subject: Re: [Users] allinone setup can't add storage
----- Original Message -----
From: "Moran Goldboim" <mgoldboi@redhat.com> To: "Cristian Falcas" <cristi.falcas@gmail.com> Cc: users@ovirt.org Sent: Wednesday, November 7, 2012 8:47:08 AM Subject: Re: [Users] allinone setup can't add storage
you are using ovirt-sdk from stable repo and engine from nightly repo, those doesn't work together.
Funnily enough, you will find it is actually in Fedora's stable updates repository and you can tell from the dist tag that is where this user (and many others) picked it up from. Refer to:
https://bugzilla.redhat.com/show_bug.cgi?id=851893 Correction, this is the BZ: https://bugzilla.redhat.com/show_bug.cgi?id=869457
https://admin.fedoraproject.org/updates/FEDORA-2012-14464/ovirt-engine-sdk-3...
This has been causing a variety of issues for users since it was pushed in early October. Here are some others from my logs:
Oct 16 06:13:04 <gestahlt> AIO: Error: could not create ovirtsdk API object Oct 23 19:58:03 <Rudde> okey here is my Error "Exception: Error: could not create ovirtsdk API object Oct 27 08:13:32 <yaro014> after yum update i'm getting error "ovirt Error: could not create ovirtsdk API object" while engine-setup, could anyone advice ? cannot find anything in google
It breaks installation of oVirt 3.1 AIO on F17 and also I believe some of the tools for "normal" installs. In future it would be better if we did not push incompatible versions to stable Fedora releases, it directly convenes Fedora's updates policy.
Thanks,
Steve
On 11/07/2012 01:40 PM, Cristian Falcas wrote:
Hi all,
Can someone help me with this error:
AIO: Adding Local Datacenter and cluster... [ ERROR ] Error: could not create ovirtsdk API object
trace from the log file
2012-11-07 13:34:44::DEBUG::all_in_one_100::220::root:: Initiating the API object 2012-11-07 13:34:44::ERROR::all_in_one_100::231::root:: Traceback (most recent call last): File "/usr/share/ovirt-engine/scripts/plugins/all_in_one_100.py", line 228, in initAPI ca_file=basedefs.FILE_CA_CRT_SRC, TypeError: __init__() got an unexpected keyword argument 'ca_file'
2012-11-07 13:34:44::DEBUG::setup_sequences::62::root:: Traceback (most recent call last): File "/usr/share/ovirt-engine/scripts/setup_sequences.py", line 60, in run function() File "/usr/share/ovirt-engine/scripts/plugins/all_in_one_100.py", line 232, in initAPI raise Exception(ERROR_CREATE_API_OBJECT) Exception: Error: could not create ovirtsdk API object
Versions installed:
ovirt-engine-3.1.0-3.20121106.git6891171.fc17.noarch ovirt-engine-backend-3.1.0-3.20121106.git6891171.fc17.noarch ovirt-engine-cli-3.1.0.6-1.fc17.noarch ovirt-engine-config-3.1.0-3.20121106.git6891171.fc17.noarch ovirt-engine-dbscripts-3.1.0-3.20121106.git6891171.fc17.noarch ovirt-engine-genericapi-3.1.0-3.20121106.git6891171.fc17.noarch ovirt-engine-notification-service-3.1.0-3.20121106.git6891171.fc17.noarch ovirt-engine-restapi-3.1.0-3.20121106.git6891171.fc17.noarch ovirt-engine-sdk-3.1.0.4-1.fc17.noarch ovirt-engine-setup-3.1.0-3.20121106.git6891171.fc17.noarch ovirt-engine-setup-plugin-allinone-3.1.0-3.20121106.git6891171.fc17.noarch ovirt-engine-tools-common-3.1.0-3.20121106.git6891171.fc17.noarch ovirt-engine-userportal-3.1.0-3.20121106.git6891171.fc17.noarch ovirt-engine-webadmin-portal-3.1.0-3.20121106.git6891171.fc17.noarch
_______________________________________________ Users mailing list Users@ovirt.org http://lists.ovirt.org/mailman/listinfo/users
_______________________________________________ Users mailing list Users@ovirt.org http://lists.ovirt.org/mailman/listinfo/users

I install only the 3.1 version on all packages. I dropped the firewall and selinux. I tried to add a new storage with the gui and via cli: [oVirt shell (connected)]# create storagedomain --host-name local_host --storage-type localfs --storage-path /media/ceva2/Ovirt/Storage --name test --type "data" error: status: 400 reason: Bad Request detail: Cannot add Storage. Internal error, Storage Connection doesn't exist. And I have the same exception: Thread-3280::DEBUG::2012-11-10 17:14:27,989::BindingXMLRPC::161::vds::(wrapper) [79.112.99.231] Thread-3280::DEBUG::2012-11-10 17:14:27,990::task::568::TaskManager.Task::(_updateState) Task=`f9dba1a1-365d-4e00-8a54-3dbc70604460`::moving from state init -> state preparing Thread-3280::INFO::2012-11-10 17:14:27,991::logUtils::37::dispatcher::(wrapper) Run and protect: validateStorageServerConnection(domType=4, spUUID='00000000-0000-0000-0000-000000000000', conList=[{'connection': '/media/ceva2/Ovirt/Storage/', 'iqn': '', 'portal': '', 'user': '', 'password': '******', 'id': '00000000-0000-0000-0000-000000000000', 'port': ''}], options=None) Thread-3280::INFO::2012-11-10 17:14:27,991::logUtils::39::dispatcher::(wrapper) Run and protect: validateStorageServerConnection, Return response: {'statuslist': [{'status': 0, 'id': '00000000-0000-0000-0000-000000000000'}]} Thread-3280::DEBUG::2012-11-10 17:14:27,991::task::1151::TaskManager.Task::(prepare) Task=`f9dba1a1-365d-4e00-8a54-3dbc70604460`::finished: {'statuslist': [{'status': 0, 'id': '00000000-0000-0000-0000-000000000000'}]} Thread-3280::DEBUG::2012-11-10 17:14:27,991::task::568::TaskManager.Task::(_updateState) Task=`f9dba1a1-365d-4e00-8a54-3dbc70604460`::moving from state preparing -> state finished Thread-3280::DEBUG::2012-11-10 17:14:27,991::resourceManager::809::ResourceManager.Owner::(releaseAll) Owner.releaseAll requests {} resources {} Thread-3280::DEBUG::2012-11-10 17:14:27,991::resourceManager::844::ResourceManager.Owner::(cancelAll) Owner.cancelAll requests {} Thread-3280::DEBUG::2012-11-10 17:14:27,991::task::957::TaskManager.Task::(_decref) Task=`f9dba1a1-365d-4e00-8a54-3dbc70604460`::ref 0 aborting False Thread-3281::DEBUG::2012-11-10 17:14:28,049::BindingXMLRPC::161::vds::(wrapper) [79.112.99.231] Thread-3281::DEBUG::2012-11-10 17:14:28,049::task::568::TaskManager.Task::(_updateState) Task=`430ededc-db77-43cb-a21e-c89cd70772f6`::moving from state init -> state preparing Thread-3281::INFO::2012-11-10 17:14:28,049::logUtils::37::dispatcher::(wrapper) Run and protect: connectStorageServer(domType=4, spUUID='00000000-0000-0000-0000-000000000000', conList=[{'connection': '/media/ceva2/Ovirt/Storage/', 'iqn': '', 'portal': '', 'user': '', 'password': '******', 'id': '00000000-0000-0000-0000-000000000000', 'port': ''}], options=None) Thread-3281::ERROR::2012-11-10 17:14:28,164::hsm::2057::Storage.HSM::(connectStorageServer) Could not connect to storageServer Traceback (most recent call last): File "/usr/share/vdsm/storage/hsm.py", line 2054, in connectStorageServer conObj.connect() File "/usr/share/vdsm/storage/storageServer.py", line 462, in connect if not self.checkTarget(): File "/usr/share/vdsm/storage/storageServer.py", line 449, in checkTarget fileSD.validateDirAccess(self._path)) File "/usr/share/vdsm/storage/fileSD.py", line 51, in validateDirAccess getProcPool().fileUtils.validateAccess(dirPath) File "/usr/share/vdsm/storage/remoteFileHandler.py", line 277, in callCrabRPCFunction *args, **kwargs) File "/usr/share/vdsm/storage/remoteFileHandler.py", line 180, in callCrabRPCFunction rawLength = self._recvAll(LENGTH_STRUCT_LENGTH, timeout) File "/usr/share/vdsm/storage/remoteFileHandler.py", line 149, in _recvAll timeLeft): File "/usr/lib64/python2.7/contextlib.py", line 84, in helper return GeneratorContextManager(func(*args, **kwds)) File "/usr/share/vdsm/storage/remoteFileHandler.py", line 136, in _poll raise Timeout() Timeout Thread-3281::INFO::2012-11-10 17:14:28,165::logUtils::39::dispatcher::(wrapper) Run and protect: connectStorageServer, Return response: {'statuslist': [{'status': 100, 'id': '00000000-0000-0000-0000-000000000000'}]} Thread-3281::DEBUG::2012-11-10 17:14:28,165::task::1151::TaskManager.Task::(prepare) Task=`430ededc-db77-43cb-a21e-c89cd70772f6`::finished: {'statuslist': [{'status': 100, 'id': '00000000-0000-0000-0000-000000000000'}]} Thread-3281::DEBUG::2012-11-10 17:14:28,165::task::568::TaskManager.Task::(_updateState) Task=`430ededc-db77-43cb-a21e-c89cd70772f6`::moving from state preparing -> state finished Thread-3281::DEBUG::2012-11-10 17:14:28,165::resourceManager::809::ResourceManager.Owner::(releaseAll) Owner.releaseAll requests {} resources {} Thread-3281::DEBUG::2012-11-10 17:14:28,166::resourceManager::844::ResourceManager.Owner::(cancelAll) Owner.cancelAll requests {} Thread-3281::DEBUG::2012-11-10 17:14:28,166::task::957::TaskManager.Task::(_decref) Task=`430ededc-db77-43cb-a21e-c89cd70772f6`::ref 0 aborting False Packages installed: [root@localhost vdsm]# rpm -qa | egrep vdsm\|ovirt-engine | sort ovirt-engine-3.1.0-3.20121109.git2dc9b51.fc17.noarch ovirt-engine-backend-3.1.0-3.20121109.git2dc9b51.fc17.noarch ovirt-engine-cli-3.1.0.6-1.fc17.noarch ovirt-engine-config-3.1.0-3.20121109.git2dc9b51.fc17.noarch ovirt-engine-dbscripts-3.1.0-3.20121109.git2dc9b51.fc17.noarch ovirt-engine-genericapi-3.1.0-3.20121109.git2dc9b51.fc17.noarch ovirt-engine-notification-service-3.1.0-3.20121109.git2dc9b51.fc17.noarch ovirt-engine-restapi-3.1.0-3.20121109.git2dc9b51.fc17.noarch ovirt-engine-sdk-3.1.0.4-1.fc17.noarch ovirt-engine-setup-3.1.0-3.20121109.git2dc9b51.fc17.noarch ovirt-engine-setup-plugin-allinone-3.1.0-3.20121109.git2dc9b51.fc17.noarch ovirt-engine-tools-common-3.1.0-3.20121109.git2dc9b51.fc17.noarch ovirt-engine-userportal-3.1.0-3.20121109.git2dc9b51.fc17.noarch ovirt-engine-webadmin-portal-3.1.0-3.20121109.git2dc9b51.fc17.noarch vdsm-4.10.1-0.119.git60d7e63.fc17.x86_64 vdsm-bootstrap-4.10.1-0.119.git60d7e63.fc17.noarch vdsm-cli-4.10.1-0.119.git60d7e63.fc17.noarch vdsm-debuginfo-4.10.1-0.119.git60d7e63.fc17.x86_64 vdsm-debug-plugin-4.10.1-0.119.git60d7e63.fc17.noarch vdsm-gluster-4.10.1-0.119.git60d7e63.fc17.noarch vdsm-hook-directlun-4.10.1-0.119.git60d7e63.fc17.noarch vdsm-hook-faqemu-4.10.1-0.119.git60d7e63.fc17.noarch vdsm-hook-fileinject-4.10.1-0.119.git60d7e63.fc17.noarch vdsm-hook-floppy-4.10.1-0.119.git60d7e63.fc17.noarch vdsm-hook-hostusb-4.10.1-0.119.git60d7e63.fc17.noarch vdsm-hook-hugepages-4.10.1-0.119.git60d7e63.fc17.noarch vdsm-hook-isolatedprivatevlan-4.10.1-0.119.git60d7e63.fc17.noarch vdsm-hook-numa-4.10.1-0.119.git60d7e63.fc17.noarch vdsm-hook-pincpu-4.10.1-0.119.git60d7e63.fc17.noarch vdsm-hook-promisc-4.10.1-0.119.git60d7e63.fc17.noarch vdsm-hook-qemucmdline-4.10.1-0.119.git60d7e63.fc17.noarch vdsm-hook-qos-4.10.1-0.119.git60d7e63.fc17.noarch vdsm-hook-scratchpad-4.10.1-0.119.git60d7e63.fc17.noarch vdsm-hook-smartcard-4.10.1-0.119.git60d7e63.fc17.noarch vdsm-hook-smbios-4.10.1-0.119.git60d7e63.fc17.noarch vdsm-hook-sriov-4.10.1-0.119.git60d7e63.fc17.noarch vdsm-hook-vhostmd-4.10.1-0.119.git60d7e63.fc17.noarch vdsm-hook-vmdisk-4.10.1-0.119.git60d7e63.fc17.noarch vdsm-python-4.10.1-0.119.git60d7e63.fc17.x86_64 vdsm-reg-4.10.1-0.119.git60d7e63.fc17.noarch vdsm-rest-4.10.1-0.119.git60d7e63.fc17.noarch vdsm-tests-4.10.1-0.119.git60d7e63.fc17.noarch vdsm-xmlrpc-4.10.1-0.119.git60d7e63.fc17.noarch On Fri, Nov 9, 2012 at 6:16 PM, Steve Gordon <sgordon@redhat.com> wrote:
----- Original Message -----
From: "Steve Gordon" <sgordon@redhat.com> To: "Moran Goldboim" <mgoldboi@redhat.com> Cc: users@ovirt.org, "Cristian Falcas" <cristi.falcas@gmail.com> Sent: Friday, November 9, 2012 11:11:23 AM Subject: Re: [Users] allinone setup can't add storage
----- Original Message -----
From: "Moran Goldboim" <mgoldboi@redhat.com> To: "Cristian Falcas" <cristi.falcas@gmail.com> Cc: users@ovirt.org Sent: Wednesday, November 7, 2012 8:47:08 AM Subject: Re: [Users] allinone setup can't add storage
you are using ovirt-sdk from stable repo and engine from nightly repo, those doesn't work together.
Funnily enough, you will find it is actually in Fedora's stable updates repository and you can tell from the dist tag that is where this user (and many others) picked it up from. Refer to:
https://bugzilla.redhat.com/show_bug.cgi?id=851893 Correction, this is the BZ: https://bugzilla.redhat.com/show_bug.cgi?id=869457
https://admin.fedoraproject.org/updates/FEDORA-2012-14464/ovirt-engine-sdk-3...
This has been causing a variety of issues for users since it was pushed in early October. Here are some others from my logs:
Oct 16 06:13:04 <gestahlt> AIO: Error: could not create ovirtsdk API object Oct 23 19:58:03 <Rudde> okey here is my Error "Exception: Error: could not create ovirtsdk API object Oct 27 08:13:32 <yaro014> after yum update i'm getting error "ovirt Error: could not create ovirtsdk API object" while engine-setup, could anyone advice ? cannot find anything in google
It breaks installation of oVirt 3.1 AIO on F17 and also I believe some of the tools for "normal" installs. In future it would be better if we did not push incompatible versions to stable Fedora releases, it directly convenes Fedora's updates policy.
Thanks,
Steve
On 11/07/2012 01:40 PM, Cristian Falcas wrote:
Hi all,
Can someone help me with this error:
AIO: Adding Local Datacenter and cluster... [ ERROR ] Error: could not create ovirtsdk API object
trace from the log file
2012-11-07 13:34:44::DEBUG::all_in_one_100::220::root:: Initiating the API object 2012-11-07 13:34:44::ERROR::all_in_one_100::231::root:: Traceback (most recent call last): File "/usr/share/ovirt-engine/scripts/plugins/all_in_one_100.py", line 228, in initAPI ca_file=basedefs.FILE_CA_CRT_SRC, TypeError: __init__() got an unexpected keyword argument 'ca_file'
2012-11-07 13:34:44::DEBUG::setup_sequences::62::root:: Traceback (most recent call last): File "/usr/share/ovirt-engine/scripts/setup_sequences.py", line 60, in run function() File "/usr/share/ovirt-engine/scripts/plugins/all_in_one_100.py", line 232, in initAPI raise Exception(ERROR_CREATE_API_OBJECT) Exception: Error: could not create ovirtsdk API object
Versions installed:
ovirt-engine-3.1.0-3.20121106.git6891171.fc17.noarch ovirt-engine-backend-3.1.0-3.20121106.git6891171.fc17.noarch ovirt-engine-cli-3.1.0.6-1.fc17.noarch ovirt-engine-config-3.1.0-3.20121106.git6891171.fc17.noarch ovirt-engine-dbscripts-3.1.0-3.20121106.git6891171.fc17.noarch ovirt-engine-genericapi-3.1.0-3.20121106.git6891171.fc17.noarch
ovirt-engine-notification-service-3.1.0-3.20121106.git6891171.fc17.noarch
ovirt-engine-restapi-3.1.0-3.20121106.git6891171.fc17.noarch ovirt-engine-sdk-3.1.0.4-1.fc17.noarch ovirt-engine-setup-3.1.0-3.20121106.git6891171.fc17.noarch
ovirt-engine-setup-plugin-allinone-3.1.0-3.20121106.git6891171.fc17.noarch
ovirt-engine-tools-common-3.1.0-3.20121106.git6891171.fc17.noarch ovirt-engine-userportal-3.1.0-3.20121106.git6891171.fc17.noarch ovirt-engine-webadmin-portal-3.1.0-3.20121106.git6891171.fc17.noarch
_______________________________________________ Users mailing list Users@ovirt.org http://lists.ovirt.org/mailman/listinfo/users
_______________________________________________ Users mailing list Users@ovirt.org http://lists.ovirt.org/mailman/listinfo/users

----- Original Message -----
I install only the 3.1 version on all packages. I dropped the firewall and selinux.
I tried to add a new storage with the gui and via cli: [oVirt shell (connected)]# create storagedomain --host-name local_host --storage-type localfs --storage-path /media/ceva2/Ovirt/Storage --name test --type "data"
error: status: 400 reason: Bad Request detail: Cannot add Storage. Internal error, Storage Connection doesn't exist.
Can you attach the full vdsm.log ?
And I have the same exception:
Thread-3280::DEBUG::2012-11-10 17:14:27,989::BindingXMLRPC::161::vds::(wrapper) [79.112.99.231] Thread-3280::DEBUG::2012-11-10 17:14:27,990::task::568::TaskManager.Task::(_updateState) Task=`f9dba1a1-365d-4e00-8a54-3dbc70604460`::moving from state init -> state preparing Thread-3280::INFO::2012-11-10 17:14:27,991::logUtils::37::dispatcher::(wrapper) Run and protect: validateStorageServerConnection(domType=4, spUUID='00000000-0000-0000-0000-000000000000', conList=[{'connection': '/media/ceva2/Ovirt/Storage/', 'iqn': '', 'portal': '', 'user': '', 'password': '******', 'id': '00000000-0000-0000-0000-000000000000', 'port': ''}], options=None) Thread-3280::INFO::2012-11-10 17:14:27,991::logUtils::39::dispatcher::(wrapper) Run and protect: validateStorageServerConnection, Return response: {'statuslist': [{'status': 0, 'id': '00000000-0000-0000-0000-000000000000'}]} Thread-3280::DEBUG::2012-11-10 17:14:27,991::task::1151::TaskManager.Task::(prepare) Task=`f9dba1a1-365d-4e00-8a54-3dbc70604460`::finished: {'statuslist': [{'status': 0, 'id': '00000000-0000-0000-0000-000000000000'}]} Thread-3280::DEBUG::2012-11-10 17:14:27,991::task::568::TaskManager.Task::(_updateState) Task=`f9dba1a1-365d-4e00-8a54-3dbc70604460`::moving from state preparing -> state finished Thread-3280::DEBUG::2012-11-10 17:14:27,991::resourceManager::809::ResourceManager.Owner::(releaseAll) Owner.releaseAll requests {} resources {} Thread-3280::DEBUG::2012-11-10 17:14:27,991::resourceManager::844::ResourceManager.Owner::(cancelAll) Owner.cancelAll requests {} Thread-3280::DEBUG::2012-11-10 17:14:27,991::task::957::TaskManager.Task::(_decref) Task=`f9dba1a1-365d-4e00-8a54-3dbc70604460`::ref 0 aborting False Thread-3281::DEBUG::2012-11-10 17:14:28,049::BindingXMLRPC::161::vds::(wrapper) [79.112.99.231] Thread-3281::DEBUG::2012-11-10 17:14:28,049::task::568::TaskManager.Task::(_updateState) Task=`430ededc-db77-43cb-a21e-c89cd70772f6`::moving from state init -> state preparing Thread-3281::INFO::2012-11-10 17:14:28,049::logUtils::37::dispatcher::(wrapper) Run and protect: connectStorageServer(domType=4, spUUID='00000000-0000-0000-0000-000000000000', conList=[{'connection': '/media/ceva2/Ovirt/Storage/', 'iqn': '', 'portal': '', 'user': '', 'password': '******', 'id': '00000000-0000-0000-0000-000000000000', 'port': ''}], options=None) Thread-3281::ERROR::2012-11-10 17:14:28,164::hsm::2057::Storage.HSM::(connectStorageServer) Could not connect to storageServer Traceback (most recent call last): File "/usr/share/vdsm/storage/hsm.py", line 2054, in connectStorageServer conObj.connect() File "/usr/share/vdsm/storage/storageServer.py", line 462, in connect if not self.checkTarget(): File "/usr/share/vdsm/storage/storageServer.py", line 449, in checkTarget fileSD.validateDirAccess(self._path)) File "/usr/share/vdsm/storage/fileSD.py", line 51, in validateDirAccess getProcPool().fileUtils.validateAccess(dirPath) File "/usr/share/vdsm/storage/remoteFileHandler.py", line 277, in callCrabRPCFunction *args, **kwargs) File "/usr/share/vdsm/storage/remoteFileHandler.py", line 180, in callCrabRPCFunction rawLength = self._recvAll(LENGTH_STRUCT_LENGTH, timeout) File "/usr/share/vdsm/storage/remoteFileHandler.py", line 149, in _recvAll timeLeft): File "/usr/lib64/python2.7/contextlib.py", line 84, in helper return GeneratorContextManager(func(*args, **kwds)) File "/usr/share/vdsm/storage/remoteFileHandler.py", line 136, in _poll raise Timeout() Timeout Thread-3281::INFO::2012-11-10 17:14:28,165::logUtils::39::dispatcher::(wrapper) Run and protect: connectStorageServer, Return response: {'statuslist': [{'status': 100, 'id': '00000000-0000-0000-0000-000000000000'}]} Thread-3281::DEBUG::2012-11-10 17:14:28,165::task::1151::TaskManager.Task::(prepare) Task=`430ededc-db77-43cb-a21e-c89cd70772f6`::finished: {'statuslist': [{'status': 100, 'id': '00000000-0000-0000-0000-000000000000'}]} Thread-3281::DEBUG::2012-11-10 17:14:28,165::task::568::TaskManager.Task::(_updateState) Task=`430ededc-db77-43cb-a21e-c89cd70772f6`::moving from state preparing -> state finished Thread-3281::DEBUG::2012-11-10 17:14:28,165::resourceManager::809::ResourceManager.Owner::(releaseAll) Owner.releaseAll requests {} resources {} Thread-3281::DEBUG::2012-11-10 17:14:28,166::resourceManager::844::ResourceManager.Owner::(cancelAll) Owner.cancelAll requests {} Thread-3281::DEBUG::2012-11-10 17:14:28,166::task::957::TaskManager.Task::(_decref) Task=`430ededc-db77-43cb-a21e-c89cd70772f6`::ref 0 aborting False
Packages installed:
[root@localhost vdsm]# rpm -qa | egrep vdsm\|ovirt-engine | sort ovirt-engine-3.1.0-3.20121109.git2dc9b51.fc17.noarch ovirt-engine-backend-3.1.0-3.20121109.git2dc9b51.fc17.noarch ovirt-engine-cli-3.1.0.6-1.fc17.noarch ovirt-engine-config-3.1.0-3.20121109.git2dc9b51.fc17.noarch ovirt-engine-dbscripts-3.1.0-3.20121109.git2dc9b51.fc17.noarch ovirt-engine-genericapi-3.1.0-3.20121109.git2dc9b51.fc17.noarch ovirt-engine-notification-service-3.1.0-3.20121109.git2dc9b51.fc17.noarch ovirt-engine-restapi-3.1.0-3.20121109.git2dc9b51.fc17.noarch ovirt-engine-sdk-3.1.0.4-1.fc17.noarch ovirt-engine-setup-3.1.0-3.20121109.git2dc9b51.fc17.noarch ovirt-engine-setup-plugin-allinone-3.1.0-3.20121109.git2dc9b51.fc17.noarch ovirt-engine-tools-common-3.1.0-3.20121109.git2dc9b51.fc17.noarch ovirt-engine-userportal-3.1.0-3.20121109.git2dc9b51.fc17.noarch ovirt-engine-webadmin-portal-3.1.0-3.20121109.git2dc9b51.fc17.noarch vdsm-4.10.1-0.119.git60d7e63.fc17.x86_64 vdsm-bootstrap-4.10.1-0.119.git60d7e63.fc17.noarch vdsm-cli-4.10.1-0.119.git60d7e63.fc17.noarch vdsm-debuginfo-4.10.1-0.119.git60d7e63.fc17.x86_64 vdsm-debug-plugin-4.10.1-0.119.git60d7e63.fc17.noarch vdsm-gluster-4.10.1-0.119.git60d7e63.fc17.noarch vdsm-hook-directlun-4.10.1-0.119.git60d7e63.fc17.noarch vdsm-hook-faqemu-4.10.1-0.119.git60d7e63.fc17.noarch vdsm-hook-fileinject-4.10.1-0.119.git60d7e63.fc17.noarch vdsm-hook-floppy-4.10.1-0.119.git60d7e63.fc17.noarch vdsm-hook-hostusb-4.10.1-0.119.git60d7e63.fc17.noarch vdsm-hook-hugepages-4.10.1-0.119.git60d7e63.fc17.noarch vdsm-hook-isolatedprivatevlan-4.10.1-0.119.git60d7e63.fc17.noarch vdsm-hook-numa-4.10.1-0.119.git60d7e63.fc17.noarch vdsm-hook-pincpu-4.10.1-0.119.git60d7e63.fc17.noarch vdsm-hook-promisc-4.10.1-0.119.git60d7e63.fc17.noarch vdsm-hook-qemucmdline-4.10.1-0.119.git60d7e63.fc17.noarch vdsm-hook-qos-4.10.1-0.119.git60d7e63.fc17.noarch vdsm-hook-scratchpad-4.10.1-0.119.git60d7e63.fc17.noarch vdsm-hook-smartcard-4.10.1-0.119.git60d7e63.fc17.noarch vdsm-hook-smbios-4.10.1-0.119.git60d7e63.fc17.noarch vdsm-hook-sriov-4.10.1-0.119.git60d7e63.fc17.noarch vdsm-hook-vhostmd-4.10.1-0.119.git60d7e63.fc17.noarch vdsm-hook-vmdisk-4.10.1-0.119.git60d7e63.fc17.noarch vdsm-python-4.10.1-0.119.git60d7e63.fc17.x86_64 vdsm-reg-4.10.1-0.119.git60d7e63.fc17.noarch vdsm-rest-4.10.1-0.119.git60d7e63.fc17.noarch vdsm-tests-4.10.1-0.119.git60d7e63.fc17.noarch vdsm-xmlrpc-4.10.1-0.119.git60d7e63.fc17.noarch
On Fri, Nov 9, 2012 at 6:16 PM, Steve Gordon < sgordon@redhat.com > wrote:
----- Original Message -----
From: "Steve Gordon" < sgordon@redhat.com > To: "Moran Goldboim" < mgoldboi@redhat.com > Cc: users@ovirt.org , "Cristian Falcas" < cristi.falcas@gmail.com > Sent: Friday, November 9, 2012 11:11:23 AM Subject: Re: [Users] allinone setup can't add storage
----- Original Message -----
From: "Moran Goldboim" < mgoldboi@redhat.com > To: "Cristian Falcas" < cristi.falcas@gmail.com > Cc: users@ovirt.org Sent: Wednesday, November 7, 2012 8:47:08 AM Subject: Re: [Users] allinone setup can't add storage
you are using ovirt-sdk from stable repo and engine from nightly repo, those doesn't work together.
Funnily enough, you will find it is actually in Fedora's stable updates repository and you can tell from the dist tag that is where this user (and many others) picked it up from. Refer to:
https://bugzilla.redhat.com/show_bug.cgi?id=851893 Correction, this is the BZ: https://bugzilla.redhat.com/show_bug.cgi?id=869457
https://admin.fedoraproject.org/updates/FEDORA-2012-14464/ovirt-engine-sdk-3...
This has been causing a variety of issues for users since it was pushed in early October. Here are some others from my logs:
Oct 16 06:13:04 <gestahlt> AIO: Error: could not create ovirtsdk API object Oct 23 19:58:03 <Rudde> okey here is my Error "Exception: Error: could not create ovirtsdk API object Oct 27 08:13:32 <yaro014> after yum update i'm getting error "ovirt Error: could not create ovirtsdk API object" while engine-setup, could anyone advice ? cannot find anything in google
It breaks installation of oVirt 3.1 AIO on F17 and also I believe some of the tools for "normal" installs. In future it would be better if we did not push incompatible versions to stable Fedora releases, it directly convenes Fedora's updates policy.
Thanks,
Steve
On 11/07/2012 01:40 PM, Cristian Falcas wrote:
Hi all,
Can someone help me with this error:
AIO: Adding Local Datacenter and cluster... [ ERROR ] Error: could not create ovirtsdk API object
trace from the log file
2012-11-07 13:34:44::DEBUG::all_in_one_100::220::root:: Initiating the API object 2012-11-07 13:34:44::ERROR::all_in_one_100::231::root:: Traceback (most recent call last): File "/usr/share/ovirt-engine/scripts/plugins/all_in_one_100.py", line 228, in initAPI ca_file=basedefs.FILE_CA_CRT_SRC, TypeError: __init__() got an unexpected keyword argument 'ca_file'
2012-11-07 13:34:44::DEBUG::setup_sequences::62::root:: Traceback (most recent call last): File "/usr/share/ovirt-engine/scripts/setup_sequences.py", line 60, in run function() File "/usr/share/ovirt-engine/scripts/plugins/all_in_one_100.py", line 232, in initAPI raise Exception(ERROR_CREATE_API_OBJECT) Exception: Error: could not create ovirtsdk API object
Versions installed:
ovirt-engine-3.1.0-3.20121106.git6891171.fc17.noarch ovirt-engine-backend-3.1.0-3.20121106.git6891171.fc17.noarch ovirt-engine-cli-3.1.0.6-1.fc17.noarch ovirt-engine-config-3.1.0-3.20121106.git6891171.fc17.noarch ovirt-engine-dbscripts-3.1.0-3.20121106.git6891171.fc17.noarch ovirt-engine-genericapi-3.1.0-3.20121106.git6891171.fc17.noarch ovirt-engine-notification-service-3.1.0-3.20121106.git6891171.fc17.noarch ovirt-engine-restapi-3.1.0-3.20121106.git6891171.fc17.noarch ovirt-engine-sdk-3.1.0.4-1.fc17.noarch ovirt-engine-setup-3.1.0-3.20121106.git6891171.fc17.noarch ovirt-engine-setup-plugin-allinone-3.1.0-3.20121106.git6891171.fc17.noarch ovirt-engine-tools-common-3.1.0-3.20121106.git6891171.fc17.noarch ovirt-engine-userportal-3.1.0-3.20121106.git6891171.fc17.noarch ovirt-engine-webadmin-portal-3.1.0-3.20121106.git6891171.fc17.noarch
_______________________________________________ Users mailing list Users@ovirt.org http://lists.ovirt.org/mailman/listinfo/users
_______________________________________________ Users mailing list Users@ovirt.org http://lists.ovirt.org/mailman/listinfo/users
_______________________________________________ Users mailing list Users@ovirt.org http://lists.ovirt.org/mailman/listinfo/users

attached is the vdsm log after trying one time with cli and one time from gui On Sun, Nov 11, 2012 at 1:03 AM, Ayal Baron <abaron@redhat.com> wrote:
----- Original Message -----
I install only the 3.1 version on all packages. I dropped the firewall and selinux.
I tried to add a new storage with the gui and via cli: [oVirt shell (connected)]# create storagedomain --host-name local_host --storage-type localfs --storage-path /media/ceva2/Ovirt/Storage --name test --type "data"
error: status: 400 reason: Bad Request detail: Cannot add Storage. Internal error, Storage Connection doesn't exist.
Can you attach the full vdsm.log ?
And I have the same exception:
Thread-3280::DEBUG::2012-11-10 17:14:27,989::BindingXMLRPC::161::vds::(wrapper) [79.112.99.231] Thread-3280::DEBUG::2012-11-10 17:14:27,990::task::568::TaskManager.Task::(_updateState) Task=`f9dba1a1-365d-4e00-8a54-3dbc70604460`::moving from state init -> state preparing Thread-3280::INFO::2012-11-10 17:14:27,991::logUtils::37::dispatcher::(wrapper) Run and protect: validateStorageServerConnection(domType=4, spUUID='00000000-0000-0000-0000-000000000000', conList=[{'connection': '/media/ceva2/Ovirt/Storage/', 'iqn': '', 'portal': '', 'user': '', 'password': '******', 'id': '00000000-0000-0000-0000-000000000000', 'port': ''}], options=None) Thread-3280::INFO::2012-11-10 17:14:27,991::logUtils::39::dispatcher::(wrapper) Run and protect: validateStorageServerConnection, Return response: {'statuslist': [{'status': 0, 'id': '00000000-0000-0000-0000-000000000000'}]} Thread-3280::DEBUG::2012-11-10 17:14:27,991::task::1151::TaskManager.Task::(prepare) Task=`f9dba1a1-365d-4e00-8a54-3dbc70604460`::finished: {'statuslist': [{'status': 0, 'id': '00000000-0000-0000-0000-000000000000'}]} Thread-3280::DEBUG::2012-11-10 17:14:27,991::task::568::TaskManager.Task::(_updateState) Task=`f9dba1a1-365d-4e00-8a54-3dbc70604460`::moving from state preparing -> state finished Thread-3280::DEBUG::2012-11-10 17:14:27,991::resourceManager::809::ResourceManager.Owner::(releaseAll) Owner.releaseAll requests {} resources {} Thread-3280::DEBUG::2012-11-10 17:14:27,991::resourceManager::844::ResourceManager.Owner::(cancelAll) Owner.cancelAll requests {} Thread-3280::DEBUG::2012-11-10 17:14:27,991::task::957::TaskManager.Task::(_decref) Task=`f9dba1a1-365d-4e00-8a54-3dbc70604460`::ref 0 aborting False Thread-3281::DEBUG::2012-11-10 17:14:28,049::BindingXMLRPC::161::vds::(wrapper) [79.112.99.231] Thread-3281::DEBUG::2012-11-10 17:14:28,049::task::568::TaskManager.Task::(_updateState) Task=`430ededc-db77-43cb-a21e-c89cd70772f6`::moving from state init -> state preparing Thread-3281::INFO::2012-11-10 17:14:28,049::logUtils::37::dispatcher::(wrapper) Run and protect: connectStorageServer(domType=4, spUUID='00000000-0000-0000-0000-000000000000', conList=[{'connection': '/media/ceva2/Ovirt/Storage/', 'iqn': '', 'portal': '', 'user': '', 'password': '******', 'id': '00000000-0000-0000-0000-000000000000', 'port': ''}], options=None) Thread-3281::ERROR::2012-11-10 17:14:28,164::hsm::2057::Storage.HSM::(connectStorageServer) Could not connect to storageServer Traceback (most recent call last): File "/usr/share/vdsm/storage/hsm.py", line 2054, in connectStorageServer conObj.connect() File "/usr/share/vdsm/storage/storageServer.py", line 462, in connect if not self.checkTarget(): File "/usr/share/vdsm/storage/storageServer.py", line 449, in checkTarget fileSD.validateDirAccess(self._path)) File "/usr/share/vdsm/storage/fileSD.py", line 51, in validateDirAccess getProcPool().fileUtils.validateAccess(dirPath) File "/usr/share/vdsm/storage/remoteFileHandler.py", line 277, in callCrabRPCFunction *args, **kwargs) File "/usr/share/vdsm/storage/remoteFileHandler.py", line 180, in callCrabRPCFunction rawLength = self._recvAll(LENGTH_STRUCT_LENGTH, timeout) File "/usr/share/vdsm/storage/remoteFileHandler.py", line 149, in _recvAll timeLeft): File "/usr/lib64/python2.7/contextlib.py", line 84, in helper return GeneratorContextManager(func(*args, **kwds)) File "/usr/share/vdsm/storage/remoteFileHandler.py", line 136, in _poll raise Timeout() Timeout Thread-3281::INFO::2012-11-10 17:14:28,165::logUtils::39::dispatcher::(wrapper) Run and protect: connectStorageServer, Return response: {'statuslist': [{'status': 100, 'id': '00000000-0000-0000-0000-000000000000'}]} Thread-3281::DEBUG::2012-11-10 17:14:28,165::task::1151::TaskManager.Task::(prepare) Task=`430ededc-db77-43cb-a21e-c89cd70772f6`::finished: {'statuslist': [{'status': 100, 'id': '00000000-0000-0000-0000-000000000000'}]} Thread-3281::DEBUG::2012-11-10 17:14:28,165::task::568::TaskManager.Task::(_updateState) Task=`430ededc-db77-43cb-a21e-c89cd70772f6`::moving from state preparing -> state finished Thread-3281::DEBUG::2012-11-10 17:14:28,165::resourceManager::809::ResourceManager.Owner::(releaseAll) Owner.releaseAll requests {} resources {} Thread-3281::DEBUG::2012-11-10 17:14:28,166::resourceManager::844::ResourceManager.Owner::(cancelAll) Owner.cancelAll requests {} Thread-3281::DEBUG::2012-11-10 17:14:28,166::task::957::TaskManager.Task::(_decref) Task=`430ededc-db77-43cb-a21e-c89cd70772f6`::ref 0 aborting False
Packages installed:
[root@localhost vdsm]# rpm -qa | egrep vdsm\|ovirt-engine | sort ovirt-engine-3.1.0-3.20121109.git2dc9b51.fc17.noarch ovirt-engine-backend-3.1.0-3.20121109.git2dc9b51.fc17.noarch ovirt-engine-cli-3.1.0.6-1.fc17.noarch ovirt-engine-config-3.1.0-3.20121109.git2dc9b51.fc17.noarch ovirt-engine-dbscripts-3.1.0-3.20121109.git2dc9b51.fc17.noarch ovirt-engine-genericapi-3.1.0-3.20121109.git2dc9b51.fc17.noarch ovirt-engine-notification-service-3.1.0-3.20121109.git2dc9b51.fc17.noarch ovirt-engine-restapi-3.1.0-3.20121109.git2dc9b51.fc17.noarch ovirt-engine-sdk-3.1.0.4-1.fc17.noarch ovirt-engine-setup-3.1.0-3.20121109.git2dc9b51.fc17.noarch
ovirt-engine-setup-plugin-allinone-3.1.0-3.20121109.git2dc9b51.fc17.noarch
ovirt-engine-tools-common-3.1.0-3.20121109.git2dc9b51.fc17.noarch ovirt-engine-userportal-3.1.0-3.20121109.git2dc9b51.fc17.noarch ovirt-engine-webadmin-portal-3.1.0-3.20121109.git2dc9b51.fc17.noarch vdsm-4.10.1-0.119.git60d7e63.fc17.x86_64 vdsm-bootstrap-4.10.1-0.119.git60d7e63.fc17.noarch vdsm-cli-4.10.1-0.119.git60d7e63.fc17.noarch vdsm-debuginfo-4.10.1-0.119.git60d7e63.fc17.x86_64 vdsm-debug-plugin-4.10.1-0.119.git60d7e63.fc17.noarch vdsm-gluster-4.10.1-0.119.git60d7e63.fc17.noarch vdsm-hook-directlun-4.10.1-0.119.git60d7e63.fc17.noarch vdsm-hook-faqemu-4.10.1-0.119.git60d7e63.fc17.noarch vdsm-hook-fileinject-4.10.1-0.119.git60d7e63.fc17.noarch vdsm-hook-floppy-4.10.1-0.119.git60d7e63.fc17.noarch vdsm-hook-hostusb-4.10.1-0.119.git60d7e63.fc17.noarch vdsm-hook-hugepages-4.10.1-0.119.git60d7e63.fc17.noarch vdsm-hook-isolatedprivatevlan-4.10.1-0.119.git60d7e63.fc17.noarch vdsm-hook-numa-4.10.1-0.119.git60d7e63.fc17.noarch vdsm-hook-pincpu-4.10.1-0.119.git60d7e63.fc17.noarch vdsm-hook-promisc-4.10.1-0.119.git60d7e63.fc17.noarch vdsm-hook-qemucmdline-4.10.1-0.119.git60d7e63.fc17.noarch vdsm-hook-qos-4.10.1-0.119.git60d7e63.fc17.noarch vdsm-hook-scratchpad-4.10.1-0.119.git60d7e63.fc17.noarch vdsm-hook-smartcard-4.10.1-0.119.git60d7e63.fc17.noarch vdsm-hook-smbios-4.10.1-0.119.git60d7e63.fc17.noarch vdsm-hook-sriov-4.10.1-0.119.git60d7e63.fc17.noarch vdsm-hook-vhostmd-4.10.1-0.119.git60d7e63.fc17.noarch vdsm-hook-vmdisk-4.10.1-0.119.git60d7e63.fc17.noarch vdsm-python-4.10.1-0.119.git60d7e63.fc17.x86_64 vdsm-reg-4.10.1-0.119.git60d7e63.fc17.noarch vdsm-rest-4.10.1-0.119.git60d7e63.fc17.noarch vdsm-tests-4.10.1-0.119.git60d7e63.fc17.noarch vdsm-xmlrpc-4.10.1-0.119.git60d7e63.fc17.noarch
On Fri, Nov 9, 2012 at 6:16 PM, Steve Gordon < sgordon@redhat.com > wrote:
----- Original Message -----
From: "Steve Gordon" < sgordon@redhat.com > To: "Moran Goldboim" < mgoldboi@redhat.com > Cc: users@ovirt.org , "Cristian Falcas" < cristi.falcas@gmail.com > Sent: Friday, November 9, 2012 11:11:23 AM Subject: Re: [Users] allinone setup can't add storage
----- Original Message -----
From: "Moran Goldboim" < mgoldboi@redhat.com > To: "Cristian Falcas" < cristi.falcas@gmail.com > Cc: users@ovirt.org Sent: Wednesday, November 7, 2012 8:47:08 AM Subject: Re: [Users] allinone setup can't add storage
you are using ovirt-sdk from stable repo and engine from nightly repo, those doesn't work together.
Funnily enough, you will find it is actually in Fedora's stable updates repository and you can tell from the dist tag that is where this user (and many others) picked it up from. Refer to:
https://bugzilla.redhat.com/show_bug.cgi?id=851893 Correction, this is the BZ: https://bugzilla.redhat.com/show_bug.cgi?id=869457
https://admin.fedoraproject.org/updates/FEDORA-2012-14464/ovirt-engine-sdk-3...
This has been causing a variety of issues for users since it was pushed in early October. Here are some others from my logs:
Oct 16 06:13:04 <gestahlt> AIO: Error: could not create ovirtsdk API object Oct 23 19:58:03 <Rudde> okey here is my Error "Exception: Error: could not create ovirtsdk API object Oct 27 08:13:32 <yaro014> after yum update i'm getting error "ovirt Error: could not create ovirtsdk API object" while engine-setup, could anyone advice ? cannot find anything in google
It breaks installation of oVirt 3.1 AIO on F17 and also I believe some of the tools for "normal" installs. In future it would be better if we did not push incompatible versions to stable Fedora releases, it directly convenes Fedora's updates policy.
Thanks,
Steve
On 11/07/2012 01:40 PM, Cristian Falcas wrote:
Hi all,
Can someone help me with this error:
AIO: Adding Local Datacenter and cluster... [ ERROR ] Error: could not create ovirtsdk API object
trace from the log file
2012-11-07 13:34:44::DEBUG::all_in_one_100::220::root:: Initiating the API object 2012-11-07 13:34:44::ERROR::all_in_one_100::231::root:: Traceback (most recent call last): File "/usr/share/ovirt-engine/scripts/plugins/all_in_one_100.py", line 228, in initAPI ca_file=basedefs.FILE_CA_CRT_SRC, TypeError: __init__() got an unexpected keyword argument 'ca_file'
2012-11-07 13:34:44::DEBUG::setup_sequences::62::root:: Traceback (most recent call last): File "/usr/share/ovirt-engine/scripts/setup_sequences.py", line 60, in run function() File "/usr/share/ovirt-engine/scripts/plugins/all_in_one_100.py", line 232, in initAPI raise Exception(ERROR_CREATE_API_OBJECT) Exception: Error: could not create ovirtsdk API object
Versions installed:
ovirt-engine-3.1.0-3.20121106.git6891171.fc17.noarch ovirt-engine-backend-3.1.0-3.20121106.git6891171.fc17.noarch ovirt-engine-cli-3.1.0.6-1.fc17.noarch ovirt-engine-config-3.1.0-3.20121106.git6891171.fc17.noarch ovirt-engine-dbscripts-3.1.0-3.20121106.git6891171.fc17.noarch ovirt-engine-genericapi-3.1.0-3.20121106.git6891171.fc17.noarch
ovirt-engine-notification-service-3.1.0-3.20121106.git6891171.fc17.noarch
ovirt-engine-restapi-3.1.0-3.20121106.git6891171.fc17.noarch ovirt-engine-sdk-3.1.0.4-1.fc17.noarch ovirt-engine-setup-3.1.0-3.20121106.git6891171.fc17.noarch
ovirt-engine-setup-plugin-allinone-3.1.0-3.20121106.git6891171.fc17.noarch
ovirt-engine-tools-common-3.1.0-3.20121106.git6891171.fc17.noarch ovirt-engine-userportal-3.1.0-3.20121106.git6891171.fc17.noarch
ovirt-engine-webadmin-portal-3.1.0-3.20121106.git6891171.fc17.noarch
_______________________________________________ Users mailing list Users@ovirt.org http://lists.ovirt.org/mailman/listinfo/users
_______________________________________________ Users mailing list Users@ovirt.org http://lists.ovirt.org/mailman/listinfo/users
_______________________________________________ Users mailing list Users@ovirt.org http://lists.ovirt.org/mailman/listinfo/users

Hi, This works now with the latest nightly builds. On Sun, Nov 11, 2012 at 6:34 AM, Cristian Falcas <cristi.falcas@gmail.com>wrote:
attached is the vdsm log after trying one time with cli and one time from gui
On Sun, Nov 11, 2012 at 1:03 AM, Ayal Baron <abaron@redhat.com> wrote:
----- Original Message -----
I install only the 3.1 version on all packages. I dropped the firewall and selinux.
I tried to add a new storage with the gui and via cli: [oVirt shell (connected)]# create storagedomain --host-name local_host --storage-type localfs --storage-path /media/ceva2/Ovirt/Storage --name test --type "data"
error: status: 400 reason: Bad Request detail: Cannot add Storage. Internal error, Storage Connection doesn't exist.
Can you attach the full vdsm.log ?
And I have the same exception:
Thread-3280::DEBUG::2012-11-10 17:14:27,989::BindingXMLRPC::161::vds::(wrapper) [79.112.99.231] Thread-3280::DEBUG::2012-11-10 17:14:27,990::task::568::TaskManager.Task::(_updateState) Task=`f9dba1a1-365d-4e00-8a54-3dbc70604460`::moving from state init -> state preparing Thread-3280::INFO::2012-11-10 17:14:27,991::logUtils::37::dispatcher::(wrapper) Run and protect: validateStorageServerConnection(domType=4, spUUID='00000000-0000-0000-0000-000000000000', conList=[{'connection': '/media/ceva2/Ovirt/Storage/', 'iqn': '', 'portal': '', 'user': '', 'password': '******', 'id': '00000000-0000-0000-0000-000000000000', 'port': ''}], options=None) Thread-3280::INFO::2012-11-10 17:14:27,991::logUtils::39::dispatcher::(wrapper) Run and protect: validateStorageServerConnection, Return response: {'statuslist': [{'status': 0, 'id': '00000000-0000-0000-0000-000000000000'}]} Thread-3280::DEBUG::2012-11-10 17:14:27,991::task::1151::TaskManager.Task::(prepare) Task=`f9dba1a1-365d-4e00-8a54-3dbc70604460`::finished: {'statuslist': [{'status': 0, 'id': '00000000-0000-0000-0000-000000000000'}]} Thread-3280::DEBUG::2012-11-10 17:14:27,991::task::568::TaskManager.Task::(_updateState) Task=`f9dba1a1-365d-4e00-8a54-3dbc70604460`::moving from state preparing -> state finished Thread-3280::DEBUG::2012-11-10 17:14:27,991::resourceManager::809::ResourceManager.Owner::(releaseAll) Owner.releaseAll requests {} resources {} Thread-3280::DEBUG::2012-11-10 17:14:27,991::resourceManager::844::ResourceManager.Owner::(cancelAll) Owner.cancelAll requests {} Thread-3280::DEBUG::2012-11-10 17:14:27,991::task::957::TaskManager.Task::(_decref) Task=`f9dba1a1-365d-4e00-8a54-3dbc70604460`::ref 0 aborting False Thread-3281::DEBUG::2012-11-10 17:14:28,049::BindingXMLRPC::161::vds::(wrapper) [79.112.99.231] Thread-3281::DEBUG::2012-11-10 17:14:28,049::task::568::TaskManager.Task::(_updateState) Task=`430ededc-db77-43cb-a21e-c89cd70772f6`::moving from state init -> state preparing Thread-3281::INFO::2012-11-10 17:14:28,049::logUtils::37::dispatcher::(wrapper) Run and protect: connectStorageServer(domType=4, spUUID='00000000-0000-0000-0000-000000000000', conList=[{'connection': '/media/ceva2/Ovirt/Storage/', 'iqn': '', 'portal': '', 'user': '', 'password': '******', 'id': '00000000-0000-0000-0000-000000000000', 'port': ''}], options=None) Thread-3281::ERROR::2012-11-10 17:14:28,164::hsm::2057::Storage.HSM::(connectStorageServer) Could not connect to storageServer Traceback (most recent call last): File "/usr/share/vdsm/storage/hsm.py", line 2054, in connectStorageServer conObj.connect() File "/usr/share/vdsm/storage/storageServer.py", line 462, in connect if not self.checkTarget(): File "/usr/share/vdsm/storage/storageServer.py", line 449, in checkTarget fileSD.validateDirAccess(self._path)) File "/usr/share/vdsm/storage/fileSD.py", line 51, in validateDirAccess getProcPool().fileUtils.validateAccess(dirPath) File "/usr/share/vdsm/storage/remoteFileHandler.py", line 277, in callCrabRPCFunction *args, **kwargs) File "/usr/share/vdsm/storage/remoteFileHandler.py", line 180, in callCrabRPCFunction rawLength = self._recvAll(LENGTH_STRUCT_LENGTH, timeout) File "/usr/share/vdsm/storage/remoteFileHandler.py", line 149, in _recvAll timeLeft): File "/usr/lib64/python2.7/contextlib.py", line 84, in helper return GeneratorContextManager(func(*args, **kwds)) File "/usr/share/vdsm/storage/remoteFileHandler.py", line 136, in _poll raise Timeout() Timeout Thread-3281::INFO::2012-11-10 17:14:28,165::logUtils::39::dispatcher::(wrapper) Run and protect: connectStorageServer, Return response: {'statuslist': [{'status': 100, 'id': '00000000-0000-0000-0000-000000000000'}]} Thread-3281::DEBUG::2012-11-10 17:14:28,165::task::1151::TaskManager.Task::(prepare) Task=`430ededc-db77-43cb-a21e-c89cd70772f6`::finished: {'statuslist': [{'status': 100, 'id': '00000000-0000-0000-0000-000000000000'}]} Thread-3281::DEBUG::2012-11-10 17:14:28,165::task::568::TaskManager.Task::(_updateState) Task=`430ededc-db77-43cb-a21e-c89cd70772f6`::moving from state preparing -> state finished Thread-3281::DEBUG::2012-11-10 17:14:28,165::resourceManager::809::ResourceManager.Owner::(releaseAll) Owner.releaseAll requests {} resources {} Thread-3281::DEBUG::2012-11-10 17:14:28,166::resourceManager::844::ResourceManager.Owner::(cancelAll) Owner.cancelAll requests {} Thread-3281::DEBUG::2012-11-10 17:14:28,166::task::957::TaskManager.Task::(_decref) Task=`430ededc-db77-43cb-a21e-c89cd70772f6`::ref 0 aborting False
Packages installed:
[root@localhost vdsm]# rpm -qa | egrep vdsm\|ovirt-engine | sort ovirt-engine-3.1.0-3.20121109.git2dc9b51.fc17.noarch ovirt-engine-backend-3.1.0-3.20121109.git2dc9b51.fc17.noarch ovirt-engine-cli-3.1.0.6-1.fc17.noarch ovirt-engine-config-3.1.0-3.20121109.git2dc9b51.fc17.noarch ovirt-engine-dbscripts-3.1.0-3.20121109.git2dc9b51.fc17.noarch ovirt-engine-genericapi-3.1.0-3.20121109.git2dc9b51.fc17.noarch
ovirt-engine-notification-service-3.1.0-3.20121109.git2dc9b51.fc17.noarch
ovirt-engine-restapi-3.1.0-3.20121109.git2dc9b51.fc17.noarch ovirt-engine-sdk-3.1.0.4-1.fc17.noarch ovirt-engine-setup-3.1.0-3.20121109.git2dc9b51.fc17.noarch
ovirt-engine-setup-plugin-allinone-3.1.0-3.20121109.git2dc9b51.fc17.noarch
ovirt-engine-tools-common-3.1.0-3.20121109.git2dc9b51.fc17.noarch ovirt-engine-userportal-3.1.0-3.20121109.git2dc9b51.fc17.noarch ovirt-engine-webadmin-portal-3.1.0-3.20121109.git2dc9b51.fc17.noarch vdsm-4.10.1-0.119.git60d7e63.fc17.x86_64 vdsm-bootstrap-4.10.1-0.119.git60d7e63.fc17.noarch vdsm-cli-4.10.1-0.119.git60d7e63.fc17.noarch vdsm-debuginfo-4.10.1-0.119.git60d7e63.fc17.x86_64 vdsm-debug-plugin-4.10.1-0.119.git60d7e63.fc17.noarch vdsm-gluster-4.10.1-0.119.git60d7e63.fc17.noarch vdsm-hook-directlun-4.10.1-0.119.git60d7e63.fc17.noarch vdsm-hook-faqemu-4.10.1-0.119.git60d7e63.fc17.noarch vdsm-hook-fileinject-4.10.1-0.119.git60d7e63.fc17.noarch vdsm-hook-floppy-4.10.1-0.119.git60d7e63.fc17.noarch vdsm-hook-hostusb-4.10.1-0.119.git60d7e63.fc17.noarch vdsm-hook-hugepages-4.10.1-0.119.git60d7e63.fc17.noarch vdsm-hook-isolatedprivatevlan-4.10.1-0.119.git60d7e63.fc17.noarch vdsm-hook-numa-4.10.1-0.119.git60d7e63.fc17.noarch vdsm-hook-pincpu-4.10.1-0.119.git60d7e63.fc17.noarch vdsm-hook-promisc-4.10.1-0.119.git60d7e63.fc17.noarch vdsm-hook-qemucmdline-4.10.1-0.119.git60d7e63.fc17.noarch vdsm-hook-qos-4.10.1-0.119.git60d7e63.fc17.noarch vdsm-hook-scratchpad-4.10.1-0.119.git60d7e63.fc17.noarch vdsm-hook-smartcard-4.10.1-0.119.git60d7e63.fc17.noarch vdsm-hook-smbios-4.10.1-0.119.git60d7e63.fc17.noarch vdsm-hook-sriov-4.10.1-0.119.git60d7e63.fc17.noarch vdsm-hook-vhostmd-4.10.1-0.119.git60d7e63.fc17.noarch vdsm-hook-vmdisk-4.10.1-0.119.git60d7e63.fc17.noarch vdsm-python-4.10.1-0.119.git60d7e63.fc17.x86_64 vdsm-reg-4.10.1-0.119.git60d7e63.fc17.noarch vdsm-rest-4.10.1-0.119.git60d7e63.fc17.noarch vdsm-tests-4.10.1-0.119.git60d7e63.fc17.noarch vdsm-xmlrpc-4.10.1-0.119.git60d7e63.fc17.noarch
On Fri, Nov 9, 2012 at 6:16 PM, Steve Gordon < sgordon@redhat.com > wrote:
----- Original Message -----
From: "Steve Gordon" < sgordon@redhat.com > To: "Moran Goldboim" < mgoldboi@redhat.com > Cc: users@ovirt.org , "Cristian Falcas" < cristi.falcas@gmail.com > Sent: Friday, November 9, 2012 11:11:23 AM Subject: Re: [Users] allinone setup can't add storage
----- Original Message -----
From: "Moran Goldboim" < mgoldboi@redhat.com > To: "Cristian Falcas" < cristi.falcas@gmail.com > Cc: users@ovirt.org Sent: Wednesday, November 7, 2012 8:47:08 AM Subject: Re: [Users] allinone setup can't add storage
you are using ovirt-sdk from stable repo and engine from nightly repo, those doesn't work together.
Funnily enough, you will find it is actually in Fedora's stable updates repository and you can tell from the dist tag that is where this user (and many others) picked it up from. Refer to:
https://bugzilla.redhat.com/show_bug.cgi?id=851893 Correction, this is the BZ: https://bugzilla.redhat.com/show_bug.cgi?id=869457
https://admin.fedoraproject.org/updates/FEDORA-2012-14464/ovirt-engine-sdk-3...
This has been causing a variety of issues for users since it was pushed in early October. Here are some others from my logs:
Oct 16 06:13:04 <gestahlt> AIO: Error: could not create ovirtsdk API object Oct 23 19:58:03 <Rudde> okey here is my Error "Exception: Error: could not create ovirtsdk API object Oct 27 08:13:32 <yaro014> after yum update i'm getting error "ovirt Error: could not create ovirtsdk API object" while engine-setup, could anyone advice ? cannot find anything in google
It breaks installation of oVirt 3.1 AIO on F17 and also I believe some of the tools for "normal" installs. In future it would be better if we did not push incompatible versions to stable Fedora releases, it directly convenes Fedora's updates policy.
Thanks,
Steve
On 11/07/2012 01:40 PM, Cristian Falcas wrote:
Hi all,
Can someone help me with this error:
AIO: Adding Local Datacenter and cluster... [ ERROR ] Error: could not create ovirtsdk API object
trace from the log file
2012-11-07 13:34:44::DEBUG::all_in_one_100::220::root:: Initiating the API object 2012-11-07 13:34:44::ERROR::all_in_one_100::231::root:: Traceback (most recent call last): File "/usr/share/ovirt-engine/scripts/plugins/all_in_one_100.py", line 228, in initAPI ca_file=basedefs.FILE_CA_CRT_SRC, TypeError: __init__() got an unexpected keyword argument 'ca_file'
2012-11-07 13:34:44::DEBUG::setup_sequences::62::root:: Traceback (most recent call last): File "/usr/share/ovirt-engine/scripts/setup_sequences.py", line 60, in run function() File "/usr/share/ovirt-engine/scripts/plugins/all_in_one_100.py", line 232, in initAPI raise Exception(ERROR_CREATE_API_OBJECT) Exception: Error: could not create ovirtsdk API object
Versions installed:
ovirt-engine-3.1.0-3.20121106.git6891171.fc17.noarch ovirt-engine-backend-3.1.0-3.20121106.git6891171.fc17.noarch ovirt-engine-cli-3.1.0.6-1.fc17.noarch ovirt-engine-config-3.1.0-3.20121106.git6891171.fc17.noarch ovirt-engine-dbscripts-3.1.0-3.20121106.git6891171.fc17.noarch ovirt-engine-genericapi-3.1.0-3.20121106.git6891171.fc17.noarch
ovirt-engine-notification-service-3.1.0-3.20121106.git6891171.fc17.noarch
ovirt-engine-restapi-3.1.0-3.20121106.git6891171.fc17.noarch ovirt-engine-sdk-3.1.0.4-1.fc17.noarch ovirt-engine-setup-3.1.0-3.20121106.git6891171.fc17.noarch
ovirt-engine-setup-plugin-allinone-3.1.0-3.20121106.git6891171.fc17.noarch
ovirt-engine-tools-common-3.1.0-3.20121106.git6891171.fc17.noarch ovirt-engine-userportal-3.1.0-3.20121106.git6891171.fc17.noarch
ovirt-engine-webadmin-portal-3.1.0-3.20121106.git6891171.fc17.noarch
_______________________________________________ Users mailing list Users@ovirt.org http://lists.ovirt.org/mailman/listinfo/users
_______________________________________________ Users mailing list Users@ovirt.org http://lists.ovirt.org/mailman/listinfo/users
_______________________________________________ Users mailing list Users@ovirt.org http://lists.ovirt.org/mailman/listinfo/users

----- Original Message -----
Hi,
This works now with the latest nightly builds.
Thanks for the update!
On Sun, Nov 11, 2012 at 6:34 AM, Cristian Falcas < cristi.falcas@gmail.com > wrote:
attached is the vdsm log after trying one time with cli and one time from gui
On Sun, Nov 11, 2012 at 1:03 AM, Ayal Baron < abaron@redhat.com > wrote:
----- Original Message -----
I install only the 3.1 version on all packages. I dropped the firewall and selinux.
I tried to add a new storage with the gui and via cli: [oVirt shell (connected)]# create storagedomain --host-name local_host --storage-type localfs --storage-path /media/ceva2/Ovirt/Storage --name test --type "data"
error: status: 400 reason: Bad Request detail: Cannot add Storage. Internal error, Storage Connection doesn't exist.
Can you attach the full vdsm.log ?
And I have the same exception:
Thread-3280::DEBUG::2012-11-10 17:14:27,989::BindingXMLRPC::161::vds::(wrapper) [79.112.99.231] Thread-3280::DEBUG::2012-11-10 17:14:27,990::task::568::TaskManager.Task::(_updateState) Task=`f9dba1a1-365d-4e00-8a54-3dbc70604460`::moving from state init -> state preparing Thread-3280::INFO::2012-11-10 17:14:27,991::logUtils::37::dispatcher::(wrapper) Run and protect: validateStorageServerConnection(domType=4, spUUID='00000000-0000-0000-0000-000000000000', conList=[{'connection': '/media/ceva2/Ovirt/Storage/', 'iqn': '', 'portal': '', 'user': '', 'password': '******', 'id': '00000000-0000-0000-0000-000000000000', 'port': ''}], options=None) Thread-3280::INFO::2012-11-10 17:14:27,991::logUtils::39::dispatcher::(wrapper) Run and protect: validateStorageServerConnection, Return response: {'statuslist': [{'status': 0, 'id': '00000000-0000-0000-0000-000000000000'}]} Thread-3280::DEBUG::2012-11-10 17:14:27,991::task::1151::TaskManager.Task::(prepare) Task=`f9dba1a1-365d-4e00-8a54-3dbc70604460`::finished: {'statuslist': [{'status': 0, 'id': '00000000-0000-0000-0000-000000000000'}]} Thread-3280::DEBUG::2012-11-10 17:14:27,991::task::568::TaskManager.Task::(_updateState) Task=`f9dba1a1-365d-4e00-8a54-3dbc70604460`::moving from state preparing -> state finished Thread-3280::DEBUG::2012-11-10 17:14:27,991::resourceManager::809::ResourceManager.Owner::(releaseAll) Owner.releaseAll requests {} resources {} Thread-3280::DEBUG::2012-11-10 17:14:27,991::resourceManager::844::ResourceManager.Owner::(cancelAll) Owner.cancelAll requests {} Thread-3280::DEBUG::2012-11-10 17:14:27,991::task::957::TaskManager.Task::(_decref) Task=`f9dba1a1-365d-4e00-8a54-3dbc70604460`::ref 0 aborting False Thread-3281::DEBUG::2012-11-10 17:14:28,049::BindingXMLRPC::161::vds::(wrapper) [79.112.99.231] Thread-3281::DEBUG::2012-11-10 17:14:28,049::task::568::TaskManager.Task::(_updateState) Task=`430ededc-db77-43cb-a21e-c89cd70772f6`::moving from state init -> state preparing Thread-3281::INFO::2012-11-10 17:14:28,049::logUtils::37::dispatcher::(wrapper) Run and protect: connectStorageServer(domType=4, spUUID='00000000-0000-0000-0000-000000000000', conList=[{'connection': '/media/ceva2/Ovirt/Storage/', 'iqn': '', 'portal': '', 'user': '', 'password': '******', 'id': '00000000-0000-0000-0000-000000000000', 'port': ''}], options=None) Thread-3281::ERROR::2012-11-10 17:14:28,164::hsm::2057::Storage.HSM::(connectStorageServer) Could not connect to storageServer Traceback (most recent call last): File "/usr/share/vdsm/storage/hsm.py", line 2054, in connectStorageServer conObj.connect() File "/usr/share/vdsm/storage/storageServer.py", line 462, in connect if not self.checkTarget(): File "/usr/share/vdsm/storage/storageServer.py", line 449, in checkTarget fileSD.validateDirAccess(self._path)) File "/usr/share/vdsm/storage/fileSD.py", line 51, in validateDirAccess getProcPool().fileUtils.validateAccess(dirPath) File "/usr/share/vdsm/storage/remoteFileHandler.py", line 277, in callCrabRPCFunction *args, **kwargs) File "/usr/share/vdsm/storage/remoteFileHandler.py", line 180, in callCrabRPCFunction rawLength = self._recvAll(LENGTH_STRUCT_LENGTH, timeout) File "/usr/share/vdsm/storage/remoteFileHandler.py", line 149, in _recvAll timeLeft): File "/usr/lib64/python2.7/contextlib.py", line 84, in helper return GeneratorContextManager(func(*args, **kwds)) File "/usr/share/vdsm/storage/remoteFileHandler.py", line 136, in _poll raise Timeout() Timeout Thread-3281::INFO::2012-11-10 17:14:28,165::logUtils::39::dispatcher::(wrapper) Run and protect: connectStorageServer, Return response: {'statuslist': [{'status': 100, 'id': '00000000-0000-0000-0000-000000000000'}]} Thread-3281::DEBUG::2012-11-10 17:14:28,165::task::1151::TaskManager.Task::(prepare) Task=`430ededc-db77-43cb-a21e-c89cd70772f6`::finished: {'statuslist': [{'status': 100, 'id': '00000000-0000-0000-0000-000000000000'}]} Thread-3281::DEBUG::2012-11-10 17:14:28,165::task::568::TaskManager.Task::(_updateState) Task=`430ededc-db77-43cb-a21e-c89cd70772f6`::moving from state preparing -> state finished Thread-3281::DEBUG::2012-11-10 17:14:28,165::resourceManager::809::ResourceManager.Owner::(releaseAll) Owner.releaseAll requests {} resources {} Thread-3281::DEBUG::2012-11-10 17:14:28,166::resourceManager::844::ResourceManager.Owner::(cancelAll) Owner.cancelAll requests {} Thread-3281::DEBUG::2012-11-10 17:14:28,166::task::957::TaskManager.Task::(_decref) Task=`430ededc-db77-43cb-a21e-c89cd70772f6`::ref 0 aborting False
Packages installed:
[root@localhost vdsm]# rpm -qa | egrep vdsm\|ovirt-engine | sort ovirt-engine-3.1.0-3.20121109.git2dc9b51.fc17.noarch ovirt-engine-backend-3.1.0-3.20121109.git2dc9b51.fc17.noarch ovirt-engine-cli-3.1.0.6-1.fc17.noarch ovirt-engine-config-3.1.0-3.20121109.git2dc9b51.fc17.noarch ovirt-engine-dbscripts-3.1.0-3.20121109.git2dc9b51.fc17.noarch ovirt-engine-genericapi-3.1.0-3.20121109.git2dc9b51.fc17.noarch ovirt-engine-notification-service-3.1.0-3.20121109.git2dc9b51.fc17.noarch ovirt-engine-restapi-3.1.0-3.20121109.git2dc9b51.fc17.noarch ovirt-engine-sdk-3.1.0.4-1.fc17.noarch ovirt-engine-setup-3.1.0-3.20121109.git2dc9b51.fc17.noarch ovirt-engine-setup-plugin-allinone-3.1.0-3.20121109.git2dc9b51.fc17.noarch ovirt-engine-tools-common-3.1.0-3.20121109.git2dc9b51.fc17.noarch ovirt-engine-userportal-3.1.0-3.20121109.git2dc9b51.fc17.noarch ovirt-engine-webadmin-portal-3.1.0-3.20121109.git2dc9b51.fc17.noarch vdsm-4.10.1-0.119.git60d7e63.fc17.x86_64 vdsm-bootstrap-4.10.1-0.119.git60d7e63.fc17.noarch vdsm-cli-4.10.1-0.119.git60d7e63.fc17.noarch vdsm-debuginfo-4.10.1-0.119.git60d7e63.fc17.x86_64 vdsm-debug-plugin-4.10.1-0.119.git60d7e63.fc17.noarch vdsm-gluster-4.10.1-0.119.git60d7e63.fc17.noarch vdsm-hook-directlun-4.10.1-0.119.git60d7e63.fc17.noarch vdsm-hook-faqemu-4.10.1-0.119.git60d7e63.fc17.noarch vdsm-hook-fileinject-4.10.1-0.119.git60d7e63.fc17.noarch vdsm-hook-floppy-4.10.1-0.119.git60d7e63.fc17.noarch vdsm-hook-hostusb-4.10.1-0.119.git60d7e63.fc17.noarch vdsm-hook-hugepages-4.10.1-0.119.git60d7e63.fc17.noarch vdsm-hook-isolatedprivatevlan-4.10.1-0.119.git60d7e63.fc17.noarch vdsm-hook-numa-4.10.1-0.119.git60d7e63.fc17.noarch vdsm-hook-pincpu-4.10.1-0.119.git60d7e63.fc17.noarch vdsm-hook-promisc-4.10.1-0.119.git60d7e63.fc17.noarch vdsm-hook-qemucmdline-4.10.1-0.119.git60d7e63.fc17.noarch vdsm-hook-qos-4.10.1-0.119.git60d7e63.fc17.noarch vdsm-hook-scratchpad-4.10.1-0.119.git60d7e63.fc17.noarch vdsm-hook-smartcard-4.10.1-0.119.git60d7e63.fc17.noarch vdsm-hook-smbios-4.10.1-0.119.git60d7e63.fc17.noarch vdsm-hook-sriov-4.10.1-0.119.git60d7e63.fc17.noarch vdsm-hook-vhostmd-4.10.1-0.119.git60d7e63.fc17.noarch vdsm-hook-vmdisk-4.10.1-0.119.git60d7e63.fc17.noarch vdsm-python-4.10.1-0.119.git60d7e63.fc17.x86_64 vdsm-reg-4.10.1-0.119.git60d7e63.fc17.noarch vdsm-rest-4.10.1-0.119.git60d7e63.fc17.noarch vdsm-tests-4.10.1-0.119.git60d7e63.fc17.noarch vdsm-xmlrpc-4.10.1-0.119.git60d7e63.fc17.noarch
On Fri, Nov 9, 2012 at 6:16 PM, Steve Gordon < sgordon@redhat.com > wrote:
----- Original Message -----
From: "Steve Gordon" < sgordon@redhat.com > To: "Moran Goldboim" < mgoldboi@redhat.com >
Cc: users@ovirt.org , "Cristian Falcas" < cristi.falcas@gmail.com
Sent: Friday, November 9, 2012 11:11:23 AM Subject: Re: [Users] allinone setup can't add storage
----- Original Message -----
From: "Moran Goldboim" < mgoldboi@redhat.com > To: "Cristian Falcas" < cristi.falcas@gmail.com > Cc: users@ovirt.org Sent: Wednesday, November 7, 2012 8:47:08 AM Subject: Re: [Users] allinone setup can't add storage
you are using ovirt-sdk from stable repo and engine from nightly repo, those doesn't work together.
Funnily enough, you will find it is actually in Fedora's stable updates repository and you can tell from the dist tag that is where this user (and many others) picked it up from. Refer to:
https://bugzilla.redhat.com/show_bug.cgi?id=851893 Correction, this is the BZ: https://bugzilla.redhat.com/show_bug.cgi?id=869457
https://admin.fedoraproject.org/updates/FEDORA-2012-14464/ovirt-engine-sdk-3...
This has been causing a variety of issues for users since it was pushed in early October. Here are some others from my logs:
Oct 16 06:13:04 <gestahlt> AIO: Error: could not create ovirtsdk API object Oct 23 19:58:03 <Rudde> okey here is my Error "Exception: Error: could not create ovirtsdk API object Oct 27 08:13:32 <yaro014> after yum update i'm getting error "ovirt Error: could not create ovirtsdk API object" while engine-setup, could anyone advice ? cannot find anything in google
It breaks installation of oVirt 3.1 AIO on F17 and also I believe some of the tools for "normal" installs. In future it would be better if we did not push incompatible versions to stable Fedora releases, it directly convenes Fedora's updates policy.
Thanks,
Steve
On 11/07/2012 01:40 PM, Cristian Falcas wrote:
Hi all,
Can someone help me with this error:
AIO: Adding Local Datacenter and cluster... [ ERROR ] Error: could not create ovirtsdk API object
trace from the log file
2012-11-07 13:34:44::DEBUG::all_in_one_100::220::root:: Initiating the API object 2012-11-07 13:34:44::ERROR::all_in_one_100::231::root:: Traceback (most recent call last): File "/usr/share/ovirt-engine/scripts/plugins/all_in_one_100.py", line 228, in initAPI ca_file=basedefs.FILE_CA_CRT_SRC, TypeError: __init__() got an unexpected keyword argument 'ca_file'
2012-11-07 13:34:44::DEBUG::setup_sequences::62::root:: Traceback (most recent call last): File "/usr/share/ovirt-engine/scripts/setup_sequences.py", line 60, in run function() File "/usr/share/ovirt-engine/scripts/plugins/all_in_one_100.py", line 232, in initAPI raise Exception(ERROR_CREATE_API_OBJECT) Exception: Error: could not create ovirtsdk API object
Versions installed:
ovirt-engine-3.1.0-3.20121106.git6891171.fc17.noarch ovirt-engine-backend-3.1.0-3.20121106.git6891171.fc17.noarch ovirt-engine-cli-3.1.0.6-1.fc17.noarch ovirt-engine-config-3.1.0-3.20121106.git6891171.fc17.noarch ovirt-engine-dbscripts-3.1.0-3.20121106.git6891171.fc17.noarch ovirt-engine-genericapi-3.1.0-3.20121106.git6891171.fc17.noarch ovirt-engine-notification-service-3.1.0-3.20121106.git6891171.fc17.noarch ovirt-engine-restapi-3.1.0-3.20121106.git6891171.fc17.noarch ovirt-engine-sdk-3.1.0.4-1.fc17.noarch ovirt-engine-setup-3.1.0-3.20121106.git6891171.fc17.noarch ovirt-engine-setup-plugin-allinone-3.1.0-3.20121106.git6891171.fc17.noarch ovirt-engine-tools-common-3.1.0-3.20121106.git6891171.fc17.noarch ovirt-engine-userportal-3.1.0-3.20121106.git6891171.fc17.noarch ovirt-engine-webadmin-portal-3.1.0-3.20121106.git6891171.fc17.noarch
_______________________________________________ Users mailing list Users@ovirt.org http://lists.ovirt.org/mailman/listinfo/users
_______________________________________________ Users mailing list Users@ovirt.org http://lists.ovirt.org/mailman/listinfo/users
_______________________________________________ Users mailing list Users@ovirt.org http://lists.ovirt.org/mailman/listinfo/users
participants (5)
-
Ayal Baron
-
Cristian Falcas
-
Itamar Heim
-
Moran Goldboim
-
Steve Gordon