If you are unable to create a new account, please email support@bspsoftware.com

 

News:

MetaManager - Administrative Tools for IBM Cognos
Pricing starting at $2,100
Download Now    Learn More

Main Menu

Getting Memory allocation Error in Framework Manager

Started by nbailey, 22 Nov 2011 01:47:08 PM

Previous topic - Next topic

nbailey

Running Cognos 8.4.1...
I am having a memory problem when trying to update a Cognos Query subjects in Framework Manager.
My warehouse tables reside on a MySql database on the same server as the Cognos application.
I have a large MySql table that I have added a new column to and I am trying to update the table definition in Cognos Framework manager in order to pull the new version of the MySql table into Cognos. Smaller tables work fine but this larger table I am getting the error message shown below. I the table size is 1.2 G and I have 16G of memory for my Mysql server. I am running the community version 5.1 of MySql...I have looked into MySql and I don't the this is the problem... has anyone encountered this problem or have any suggestion as to what may be the cause ???
.

RQP-DEF-0177 An error occurred while performing operation 'sqlPrepareWithOptions' status='-9'.
UDA-SQL-0107 A general exception has occurred during the operation "prepare".
MySQLhttp://odbc 3.51 Driverhttp://mysqld-5.1.49-community-logmysql client ran out of memory


Lynn

You can try adding a design filter that brings back a small set of data.


Sent from my iPhone using Tapatalk

blom0344

For testing / updating purposes FM by default fetches a small set of rows. When you test and go to the SQL tab --> options, you can change the number of rows. Unfortunately Cognos also allows to deselect it entirely causing the read of the entire table. Perhaps this is why you get an out-of-memory error . Check for the BMT.exe process, this will then explode..

nbailey

Thanks for the 2 replys to this post... I have tried both suggestions...
the number of rows for the test are limitted to 25.   I have added a design filter limiting the number or rows and also set the filter to a value that would return no rows... in all cases I am still getting the same error...

Do you know if there is a limit in FM as to the number of columns that a table to have ?   this particular table has 101 columns.   

blom0344

My models have tables with close to 500 columns. Something else must be amiss. What happens when you import the table through the metadata wizard in a new namespace. (After temporarily renaming the existing query subject) Are you updating imported tables as a bunch or one by one?

nbailey

Very interesting... if I rename the existing query Subject... and import a new one I am able to get the table into cognos with the 2 new fields I have added.  But when I test the new Query subject in FM, I still get the same error message ? 

Just one table imported not multiple tables.   

blom0344

I am guessing by now, but are the added fields some kind of binary ones? I doubt that this is a Cognos issue, looks like it has more to do with MySQL / ODBC

nbailey

The 2 new columns are varchar(12) and Varchar(30)...does anyone have any experience with MYSql ODBC drivers ?   is there a size or column limitation setting ? 

cognostechie

I used MySQL with FM few years ago and at that time I had some problems too when I used the latest ODBC driver for MySQL. It was testing fine but not running the query subjects. After a lot of brain wrecking, I found that an older version of ODBC runs better. I think it was 3.5 . There was some documentation about it as to why that version should be used but I don't have that anymore.

nbailey

Thanks for that reply.
I am corrently running 3.51.27 version of the MySql ODBC driver...
do you know if yours was earlier than this version ?

cognostechie

Not sure but I vaguely remember that it was 3.5 and then I tried 3.1. Try it.

nbailey

In case anyone is interested... I found the solution to this.
looking at this link for the my_use_result() setting http://dev.mysql.com/doc/refman/5.1/en/mysql-use-result.html
I was able to set the my_use_result parameter on the odbc data source administrator utility, restart the service and retest. this setting forces the cache on the server to be used instead of the cache in the odbc driver.


;)