If you are unable to create a new account, please email support@bspsoftware.com

 

News:

MetaManager - Administrative Tools for IBM Cognos
Pricing starting at $2,100
Download Now    Learn More

Main Menu

RAM Utilization while building a cube

Started by ComplexQry, 08 Apr 2011 09:39:54 AM

Previous topic - Next topic

ComplexQry

Is there a configuration setting that sets how much RAM Transformer can use while building a cube?  I have 4Gb of RAM, but the most I have seen Transformer use around 160Mb.  I would think that after records have been pulled from the database, RAM would really help with the sorting and such.  Thoughts>

Arsenal

Assuming that it's a Windows based cube, the answer is - you can specify it indirectly to the best of my knowledge.
It might be helpful to understand how ram really is used at the time of building a cube. While size of "available" memory to use is critical, it is not the only variable that will drive cube build times. CPU and disk configuration will also play very important roles as will the meat and potatoes of the cube - the actual query.

Coming back to RAM, Transformer's use of available memory is maximum when generating catagories (so, huge number of categories = u better have a huge amount of physical ram), drops off slightly during metadata update stage but continues to be quite high and then significantly drops when actual data is being written to the cube (at which stage disk configuration plays a much bigger role than RAM). Thus, it is during record pulling that RAM plays a big part, not "after records have been pulled." If the actual working memory available to Transformer is lower than what it needs, then page file starts being used and this can be a big performance hit.

Since we know that category creation stage typically requires the most memory, we can set it to read in more categories/transactions per commit, i.e. "cache" more transactions before writing to the disk to make use of a large physical memory. This can be set on Transformer's File>Preferences>General tab - Maximum number of Transactions per commit. You can play around with that number. Try setting it to really high.

Check to make sure that your system is not restricting the RAM available to Transformer. If all physical RAM is available and there are no restrictions, and you have 4 GB and only 160Mb is being used that might suggest that RAM really is not a bottleneck for you and you may want to look at the query itself if you're not happy with the build time of the cube.

What do the other gurus have to say?

Arsenal

Also, with Unix you have more control over RAm use and the Unix verison of Trransformer has more settings, but I'm not aware of all the settings.

jive

In the cube building process, there is step who take more disk then memory or CPU. In the sort phase depending on your setting it will keep a certain amount of row in  memory and sort it.
If you can build a cube and trace the memory usage and disk i/o and CPU for the process you will saw the fluctuation depending on transformer step.

Check at Cognos documentation for transformers in:
-- setting memory
-- parallel query
-- CPU attribution

I didn't work in Unix environment for a long time so I'm sorry I can't tell you with variable exactly.

Regards Jacques