If you are unable to create a new account, please email support@bspsoftware.com

 

News:

MetaManager - Administrative Tools for IBM Cognos
Pricing starting at $2,100
Download Now    Learn More

Main Menu

99 Million Cell Cube with 100k e.list slices

Started by Charles.Lorenz, 26 Apr 2007 07:18:08 AM

Previous topic - Next topic

Charles.Lorenz

Can anyone comment on cube sizes in Analyst versus Contributor and the pitfalls of large cubes.

I'd like to get a list together for this area in order to help newer Planning developers understand why cube sizes need to stay small...although e.list slice also needs to be taken into account when thinking through size.

I have a pretty good list in my head, but want to hear from others on the topic.

thanks
Charlie

Wouter

#1
Hello,

I am currently experimenting with cut-down models and dynamic access tables. Basically where I used to think that size does matter (...), I'm currently changing my opinion.

In general I use
* cut-down models for the e-list items
* dynamic access tables to divide large D-Lists (I put ID's in the labels of the d-list items, use a filter to cut down the d-list into several smaller parts for access tables which are imported from Analyst cube exports (..))

E.g. Including E-List I had a cube containing ~45,6million cells (65(elist)*1170*15*2*2*10). By using cut-down models and the access tables, I come to ~10800 cells per piece. So cut down models to cut my e-list (/65) and access tables to cut my largest D-List (also /65). Meaning that this wil take 10800cells*4bytes/1024=42kb in memory. So the memory usage is neglectable. Off course it takes a lot of extra work, plus you need to be able to cut some of your larger d-lists based on a simple rule.

The official Cognos rule is that you should not exceed 2GB of memory (500 million cells) but I doubt if this will work at all (still talking Contributor). I think a general rule that can be applied is that the number of cells in the computer memory should be the same as the number of cells that the human memory can process at once.

In Analyst the main thing is that if you want to open or process a large cube (5million+) in a single take with some breakbacks etc,  it can take some time (...) solution here is indeed using smaller cubes or update one slice at the time. Here it will depend what requires the least amount of work (making a big one and setting up a sliced update/calculation process or making a number of similar smaller ones). When partial updates are possible there is not really a limit as long as you  do not open the cube entirely or want to publish it entirely. When it is used by humans I gues that a million cells is still manageable, and bigger then 5 million cells becomes a problem (regardless the amount of memory available).

People with other ideas, please share.

BR,

Wouter