COGNOiSe.com - The IBM Cognos Community

Legacy Business Intelligence => COGNOS PowerPlay => Topic started by: ykud on 21 Jul 2005 08:33:00 AM

Title: Cube & Fact table size
Post by: ykud on 21 Jul 2005 08:33:00 AM
PP, being a MOLAP and file-storage tool, suffers from fact data size increase.
Therefore:
1 what was your cube\fact table maximum?
2 query response time (complex Advanced Subsets i.e.)
3 did cube groups help you with this issue?

Mine, for start:
1 200 Mb \ 10 GB
2 30 sec
3 Nope, that was week's data
Title: Re: Cube & Fact table size
Post by: forsasi on 24 Jul 2005 11:10:37 PM
I doesn't matter how big you cube.. it depends on the number of dimension and measurs.. if number of dimension increases then the query response time will increate and please check the optimization tips from Cognos PDFs...
Title: Re: Cube & Fact table size
Post by: ykud on 25 Jul 2005 01:31:48 AM
Surely it's so. But there surely is a correlation between table size & quantity of dimensions :))).
And usually client's ask: "We've 0,5 Pb datawarehouse, what your cubes maximum?"

Yet I can bring dimension chart as well. Is it interesting?
Title: Re: Cube & Fact table size
Post by: Darek on 04 Aug 2005 10:59:20 AM
I've been successfully running cubes in excess of 2.5GB, even before COGNOS officialy supported such monster. There are many factors that will decide how big your cubes is:

1. # of dimensions
2. # of measures and their storage type
3. # of partitions (auto or manual)

Having the cube time-partitioned helps performance as well, but there are limitations on what a time-partitioned cube can contain (like only one real time dimension, but you know, there are fake ones, too).

Reponse times are usually within 3-30 seconds range, depending on query complexity, if running on an Intel platform. Any RISC hardware would be a waste of time, regardless how good the industry think it is.
Title: Re: Cube & Fact table size
Post by: Darek on 28 Sep 2005 01:21:19 PM
Additional comments:

1. There is a direct correlation between a cube size and its run-time performance. 5MB cube will perform much faster than a 5GB.

2. There is no physical limit on how big a cube can be. In my experience I used cubes as large as 10GB with decent performance.


[To other moderators: If you think the next paragraph is of self-marketing nature, please remove]
I'd like to invite everyone around Chicago Land interested in more details, to a presentation on this very subject during Performance 2005 on Oct 14th.
Title: Re: Cube & Fact table size
Post by: ykud on 29 Sep 2005 01:54:01 AM
Darek, do you, be any chance, know anything about PP's internal processing algorithm?

I still hope on having time someday to perform rather a dull, but very informative test of loading the same table with different sorting. Results should vary, if any optimization techniques are used :)

Just a remark: rather fast molap engines are being devised in science(exp QC,DWARF), but it seems not to have any issue with Cognos. Pity.
Title: Re: Cube & Fact table size
Post by: Darek on 29 Sep 2005 10:16:43 AM
Unfortunatelly, I don't. But that DWARF thing looks promising.
Title: Re: Cube & Fact table size
Post by: ykud on 30 Sep 2005 01:32:31 AM
Yet a lot o work to do.
No incremental updates, no multi-measuring.

See Quotient Trees & Quotient Cubes -- brilliant idea inside.