If you are unable to create a new account, please email support@bspsoftware.com

 

News:

MetaManager - Administrative Tools for IBM Cognos
Pricing starting at $2,100
Download Now    Learn More

Main Menu

DMR Vs Data Volume Vs YOUR own experience

Started by mederik, 03 Nov 2014 04:09:45 PM

Previous topic - Next topic

mederik

Hi everyone,

I would like to have your own opinion about implementation of DMR into a real project.
I'm about to implement a solution and I'm considering wether implementing DMR or not. The problem is that I get different stories.

Some say there is no performance issues up to 15 millions lines facts tables provided  there is a good index strategy. But others also say one must not consider this solution for facts tables of more than a couple a thousand lines.

So where is the truth. I guess among all of you there are people who have implemented DMR ans are able to get me with a sound and true return of experience.

So tell me please, in which context have you worked with DMR and was it reliable and performant. What are the drawbacks and its real limits in terms of volumetry etc...

I would be most grateful to all of you if you share with me your true experience about that

Mederik

cognos810

Hello Mederik,
In my opinion, what you have heard is a good mix of both true and false statements(it is definitely more than a couple of thousands though  ;D). I will not delve into that, but rather discuss a good implementation I did for a school district dashboard. An interactive dashboard that does require really good performance times.

With DMR, to get the best results it highly relies on how your mart/warehouse is structured, volume of data comes much later in the picture. If you look into the product documentation, it is recommended that you work with a star-schema already defined in the database layer. For my implementation I used DQM(Dynamic Query Mode), as my relational source was SQL Server which is supported for DQM. Then the entire report development was done using MDX/Dimensional style reporting.

DQM greatly increases the performance of dimensionally modeled relational (DMR) packages. Rather than the complex SQL statements formerly generated for DMR packages, the DQM engine now generates true MDX queries which are eventually decomposed by the new DQM engine into smaller, better performing SQL queries. Along with other improvements offered by DQM , the net effect is that DMR queries should run significantly faster in most implementations. This is Cognos 10.1.1 or later.

When it comes to volume of data, this KB on IBM may give you some insight.
http://www-01.ibm.com/support/docview.wss?uid=swg27036155
It states that with DMR when the volume of data approaches 20-25 millions, the performance starts to suffer.

I would suggest that you do a POC implementation on your end, to see if Star-schema structure (within DB), + DQM + DMR gives you acceptable performance and then make your decision.

-Cognos810

mederik

#2
Hi Cognos810.

Thanks for your very interesting inputs. Several question if you please :

In our data model we have 2 major common dimensions : TIME and ORGANIZATION (company structure)

1- Most of our facts tables actually store pre aggregated facts on the TIME dimension. Like an olap cube would do. Does this particularity is a blocking point in implementing DMR qnd DQM ?
2 - Is it possible to mix relational dimension and DMR dimension associated to a MEASURE axis in a pack and use DQM ?
3 - Is it possible to implement DMR and DQM with facts that have different level of granularity on TIME and ORGANIZATION axis and are in the same fact table ?

cognostechie

I did a thorough test on this two months ago. I created a DMR and a cube pointed to the same tables, same amount of data , same DB , same server . The DMR was created using DQM and have to say that DQM wins over CQM no doubt. However, it did not perform as well as the cube. My cube never takes more than 2 seconds no matter how much data you put in the report and no matter how many times you run the report at what time of the day and no matter how many users are connected to it. However, DMR takes quite a bit of time when you run it the first time, from the 2nd time onwards, it is quick  but when you don't run the report for about 30 minutes and you run it again, it again takes the same amount of time as it took when it ran the first time. Looks like the caching times out.

barrysaab

Would like to add further on great insight of cognostechie,DMR is basically still a relational Setup of queries which is forced to behave like multidimensional,Thus comes the performance equation.Thanks
Boy! Cognos getting on to me!!!