If you are unable to create a new account, please email support@bspsoftware.com

 

News:

MetaManager - Administrative Tools for IBM Cognos
Pricing starting at $2,100
Download Now    Learn More

Main Menu

cognos dynamic cube performance

Started by Raghuvir, 10 Jan 2016 08:26:19 AM

Previous topic - Next topic

Raghuvir

Hi All,

We have made a report cognos 10.2.2 using the go datawarehouse analysis package...we have made a report of order number wise quantity.this report runs in around 17 seconds.

We made a dynamic cube and made the same report using the cube as the reference.we also ran the dynamic query analyzer and applied the recommendations. Still the report is taking same time to execute.

Request you to guide us if we are going wrong.


Thanks in advance.

Regards

MFGF

Quote from: Raghuvir on 10 Jan 2016 08:26:19 AM
Hi All,

We have made a report cognos 10.2.2 using the go datawarehouse analysis package...we have made a report of order number wise quantity.this report runs in around 17 seconds.

We made a dynamic cube and made the same report using the cube as the reference.we also ran the dynamic query analyzer and applied the recommendations. Still the report is taking same time to execute.

Request you to guide us if we are going wrong.


Thanks in advance.

Regards

You didn't mention whether you enabled workload logging before running the report then running Aggregate Advisor? I suspect you might not have?

MF.
Meep!

Raghuvir

Quote from: MFGF on 11 Jan 2016 03:10:13 AM
You didn't mention whether you enabled workload logging before running the report then running Aggregate Advisor? I suspect you might not have?

MF.

H
Quote from: MFGF on 11 Jan 2016 03:10:13 AM
You didn't mention whether you enabled workload logging before running the report then running Aggregate Advisor? I suspect you might not have?

MF.

Hi MFGF,

Yes we have enabled the workload logging. While running then dynamic cubes aggregate advisor we selected then "Cube structure and Query Workload Information".


Is it something to do with the number of records in the database...i mean to say that we could really test the performance of dynamic cubes only when we have large amount of records in the database?


Request you to advise.

Regards

MFGF

Quote from: Raghuvir on 11 Jan 2016 04:53:38 AM
H
Hi MFGF,

Yes we have enabled the workload logging. While running then dynamic cubes aggregate advisor we selected then "Cube structure and Query Workload Information".


Is it something to do with the number of records in the database...i mean to say that we could really test the performance of dynamic cubes only when we have large amount of records in the database?


Request you to advise.

Regards

A Dynamic Cube ought to deliver super-quick performance for both small and large data volumes.

Did you implement the in-database aggregates suggested by Aggregate Advisor? Once you created them in the database, did you then model them appropriately in Cube Designer?
How many in-memory aggregates were suggested? Did you implement them?

MF.
Meep!

Raghuvir

Quote from: MFGF on 11 Jan 2016 11:18:12 AM
A Dynamic Cube ought to deliver super-quick performance for both small and large data volumes.

Did you implement the in-database aggregates suggested by Aggregate Advisor? Once you created them in the database, did you then model them appropriately in Cube Designer?
How many in-memory aggregates were suggested? Did you implement them?

MF.

Hi MFGF,

The number of in memory aggregates that are suggested are 15. We applied the in memory recommendations by goin to the file tab and selecting "Apply Selected In Memory recommendations...".


The count for in database recommendation is 2. There is an option to save the in database recommendations, we did that, but haven't applied them as we are not to clear on the process of applying it.

When we went through the dynamic cube red book...it says we need to give the saved file for in database aggregates to the DBA and modeler. We are not sure about this process so request you to guide us with the same if possible for u.

Regards

schrotty

Hi Raghuvir,

Quote
The number of in memory aggregates that are suggested are 15. We applied the in memory recommendations by goin to the file tab and selecting "Apply Selected In Memory recommendations...".

After you have done this you have to restart the Dynamic Cube.
Please check then if the in-memory Aggregates are loaded successfully. (Navigate to the Query Service in Cognos Administrtation and check in the Cube metrics the Loaded/defined in-memory aggregates).

To modell in database aggregates you need to create a DB-Table and load data with the generated SQL-Statement. Then You need to modifiy your Modell in the cube designer:
http://www-01.ibm.com/support/knowledgecenter/SSEP7J_10.2.0/com.ibm.swg.ba.cognos.ug_cog_rlp.10.2.0.doc/c_cog_rlp_modelinganaggregatetable.html


Schrotty

MFGF

Quote from: Raghuvir on 12 Jan 2016 12:49:47 AM
Hi MFGF,

The number of in memory aggregates that are suggested are 15. We applied the in memory recommendations by goin to the file tab and selecting "Apply Selected In Memory recommendations...".


The count for in database recommendation is 2. There is an option to save the in database recommendations, we did that, but haven't applied them as we are not to clear on the process of applying it.

When we went through the dynamic cube red book...it says we need to give the saved file for in database aggregates to the DBA and modeler. We are not sure about this process so request you to guide us with the same if possible for u.

Regards

Hi,

Originally you told us:

Quote from: Raghuvir on 10 Jan 2016 08:26:19 AM
we also ran the dynamic query analyzer and applied the recommendations.

What you're describing in this last post is somewhat different. You haven't applied any of the in-database recommendations. You need to create two aggregate tables in your database that correspond with the saved queries from Aggregate Advisor. You then need to add these tables to your cube model from within the Aggregates tab, and make sure all the columns map to the appropriate dimensions and levels. You then need to republish your cube and restart it.

As schrotty says, you also need to restart your cube after applying in-memory aggregates.

Have you taken any training in Dynamic Cubes? The course covers all these concepts and you get chance to practice how to implement them. If you haven't taken the course I'd recommend it - it will save you a lot of time and headaches :)

MF.
Meep!