If you are unable to create a new account, please email support@bspsoftware.com

 

News:

MetaManager - Administrative Tools for IBM Cognos
Pricing starting at $2,100
Download Now    Learn More

Main Menu

How to tunning performance for cognos 8.1 BI

Started by kittipoom, 30 Jun 2009 03:09:10 AM

Previous topic - Next topic

kittipoom

Hi Guys;

My user has complained about slow speed when they used my cube in analysis studio.

My cube 's size is around 1.44 GB ,  2 CPUs in the cognos server , RAM 8 GB ( My Cube was created by Cognos Transformer V8)

How to tuning the best speed for Cognos 8 BI Environment.

Thanks.

Kittipoom

UseCog

#1
first of all the cube is heavy. The best way is to create smaller cubes if possible

If this is not possible, you can try to partition the cube based on'time' or any other dimensions.

Do you have only one dispacher server with 2CPU and 8GB RAM? Do you have any complex reports on this cube and can you tell me the speed of these reports? And where is your physical cubes actually located(is it in dispatcher server)

kittipoom

Hi UseCog;
    Thank you for your reply.
1)Do you have only one dispacher server with 2CPU and 8GB RAM?
Yes, I have one
2)Do you have any complex reports on this cube and can you tell me the speed of these reports?
   My user wait for open report from this about 2 -3 min
   
 

kittipoom

Hi UseCog;
   3) where is your physical cubes actually located(is it in dispatcher server)
       It located in dispatcher.And I installed COGNOS8 BI components  in same server.
Thanks
Kittipoom


UseCog

it looks the size of the cube is big compared to a single server cognos installation. You can try to add partitions or try to make the cube much smaller compared to the current size. Following are some recommendations from cognos

Recommendation - Keep Model and Cube Sizes Within Practical Limits
There are practical limitations on the file size of production models and cubes, based on typical
memory capacities and run-time performance requirements, in the production environment.
For example, we recommend that you limit the size of your models to 2 million categories each.
Pay particular attention to bloating due to excessive use of labels, descriptions, short names, category
codes, and custom views. Metadata associated with your structural data source dimensions, levels,
and categories can contribute significantly to overall storage requirements.
In virtual memory terms, models held in allocated memory are limited by the available address
space. Also, performance is severely impacted if a model is larger than the available physical memory.