If you are unable to create a new account, please email support@bspsoftware.com

 

News:

MetaManager - Administrative Tools for IBM Cognos
Pricing starting at $2,100
Download Now    Learn More

Main Menu

No Data access tables

Started by osmodeus, 16 Nov 2009 02:10:11 PM

Previous topic - Next topic

osmodeus

We have some very large models but have used "no data" access tables in Contributor to greatly reduce the overall size of the models. My question is, how does the use of no data access tables directly affect the memory on the job servers when GTPing the models and when reconciles run on the model? I have found the documentation below directly from Cognos but beleive this is relevant to access tables hiding data, and not specifically to the no-data access tables, but correct me if I am wrong. We are having some problems from time to time with the memory on our job servers approaching 2GB which then kills all subsequent reconciles unitl the memory is released and we are trying to get our hands around the source of the problem. Thanks.

3.2 Access Tables

Access tables definitions are so significant to performance and scalability that it is worth highlighting them separately. When you look at the size of the application model, often access tables are large and take the most memory to resolve.

The amount of memory needed to resolve an access table is determined by the product of the number of items in each dimension multiplied by four bytes - the greater the number of dimensions you include in an access table, the greater the memory required.

If you use more access tables with fewer dimensions in each, the memory requirements are reduced considerably. Note that 2 GB is the maximum memory that can be addressed by the operating system for a single process (the calculation engine cannot take advantage of the 3GB support provided in later versions of Windows).

Gunes

Hello,

My first question would be, do you have Cut Down models enabled? This would distinguish whether the cutting down of your model is happening on the job server(s) or on client machines, and would determine if you are seeing a Cutdown job prior to a Reconcile.

Access tables are quite demanding on the engine & they have a direct influence on the model definition (again depending on whether you have cut downs enabled or not) - please see my early post regarding Cut down models (if you are interested in what happens behind the covers)

The documentation you mentioned below applies for all types of access tables. It's a calculation of the taxation to memory when resolving an Access table, irrespective of type.

There is also a model analyzer tool ( I am not sure if you are aware of this tool) which would be a fantastic tool in this scenario to see how big the model definition of your model is & give you a different avenue to tackle the intermittent issue you seem to be having.

Also, you mentioned that you are reaching the 2GB limit, could you just confirm how many processors or cores you have & the amount of RAM on each of your job server(s)

Many Thanks & Gd Luck
-Gunes

craig_karr

Where can I find this model analyzer tool you are refering to? I guess you don't mean the epmodelsizereader?

StuartS


craig_karr

Thanks a lot! Seems like a very useful tool. I will download it right away and play around with it.

Gunes

Yep - the tool I am referring to is above. Thanks for the link Stuart.  :)

aa2288

I am not sure how does the tool would indicate isuues in the model design?