COGNOiSe.com - The IBM Cognos Community

Planning & Consolidation => COGNOS Planning => Topic started by: satay2hot on 24 Apr 2009 01:41:13 AM

Title: Workplace full
Post by: satay2hot on 24 Apr 2009 01:41:13 AM
Has anyone encounter this error message in Analyst?

What does this mean when pushing data from analyst to contributor

Thanks for your attention

Cheers.
Title: Re: Workplace full
Post by: StuartS on 24 Apr 2009 03:08:44 AM
The default workspace, (memory allocation) that analyst uses by default is 64000, (at least it was up to 8.1).

The task you are trying to do means that analyst needs more than this.

There are two things you can do.

1.  Reduce the amount of data you are pushing from analyst to contributor.

2.  Increase the workspace size.  If is already at 512kb then you need to do number 1.

You may find you are trying to push too much data from analyst to contriobutor.  If this is the case then try to export the D-Cube as a txt file and do a copy load prepare in Contributor.

Regards

Stuart
Title: Re: Workplace full
Post by: mrobby on 24 Apr 2009 11:04:24 AM
Also, you should take a look at the @SliceUpdate macro.

This effectively allows you to updated a slice of a model or cube and then loops through till all slices have been updated.  I have had to use this before when attempting to update cubes or models that are very large.
Title: Re: Workplace full
Post by: nedcpa on 24 Apr 2009 01:36:04 PM
You need to consider how often you have to transfer this data from Analyst to Contributor. If this is a one-time load, then load only a few nodes at a time to accomplish your task. You can always use the slice update macro, but it would be unnecessary if this is a one-off situation. If you have to transfer this data on a regular basis, then your simple and safe bet is to use the CLAP (copy>load>prep) feature, and set up a macro to run the entire process. It is important to note that A>C links run on a single thread, and they have not been designed to handle large data volume.
Title: Re: Workplace full
Post by: mrobby on 24 Apr 2009 03:55:36 PM
Just on a side note Nedcpa,

What do your quereies typically look like if you are going the import data method?  Since your load is a single column do you union a bunch of queries together to create the load file?  Just looking for a best practice on how you would write your query to prepare the flat file for load.