If you are unable to create a new account, please email support@bspsoftware.com

 

News:

MetaManager - Administrative Tools for IBM Cognos
Pricing starting at $2,100
Download Now    Learn More

Main Menu

Large Dimensions handling. Sales planning.

Started by ykud, 27 Jun 2006 11:38:59 AM

Previous topic - Next topic

ykud

Greetings, everyone.
I'm stuck trying to build a sales planning system with cognos ep. Problem is described here.
I would greatly appreciate any advice or experience sharing.
Thanks in advance.

Smajil

After many emails between Cognos support and me here is the conclusion:

There are 3 options to resolve the problem:

1) Remodel to reduce the size of your application;
2) Allow multi-view but of less e.list items at once (i.e. restructure e.Lists so that there are fewer children under each single parent);

3) Disable multi e-list item view.

At first I was very disappointed with this suggestion, but later I realized that the application could be split in two
between the input and report.
I did create one application with one simple cube that was able to display about six undred children under one node.
Hope this helps,
Smajil

andrewbho

Cognos Support doesn't give good modeling advice and these suggestions doesn't answer the modeling challenge. ... remodel?  Multi-node elist items has nothing to do with this modeling challenge.

the challenge is that it's a big a** model.  A 10k dlist will work depending on how many subtotals are involved as well as the format.  It's all about RAM usuage of the application.  In the old days, what I would do is to break out the products into multiple cubes as well as use multiple cubes to subtotal and group through d-links.  With new versions, I would create multiple apps and either consolidate in 1 large one or publish to a reporting application.

A 10k elist is possible, just make sure not to have more than 5 levels and keep your children to parent rollup less than 30 or you will run of memory.  Your reconciles are going to suck and you are going to have a sparse model which means you better go and get some massive storage.  You are also going to have to extend several of your asp time outs as well to ensure that the query to bring back and show 10k elist items will work.

A few years ago, I built an application with >20k elist items.   I wouldn't want to go down this road with the tool.   Spend your time re-engineering the process.  No one should be planning 10k products.  It's not Planning Best Practices.

ykud

Well -- that's not ours Planning Process, it's Clients. And it's their Sales Planning system, so they deftly need to plan all products.
Whether is Cognos EP applicable to Sales Planning -- it's a question for discussion, but we had to do it anyway.

The problem was solved by creating a number of applications with lesser product list (each divisions sells only about 2k products, so number of applications = number of divisions), but I think that this variant is really bad due to support problems and was considering possible re-modeling ways.

And it could easily be solved, if it was possible to have allocation cubes with elist. Having a dlist 1-2500(floating list) and entering products as dlist format element would allow a single application instead of 15. But there is a need for subtotals, which could be generated by allocation links. Those links would be fast in case of small allocation cubes (2,5k for ex), but that's not applicable, because elist is needed in such cubes. And using an overall(10k elems) alloc cube makes those links slow.

Sandip

Hi Void
I kinda agree with you. We too just finished doing a full roll out in a similar scenario -- where number of Division = Number of applications.
However the client is very satisfied cause we made him understand that for best performance this was the only way out. Moreover with the batch files for updating all models running at nights (when no one uses the system) its all perfect.

ykud

Quote from: Sandip on 19 Nov 2006 02:53:24 AM
Moreover with the batch files for updating all models running at nights (when no one uses the system) its all perfect.
Hi Sandip. I've just wanted to point out the main problem I see in such scenario of modeling. That's all about applying changes to the model. What about a new division? How it's added? We need another contrib application, and linking it to consolidation model - it's a work to be done every time. And it should be just adding more elist items, shouldn't it?
About batch updates -- I'll just describe the way we do that, maybe it'll be useful for you.
1 We've got a number of skeleton libraries, differing only by product list, all dlists are, of course stored in the common_dlists library. After any dlist change a mass synchronize is ran over all applications.
2 We need to transfer Contrib data to Analyst, so we have an exp(export) lib for every app (see this    for detailed explanation of exp-skl-imp concept). In this exp lib we store Contrib>Analyst links with matched desr on all dimensions, packed into update sequences of all cubes. And (here's the idea) we ran it with SliceUpdate macro. So if we need to change allocations somehow (to get only 1 month data) a macro with &Input Variable is ran, and user selects the desired month. This saves time to edit the whole pack of identical links (adding a new allocation cube for months, for example).   

ykud

Quote from: Void on 16 Nov 2006 03:02:17 AM
And it could easily be solved, if it was possible to have allocation cubes with elist. Having a dlist 1-2500(floating list) and entering products as dlist format element would allow a single application instead of 15. But there is a need for subtotals, which could be generated by allocation links. Those links would be fast in case of small allocation cubes (2,5k for ex), but that's not applicable, because elist is needed in such cubes. And using an overall(10k elems) alloc cube makes those links slow.

Just wanted to add that it's possible to create substitutes for allocation cubes in Contributor, using dlist-formatted elements and lookup links. More details on that "contributor allocation cubes" here.
So next time in such conditions, I'll be doing a single application with 1-2500 dummy product list.