If you are unable to create a new account, please email support@bspsoftware.com

 

News:

MetaManager - Administrative Tools for IBM Cognos
Pricing starting at $2,100
Download Now    Learn More

Main Menu

Links from Cognos packages - how large can you go?

Started by blackadder, 14 Mar 2009 09:40:02 AM

Previous topic - Next topic

blackadder

I'm interested in peoples experiences of using FM packages as Admin link sources.

Has anyone had experiences (good or bad) with large data volumes?




Gunes

Hi Mike,

I can comment from a technical perspective....

A fair few teething problems in 8.2 and 8.3 with this functionality and the use of the datamovement service behind the covers, therefore I would recommend that you are on 8.3 SP1.

From a functionality perspective, others might be in a better position to give you better indications, but from what I've seen at various clients using 8.3 SP1/8.4 the functionality is hardened to accept large volumes of data.

Hope that helps,

-Gunes

jan.herout

So far, the biggest data set I was loading into Contributor using Admin link was approx. 200k records, which translates roughly to 500 MB in terms of temporary data file on the application server.

So far I have not encoutered any blocking issues with this functionality when loading data into Contributor application. However, seems to me that Analyst has some issues with that, even when using small data sets (<500 records) - eg. for dlist update. Sometimes it just fails and later on it works, virtually no pattern in that.

Gunes

What version are you using Jan?

Again, the teething problems I mention above apply even more heavily against Analyst.

jan.herout

8.3 with SP1

Prior to SP1 installation it crashed more often than now. Sometimes if I run the same dlink / dlist update twice, the first attempt fails, the second runs correctly.

jan.herout

One more comment to this - seems to me that running dlink from package into analyst works this way (can somene clarify that or point out where I may be mistaken?)

- BI instance of the cluster queries the data
- data movement service transfers data to the Planning instance
- Planning instance creates a plain text file which is saved into the same location that stores FILESYS.INI (based on path to filesys.ini set for the cluster)
- Analyst creates temporary file map (? I guess)
- dlink is run

Based on the behaviour I was experiencing it seems to me that the problems are mainly on Analyst side, since the text file with data was transferred to the client even in cases when the dlink failed.


As for administration links, my guess is that the only difference is that the data are loaded into import tables in the application data store using appropriate bulk loader for target database. Otherwise the principle seems to be the same (which is probably the reason you must target DEV area of the application in this case).

mrobby

Im not sure if this is right for Cognos Packages or Dlinks for that matter but doesn't analyst have to use the RAM memory to transfer from the plain text temp file that is written to the analyst cube?

If so I would think your limitations could be based on memory.

jan.herout

Not in this case. Problems I had were not linked to huge amounts of data - imagine dcube with 7k cells being loaded from database view via package with approx. 800 records in the set.