If you are unable to create a new account, please email support@bspsoftware.com

 

News:

MetaManager - Administrative Tools for IBM Cognos
Pricing starting at $2,100
Download Now    Learn More

Main Menu

Analyst Table Publish Size issues

Started by gidster, 21 Jul 2008 11:43:09 AM

Previous topic - Next topic

gidster

Hi Guys,
Have any of you got any suggestions?: We have a number of very large (approx 30 million cell) cubes that we want to run a Table Publish on from Analyst.

The Publish would then be used to run a Generate Framework Manager Model and then be used in Cognos 8 to report on.

However, we are getting "stuck" at the Table Publish stage.  We are not getting any error messages...just 100% CPU useage, and no database tables created.

We have left it for ages and ages (still no result) but no error messages either.

Whereas I might have been tempted to simply "leave it running", some of our other applications (including contributor applications) are being impacted (as they all use the same "Cognos Database SQL Server".)  So letting it run over such a long timeframe is simply not an option.

So, have any of you dealt with similar issues?

Some (horrid!) ideas I have had are:

- to split out my 30 million big Cubes across one of my (relatively!) static D-Lists.  This could mean each cube shrinks to less than 10 million. And then use Framework Manager to bring all of the information together (so I am back to the full 30 million)

- to (somehow!) link the data from Analyst into Contributor and then move away from an Analyst Publish and Generate FM Model to a Contributor Publish and Generate FM Model.  Again, not sure I like this idea either!

Please send me some inspiration, as I would hate to try my ideas out to find a better one exists out there!

Thanks!

ykud

Export cubes as txt and use a simple dts (bcp, sqlloader) to import them into table is not an option?

gidster

Hi Ykud... this is an interesting idea, and I could see how it would maybe work once my initial Publish Tables were created.  However, it is vital that we use the Generate Framework Manager Model functionality...and this (I think) requires that all of our Tables are published in the manner that the Analyst Table Publish automatically does (with it_tables and et_tables etc.) so that it can automatically generate our FM model.

And at the moment, the Analyst Table Publish is simply seeming to "hang" and not error, and also not produce my Tables.

So I think I am a bit stuck.

Please let me know any other ideas you have!

Many thanks!

ykud

Just a quick question:

Smaller cube get published successfully?  Are tablenames generated correctly (et_cubename, it_dlistname)? -- there used to be a huge problem with that.

gidster

Hi Ykud,

I managed to leave everything running for over 12 hours overnight.  It was still running in the morning, and I was forced to use Task Manager to cancel Analyst.  This obviously killed the Publish.  But what it also did was unlock the SQL tables...and when I looked at my Publish Container, I could see that 2 our of my 6 30 million cell cubes had published successfully.

So I am convinced that with enough time, this would work.  It is just we cannot leave it running for so long as it impacts on our other Cognos systems (presumably as they all share the same SQL box).

Is there anything I can do?

One other idea I have had is to change my Dimension for Publish... at present I have specifically created a single-item "Dummy Dimension" just to select as a Dimension for Publish.  I would imagine that this means I am causing the Publish Process to create 30 million rows (and hence causing such slow performance).  So I was thinking perhaps I could pick a relatively static alternative D-List as a Dimension for Publish...this might reduce the number of rows in the Publish and speed things up.

But I am worried about what it would mean for the way I would build my reports in Cognos 8 (ie after running the Generate Framework Manager Wizard).

Anyway, please keep your thoughts coming!

ykud

Hm. You should definitely change dimension for publish ), or you'll ruin your database server. 30 mln rows is a really huge import chunk. Dimension for publish will be used as measures, so if've got accounts in the cube -- that's a candidate.

I'd create a contributor application (containing this cube and split by some "long" dlist as elist), export cubes as txt, import txt data  and then publish application. You'll get really fast export from Analyst and import&publish will be using your cluster (parallel execution)

I don't trust analyst publish at all, to be sincere, and allways avoid using it, since it was randomly changing table names in pre 8.2 releases.

gidster

Hi Ykud... well I have finally made some progress:
By using my 144 item long Mmm-YY Dimension I was able to get my Analyst Publish to work on all 6 of my 30 million cell cubes.  And it completed in less than 1.5 hours.

So that is the good news.

Now the Bad news: How do I actually write reports on this data?  I ask because all of my previous reports I have written which have interfaced with Cognos Planning have all had a single item dimension for Publish.  So in the past I was easily able to simple use a prompt to select the Months-Years that the Users wished and have the report return the appropriate values.

And as things stand, I do not think I can do this.  This seems to be because I have 144 columns of data...so I cannot (for example) dynamically select 5 years worth of data.

Are there any options?

A couple of (probably very bad!) ideas I had were to:
- somehow use SQL to 'pivot' the data back into my more typical format (ie 30 million rows with a single column of data)

- maybe experiment with the sys_column tables

But basically I am floundering, and would appreciate any pointers.

Thanks again for your help!

Regards.

mellyssa

Hi,

We have a similar problem. There is a cube very large (approx 9 million cell) that we need to run a Table Publish on from Analyst.

The tables generated will be used to load others tables and those will used in the Framework Manager Model and then be used in Cognos 8 to report on.

Here, happen the same thing ."..not getting any error messages...just 100% CPU useage, and no database tables created"

This situation is being analyzed here, but still we haven't a solution.

Do you have any solution?

thanks.

PS: Sorry, my english is terrible.

gidster

This sounds very similar to my issue.

As you can see (there is also a similar thread on the Cognos Support Talk website..I think it is called something like Reporting from Planning)

You basically have to either:

1. Live with the slow performance, or
2. Pick another Dimension to Publish

If you pick another Dimension to Publish then the less rows you can make the output (ie the more columns you have) the better.

Good luck!