If you are unable to create a new account, please email support@bspsoftware.com

 

News:

MetaManager - Administrative Tools for IBM Cognos
Pricing starting at $2,100
Download Now    Learn More

Main Menu

Performance issues in active reports

Started by AlexRC93, 08 Mar 2012 03:44:04 AM

Previous topic - Next topic

AlexRC93

Hi there,
We have the problem, that our first test active reports are VERY slow and I am wondering what's the reason for that.
The cognos samples look really good and my company wants to have something similar as well. But not with data like revenues per quartal (this would be easier because it is not that much data).
We have e.g. a list of clinical studies, each study has several countries and each country several sites (see attachment).
Filtering these 3 steps is not the problem. But on site level, there is sometimes a lot of data (e.g. for each date when a change to the data occured there is one entry)

Does anybody have a similar data structur to display in active reports? A small report with 3 selects and one chart takes up to 30min (!) to generate!
Perhaps it's a server issue as well (only a testserver)
Smaller reports work as fast as expected. (one value slider with ~8 values filters 4 charts)

Is there a possibility to run an active report with a large amount of data or does it only make sense if you have smaller data?
we can only use relational data, dimensional data is not available=/

Thy for your help
kind regards
Alex

Galadin

The performance issue is due to the size of the data set.  What is happening is that Cognos is generating a page for every possible value set in your prompts.  You said there are many countries and many sites, and you are using a data deck to hold this information.  This means that Cognos is gathering ALL of that data at that level and generating the information when you run this object.  I bet the size of this active report is also quite large.

While it sounds like Active Reports generates data sets and then sends them out, what it actually does is render the data decks at runtime, encapsulates them into an MHTML file and then sends that out.  When a user clicks on a filter value, it is actually pulling that defined deck from the MTHML stack.   This is the main cause for your slowdown.

funkigreendog

How long do the queries take on their own?

I found that a query which took less than a minute appeared to add about 10 mins to the report build. I think this was caused by the query being shot off multipe times as it was being used as a source, for decks, lists and drop down lists.

Try setting the query option to use local cache to yes, as I found this brought it back to about 2 mins for my report.

AlexRC93

Hi, thanks for your replies!
The query does only take about 10s to 15s to execute (pdf output, 206 pages). But this output is already prefiltered on only 2 Protocol Numbers!

@Craig: Is there a possibility to display the data without a data deck, what could make it faster?
Right now I'm only testing, perhaps I don't need the data deck...most of the reports do only display a list as result. We do only use graphs in a few reports. Only thing to do is deleting the prompt page and add these selects to the main report.
And then all the data gets loaded=/

@funkegreendog:
I'll try the local cache thing. That will make it faster for sure, but the report is way to slow anyway due to the huge data. I'll try the execution method 'concurrent' as well, this may improve the performance as well.

But I think I have to tell my company, that we can use active reports only for a small amount of data and therefore for less reports.

Thanks so far, if anyone else has a hint for me, please let me know=)

Amruta Gandhi

Hi,
I have also experienced that active reports may certainly be an issue for very large data. However, based on my experience with active Reports, some things can be taken care of  to improve performance.
Use of data decks should be done only for items having less values. We can limit those by adding filters.
E.g. If you are required to show data for just 5 products where as the products available are 20, then its better to apply the filter for 5 products only while assigning that data item to the Data deck. This will allow the data deck to create data sets for only those 5 products which otherwise would have created for 20.
Also, the queries assigned to the data deck should contain only the required data items. For example above only products. Assigning fact queries should be avoided for this purpose, even though the required data items are found to be present in the fact query. Even though this may increase the count of queries overall but it certainly improves on performance.
Regarding formatting, define sizes for the objects available on report by means of size and overflow. This would reduce the size of the mht and would help in faster rendering. If possible formatting should be applied to the parent object like a block/table containing deck instead of object in each card in a deck. Try using classes option wherever possible for formatting objects.

Please let me know if this information was of any help to you.

Thanks,
Amruta