If you are unable to create a new account, please email support@bspsoftware.com

 

News:

MetaManager - Administrative Tools for IBM Cognos
Pricing starting at $2,100
Download Now    Learn More

Main Menu

Active Report performance issues

Started by onsteinj, 26 Aug 2015 10:57:56 PM

Previous topic - Next topic

onsteinj

Hi guys,

Hope you can help me out... My client is having trouble with Active Reports.

First the environment: Cognos 10.2.2 latest FP, Linux servers, 2 dispatchers, 1 IIS, 1 CS, 64 bit, lot's of RAm, DQM enabled. Framework based on best practice: 5 layers, 1 of them being DMR (and Import, Foundation, Presentation). Source; TeraData DB, Financial Industry model warehouse, starscheme dimensionally modeled DM (source of FW). (by the way, I haven't developed anything, I am just trying to figure out where it al goes wrong)

A development team (hired by client) is asked to develop some reports based on Excel examples. As we al know Excel is very generous in it's posibilities. IBM told my client that all those reports can be made in Cognos. Another company is doing the development and choose the DMR approach (which I am not a fan of) and is using Active Reports to build the desired reports based on that DMR.

Issue 1) The Active Reports take up to 2 hours too load.
Issue 2) When loaded and then consulted online and/or offline the performance is bad (up to 20 seconds lag on clicking a filter or slider)

I have checked the environment (installation/configuration). Nothing wrong there. I have checked the FrameWork. Some small issues there but nothing really big. Only "big"thing was that they were not utilizing the DMQ join type settings. Changing that didn't have any big results though.

I think they should change to a Relational model, as the source is already dimensional and on a very fast TeraData DB.
The Active Reports are also very very complex, with lots of objects, decks, sliders, etc.

p.s. One of the reasons they choose DMR is because of the need for drill-down/up. But as I understand, the regular drilling functionality of a cube and/or DMR isn't available in an Active Report?? So, they would have to use Decks anyways?

Do you guys have any leads or advice on where to look? I am running out of options... Any help is greatly appreciated. Thank you in advance.

Cheers,
J.

MFGF

Quote from: Boom on 26 Aug 2015 10:57:56 PM
Hi guys,

Hope you can help me out... My client is having trouble with Active Reports.

First the environment: Cognos 10.2.2 latest FP, Linux servers, 2 dispatchers, 1 IIS, 1 CS, 64 bit, lot's of RAm, DQM enabled. Framework based on best practice: 5 layers, 1 of them being DMR (and Import, Foundation, Presentation). Source; TeraData DB, Financial Industry model warehouse, starscheme dimensionally modeled DM (source of FW). (by the way, I haven't developed anything, I am just trying to figure out where it al goes wrong)

A development team (hired by client) is asked to develop some reports based on Excel examples. As we al know Excel is very generous in it's posibilities. IBM told my client that all those reports can be made in Cognos. Another company is doing the development and choose the DMR approach (which I am not a fan of) and is using Active Reports to build the desired reports based on that DMR.

Issue 1) The Active Reports take up to 2 hours too load.
Issue 2) When loaded and then consulted online and/or offline the performance is bad (up to 20 seconds lag on clicking a filter or slider)

I have checked the environment (installation/configuration). Nothing wrong there. I have checked the FrameWork. Some small issues there but nothing really big. Only "big"thing was that they were not utilizing the DMQ join type settings. Changing that didn't have any big results though.

I think they should change to a Relational model, as the source is already dimensional and on a very fast TeraData DB.
The Active Reports are also very very complex, with lots of objects, decks, sliders, etc.

p.s. One of the reasons they choose DMR is because of the need for drill-down/up. But as I understand, the regular drilling functionality of a cube and/or DMR isn't available in an Active Report?? So, they would have to use Decks anyways?

Do you guys have any leads or advice on where to look? I am running out of options... Any help is greatly appreciated. Thank you in advance.

Cheers,
J.

Hi,

The first and most obvious question is how big are these active reports? If you grab one of the rendered MHT files, what size is it? My guess is that the developers have crammed too much into the reports, making them big and slow. There might be ways of improving them (eg switching from charts to visualizations, ditching data decks and filtering visualizations directly etc) but there is still a limit to how much you can do in a single report.

If you fire up one of the active reports in your browser and press <ctrl><shift><D> to toggle Debug mode on, then right-click and choose "Info", you can see details about how many controls there are, rows there are, images there are etc. Can you post up the results?

Cheers!

MF.
Meep!

onsteinj

Thank you MF for your quick reply.

I ran 2 reports: 
Active report 1: needs 6 minutes to run and takes up 42Mb.   
     (asking the informacion makes my Mozilla Explorer crash, will try again, if I get it to work I will add the reuslt in another reply)
Active report 2: needs 25 minutes to run and takes up 15Mb.
     (printscreens of Informacion added)

I also would like your opinion on using DMR on a starscheme (multidimensional source) model. I think they should switch to a Relational Framework for their queries in the Active Reports. What do you think?

And my other question; does drill up/down work in Active Report, when the package is DMR? Or no matter the package (DMR/Relational) we are stuck using Decks?

Thanks again for all your help.

Cheers,
J.

MFGF

Quote from: Boom on 27 Aug 2015 10:14:05 AM
Thank you MF for your quick reply.

I ran 2 reports: 
Active report 1: needs 6 minutes to run and takes up 42Mb.   
     (asking the informacion makes my Mozilla Explorer crash, will try again, if I get it to work I will add the reuslt in another reply)
Active report 2: needs 25 minutes to run and takes up 15Mb.
     (printscreens of Informacion added)

I also would like your opinion on using DMR on a starscheme (multidimensional source) model. I think they should switch to a Relational Framework for their queries in the Active Reports. What do you think?

And my other question; does drill up/down work in Active Report, when the package is DMR? Or no matter the package (DMR/Relational) we are stuck using Decks?

Thanks again for all your help.

Cheers,
J.

So... one report is 42Mb and one is 15Mb? For what it's worth, my opinion is that the first is WAY too big and the second is about at the limit of what I'd honestly expect an Active Report to be able to handle.

Using DMR as a source for Active Reports is ok, I guess. The real question is how does the DMR perform with "normal" reports run live? If performance there is good then it sounds like the model and database are optimal. If not, you might get better performance when rendering the Active Report MHTs by using a relational model - if the queries end up being simpler.

Drilling down and up is a concept that applies only to reports run "live" against an OLAP source or DMR source. Active Reports are not - the MHTs are rendered and include all the necessary data, but when you view an Active Report you are simply interacting with the rendered MHT file. There are techniques you can add to the report to emulate drill up and drill down, but essentially these involve the report switching from one card in a data deck to another (and back again), so it's not really drilling down or up. It would make no difference whether the underlying package was relational or dimensional.

Ideally an Active Report would be less than than 10Mb in size (in an ideal world). At this size, generally it can be handled efficiently (without lagginess) in a browser or mobile device, and can be emailed without blowing the attachment size limit of a mail server.

Cheers!

MF.
Meep!

onsteinj

MF, thank you agian for your quick reply! It is greatly appreciated!

Two last Q's:
1) I have now also added the printscreens of the Active Report 1 (now with English language). Could you quickly check if the number of objects, etc. are indeed too much for an AR for the 2 AR's?

2) Could you explain why Visualizations work better then Charts?

That's it! You have helped me greatly. With these last questions I think I have all I need to finalize my report and send my advice to my client.

Cheers,
J.

MFGF

#5
Quote from: Boom on 27 Aug 2015 10:55:47 AM
MF, thank you agian for your quick reply! It is greatly appreciated!

Two last Q's:
1) I have now also added the printscreens of the Active Report 1 (now with English language). Could you quickly check if the number of objects, etc. are indeed too much for an AR for the 2 AR's?

2) Could you explain why Visualizations work better then Charts?

That's it! You have helped me greatly. With these last questions I think I have all I need to finalize my report and send my advice to my client.

Cheers,
J.

By way of comparison, here are the stats from the most complex Active Report I have written to date:




It has seven different pages, all with multiple filtered visualizations and one including a map. It renders as a 4.4Mb MHT file. My old iPad2 struggles to render it without some delays, but it runs fine on my iPad air and my iPhone 5s. I have no issues with performance when running it in IE11 either (although IE9 really struggles).

Why are Visualizations better than charts? It's not a golden rule that they are, but understanding the differences is important.

- There is no embedded charting engine that can be added to an MHT to render charts interactively. This means any charts you add to your report have to be rendered as the MHT is being created, and they get embedded into the MHT as .png image files. Because the charts are static images, you can't filter them in the active report. Instead, you use a technique of pre-rendering them on the cards of a deck or data deck, and your control switches from one card to the next (making it appear to the user as though a single chart is changing). If you have a single chart on a data deck, and there are 10 different data values for the data item controlling the data deck, then the report has to generate, render and convert to .png 10 separate chart images - one for each card. If you have a single chart on a data deck controlled by two items, one having 10 values and one having 5 values, then it has to render 10x5 = 50 chart images. And so on. The more charts you add and the more combinations of data item values you define to control them, the more work has to be done when rendering the MHT and the bigger the MHT file becomes (because of all the images).

- Things are entirely different with Visualizations. These use RAVE technology, so each visualization is essentially a small piece of JSON code, and there is a RAVE engine which interprets the code and renders the visualization. When you add a visualization to an Active Report, there is an initial one-off overhead of adding just over 2Mb to the size of the rendered MHT - this is because the visualization engine gets included. Each visualization you add subsequently is perhaps a couple of Kb extra. Of course, the data to display in the visualization has to be added too, so you need to factor in a few more Kb for this. The benefit is, though, that you don't need to put visualizations on data decks to make them "react". Your Active Report control can directly filter a visualization within the MHT and it will get re-rendered on-the-fly. So, it is far quicker to render the MHT initially (no data decks to iterate through generating png image files) and the MHT often ends up much smaller and more efficient.

Cheers!

MF.
Meep!

onsteinj

Thank you very much for all your help. You, sir, deserved a beer. I´d recomend a Grolsch!!  :D

Cheers,
J.

GoKartMozart

Hi MF

Could I ask a follow up question? Is there any way to handle lists and crosstabs other than using data decks? Our customer wants to have tables displaying information shown in charts. Based on your points above I can see this generating a .mht file that's far too big if I use decks!

Cheers

James

MFGF

Quote from: GoKartMozart on 30 Oct 2017 11:51:37 AM
Hi MF

Could I ask a follow up question? Is there any way to handle lists and crosstabs other than using data decks? Our customer wants to have tables displaying information shown in charts. Based on your points above I can see this generating a .mht file that's far too big if I use decks!

Cheers

James

Hi,

With lists and crosstabs, you're basically just splitting up the data sets between different cards, so there's not much of an overhead in using data decks. The big difference comes with charts and maps. Each chart/map is included as a pre-rendered png image, and it's these that really bloat up the size of your rendered report.

It's possible to filter a list with controls rather than putting it onto a data deck, but in terms of runtime efficiency it's probably not as optimal, as you're having to filter the list when consuming the report rather than just jumping to a pre-rendered section of the list on a card.

Cheers!

MF.
Meep!

GoKartMozart