If you are unable to create a new account, please email support@bspsoftware.com

 

report not showing up while the database is already done with the query

Started by kiran.timsina, 11 Sep 2013 04:41:16 PM

Previous topic - Next topic

kiran.timsina

I have a report whose query is active for only about 30 seconds in the database server but the report shows up after 7 minutes or in worst cases throws CCL out of memory error. The report is pulling about 2500 rows of data. 10 out of the 30 metrics are calculated metrics.
I have taken following actions to get rid of it
- limited the number of rows retrieved to 100
- Played with options for processing the query (Default, Limited Local, Database Only)
Still no progress, limiting rows to few has about a minute gain though. But still I believe something which takes 30 seconds to run should not take more than a minute to render on the screen.

I have also checked the configurations on server side which according to IBM is all good. What could be other possible reasons for the delay in between? Thanks.

MFGF

It depends how your report is written. It may be that you have done things in the report that require a vast amount of local processing. You haven't told us anything about your report other than the fact it has 10 calculated metrics...

MF.
Meep!

kiran.timsina

Actually it is 30 columns with each column having 5 metrics in blocks. So, its 150 metrics and about 33% of them are calculated. Besides that, its a list report and I have 3 subtotals and a grand total. So, there seems to be some overhead of local processing. What surprises me is that the report used to run within a minute before few months and now it takes 6 or 7 minutes or in worst cases gives memory error. And as I said earlier server configuration has remained the same and is as per IBM's recommendations.
Is there a way to make it run faster except by removing the subtotals itself?
Is it okay to leave all the query properties to default?
Thanks for the reply and please let me know if you need more information.

blom0344

Check the server where temp. files are stored for local processing. If these are very large then you know what the cause is. Also check the Cognos  BiBus process . If this bloats to very high values or starts going up and down, then the server is handling a lot..

kiran.timsina

Right now I don't have access to see server settings but I'll request for it. Once I have the access and if I see the unexpected number/size as you mentioned then what's the solution? Do I need to (cut-off calculation metrics or probably some subtotals) OR (can we solve this by changing some configurations or adding memory or whatever related to configuration)?

blom0344

You've still as yet not told anything about the report / calculations. We're not clearvoyant  ;)

kiran.timsina

- The report is a list report with 30 columns having 5 metrics in each column in blocks. 33% of them are calculation metrics.
- The calculation metrics are mostly percentage calculations. eg. (This Year Sales - Last Year Sales)*100/Last Year Sales.
- There are 3 subtotals and an overall total.
- The query pulls about 2500 rows of data with worst case filters. The query runs in less than a minute.
I don't there is anything more special about the report. Please let me know in specific if you need to know anything else  :).
Please also note that the report was running in about 1.5 min before 2 months and it is now that it's taking about 7 min or throws memory error.

blom0344

I would take a copy of the report and remove calculations step by step. Performance may stay the same or gradually improve or improve at a certain step.  The 2500 rows of data are probably the result of a much larger source set?  But even then  2500 rows would be about 40-50 pages to be rendered..
Monitoring the server may tell you more. We did this quite extensively by running monitors for the length of a very complex report. Relative short DB activity was followed with many minutes , in this case due to repeaters. Is your list a plain one?

bdbits

2500 * 150 = 375,000 calculations, over 40-50 pages. I could see that taking some time, though 7 minutes does seem excessive if you have reasonable CPU and RAM available. Hmmm...

There is a report trace facility you can invoke that will show you the various phases of the report generation and give you pretty detailed information. I would probably talk your server admin into a trace and see if it reveals anything useful. We ran one just the other day, and it was pretty interesting and helped narrow down our issue, which turned out to be database-related. Beware that it will trace all reports running, so you probably want to do it at a time nobody else is running anything. It makes the resulting logs kind of messy to read, though not impossibly so.

If your admin does not know how to do this, let me know and I will look it up, or ask our admin.

kiran.timsina

With all the wonderful feedbacks, it seems that one of the probable cause is the number of calculation metrics. I'll trace the server activities by reducing the metrics.

But I still don't get why I'm facing this problem only after 2 months. Cognos put me in trouble, one more time, by showing performance issue after 2 months of report delivery. If it could not handle that many metrics, I should be getting issues right at the time I was developing it. Now, I can't go back to client and say that we need to break the report to reduce the number of metrics. Do you guys see any reasons for popping out the issue so late? What could probably get worse over time by running a heavy report?

blom0344

No, no & no.. Performance within a production environment is the sum of many things, not the pure  report performance while developing. The load on the server may be quite different. The DB server may be quite a bit slower having to deal with many more queries/loads. We often have trouble within customer environments and some times its even a matter of congestion of the customer's network or faulty switches etc that heavily influence the ultimate report performance. But even then, running the report in the background or setting up a good schedule may make thing acceptable

kiran.timsina

With further more tests/observations, I 've come across several findings. I feel like I'm one step closer to solution  :). It would be great if you guys could let me know your feedback on each of them.

1. Subtotals seem to be the major culprit. I was able to drop report run to time to 2 min from 7 minutes by removing all subtotals. CSV file size dropped from 4.1MB to 1.3MB because data for subtotals was getting repeated in each row. Can I say to calculate subtotals locally and not send to the database?

2. Other issue that I found in the server is that- server was consuming 3GB of memory to process a file <10MB. What's Cognos doing with 3GB of memory and not releasing even after 10 minutes when the report is rendered?

3. The server has two processors of 2.2 and 2.9GHz; 16 GB RAM but application server is allocated only 1GB. Is this enough?

Thanks.