If you are unable to create a new account, please email support@bspsoftware.com

 

BiBus process consuming more than 2GB ram and report is getting failed

Started by sunosoft, 22 Apr 2014 01:32:06 AM

Previous topic - Next topic

sunosoft

Hi All,

Getting an issue to run a report with large data in excel/pdf(around 2 lacs rows).

I monitored the related BIBUS process for this report and memory consumtion for this report is going above 2 GB and report is failing with error outofmemory i.e. BIBUS run out of memory. And its obivous that it will fail if BIBuS consumes more than 2GB.

Is there anything that can be done to execute this report successfully. Any report level tunning will help ? I think 2 lacs rows is really not a that much huge data to fail cognos report.

This report consist of crosstab.
Thanks
SK

MFGF

Quote from: sunosoft on 22 Apr 2014 01:32:06 AM
Getting an issue to run a report with large data in excel/pdf(around 2 lacs rows).
I think 2 lacs rows is really not a that much huge data to fail cognos report.

What does 2 lacs mean? It's not a word or abbreviation I recognise. I googled it and only got "League Against Cruel Sports", and I don't think that's what you are referring to here :)

It sounds like you may have a massive crosstab, though. The obvious question is "why"? What use is a huge crosstab like this to a user? Are you trying to use Cognos as a data dump utility?

MF.
Meep!

sunosoft

"League Against Cruel Sports" :) :) :)

Actually I meant to say "lakh" which is like 10 lakhs=1 million, this the conversion I got from google :)

Yeah,actually its monthly report which seems to be proudcing huge data. User don't want any filter in it. As an admin I gave the same answer to developer as you gave below :).

But still was trying to find something to make it so that users will be happy. :)
Thanks
SK

Grim

First off, is this 32bit or 64bit install? If it's 64bit then 2GB mem limit doesn't really play into this. That was my first thought...


Now, as for the report itself, I'm with you guys. 2 Million rows...WHO IN THEIR RIGHT MIND IS GOING TO LOOK AT 2 MILLION ROWS OF STUFF/JUNK/AND MORE CRAP!?¿ That's like saying "Why can't I get freakin sharks with freakin lazer beams on their heads". Filter the damn report. Honestly I would just say to the dev, "Yeah dude your totally breaking the server limits...your going to have to filter stuff out yo! kthxbai".  ;)

PS. COGNOS IS NOT A *&%$#@'n ETL TOOL.  >:(
(Sorry, just a pet peeve of mine with past experiences with bad devs and worse end users dumping everything using Cognos to excel/csv. Like dumping everything out of a cube. /facepalm)
"Honorary Master of IBM Links"- MFGF
Certified IBM C8 & C10 Admin, Gamer, Geek and all around nice guy.
<-Applaud if my rant helped! 8)

Suraj

2 lakh rows is 200,000 rows not 2 million and we have many reports around that all the time.
It's not about going row by row but about looking at the summary and doing more analysis so they don't have to re-run detail report again if they need to focus on something.

As far as memory, make sure if it's the BIBus memory or the temp space running out.
If it's BIBus, either upgrading to 64bit may help or make the report smaller.
If it's temp space, increase temp space or point it to a location with more space.

bdbits

<soapbox> Even 200,000 rows is more than you should be dumping to a spreadsheet, in my opinion. If you cannot efficiently drill into details (i.e. focus) with the Cognos tools, and do it better than you can with Excel, I would argue the modeling is inadequate or the users need training. Why even bother with Cognos if you are just dumping it out to Excel? Use an Excel data connection straight to the data source and skip Cognos. </soapbox>

All that said, if you have to do this... 200,000 rows could still be exhausting memory. There are so many variables it is difficult to say what is the source here - could be the size of the rows, could be formatting, the way the data is queried/fetched/filtered/sorted/etc. There is also overhead in outputting to Excel or PDF format. So as suggested, you either have to reduce the result set size, or increase the memory available to bibus by going 64-bit. The 2G limitation is due to 32-bitness and you cannot work around that. Temp space could be a factor, but it sounds like a memory limitation from what was said.

sunosoft

Thanks all for your posts on this.

Checking with users to reduce the data, lets see what happens.

Along with discussion seems like I have different understanding on 32 bit and 64 bit installs, please let me know if I am correct or not.

Install that we have is 64bit. However my understanding was even if BI installation is 64bit, report server execution mode defined in Cognos configuration is 32 bit(32 bit needed so that CQM reports will be executed). Hence related BIBUS process is also 32bit and which should not go above 2GB memory utilization.
In order to make BIBUS to use more than 2GB, we need to set report server execution mode to 64bit and will need to use DQM.

Above is my understanding on 64bit and 32bit Cognos. Let me know if this is correct. Sorry moving bit away from original post, but wanted to clear this.

Thanks
SK

navissar

Right, let's start from the top.
The BiBus  in 32 bit installations and 64 bit installations will behave the same because the BiBus is a 32-bit process even in 64 bit installations. 2GB, upper limit. After that, a dump is thrown.
Now, there could be several reasons the BiBus bloats up like this. BiBus can handle 200K records as such. I just issues a report with well over a million rows just to try, with no problem. Of course, it may be that your data contains 200K HUGE rows which will make the returned query size be over 2GB, but I think that is highly unlikely.
So, one might ask - how come the BiBus fails, then? Well, this has to do with many things, the most common explanations are the complexity of the query, the existence of master-detail relationships in the report, and local processing.
Also, since this is a crosstab report, check to see that you have free space on your Cognos hard drive. Trust me on that one, it could make a difference.
As any beginning business analyst knows, there are 3 ways to deal with insufficient resources. So, there are 3 ways I see of dealing with this:
1. The "put more resources" solution: As any Lanister knows, any problem can go away if you pour enough resources on it. Make the whole thing DQM. Set up your datasource for DQM access, publish the package as DQM, and when you run the report it will no longer use the BiBus at all but it will use the DQM engine which isn't limited by 32 bit puny memory limitations (Assuming you have a 64 bit env).
2. The "Cut costs" solution: Check your report. Simplify it. Create an aggregated table in the database which already does all the heavy lifting of joins etc. and pull your report from there, in as much as possible get rid of any master detail relationships, be sure not to do any local processing.
3. The "Move to another department" solution: Crosstab reports over relational create a mini-cube on the Cognos server (Which is why clearing hard drive space might help). So, you could create a Transformer cube/dynamic cube and have the report run against this. A Transformer cube would save the BiBus the need to build the in-server cube, and lighten the burden.

sunosoft

Hi  Nimrod,

Thanks for such a nice explanation.

Free space on cognos server seems to be enough.(around 25GB).

There is no master detail relationship in the report. However just saw local processing is enabled.

Will check on DQM and cube option. 
Thanks
SK

MMcBride

I run Cognos on a 64 Bit AIX platform - 80GB of Memory, Terabytes of Dasd and yes I occassionaly see this error as well.

I had an open ticket with IBM for over 6 months with no resolution before I was forced to close the ticket (on my end not IBM's)

What I did find was a rather quick and easy work around.

First a little conjecture on my part... the BIBusTKServermain processes are not unique to individual sessions so there may be some session information that isn't cleaned up properly - I have never been able to prove this...

But what I do when I see this error is Restart the dispatcher via the Admin panel.
It takes seconds - "Stop Dispatcher immediately" - I make sure there are no active or background sessions running. Once I get the "Dispatcher stopped" message I click the Start dispatcher.

I then rerun the offending report and it succeeds 100% of the time.

I have on a very rare occassion seen the out of memory be consistant with certain reports - usually converting large numbers of rows to Excel. Switching these reports to CSV usually takes care of that for me - if it doesn't then I go back and force a change to the report "You are out of bounds - limit the report or it will not run"

However 99% of the time when I get the out of memory error restarting the dispatcher solves it.

xplorerdev

Quote from: Nimrod Avissar on 23 Apr 2014 02:16:41 AM
Make the whole thing DQM. Set up your datasource for DQM access, publish the package as DQM, and when you run the report it will no longer use the BiBus at all but it will use the DQM engine which isn't limited by 32 bit puny memory limitations (Assuming you have a 64 bit env).

Hi Nimrod,

I have my 64-bit Cognos BI 10.2.1 FP4 setup on a Win 2008 Server. My Cognos installation is using the inbuilt Java. The Framework Model is DQM. The datasource is DQM. I have set the 'Report Server Execution Mode' to 64-bit in Cognos Server Configuration. Based on your comment, then my reports should not be using BIBusTKServerMain.exe process at all right ? Am I correct in saying that ? If yes then how do I explain the spikes in BIBusTKServerMain.exe, everytime my reports display the result ? Am I missing something ?

Thanks.
Dev