If you are unable to create a new account, please email support@bspsoftware.com

 

News:

MetaManager - Administrative Tools for IBM Cognos
Pricing starting at $2,100
Download Now    Learn More

Main Menu

Exporting to Excel from Report Studio

Started by pdaprem, 26 Mar 2012 02:22:42 PM

Previous topic - Next topic

pdaprem

I have a Report Studio report which pulls in 21,760, 29,784, 42,050 or 52,750 rows base don a date setting. I have the report set to display 20 rows per page. With all runs I can see all the rows via HTML and export the data to CSV format within a time window of 3.5 to 9.5 minutes. If I tried to export any of the result sets to Excel the export never completes, in timing out based on a time governor imposed at the system level. The report has 123 columns of data.   It is unclear to me what is causing the inability to export the data to Excel 2007 format (also tried 2002). Are there settings I need to establish at either the Framework Model or Report Query level? Doing most of my testing using FireFox 3.6.2 but have also tired on IE 1.8 with the same results, can't export to Excel.

Any suggestions/recommendations would be appreciated!

Grim

What about CSV, or PDF?

Here are the Cognos 10 documented Excel limitations.
http://publib.boulder.ibm.com/infocenter/cbi/v10r1m1/index.jsp?topic=%2Fcom.ibm.swg.ba.cognos.ug_cr_rptstd.10.1.1.doc%2Fc_excel_limitations.html

You could also try the following Parameter to force Cognos to spit out an actual "XLS" file. Questionable if it will help, but worth a shot.
http://www-01.ibm.com/support/docview.wss?uid=swg21342065
"Honorary Master of IBM Links"- MFGF
Certified IBM C8 & C10 Admin, Gamer, Geek and all around nice guy.
<-Applaud if my rant helped! 8)

pdaprem

PDF isn't a viable option since users need to be able to manipulate the data in a spreadsheet format.  CSV has the issue that I can't seem to resolve in that for a number of my data fields it is suppressing the leading zeroes. All of the data fields are defined as Varchar.  If I can figure out a systematic way to avoid having CSV drop the leading zeros then it would be a viable option.

Hoping that there is someone out there with a viable and workable solution.  I'm trying to move my application from Brio to Cognos, but with the inability to get data pulls, which fit in Excel row limits, to export before it hits the timeout value the move is in jeopardy. Feel fairly comfortable from a basic model perspective since I can get up to 52K of records to display via html and export to CSV. So the question is what to do to get it to play friendly with Excel?

Lynn

I suspect that your leading zeros are not dropped at all....if you open the file in a text editor you will probably see them. When you open the CSV in excel it is dropping them from display, but they are there.

When you open the csv file in excel and run the wizard you need to specify the type for those feels as text rather than the "general" default.

pdaprem

Lynn, As you indicated the leading zeroes are indeed there if I open the file with a text editor.  When opening the file with Excel it does not open up any wizard which would allow for changing formats for selected columns. It just opens the file straight into Excel.

Lynn

If you rename the extension to .txt instead of .csv the wizard will come up automatically. Otherwise, go to the Data tab and use the option to get external data from text to open the file using the wizard.

I have Excel 2007.

If there is an easier way I've not found it :) Certainly would appreciate anyone else's input if there is a better way.

Arsenal

is there a Cognos based error message associated with this timeout thing? Or is something kiiling your connection automatically if the report takes a while to return?

try this
in your FM model's Governor settings, restrict the max rows returned to say 20, republish and now rerun the report in excel. Does Excel open up now?

pdaprem

Arsenal, The timeout is based on a governor which automatically kills interactive runs which take more then 10 mins to retrieve the data.  I can pull in up to 52K rows of data and export them to CSV within the time limit But starting with the  pull of around 12K records it timeouts out going to Excel. I just timed a pull of 4K records which took about 46 secs to CSV but 4:50 mins to Excel.

For all of the pulls I tested thus far I have set the max roes per page at the Report level to 20. Based on this is it really necessary to change the FM model Governor settings to reflect 20 rows as the max?

pdaprem

Lynn, I did the rename to reflect an extension of .txt but the wizard does not open. If I go to Data > Import External Data > Import Data the wizard for open. The frustration though continues in that one has to scan across each column and change the column data format to Text. On screen 3 of the wizard (see .jpg) it implies that you can indicate what idenitfies a numeric field. My data columns are Varchar so really confused as to why Excel appears to be treating them as numeric.


Lynn

Overall my impression of your dilemma is that you are using a BI reporting tool in order to perform data extraction. Somewhat like pounding a nail in the wall using the heel of a shoe instead of a hammer. It might work, but perhaps not the best choice of tool for the job.

But sometimes we gotta do what we gotta do.

If you think about producing a CSV file...there is no datatype explicitly defined in a CSV. It doesn't matter what your database calls it nor what you specify in your report. It is just a text file and Excel decides that what looks like a number must be a number and what looks like a date must be a date and so forth.

Producing directly to Excel instead of CSV will preserve your leading zeros. You might examine the SQL being generated to see if you can tune things on the database side. Although the data volumes and number of columns you describe are high for a reasonably consumable business report, they don't sound so huge as to warrant the run times you describe. Can you see what it takes to pull that amount of data direct from the database without going through Cognos?

Grim

Quote from: Lynn on 28 Mar 2012 07:32:24 AM
Overall my impression of your dilemma is that you are using a BI reporting tool in order to perform data extraction. Somewhat like pounding a nail in the wall using the heel of a shoe instead of a hammer. It might work, but perhaps not the best choice of tool for the job.

But sometimes we gotta do what we gotta do.

I agree with Lynn. If your backend DB is MS SQL your DBA can export the data you are looking for direct to Excel format way faster. Might be an option you want to explore.
"Honorary Master of IBM Links"- MFGF
Certified IBM C8 & C10 Admin, Gamer, Geek and all around nice guy.
<-Applaud if my rant helped! 8)

pdaprem

Grim/Lynn, Thanks! Yes I'm using Cognos Report Studio, a BI Reporting tool, because I need to give my end users the ability to pull interactive reports. Exporting data from the DB2 tables isn't an option. Have to figure out either (a) why it takes so long to export of Excel (another example a pull of 4,114 rows to CSV takes 46 secs, to go to Excel 2007 it takes 4:50 mins) or (b) how to as painless as possible get a process established for users to use the CSV export and then properly use wizard to change the specific column data types .

There might be things I can do in the FM model or in the Report studio report to help with the Excel export. When I can go to CSV in 46 secs and it takes 4:50 mins to go to Excel, to me that that doesn't sound like a FM model issue, but if I wrong I'm not to proud to accept suggestions.

Lynn

I'm not certain about this....
but I think that CSV is basically a raw data pull of your native database query whereas Excel is going to impose local processing along with formatting and such.

Can you compare the native and Cognos SQL to see if there are any dramatic differences? Do you have any formatting specified in the layout?

I still say there are issues on the database side if a paltry 4 thousand rows takes 46 seconds!! Can you have a DBA grab the SQL that gets sent to the database and ask their assistance? You could do this for both output formats to see if the SQL being generated for one format vs. the other varies in some odd way.

pdaprem

Lynn, I'll revisit the database side. Thanks!