If you are unable to create a new account, please email support@bspsoftware.com

 

News:

MetaManager - Administrative Tools for IBM Cognos
Pricing starting at $2,100
Download Now    Learn More

Main Menu

Cognos for a Fact Table with hundre millions rows

Started by Enrique, 07 Jul 2017 06:37:40 AM

Previous topic - Next topic

Enrique

Hi Experts,

I wonder if Cognos could be competitive for a Star-Schema that have hundreds millions rows. In my Experience that i work with Cognos  i guess its not the accurate tool. But i would like to hear another opinion.
So until how much volume of data would be cognos have a good performance.

Thank in advance

Enrique

bdbits

#1
You've not provided anywhere near enough information. And it is most likely to be a matter of database performance anyway, assuming a competent modeller with a well-defined model.

Assuming relational... Server resources? Server configuration? What database? Database configuration? Row size? Indexes? etc.

I have personally created data warehouses with multiple fact tables of 10s of millions of rows, biggest single table around 70M as I recall, and performance was fine. We did have decent hardware and a very competent DBA who knew how to take advantage of database features for performance.

dougp

bdbits nailed it. 
I have competent DBAs.  We have one fact that has about 70M rows.  That database has about a dozen facts.  It's running alongside 8 other, similar databases on newer hardware.  The DBAs have employed performance-enhancing methodologies like indexing and scheduled statistics rebuilding, as well as database partitioning.  The database server is not causing any slowdowns.

Let me add a couple of items, though:

First, it depends on what kind of reporting you need.  If you're doing BI (dashboard visualizations, summary tables, etc.), Cognos should perform quite well.  If you are expecting Cognos to produce a CSV file containing 300 columns and 200,000 rows (an output that no human I have ever met could consume), there may be problems.

I have identified a bug in Cognos in the way it handles grouping in lists.  It can generate some horrendous SQL code that can take up to 40 times as long as it should to run (on the database server).  This usually involves detailed data and always involves either using the group span feature or a group header that contains related dimensional data (effectively, a group span).  There are ways around this, but it's irritating.  This has affected a handful of my 14,000 reports.

CognosPaul

Just chiming in here.

With a decent architecture, competent DBAs, and good hardware, database size really becomes irrelevant. With a good partitioning and indexing strategy the database is only going to be looping through a small fraction of your billion row table.

Cognos is a reporting tool. It generates SQL and sends that to the database. There are some cases where the SQL is sub-par (one query calling multiple noncomformed facts with total(measure for dim) type data items spring to mind), but usually those can be fixed with changes to the metamodel or report.

The biggest database I've worked with had significantly more than 200m rows (200k rows (at peak) per second for a 5 day period with 1 days worth dropped every 24 hours), and the SLA was a maximum of 10 seconds per report.

The fact is that Cognos was able to deliver the reports in under 10 seconds wasn't due to Cognos' awesome querying prowess, it was down to the incredibly intelligent DBA architects who designed the system.

Invisi

Totally agree. The biggest fact table of my current client has roughly 88 million rows. The core of its performance is the tuning of the database. The quality of the tool querying is secondary to the quality of the database model and the tuning of the database.
Few can be done on Cognos | RTFM for those who ask basic questions...