If you are unable to create a new account, please email support@bspsoftware.com

 

News:

MetaManager - Administrative Tools for IBM Cognos
Pricing starting at $2,100
Download Now    Learn More

Main Menu

What's the easiest way to add minutes/hours to a cube?

Started by Suraj, 04 Oct 2013 01:33:52 PM

Previous topic - Next topic

Suraj

It's been a while working on Transformer cubes.
I'm creating a custom time dimension that includes minutes and hours.
Data source is a text file that has date time.
What's the easiest way to do this so we can analyze data by minutes in Analysis studio?
Thanks

cognostechie

You can run the metadata wizard in Framework Manager to bring in that text file, then use extract function to seperate hours and minutes into another column. Export that file again as CSV and use that in Transformer. Put that column under the 'Day' of Time Dimension.  You will need to create unique keys in Transformer and assign that to the 'Source' and 'Category Code' property because the same hour (ex: 14) can belong to multiple dates, otherwise it will give you a uniqueness violation error and stop the process.

This also means that your Fact queries needs to have the hours and minutes in another column too otherwise it won't roll up the data properly.

Suraj

Thanks cognostechie.
The source file changes every 10 minutes so using it in FM and exporting is not so straight forward.

cognostechie

In that case, you can just use a report/package as the source for Transformer which is actually a better way to build the cube. How often the cube should be refreshed is going to be another decision. It depends on how recent the data should be in the cube.

Suraj

The source file is from a different app generated every 10 minutes and the cube also needs refresh every 10 minutes.
Basically, there is no data source to create a package from as the incremental csv file needs appendment and Cognos is notorious to keep files in cache for long time.
I'm trying few different approaches using calculations within transformer.
Hopefully, something will work without involving a chain of apps.

bdbits

Does it have to be a cube? Have you considered DMR?

I say this because of your refresh requirements. It seems to me you would want an ETL process to run and populate a data warehouse, over which you have a DMR model. This can be used in Analysis Studio.

Just a thought. I realize there may be other reasons you need a cube.

navissar

I agree with bdbits: I would try my best not to use tranformer for this, for the following reasons:
1. Size of cube: A day has 60X24=1440 minutes. That's 1440 new rows every day. After one month(30 days) your cube will have to generate measures for 30daysX1440 minutes=43,200 rows before we considered any other dimensions. After a year, your cube will generate over half a million rows just from date-time. If you add just one more dimension with 5 categories, you're already over 2 million. Transformer cubes tend to cough and hickup when built and be slow when queries when they reach these sizes. Remember that traditional OLAP is meant for aggregates, not for the row level data.
I'd go with one of the following solutions:
1. Create an ETL process that will load the time dimension along with fact data in star schema, and model it as a dynamic cube or DMR (Given that OLAP structure is anm absolute requirement, you could simply use relational for that if that's not the case).
2. If you happen to have access to MS SQL Server v. 2012 and up, you could use SSAS Tabular Mode for that. Being in-memory and tabular, it will hold your data nicely and will build at a reasonable pace.

nybble

Have you tried creating a calculated column in Transformer to extract the hours/minutes?