If you are unable to create a new account, please email support@bspsoftware.com

 

News:

MetaManager - Administrative Tools for IBM Cognos
Pricing starting at $2,100
Download Now    Learn More

Main Menu

Live DataModule from flat files

Started by sdf, 30 Jul 2018 03:41:56 PM

Previous topic - Next topic

sdf

Hi,

Would it be possible to setup a structure where source flat files are updated in a repository every morning,
then will update the data module thus update the report.

I'm looking at automating the process. So is there a job that can run this setup?


MFGF

Quote from: sdf on 30 Jul 2018 03:41:56 PM
Hi,

Would it be possible to setup a structure where source flat files are updated in a repository every morning,
then will update the data module thus update the report.

I'm looking at automating the process. So is there a job that can run this setup?

Hi,

If you are uploading files to Cognos Analytics, there is no process to automate this as it is viewed as an ad-hoc data discovery task rather than a robust, governed method of delivering data. You would be much better loading the files into a database and pointing your data module to the database tables - there would then be no issue in seeing updates flow though automatically. You would need some sort of ETL process to load the files into your database - that will be your biggest challenge.

Cheers!

MF.
Meep!

sdf

#2
That was what I thought.
The thing is, before I did a script for API scrapping and combine it with an sql export then load it in a TI process to create/update a cube. I was hoping to have the same with data modules.

But I get it now, I just realized after you mentioned about the uploaded files as adhoc.
For now, I'm guessing the way to it would be uploading the files again and use the relink feature for the data module.

Thanks MF!

MFGF

Quote from: sdf on 31 Jul 2018 06:49:06 AM
That was what I thought.
The thing is, before I did a script for API scrapping and combine it with an sql export then load it in a TI process to create/update a cube. I was hoping to have the same with data modules.

But I get it now, I just realized after you mentioned about the uploaded files as adhoc.
For now, I'm guessing the way to it would be uploading the files again and use the relink feature for the data module.

Thanks MF!

Provided the structure and format of the file remains constant, you can upload new versions which overwrite the original, and there's no need to re-link your data module. If the structure changes, then you need to upload a new file rather than overwriting the original - in this case relinking would be needed.

Cheers!

MF.
Meep!

sdf

OH, so data module does query the uploaded files even after creation. There I thought after creating the datamodule it would be stand alone.
Good that'll do. Since the flatfiles will retain the same columns and format and will just be updated from time to time.

I have not gone through datamodules yet, but on the process to it. Just started the migration plan and everything here has been informative.

Cheers Meep!


MFGF

Quote from: sdf on 31 Jul 2018 07:57:13 AM
OH, so data module does query the uploaded files even after creation. There I thought after creating the datamodule it would be stand alone.
Good that'll do. Since the flatfiles will retain the same columns and format and will just be updated from time to time.

I have not gone through datamodules yet, but on the process to it. Just started the migration plan and everything here has been informative.

Cheers Meep!

When you upload a file, by default it gets stored in the content store database. When you want to access the uploaded file, it is retrieved from the content store and made available as a Parquet file (columnar) to the data module, dashboard, storyboard or report that is using it. When you re-upload the file, the content store is updated with the new data, and this is retrieved into Parquet when you want to access it next time.

Cheers!

MF.
Meep!

sdf