If you are unable to create a new account, please email support@bspsoftware.com

 

News:

MetaManager - Administrative Tools for IBM Cognos
Pricing starting at $2,100
Download Now    Learn More

Main Menu

CA - JDBC Connection to .CSV files

Started by adam_mc, 02 Mar 2017 02:41:15 PM

Previous topic - Next topic

adam_mc

I am in the process of converting all my data sources to be JDBC rather than ODBC.
This is being driven by two parallel tracks:

1. We are taking advantage of IBM's Bridge to Cloud (but, I am told this will only support DQM sources)
2. I have a requirement to incorporate an Amazon Redshift source which is DQM only.

I need packages to contain data from mulitple sources so I cannot have multiple FM models.

I have most converted over, but I am stuck on what JDBC driver to use for .CSV flat files and how I configure the source in Cognos Administration.
I jave downloaded SourceForge's CsvJdbc driver into my <install directory>\drivers folder and stopped/re-started services, but I am not seeing what I hoped to see in the JDBC drivers list.

Any advice, help, or thoughts would be greatly appreciated.
Thanks in advance,
Adam

MFGF

Quote from: adam_mc on 02 Mar 2017 02:41:15 PM
I am in the process of converting all my data sources to be JDBC rather than ODBC.
This is being driven by two parallel tracks:

1. We are taking advantage of IBM's Bridge to Cloud (but, I am told this will only support DQM sources)
2. I have a requirement to incorporate an Amazon Redshift source which is DQM only.

I need packages to contain data from mulitple sources so I cannot have multiple FM models.

I have most converted over, but I am stuck on what JDBC driver to use for .CSV flat files and how I configure the source in Cognos Administration.
I jave downloaded SourceForge's CsvJdbc driver into my <install directory>\drivers folder and stopped/re-started services, but I am not seeing what I hoped to see in the JDBC drivers list.

Any advice, help, or thoughts would be greatly appreciated.
Thanks in advance,
Adam

Hi,

I've not seen anything in the supported connections list that mentions JDBC connections to csv being supported. Do you have the option to upload the csv files to Cognos Analytics and create a data module over the uploaded files, linking them to your other data?

MF.
Meep!

adam_mc

Thanks MFGF, that's what I was fearing!

I think rather than creating data modules, I am going to load the .CSV source directly into our Netezza Data Warehouse using Data Stage.
That's probably the overall best direction as it will enable as much functionality as possible to be pushed to the database rather than in the reporting layer.

I know it's more work, but I think it's the prefered solution.

Thanks again for your feedback,
Adam.

MFGF

Quote from: adam_mc on 03 Mar 2017 07:55:07 AM
Thanks MFGF, that's what I was fearing!

I think rather than creating data modules, I am going to load the .CSV source directly into our Netezza Data Warehouse using Data Stage.
That's probably the overall best direction as it will enable as much functionality as possible to be pushed to the database rather than in the reporting layer.

I know it's more work, but I think it's the prefered solution.

Thanks again for your feedback,
Adam.

That's a WAY better solution! :)

MF.
Meep!

adam_mc

I am now encountering some additional problems now that my sources are defined as JDBC and using a DQM FM model.
This is not necessarily related to the .csv sources, but other data sources too.

Query Subjects that worked without issues are now valid, but fail when I run a test.

I'm not sure if this is the result of ODBC vs. JBDC or CQM vs. DQM (or both), but I am wondering if there is any way other than trial and error for every Query Subject to determine if I have problems (and where they exist).

Thanks in advance,
Adam.

MFGF

Quote from: adam_mc on 06 Mar 2017 01:00:54 PM
I am now encountering some additional problems now that my sources are defined as JDBC and using a DQM FM model.
This is not necessarily related to the .csv sources, but other data sources too.

Query Subjects that worked without issues are now valid, but fail when I run a test.

I'm not sure if this is the result of ODBC vs. JBDC or CQM vs. DQM (or both), but I am wondering if there is any way other than trial and error for every Query Subject to determine if I have problems (and where they exist).

Thanks in advance,
Adam.

Hi,

What errors are you seeing? Is this a new FM model using DQM from the outset, or is it an old CQM model you have switched to using DQM?

MF.
Meep!

adam_mc

This is an existing CQM model that is being switched over to DQM.

Some of the errors that I am seeing are things like now having to cast "Flag" fields/columns from an Amazon Redshift source as character.
Also, I am getting errors on what primarily seem to be Date fields/columns from multiple sources (including IBM AS400/IBMi and even SQL Server).

My primary DW source (Netezza) seems fine although I have not completed all testing.
However, run times for testing Query Subjects seems to have increased dramitically.
I have 25 rows for a test, but it takes so long that I eventually cancel and then results appear!

Thanks in advance,
Adam.

the6campbells

You can only use JDBC drivers and data servers which Dynamic Query is certified with.

Attempts to use a JDBC driver which is not supported by Dynamic Query will result in a runtime error being thrown.