If you are unable to create a new account, please email support@bspsoftware.com

 

Transformer error while checking data in data source viewer - TR3311 RQP-DEF-017

Started by damzI, 04 Feb 2017 10:35:59 AM

Previous topic - Next topic

damzI

Hello Guys,

This is frustrating. So I have a model using a few dims and 2 facts (f1 and f2). I have developed and tested the model to run in the dev env. And am using report studio reports as data sources with one report as the data source for each dim/fact.
Now the model has moved to prod and  when checking the data source viewers for each data source ALL the data sources work EXCEPT the one for f2.

Here is the error that I get...

TR3311 An error occurred during data retrieval from the database
RQP-DEF-0177 An error occurred while performing operation 'sqlOpenResult' status='-237'.
UDA-SOR-0005 Unable to write the file

I have tried to clean up the temp file, have checked the data source in framework manager and report studio (and they run with no issues) - but it is in transformer that I get that error. I am able to view the sql and native sql tabs BUT not able to preview the data - it returns the same above error !

Help !
Thanks, D

damzI


bdbits

I suspect this is the key: UDA-SOR-0005 Unable to write the file

Sounds like maybe permissions on ... something.

Have you checked the logs to see what it is doing when the error is encountered?
If you logon to Cognos as the user Transformer is using, does your report run successfully?

prikala

Where do you run transformer? Does that machine have enough temp space?
I believe data source reports might be handled locally by the transformer.

damzI

Thanks for your consistent response. So I have checked the space and it does look like I run out of space and that explains the error.
However I have used the same model with the same number of Dim/Measures (reg and cal)/data load earlier this month - nothing seems to have changed in the configuration. I have build the cube for 4 years of data without any temp space issues.
The only difference in the current model is an additional column in one of my facts with a Case statement and a concatenation with another column (to resolve unique issues). And so I am unable to load the cube with 4 years - it runs out of space. I am able to load only for 1 year without the space issue. How can it be that just one calculation has impacted the space ? We have about 270gb of space available, and I think that should be enough.
Are there any benchmarks for temp space that I could reference to figure this out. Are there any configurations in Cognos that might help?

Thanks, D !

prikala

Quote from: damzI on 10 Feb 2017 08:47:18 AM
How can it be that just one calculation has impacted the space ?
Is it possible that before the new calculation the query was executed totally by database and after calculation, cognos is forced to use local processing? For example loading the whole fact or facts into temp folders and then doing the processing.

I am not sure how to identify local processing. One sign is if native sql contains more than one select.
You could also compare native sql queries (as shown by transformer data source viewer) before and after the new calculation. Is there anything that looks suspicious.

damzI

Thanks Prikala and All my friend who responded to my issue,

I wanted to share that I resolved the issue by creating a custom SQl for all my DS queries. And the using the package as my data sources. And pushing the processing to database only - the build time has reduced from 5 hours (for 1 year of data) to 3 hours for 4 years of data - i have 3 facts with the largest being about 130 mil.

Thanks ! :) - d