If you are unable to create a new account, please email support@bspsoftware.com

 

Fact build does not deliver any data?

Started by larus, 20 Aug 2012 07:46:36 AM

Previous topic - Next topic

larus

As a data manager noob, I created a fact build based on a Data Manager DS_Lessons sample catalog. Now when I execute this fact build (named Additional), I don't receive any data from source into the data mart tables (fact or dimensions tables). I tried to log this behaviour but it does not give any hints where the problem lies. Any ideas how to track this problem down?

eknight

There should be log output going to standard out as well as logs produced by the build and the job. See if you can find these and post the contents.

larus

Thanks, here is the output from the log file (stadard out):

Data Manager v10.1.6235.143 -- (C) Copyright IBM Corp. 1999, 2011
databuild -- start run on WIN-SHGCN3OLSCR (20-Aug-2012 16:08:29)
[PROGRESS   - 16:08:30] Accessing catalog
[INTERNAL   - 16:08:30] Exported 1 Catalog component(s)
[INTERNAL   - 16:08:30] Exported 1 Preference component(s)
[INTERNAL   - 16:08:30] Exported 4 Database Alias component(s)
[INTERNAL   - 16:08:30] Exported 2 User Defined Function component(s)
[INTERNAL   - 16:08:30] Exported 4 Reference Dimension component(s)
[INTERNAL   - 16:08:30] Exported 26 Template component(s)
[INTERNAL   - 16:08:30] Exported 4 Hierarchy component(s)
[INTERNAL   - 16:08:30] Exported 3 Lookup component(s)
[PROGRESS   - 16:08:30] Exporting required components
[PROGRESS   - 16:08:30] Catalog banner -> COGNOS INTERNAL USE ONLY
[PROGRESS   - 16:08:30] Build 'Additional' - run id 19, audit id 26 (pid 3912)
[VARIABLE   - 16:08:30] TRACE_VALUES = 'PROGRESS,DETAIL,INTERNAL,SQL,EXECUTEDSQL
,USER,VARIABLE'
[VARIABLE   - 16:08:30] TRACE_FREQUENCY = 5000
[VARIABLE   - 16:08:30] OTRACE_FREQUENCY = 50000
[VARIABLE   - 16:08:30] MEMBER_FREQUENCY = 10000
[VARIABLE   - 16:08:30] DM_CST_NEW_COOKIE_PASSPORT = '1'
[VARIABLE   - 16:08:30] DS_LOB_MAXMEM_SIZE = '65536'
[VARIABLE   - 16:08:30] DS_LOB_BUF_SIZE = '8000'
[VARIABLE   - 16:08:30] DS_DBMS_TRIM = 'SPACE'
[VARIABLE   - 16:08:30] DS_UDA_FETCH_ROWS = '50'
[VARIABLE   - 16:08:30] DS_UDA_BULKWRITE_ROWS = '50'
[VARIABLE   - 16:08:30] DS_RUN_TIMESTAMP = 2012-08-20 16:08:30
[VARIABLE   - 16:08:30] DS_MAX_RECURSION = '100'
[VARIABLE   - 16:08:30] DS_BUILD_NAME = 'Additional'
[VARIABLE   - 16:08:30] DS_BUILD_FNAME = 'Additional'
[VARIABLE   - 16:08:30] DS_AUDIT_ID = 26
[VARIABLE   - 16:08:30] DS_RUN_ID = 19
[VARIABLE   - 16:08:30] DS_LOG_DIR = '..\datamanager\log'
[VARIABLE   - 16:08:30] DS_LOG_NAME = '..\datamanager\log\Build_Additional_0019.
log'
[VARIABLE   - 16:08:30] . REJECT_FILE = 'Additional.rej'
[VARIABLE   - 16:08:30] . TRACE_VALUES = 'PROGRESS,DETAIL,INTERNAL,SQL,EXECUTEDS
QL,USER,VARIABLE'
[VARIABLE   - 16:08:30] . TRACE_FREQUENCY = 5000
[VARIABLE   - 16:08:30] . OTRACE_FREQUENCY = 50000
[VARIABLE   - 16:08:30] . MEMBER_FREQUENCY = 10000
[VARIABLE   - 16:08:30] . AUDIT_VALUES = 'TIMING,ACQUIRE,TRANSFORM,DELIVER,INTER
NAL,ALERT,USER'
[VARIABLE   - 16:08:30] . PAGE_SIZE = 16384
[VARIABLE   - 16:08:30] . HASH_TABLE_SIZE = 200000
[PROGRESS   - 16:08:30] Fact Build 'Additional'
[PROGRESS   - 16:08:30] Parsing Reference Definition
[PROGRESS   - 16:08:30] Validating Build Specification
[SQL        - 16:08:30] Preparing Data Source 'DataSource1'

SELECT b.`ORDER_DATE`,
a.`PRODUCT_NUMBER`,
a.`QUANTITY`,
a.`UNIT_COST`,
a.`UNIT_PRICE`,
a.`UNIT_SALE_PRICE`
FROM `ORDER_DETAILS` a, `ORDER_HEADER` b
WHERE a.`ORDER_NUMBER` = b.`ORDER_NUMBER`

[SQL        - 16:08:30] Preparing Data Source 'Dates'

SELECT "YearId",
"YearCaption",
"QuarterId",
"QuarterCaption",
"MonthId",
"MonthCaption",
"WeekId",
"WeekCaption",
"DayId",
"DayCaption"
FROM "Fiscal"

[SQL        - 16:08:30] Preparing Data Source 'ProductLine'

SELECT DISTINCT `PRODUCT_LINE_CODE`,
`PRODUCT_LINE_EN`
FROM `PRODUCT_LINE`

[SQL        - 16:08:30] Preparing Data Source 'ProductType'

SELECT `PRODUCT_TYPE_CODE`,
`PRODUCT_LINE_CODE`,
`PRODUCT_TYPE_EN`
FROM `PRODUCT_TYPE`

[SQL        - 16:08:30] Preparing Data Source 'Product'

SELECT a.`PRODUCT_NUMBER`,
b.`PRODUCT_NAME`,
a.`PRODUCT_TYPE_CODE`
FROM `PRODUCT` a, `PRODUCT_MULTILINGUAL`b
WHERE a.`PRODUCT_NUMBER` = b.`PRODUCT_NUMBER`

[PROGRESS   - 16:08:30] Checking Fact Data existence
[DETAIL     - 16:08:30] Fact Data found
[PROGRESS   - 16:08:30] Processing Reference Data
[DETAIL     - 16:08:30] Caching Hierarchy/Level 'Product/AllProduct'
[DETAIL     - 16:08:30] Caching Hierarchy/Level 'Product/ProductLine'
[DETAIL     - 16:08:30] Cached 6 members
[DETAIL     - 16:08:30] Caching Hierarchy/Level 'Product/ProductType'
[DETAIL     - 16:08:30] Cached 22 members
[DETAIL     - 16:08:30] Caching Hierarchy/Level 'Product/Product'
[DETAIL     - 16:08:30] Cached 115 members
[INTERNAL   - 16:08:30] 0.09MB used to cache 'Product' reference data
[DETAIL     - 16:08:30] Caching Hierarchy/Level 'Dates/Year'
[DETAIL     - 16:08:30] caching multiple levels
[DETAIL     - 16:08:30] Caching Hierarchy/Level 'Dates/Quarter'
[DETAIL     - 16:08:30] Cached 13 members
[DETAIL     - 16:08:30] Caching Hierarchy/Level 'Dates/Month'
[DETAIL     - 16:08:30] Cached 37 members
[DETAIL     - 16:08:30] Caching Hierarchy/Level 'Dates/Week'
[DETAIL     - 16:08:30] Cached 160 members
[DETAIL     - 16:08:30] Caching Hierarchy/Level 'Dates/Day'
[DETAIL     - 16:08:30] Cached 1096 members
[INTERNAL   - 16:08:30] 0.12MB used to cache 'Dates' reference data
[PROGRESS   - 16:08:30] Processing Dimension Domains
[INTERNAL   - 16:08:30] Dimension 'ORDER_DATE' - max member domain size 2048
[INTERNAL   - 16:08:30] Dimension 'PRODUCT_NUMBER' - max member domain size 256
[PROGRESS   - 16:08:30] Processing Fact Data
[SQL        - 16:08:30] deleting all rows from table

DELETE FROM `F_Additional`

[DETAIL     - 16:08:30] 5000 input (so far), 0 accepted, 0 delivered
[INTERNAL   - 16:08:30] Mem(M): 0.2 [Peak=0.2] (Ref=0.2 Domain=0.0 Pool=0.0)
[DETAIL     - 16:08:30] 10000 input (so far), 0 accepted, 0 delivered
[INTERNAL   - 16:08:30] Mem(M): 0.2 [Peak=0.2] (Ref=0.2 Domain=0.0 Pool=0.0)
[DETAIL     - 16:08:30] 15000 input (so far), 0 accepted, 0 delivered
[INTERNAL   - 16:08:30] Mem(M): 0.2 [Peak=0.2] (Ref=0.2 Domain=0.0 Pool=0.0)
[DETAIL     - 16:08:30] 20000 input (so far), 0 accepted, 0 delivered
[INTERNAL   - 16:08:30] Mem(M): 0.2 [Peak=0.2] (Ref=0.2 Domain=0.0 Pool=0.0)
[DETAIL     - 16:08:30] 25000 input (so far), 0 accepted, 0 delivered
[INTERNAL   - 16:08:30] Mem(M): 0.2 [Peak=0.2] (Ref=0.2 Domain=0.0 Pool=0.0)
[DETAIL     - 16:08:30] 30000 input (so far), 0 accepted, 0 delivered
[INTERNAL   - 16:08:30] Mem(M): 0.2 [Peak=0.2] (Ref=0.2 Domain=0.0 Pool=0.0)
[DETAIL     - 16:08:31] 35000 input (so far), 0 accepted, 0 delivered
[INTERNAL   - 16:08:31] Mem(M): 0.2 [Peak=0.2] (Ref=0.2 Domain=0.0 Pool=0.0)
[DETAIL     - 16:08:31] 40000 input (so far), 0 accepted, 0 delivered
[INTERNAL   - 16:08:31] Mem(M): 0.2 [Peak=0.2] (Ref=0.2 Domain=0.0 Pool=0.0)
[DETAIL     - 16:08:31] 43063 input (in all), 0 accepted, 0 delivered
[INTERNAL   - 16:08:31] Mem(M): 0.2 [Peak=0.2] (Ref=0.2 Domain=0.0 Pool=0.0)
[PROGRESS   - 16:08:31] Delivery 'F_Additional': no rows delivered to 'F_Additio
nal'
[INTERNAL   - 16:08:31] Dimension 'ORDER_DATE' - actual domain size 0 (peak 0)
[INTERNAL   - 16:08:31] Dimension 'PRODUCT_NUMBER' - actual domain size 0 (peak
0)
[DETAIL     - 16:08:31] Data Stream row limits: sample rate 1, max rows 0
[DETAIL     - 16:08:31] Data Source 'DataSource1': 43063 physical (43063 logical
) rows read, 43063 used
[PROGRESS   - 16:08:31] Acquisition: 0 accepted, 43063 rejected
[PROGRESS   - 16:08:31] Delivery   : 0 direct, 0 summary/merge, 0 total
[SQL        - 16:08:31] deleting all rows from table

DELETE FROM `D_Additional_Product`

[PROGRESS   - 16:08:31] Table 'D_Additional_Product': no changes
[SQL        - 16:08:31] deleting all rows from table

DELETE FROM `D_Additional_Dates`

[PROGRESS   - 16:08:31] Table 'D_Additional_Dates': no changes
[PROGRESS   - 16:08:31] Done - 0 00:00:02 elapsed
databuild -- completed (20-Aug-2012 16:08:31)

Press Enter to continue

larus

It seems that the reason is that the format of the date field from the time dimension does not match with the format of the ORDER_DATE field.

Date field from the time dimension:

2004-01-01 00:00:00

ORDER_DATE

2004-01-01 00:00:00 +0:00

What is this +0:00 in this field? ORDER_DATE is a derived field using a computation like this

ToDate(ToChar( ORDER_DATE, 'yyyy-mm-dd' ))


eknight

#4
I would agree with your diagnosis. What type of database are you using?

The +0:00 looks like an offset for timezones. Which I imagine would require a slightly wider column than the base timestamp datatype.

larus

Thanks for the reply. Database is MS Access database, because these are Data Manager sample databases (DM_Marts is the destination database). Is there a possibility to remove this offset from the data?

eknight

Where are you calling the ToDate() function? Is there maybe a ToTimestamp() option or something similar?

larus

It is in the fact build, ORDER_DATE from the data souce is used to create a derived column OrderDate using the function I provided. This is mappad into a DataStream as ORDER_DATE.

In the Transformation Model Dates dimension (using Time hierarchy) is used. This is the exact example from in the 'Additional' Fact build in DS_Lessons catalog (Data Manager samples).

MFGF

What data type is the original ORDER_DATE column in your source data?

If you look at the help for the ToDate() function, this is what it says:

ToDate( string|integer [, format] )
Either converts a string (of optional 'date format') to a date or 'date with time zone'; or an integer, as a Julian date, to a date. If no format is specified, 'syyyy-mm-dd hh:mi:ss.fffffffff stzh:tzm' is assumed.

You are not specifying a format for your ToDate() function (but you are for ToChar), therefore your resultant date has a timezone added - as per the help.

If your ORDER_DATE column is already a date, you do not need to convert it at all - just map it straight through. If it isn't a date, then try this expression:

ToDate(ToChar( ORDER_DATE, 'yyyy-mm-dd' ),'yyyy-mm-dd hh:mi:ss');

Cheers!

MF.
Meep!