If you are unable to create a new account, please email support@bspsoftware.com

 

News:

MetaManager - Administrative Tools for IBM Cognos
Pricing starting at $2,100
Download Now    Learn More

Main Menu

DB2 Load Delivery

Started by eknight, 17 Aug 2012 04:48:23 AM

Previous topic - Next topic

eknight

Hello,

Our system has been growing into some nice performance problems lately, so we've started looking at using the DB2 LOAD Delivery option at specific stages. After adding the new delivery method and a filter option to switch between the two we tried out the DB2 LOAD and it keeps crashing.

The problem is not at the LOAD but at the EXPORTing, usually around 1.2 to 1.3 millions rows with this message:

DS-FILE-E202: Error writing to data file '/xxxxxxx/xxxxxxxx/data/DB2LOAD_029a0044_349_4.dat'

There is definitely enough space on the disk drive and it can't be an authorization problem because a partial .dat file is found at that location.

So here are my questions:

1) I've checked the option for "Dump log file" but can't seem to find it anywhere, any idea where it's supposed to end up?

2) I noticed the export is of type DEL which creates a .dat file. Is there any way I can force it to use IXF instead?

Thanks,

MFGF

My guess is that the delivery is creating a file on disk containing the data to load, then the idea is it fires up the bulk loader to suck in this file. If I had to guess, I'd venture that the build is running on a 32 bit OS, and the size of the file is exceeding 2Gb, which the 32 bit Windows OS will not like at all!

Try splitting the delivery into two separate deliveries in the build, each with an output filter which limits the data to about half the set of rows for each. Mark the first delivery as Exclusive for efficiency. Let us know how you get on!

Cheers!

MF.


Sent from my iPad using Tapatalk HD
Meep!

eknight

Hi MF,

Thanks for the response. The system is a 64 bit AIX machine which houses all the databases as well as runs the DS jobs. File sizes > 2Gb are used regularly. In fact my current solution is to simulate the DS using a ksh script, a 40 GB export file and a direct load to the database. This is sub-optimal however because SQLs are now duplicated. Once in DS and once in the scripts. I would rather just have everything in DS but for that I need a way to load data faster than APPEND mode.

There should be more detailed loggin information available. Do you have any idea where the log file associated with the "Dump log file" checkbox ends up?

Thanks again,

eknight

#3
It seems it was a disk problem. I was saving to a folder which, unbeknownst to me, was on a different disk with much less space.

It's working now though and amazingly fast. Although I wish it defaulted to UTF-8  :o

Nope, I take that back. It's not working due to an encoding problem. DS seems to have problems with special characters. Does anyone know how I can force it to either use IXF? It automatically exports the data in the DEL format (which is terrible). There must be a way to just say OF IXF right?

Thanks,