If a DataStage job has aborted after say 1000 records, how to continue the load from 1000th record after fixing the error?

Showing Answers 1 - 14 of 14 Answers

Dharmendra

  • Aug 25th, 2006
 

After fixing an error , we can run the job once again by using lookup target table with hash file stage.

  Was this answer useful?  Yes

sachin

  • Nov 21st, 2006
 

Hi,

anyone could explain this process exactly. what i know is ...just we can limit or abort the job after complete loading  of particular amount of records.

  Was this answer useful?  Yes

baglasumit21

  • May 17th, 2007
 

It is not possible to start loading the rows after 1000th row. but there are many ways you can achieve it.
1) If the target table has current timestamp as a column then we can delete the rows having max timestamp and load all the rows from start.
2) We can perform a look-up of source table with the target table to check if the rows exist in target table and insert only the non-existing rows.

  Was this answer useful?  Yes

Pranav

  • Jun 21st, 2007
 

Am not sure how you have designed the jobs. Assuming following jobs / table details :

Table : EMP_TAB
Job1: Hash file extraction from EMP_TAB ( with key details, e.g. month date if its a monthly load)
Job 2: Cleanse/Transform ( Apply all business logic)
Job 3: Load job with look up on hash file from Job1 (to EMP_TAB)

Decription:
The extract job should be executed first in the sequence. For all the jobs check the option "Do no checkpoint run" except the first hash creation job.

Now in case your source had 1000 rows, and the job failed on 200th row. On simpley restarting the sequence the hash file would have all the rows inserted by earlier run, & the earlier jobs like job 2 would not run unecessarily. The load job would ensure that where the match is found , not action is performed.

Thanks
Pranav

It is possible to restart the job after the aborted rows.
You have to select the "check point" property in the sequence.
There you have to give the aborted final record number.

This is only in datastage(PX)


  Was this answer useful?  Yes

Hi ,if your job is using OCI stage as the target stage  .then u can do it. just specify the no of the record in the Transaction handling of the OCI stage . So if the job get aborted after 1000 records  ,say some 1220 then 1000 record will be there in the target table 

Give your answer:

If you think the above answer is not correct, Please select a reason and add your answer below.

 

Related Answered Questions

 

Related Open Questions