Skip to main content
Announcements
A fresh, new look for the Data Integration & Quality forums and navigation! Read more about what's changed.
cancel
Showing results for 
Search instead for 
Did you mean: 
Anonymous
Not applicable

Single Context for Multiple Jobs (that aren't subjobs)

I know that I can create a single context for a parent job and pass that to different subjobs. However, I'm wondering if I can have a separate job that loads the context (e.g. with tContextLoad) and then have several other jobs that *aren't* subjobs reference that context.

 

The motivation is that I have several etl jobs that can be run independently or in sequence (as needed). Each job can be run against an arbitrary customer file. In practice, each customer file will need most of the jobs, but probably not all of them, and perhaps not in the same order; for this reason, it won't work to create a master job with each etl job as a subjob, because I need to be able to schedule the etl jobs independently.

 

For example, say I have files for customer1 and customer2. I also have 5 etl jobs labelled etl1, etl2, etc. The file for customer1 needs jobs 1-3 and job 5, but not job 4; the file for customer2 usually needs all 5 jobs, but they just sent a revised file that only needs job 2. I'd like to have a single job at the start of the etl workflow that loads a context containing the filename, folders, etc. for each customer so that each etl job can be used for each customer. Is that straightforward/possible to do, perhaps using globalMap or tJavaFlex? I'm using the licensed version of Data Integration, if that makes a difference.

 

Thanks,

 

David

Labels (3)
1 Reply
Anonymous
Not applicable
Author

Hello,

Context variables centrally stored in the Repository can be reused across various Jobs. 

Please refer to this user guide about:TalendHelpCenter:How to centralize context variables in the Repository.

Let us know if it is what you are looking for.

Best regards

Sabrina