Do not input private or sensitive data. View Qlik Privacy & Cookie Policy.
Skip to main content

Announcements
Qlik GA: Multivariate Time Series in Qlik Predict: Get Details
cancel
Showing results for 
Search instead for 
Did you mean: 
Anonymous
Not applicable

Dynamic data loading from a single Talend job

Hi,
I have the following need. I have 200 plus source tables and respective target tables (one to one mapping). On daily basis, I need to look into source and update /insert the target. Here source is a custom database (which can be connected using ODBC) and target is MSSQL.
One way to achieve this creating 200 individual jobs for reading from the respective source table and load to specific target table.
Other way which I am thinking is write a single job and load all the tables from source and run that job 200 times for 200 different tables. For this options, source and target schema need to be dynamic. One table may contain 5 columns and other may contain 100 columns and those could be with different data types. 
Can we do something like this using single Talend job? If so, any suggestions or past scenariop examples?
Thx,
Siva
Labels (2)
11 Replies
_AnonymousUser
Specialist III
Specialist III

Hi Sabrina,
I got badly stucked, is there any component which retrieves the info like, what are the tables used in the job?
For example:
I had three table input components in a job, those three components are joined with tmap, finally the results are kept in a table.
on this process i need dynamically capture what all tables are used in the job.Is that possible in Big-data talend 6.0
Thanks,
Blessing
Anonymous
Not applicable
Author

Hello,
I had three table input components in a job, those three components are joined with tmap, finally the results are kept in a table.
on this process i need dynamically capture what all tables are used in the job.Is that possible in Big-data talend 6.0

Could you please elaborate your case with an example with input and expected output values?
Best regards
Sabrina