Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
In case you are dealing with huge volume of data, I will suggest you separate extraction an loading into two different jobs
Commercial version offers a simple way to build generic job using the datatype “dynamic”.
On TOS, I thinks we can find a complex way to build something similar
You can use context to handle some configurations
It's what we did in the past case where we were importing data from 150 sources, around 50 different schemas and a total of about 1 billion records per day.
Hi,
Since you are specifing about count details, I assume the schema for your joblet will remain same for multple tables. In this case, you can pass the other details as parameters to both source and target components (like tablename, query where clause etc.)
Joblet is nothing but a part of the job which you are moving as separate entity either due to its complexity or to make it as a reusable part for your jobs. Please create your jobs and if there are any errors, please share the details of the job along with job flow and other screenshots for further analysis.
Warm Regards,
Nikhil Thampi
Please appreciate our Talend community members by giving Kudos for sharing their time for your query. If your query is answered, please mark the topic as resolved 🙂