Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
Hi,
I have a job in which I am using around 60 times the same tables, as shown in the attached. Is there a way to replicate the table without doing 60 time the same select? and doing so increasing the job's performance?
Thank you in advance!!
Hi @Emanuele89 ,
Did you try using tReplicate component for your usecase. By using this you only have to read data once from database and the read data flow will be replicated as many times as you want. See below screenshot.
Regards,
Pratheek Manjunath
If you have to use the same dataset, have a look to tHashOutput / tHashInput.
I already tried to use tHashOutput/Input, but nothing changed.
Hi @Emanuele89 ,
Did you try using tReplicate component for your usecase. By using this you only have to read data once from database and the read data flow will be replicated as many times as you want. See below screenshot.
Regards,
Pratheek Manjunath
I already tried it and it works fast. But, If I have to many components I got java error.
I can't connect the same tReplicate to 60 components. So I need to create many tReplicate components e connect each one of them.
Hi,
It is not advisable to use a single job to extract the data for 60 tables. You should use 5 to 10 tables maximum in 1 job (not a hard and fast rule but a regular recommendation). You can orchestrate these jobs in parallel and if you want to combine these datasets later, you can do it down the line with another job where you can use tUnite component.
Yes, I agree that we are creating more jobs instead of single job to do the same process. But we need to understand that job readability and maintenance is also equally important.
Warm Regards,
Nikhil Thampi
Please appreciate our Talend community members by giving Kudos for sharing their time for your query. If your query is answered, please mark the topic as resolved 🙂
Hi,
Agree with @nthampi there's something wrong with your job structure if you're wanting to do the same thing 60 times.
Can you not feed the 60 destination tables in via some flow and use a tFlowToIterate and then perform the DB output 60 times with a much smaller job set?