Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
Hello everyone
I have more than 70 tables in my database(Mysql), so I want to extract all of them in the same job.
I used to extract them with tDBInput (one component for each table) but since the number of tables is in growth I'm searching for an optimized solution.
Thanks
Which product are you using (Talend Open Source or Talend paid for edition)? Where are you intending on sending the data?
i'm using Talend Open Source
What is the target of the data?
You are going to struggle with this. You may as well just take a database backup and load it onto your target db.
Unfortunately you really cannot ignore schemas with Talend. The Enterprise Edition comes with dynamic schema functionality which would likely have helped here. But since you do not have that and you need to keep your column types (you can't arbitrarily dump out to Varchar....to a flat file), you will have to create a job for each table. However, this shouldn't take too long if you are mapping like to like with no transformation.
okey then i'll go with the traditional solution
thanks a lot
My colleague @groupproductmanagement has created a good article to do dynamic ingestion for Oracle database. If you can make the changes to read from MySQL configuration tables instead of Oracle, you should be able to do the dynamic ingestion for your use case too.
Warm Regards,
Nikhil Thampi
Please appreciate our Talend community members by giving Kudos for sharing their time for your query. If your query is answered, please mark the topic as resolved 🙂
@nthampi thank you very much i'll try it now and see if it works
Cool solution @nthampi, but this will still require the data to be retrieved from the source system before it is loaded into the target. By the time the jobs have been written to export the data from the multiple tables, they may as well simply push it to the target db.