Skip to main content
Announcements
Join us at Qlik Connect for 3 magical days of learning, networking,and inspiration! REGISTER TODAY and save!
cancel
Showing results for 
Search instead for 
Did you mean: 
thomas_2583
Contributor III
Contributor III

Large Data Transfer

Hi,

I have created a job design to move data from one database into another. This particular job is trying to transfer c.500k rows of data. I have run the job (took around 5 hours) and it looks to have run fine, however, the database (postgres) the data is being moved to, only shows around c.8k records.

Any idea why this could be, please? I am new to Talend so if there is any more information that is needed, please let me know.

0695b00000QFjXiAAL.png

0695b00000QFjXxAAL.png

Labels (3)
1 Solution

Accepted Solutions
gjeremy1617088143

Update or insert option is really poor in performance on huge data set, as you check inner join reject just use insert options, also you could raise your commit every number to 10000 for example. With that you will really lower your execution time.

View solution in original post

3 Replies
gjeremy1617088143

Hi could you show the configuration of your tDBOutput, basic and advance settings ?

thomas_2583
Contributor III
Contributor III
Author

I hope this is ok, I have obviously had to remove the connection details.

 

0695b00000QFjfmAAD.png 

0695b00000QFjfwAAD.png

gjeremy1617088143

Update or insert option is really poor in performance on huge data set, as you check inner join reject just use insert options, also you could raise your commit every number to 10000 for example. With that you will really lower your execution time.