Do not input private or sensitive data. View Qlik Privacy & Cookie Policy.
Skip to main content

Announcements
Qlik Open Lakehouse is Now Generally Available! Discover the key highlights and partner resources here.
cancel
Showing results for 
Search instead for 
Did you mean: 
Anonymous
Not applicable

Jobs getting stuck while loading large data tables to target

Hi Experts,

 

I am using Talend Open Studio for Data Integration Version: 6.3.1. I am using Open Source version. I am trying to load data from a postgres table (10 million records) to a postgres table in my Target database. When I trigger the job, I only see that the job started, but it is not loading any data to my target table and it is not even failing. Even if I try to load the same to a flat file, even then it is not starting.

Can someone please help.

Labels (4)
15 Replies
Anonymous
Not applicable
Author

@suvi - In the advanced settings of output component.

Any luck using cursor?

Anonymous
Not applicable
Author

@snishtala I am not seeing the option in the Advanced section. I am using Open Source 6.3 version.

I do not see any progress with the cursor option.


Advance option for Target.png
Anonymous
Not applicable
Author

@suvi - My bad. I totally forgot that its for postgresql . you wont see that option. you are right.

Anonymous
Not applicable
Author

Anonymous
Not applicable
Author

@snishtala I did some changes into my job. Setting cursor to 100K, commit level to 100K, batch load to 100K, check the option on Trim all spaces, Tmap: ignore traling zeros while load and finally fetch year wise data. I am able to do 5M data insertion using open source and a python script in a Linux server in 20 min. 

Anonymous
Not applicable
Author

@suvi - glad you found a solution.