Do not input private or sensitive data. View Qlik Privacy & Cookie Policy.
Skip to main content

Announcements
Qlik Open Lakehouse is Now Generally Available! Discover the key highlights and partner resources here.
cancel
Showing results for 
Search instead for 
Did you mean: 
Anonymous
Not applicable

Jobs getting stuck while loading large data tables to target

Hi Experts,

 

I am using Talend Open Studio for Data Integration Version: 6.3.1. I am using Open Source version. I am trying to load data from a postgres table (10 million records) to a postgres table in my Target database. When I trigger the job, I only see that the job started, but it is not loading any data to my target table and it is not even failing. Even if I try to load the same to a flat file, even then it is not starting.

Can someone please help.

Labels (4)
1 Solution

Accepted Solutions
Anonymous
Not applicable
Author

@snishtala I did some changes into my job. Setting cursor to 100K, commit level to 100K, batch load to 100K, check the option on Trim all spaces, Tmap: ignore traling zeros while load and finally fetch year wise data. I am able to do 5M data insertion using open source and a python script in a Linux server in 20 min. 

View solution in original post

15 Replies
Anonymous
Not applicable
Author

Hello @suvi 

 

Can you post a screenshot of the output components configuration window?or the job itself?

Anonymous
Not applicable
Author

Hi Snishtala,

 

Thanks for the response. I have attached the screenshot of my job. There is nothing in my job. It is just dump of data from table to a flat file.


Talend issue.png
Anonymous
Not applicable
Author

@suvi - What is the error on input component?

Is it related to query given in the component? Can you try putting limit on your query to see if its loading data to file at all?

Anonymous
Not applicable
Author

@snishtala Yes on applying the filter at SQL query level, I am able to insert data to Table as well as file, but if I want to insert the entire data in one flow, its not happening.

Is there something that can be done to achieve it in one go. I tried increasing the batch job size and also the commit limit but still facing the issue.

The reason for getting it done in one go is, some of the tables are having around 60M records and if I split by applying filter, I will have to create multiple flows.

Anonymous
Not applicable
Author

@suvi - Okay in that case what is the batch size you are using in output component?

And also are you using cursor in DB input component in advanced settings?

Anonymous
Not applicable
Author

@snishtala I tried different batch size 1000, 10000, 100000 and same I tried for commit size.

No I have not tried the cursor from the Input component.

Anonymous
Not applicable
Author

@suvi - Okay rather than batch try using the extended insert in input component(set it starting from 1000) and leave the commit at 10000. And also try putting the cursor value same as whatever you put for extended insert

Anonymous
Not applicable
Author

@snishtala I tried to look for the "Extended Input" option but I am not able to locate it. Could you please let me know where or how to do that.

I am pretty new to Talend tool.

Anonymous
Not applicable
Author

@snishtala sry typo "Extended Insert"