Do not input private or sensitive data. View Qlik Privacy & Cookie Policy.
Skip to main content

Announcements
Join us in Bucharest on Sept 18th for Qlik's AI Reality Tour! Register Now
cancel
Showing results for 
Search instead for 
Did you mean: 
Anonymous
Not applicable

Job completed with exit code=1 with less data in DB.

Hi Team,

 

I am loading 90 Million data from DB2 to PostgreSQL with simple table to table mapping with PostgreSQL target property as Batch load active as 1 Lk and Commit level same as 1 Lk and it's Insert only. 

After we execute the job and while it loaded 10 Million records in 1 Hr , it disconnected with Exit code 1.

There is no error logged in console.

 

Please suggest what is the issue? The load time is not the concern but the data should be loaded correctly at the end.

 

Also, please share you input on Bulk insert strategy as I will be trying to avoid that since, we have talend and  target PostgreSQL DB on different server however, I know that I can use tFTPput to move the bulkoutput file output to DB server but just want to avoid uncertainity for increasing the complexity and file write issue as there is no time related concern and without bulk insert atleast I hope it will finish within 8 Hr max...:-)……..In short, we are loading data for all DB2 table in PostgreSQL with table to table map.

Even looking for suggestion if any to improve the performace other than bulk exec

 

Labels (2)
3 Replies
Anonymous
Not applicable
Author

Hello,

Probable causes are not enough available memory. The parameter controlling this setting is configured in the Studio configuration file.

Let us know if this article helps.

Best regards

Sabrina

Anonymous
Not applicable
Author

Thanks for your reply Sabrina.

 

Could you please suggest where this configuration files is stored, as I see there is configuration file folder but I am not sure whether we can change the parameter. If it allow to change, please suggest which are all parameter we can change to prevent this issue.

 

Even sometimes I have noticed weird behaviour, there are total 90 million data which we have transferred from DB2 to PostgreSQL DB with talend and the monitored rows in talend show successful completion with pipeline values in green along with exit code = 0 but when we try to fetch the values from table to validate then, the query is long running without any result. This is something we are fear about that what exactly happening even talend do mention successful completion but unable to fetch the data like simple count(*) or filter on the table data through "Where" clause. Please suggest.

Anonymous
Not applicable
Author

Also, just to add I have selected batch size and commit interval as 1 Lk.