Skip to main content
Announcements
Join us at Qlik Connect for 3 magical days of learning, networking,and inspiration! REGISTER TODAY and save!
cancel
Showing results for 
Search instead for 
Did you mean: 
et123
Contributor III
Contributor III

Talend Job Hangs without any errors

My Talend Job hangs without producing any errors. I have a mysql input (with 10 millions rows) and a mysql output. After 5 thousand rows, the job just hangs (the rows are not increasing on the output). Please who has an idea about what is wrong with it.

I already changed the Xms and Xmx and enabled streaming but nothing has changed. here's some pictures

0695b00000N4BgwAAF.png

0695b00000N4Bh6AAF.png

0695b00000N4BhGAAV.png

Labels (2)
4 Replies
Anonymous
Not applicable

Can you describe what your job is doing here? This is not to do with memory. I suspect that it is caused by table locking in the DB. Are any of the input queries using the table being inserted/updated? What is taking place at the output (update/insert)? Have you checked MySQL for table locks?

et123
Contributor III
Contributor III
Author

Hi, thanks for replying! you mean that tables are getting locked during the execution ?

Well, I tried to delete the lookups, I tried to just load data from "i_invoice" (mysql input) to "dim_flux" (mysql output) and it's just an insert to make the process faster! but as you see it hangs without errors again! I really didn't understand the problem !

 

0695b00000N4LzAAAV.png

Anonymous
Not applicable

Last time you said it was only getting about 5000 rows through. This time it is a lot more. Can you tell me whether you are inserting or updating? Have you checked your DB for locks? Google how to do that in MySQL if you are not sure. You will need to run that query while the job is running.

et123
Contributor III
Contributor III
Author

I'm inserting , and yes when i deleted the other lookups it gets more rows but it get stuck again. there are 11 millions rows that should be inserted. ok I will check about the DB locks. thank you