Do not input private or sensitive data. View Qlik Privacy & Cookie Policy.
Skip to main content

Announcements
Join us in NYC Sept 4th for Qlik's AI Reality Tour! Register Now
cancel
Showing results for 
Search instead for 
Did you mean: 
Anonymous
Not applicable

java.lang.OutOfMemoryError: Java heap space SQOOP

I build a job which will ingest list of tables from oracle to hadoop, using standard job. I do ingestion through tSqoopImport. As I know, by using sqoop, job server will act as a driver only. Data will ingested directly from source to target, not through job server. No other's component in my job (tMap, tSort, etc), just tSqoopImport. I try to increase memory up to 50GB, but didn't solve the problem.

 

Is any of you have solution for this case guys?

Labels (3)
1 Solution

Accepted Solutions
Anonymous
Not applicable
Author

Hi all, after doing simple mitigation by running the job and check memory usage. Memory (heap) usage still under allocated memory. The problem come because map and reduce memory allocation in yarn to small for the job. So I manually put map memory allocation to the job, on sqoop component -> advanced settings -> set memory for map and reduce as the job needed.

View solution in original post

2 Replies
Anonymous
Not applicable
Author

Hello,

What's your source DB? Could you please show us the full error stack trace? Did you allocate more memory for current active job or the whole studio?

Have you already checked this article about:https://community.talend.com/t5/Migration-Configuration-and/OutOfMemory-Exception/ta-p/21669?

 

Best regards

Sabrina

Anonymous
Not applicable
Author

Hi all, after doing simple mitigation by running the job and check memory usage. Memory (heap) usage still under allocated memory. The problem come because map and reduce memory allocation in yarn to small for the job. So I manually put map memory allocation to the job, on sqoop component -> advanced settings -> set memory for map and reduce as the job needed.