Do not input private or sensitive data. View Qlik Privacy & Cookie Policy.
Skip to main content

Announcements
Qlik Connect 2026! Turn data into bold moves, April 13 -15: Learn More!
cancel
Showing results for 
Search instead for 
Did you mean: 
Anonymous
Not applicable

Insufficient memory for the Java Runtime Environment to continue in Talend Open studio for DI 6.3

Hi everyone,

 

My requirement is to load excel data into SQL server table.In excel file there are 38000 records there when I run the job I got the following error

There is insufficient memory for the Java Runtime Environment to continue.
# Native memory allocation (malloc) failed to allocate 1058864 bytes for Chunk::new
[thread 4800 also had an error]
# An error report file with more information is saved as:
# E:\Mathi\Dowloads\TOS_DI-20161216_1026-V6.3.1\hs_err_pid700.log
#
# Compiler replay data is saved as:
# E:\Mathi\Dowloads\TOS_DI-20161216_1026-V6.3.1\replay_pid700.log

 

0683p000009Lv4e.png

 

 

Can someone please help me resolve this issues?

 

Thanks in advance

Labels (3)
7 Replies
Anonymous
Not applicable
Author

When I divide Excel data into three files with each 10K records Job Runs successfully.

I have tried by adding -Xmx4096M in Run Advanced settings.
Please give me some solutions to resolve it completely!
Anonymous
Not applicable
Author

You're not running a job, you're defining metadata. Do this with your cut down file. When you use this metadata in a job with you're full file you should be ok. Talend prefers smaller files when sampling for metadata creation.

Anonymous
Not applicable
Author

Hi,

@tal00000 Ya I did cut down source file. FYI I have retrieved excel file with the 4Lakhs record, recently Couldn't retrieve even with 30,000 records.

While running my job I'm getting errors like below

0683p000009Lv6a.png

Is this because Talend occupies more memory.

My system properties are Windows 10 64bit.

 

Sometimes Job works fine when I hardcode -Xmx4096M.

 

Can anyone give me the permanent solution?

Anonymous
Not applicable
Author

You are not being clear with your issue.

 

Your screen shot is of metadata definition. That is not running a Job.

 

If you need more memory for your data, -Xmx is the correct way or, of course, review your design to use less memory. You can increase this either for the Design Tool, or for your running Job.

 

Why do you need to sample large data volumes for metadata definition?

Anonymous
Not applicable
Author

Hi,

Very first time When I retrieve excel file with 30K records I got that error "out of memory" which I have shown In the first screenshot.Later I have divided excel file into three files with each 10k Records, then I can able to retrieve it but while running the job it throws an error as

Error Message -
"Exception in thread "Thread-0" java.lang.OutOfMemoryError: Java heap space..."

Then I increased JVM size: -Xmx4096M at that point it works.

 

After some time if I run the same job with some newly arrived excel I got an error which I have shown in screen shot 2

 

Screen shot 1: 0683p000009Lv7T.png

 

Screen shot 2:

0683p000009LuyY.png

Then If I run the same job after system restart it works fine.

 

What is my confusion is this issue is because of RAM specification or with Talend Tool?

 

Anonymous
Not applicable
Author

In that case, I think you need to look at your Job design and the size of your input data - columns, column lengths etc.

 

What are you doing with this data?

 

30k rows in itself does not sound very much.

Anonymous
Not applicable
Author

I'm supposed to load those records into staging table which is needed by another team.

So I should split excel file is the only solution!

Thank you! @tal00000