Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
Hi everyone,
My requirement is to load excel data into SQL server table.In excel file there are 38000 records there when I run the job I got the following error
There is insufficient memory for the Java Runtime Environment to continue.
# Native memory allocation (malloc) failed to allocate 1058864 bytes for Chunk::new
[thread 4800 also had an error]
# An error report file with more information is saved as:
# E:\Mathi\Dowloads\TOS_DI-20161216_1026-V6.3.1\hs_err_pid700.log
#
# Compiler replay data is saved as:
# E:\Mathi\Dowloads\TOS_DI-20161216_1026-V6.3.1\replay_pid700.log
Can someone please help me resolve this issues?
Thanks in advance
You're not running a job, you're defining metadata. Do this with your cut down file. When you use this metadata in a job with you're full file you should be ok. Talend prefers smaller files when sampling for metadata creation.
Hi,
@tal00000 Ya I did cut down source file. FYI I have retrieved excel file with the 4Lakhs record, recently Couldn't retrieve even with 30,000 records.
While running my job I'm getting errors like below
Is this because Talend occupies more memory.
My system properties are Windows 10 64bit.
Sometimes Job works fine when I hardcode -Xmx4096M.
Can anyone give me the permanent solution?
You are not being clear with your issue.
Your screen shot is of metadata definition. That is not running a Job.
If you need more memory for your data, -Xmx is the correct way or, of course, review your design to use less memory. You can increase this either for the Design Tool, or for your running Job.
Why do you need to sample large data volumes for metadata definition?
Hi,
Very first time When I retrieve excel file with 30K records I got that error "out of memory" which I have shown In the first screenshot.Later I have divided excel file into three files with each 10k Records, then I can able to retrieve it but while running the job it throws an error as
Error Message -
"Exception in thread "Thread-0" java.lang.OutOfMemoryError: Java heap space..."
Then I increased JVM size: -Xmx4096M at that point it works.
After some time if I run the same job with some newly arrived excel I got an error which I have shown in screen shot 2
Screen shot 1:
Screen shot 2:
Then If I run the same job after system restart it works fine.
What is my confusion is this issue is because of RAM specification or with Talend Tool?
In that case, I think you need to look at your Job design and the size of your input data - columns, column lengths etc.
What are you doing with this data?
30k rows in itself does not sound very much.
I'm supposed to load those records into staging table which is needed by another team.
So I should split excel file is the only solution!
Thank you! @tal00000