
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Memory error while dealing with lot of excel files
Hi,
Excuses in advence for my English. I'll try my best.
I need help and i tried lot ideas but my job still cannot work.
Context.
I have about 600 excel files.
Each files contains data (~40000 rows) for a supermarket, and a city. So I have 600 supermarket and 15 city.
My problem it's that I have to group those files in 16 files. 1 file per city, with one sheet per supermarket.
And I'm facing memory error due to the amount of data I have to read or to put in the same workbook.
Tried with the tfileexcel component, but after 6 files...i get the memory error.
I cant' use the memory saving mode cauz in the tfileexcelsheetout, I need to use the "create sheet as copy option".
Hope I was clear and understanding.
Thanks
Accepted Solutions

- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Please try to add below jvm parameters in your job.
Goto Run -> Advance Setting -> Add JVM Parameter
1)-Xms4096M
2)-Xmx4096M
3)-XX:-UseGCOverheadLimit
Let me know if this helps or not.

- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hello
Take a look at this article and try to optimize the job design and allocate more memory to the job execution.
Let me know if you could fix the error.
Regards
Shong

- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Could you share the error message ?

- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi,
here this is my JOB Design.The name of cities and supermarket are on the name of my files.
So i create an excel file to list all oh the cities.
This first subjob is to deal with the 600 files city by city.
This is the second subjob wich gonna merge my excel files with a sheet per supermatket.
I keep having "Native memory allocation (malloc) failed to allocate 4088 bytes for AllocateHeap", "java.lang.Exception: createCopy from source failed:GC overhead limit exceeded" or "java.lang.OutOfMemoryError: Java heap space".

- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Please try to add below jvm parameters in your job.
Goto Run -> Advance Setting -> Add JVM Parameter
1)-Xms4096M
2)-Xmx4096M
3)-XX:-UseGCOverheadLimit
Let me know if this helps or not.

- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
i tried with 3 files, and it did worked.
Let me try on 10 than 50.
Thanks

- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
yes sure, let me know

- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Thank you so much. It worked.

- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Great you can select it as the solution.
