Skip to main content
Announcements
Introducing Qlik Answers: A plug-and-play, Generative AI powered RAG solution. READ ALL ABOUT IT!
cancel
Showing results for 
Search instead for 
Did you mean: 
CBIJOU1610318143
Contributor
Contributor

Memory error while dealing with lot of excel files

Hi,

Excuses in advence for my English. I'll try my best.

I need help and i tried lot ideas but my job still cannot work.

Context.

I have about 600 excel files.

Each files contains data (~40000 rows) for a supermarket, and a city. So I have 600 supermarket and 15 city.

My problem it's that I have to group those files in 16 files. 1 file per city, with one sheet per supermarket.

And I'm facing memory error due to the amount of data I have to read or to put in the same workbook.

​ Tried with the tfileexcel component, but after 6 files...i get the memory error.

I cant' use the memory saving mode cauz in the tfileexcelsheetout, I need to use the "create sheet as copy option".

Hope I was clear and understanding.

Thanks

Labels (2)
1 Solution

Accepted Solutions
Prakhar1
Creator III
Creator III

Please try to add below jvm parameters in your job.

Goto Run -> Advance Setting -> Add JVM Parameter

1)-Xms4096M

2)-Xmx4096M

3)-XX:-UseGCOverheadLimit

 

Let me know if this helps or not.

View solution in original post

8 Replies
Anonymous
Not applicable

Hello

Take a look at this article and try to optimize the job design and allocate more memory to the job execution.

 

Let me know if you could fix the error.

 

Regards

Shong

Prakhar1
Creator III
Creator III

Could you share the error message ?

CBIJOU1610318143
Contributor
Contributor
Author

Hi,

here this is my JOB Design.0693p00000BBNM8AAP.pngThe name of cities and supermarket are on the name of my files.

So i create an excel file to list all oh the cities.

This first subjob is to deal with the 600 files city by city.

 

0693p00000BBNR3AAP.pngThis is the second subjob wich gonna merge my excel files with a sheet per supermatket.

 

I keep having "Native memory allocation (malloc) failed to allocate 4088 bytes for AllocateHeap", "java.lang.Exception: createCopy from source failed:GC overhead limit exceeded" or "java.lang.OutOfMemoryError: Java heap space".

 

 

 

Prakhar1
Creator III
Creator III

Please try to add below jvm parameters in your job.

Goto Run -> Advance Setting -> Add JVM Parameter

1)-Xms4096M

2)-Xmx4096M

3)-XX:-UseGCOverheadLimit

 

Let me know if this helps or not.

CBIJOU1610318143
Contributor
Contributor
Author

i tried with 3 files, and it did worked.

 

Let me try on 10 than 50.

 

Thanks

Prakhar1
Creator III
Creator III

yes sure, let me know

CBIJOU1610318143
Contributor
Contributor
Author

Thank you so much. It worked.

 

 

Prakhar1
Creator III
Creator III

Great you can select it as the solution.