Do not input private or sensitive data. View Qlik Privacy & Cookie Policy.
Skip to main content

Announcements
Qlik GA: Multivariate Time Series in Qlik Predict: Get Details
cancel
Showing results for 
Search instead for 
Did you mean: 
Anonymous
Not applicable

when try to read large CSV files out of memory issue coming

I have 4 csv files which contains 5,00,000 records when i try to read as metadata i am getting error of out of memory. I have i7 processor with 16gb RAM. 

Labels (3)
1 Solution

Accepted Solutions
Anonymous
Not applicable
Author

I solve the issue by applying the below option 

 

0683p000009M5j4.png

View solution in original post

7 Replies
Anonymous
Not applicable
Author

Hi @sandeephakki ,

 

Can you try increasing memory allocated to studio by modifying Xms and Xmx values based on the below article and let us know if it helps.

 

https://community.talend.com/t5/Migration-Configuration-and/Allocating-more-memory-to-Talend-Studio/... 

Anonymous
Not applicable
Author

Hi,

I tried all these options but it's not working.

0683p000009M605.png

Anonymous
Not applicable
Author

Hi,

 

    Seems you have memory issues. Why don't you add a tMap where you can use all columns as pass through field. But make sure that you are enabling disk usage for processing data. This will help you since are writing data temporarily to the disk.

 

https://help.talend.com/reader/EJfmjmfWqXUp5sadUwoGBA/J4xg5kxhK1afr7i7rFA65w

 

Warm Regards,
Nikhil Thampi

Please appreciate our Talend community members by giving Kudos for sharing their time for your query. If your query is answered, please mark the topic as resolved 🙂

Anonymous
Not applicable
Author

Hi,

 

Are you trying to create a repository metadata after reading the large file?

 

In that case you can put some sample rows of that file to a new file and create the metadata.

 

Thanks and regards,

Subhadip

Anonymous
Not applicable
Author

Hi Subhadip,

 I tried with 500 rows from each sheet data to create metadata, it's working fine. able to create a build of it and replacing the 500 row record data with large one and run the bat file it's failing .

Anonymous
Not applicable
Author

Hi,

 

In the bat file you can update the Xms and Xmx to -Xms1024M and -Xmx4096M and run the job once.

 

Thanks and Regards,

Subhadip

Anonymous
Not applicable
Author

I solve the issue by applying the below option 

 

0683p000009M5j4.png