Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
Hi
I'm having a input text file of 200 mb more or less with one single line data
And that input file contains a header...that header can exist at any point of the file...first position, middle position or at last
So due to memory issue can occur...I cant load whole file into talend in once
so is there any way I could read chunks of data from file and load into talend then match with my lookup file
and chunks will keep moving into talend till I dont find my header
once I find...next chunk should not come...!!
Is it fine If I use tfileinputraw component to read whole file(200 to 300 MB) in one go with enable input stream editor and then convert that object type to string and use that string with my lookups ?
Memory issue can occur with this much of file size having single line containing all data in talend ?