Skip to main content
Announcements
Introducing a new Enhanced File Management feature in Qlik Cloud! GET THE DETAILS!
cancel
Showing results for 
Search instead for 
Did you mean: 
Sriram151515
Contributor
Contributor

Split Output JSON files based on size

Hello,

I am quite new to the usage of Talend. I am trying to load large volume of data from Snowflake DB into Talend for conversion to JSON using Talend Data mapper in a specified format to be loaded to the downstream systems. The catch here is the output file size should not exceed 15MB. Hence need to split the output based on the size. Could you please guide how this could be implemented.

Thanks

3 Replies
Anonymous
Not applicable

Hello,

Could you tell us how large this file is likely to be?

Is it possible to use tFileInputRaw to load it entirely in memory? The work flow should be: tFileInputRaw --> tExtractJSONFields.

You could try a route using cFile and streaming Splitter via Camel solution.

Best regards

Sabrina

Sriram151515
Contributor
Contributor
Author

Hi Sabrina,

The overall file size is in 10 to 15 GB. But acceptable for the downstream system is only 15MB max. Could you please help with an example or some screenshots of the components.

MdFazil
Partner - Contributor III
Partner - Contributor III

hello Sriram,

Do you get any solutions I too currently facing same problem. If there any solutions please let me know

Thanks in advance

regards

Fazil M