Do not input private or sensitive data. View Qlik Privacy & Cookie Policy.
Skip to main content

Announcements
Qlik and ServiceNow Partner to Bring Trusted Enterprise Context into AI-Powered Workflows. Learn More!
cancel
Showing results for 
Search instead for 
Did you mean: 
ankushd
Contributor
Contributor

Parquet performance optimization

HI All,

 

We have use case where we want to convert 17TB of data(csv files) into parquet. we are using EMR spark cluster for conversion. we have designed Big data job with tFileOutputParquet component to create the file. currently our job is taking long time to convert the files. did anyone achieved parquet conversion with alternate approach and optimized design? kindly share some inputs if known.

 

Thanks.

Labels (3)
0 Replies