Skip to main content
Woohoo! Qlik Community has won “Best in Class Community” in the 2024 Khoros Kudos awards!
Announcements
Nov. 20th, Qlik Insider - Lakehouses: Driving the Future of Data & AI - PICK A SESSION
cancel
Showing results for 
Search instead for 
Did you mean: 
JackStrong
Contributor II
Contributor II

[AWS S3] How to put/upload thousands of files into bucket

Hi All.

Can anyone help me and propose what is the best implementation (from performance point of view) to upload many thousands of files into S3 bucket (let's say 100 000 json files; files are small, max 1 KB per file)?

I implemented some solution based on the tS3Put component but it takes a lot of time (many, many hours) to complete. My tests showed that tS3Put component (generally speaking all tS3* components) is not a good choice for processing many files. The reason is because there is needed to add iteration link and the files are uploaded one by one.

I have to prepare some solution where whole upload process will take a few minutes.

Do you now if there is some different approach (more suitable) in case of my scenario?

Regards,

Labels (3)
1 Reply
Anonymous
Not applicable

Hi

tS3Put upload the file one by one, check the 'Enable parallel execution' box on iteration link might improve the performance a bit. Or you can try other S3 client tool, as mentioned in this page.

If you can execute command to upload the files, then use tSystem to run the command in Talend.

 

Regards

Shong