Do not input private or sensitive data. View Qlik Privacy & Cookie Policy.
Skip to main content

Announcements
See why IDC MarketScape names Qlik a 2025 Leader! Read more
cancel
Showing results for 
Search instead for 
Did you mean: 
Anonymous
Not applicable

put into amazon s3 repeatedly

Hi There,

Hope someone can give me some pointer. I have a file directory containing thousands of files. I have a csv file that contains the name of the files that I need to upload to S3. This csv file changes on regular basis.
I tried to read the csv file using FileInputDelimited, and use tFlowToIterate where I specify "fileName" key and the value pointing to the cell in the csv file, and then iterate to tS3Put 
FileInputDelimited --(Connect)-- tFlowIterate --(Iterate)--tSPut
On tS3Put's File, I enter the file location path and ((String)globalMap.get("fileName"))
When I run it, i got a Null (repeatedly) as if the key is not getting the value.
Do I set the job design and component correctly? What mistake that I did?


Thanks for your help  in advance!

Labels (2)
2 Replies
Anonymous
Not applicable
Author

Hi,
What does your filename in .csv file look like?Is it a relative path? Could you please check out "Die on error" option on tS3put to see if there is any error message printed on console?
Your screenshots of job setting will be preferred.
Best regards
Sabrina
Anonymous
Not applicable
Author

Hi.  I have the same question with regard to uploading many files to an S3 bucket from my local hard drive.  My files are all csv formatted.  3 columns (id, filepath, ownerid).  The names of the files are all file[0-9]|[0-9][0-9].csv.  The path is C:\tmp.    My screenshot isn't uploading.  It's 3 components: tFileList_1 >iterate connector > tS3Connection > tS3Put_20.  I didn't know what to define my file key as in the Put component so I tried __FILE__.  This job runs in a second or two and nothing gets copied to my S3 bucket.  I don't get any errors.

Thank you for your help.

Cathy