Do not input private or sensitive data. View Qlik Privacy & Cookie Policy.
Skip to main content

Announcements
Join us in Bucharest on Sept 18th for Qlik's AI Reality Tour! Register Now
cancel
Showing results for 
Search instead for 
Did you mean: 
Anonymous
Not applicable

moving data from Amazon S3 bucket to postgres DB on RDS

My current pipeline consists of tS3_Connection --> tS3_Get --> tFileInputDelimited --> tMap. When I run the job, the data is successfully pulled from my S3 bucket and stored in a local file on my computer; however, when I try to use tFileInputDelimited to pull the data into tMap for transformation, it is all empty. Do I need to use a different component for this process? I've checked all of my settings and they appear to be in order.

0683p000009M7G2.png

 

Labels (4)
6 Replies
Anonymous
Not applicable
Author

Please post some screenshots of your job / component config

Anonymous
Not applicable
Author

Updated my post
manodwhb
Champion II
Champion II

@SAI_Chief ,what type of file is it ?

manodwhb
Champion II
Champion II

@SAI_Chief ,have you defined,schema in the tFileInputDelimitted? i think you have not defined 

Anonymous
Not applicable
Author

It is a csv file. Is there a way to define the schema without having to manually type in all of the column names? Especially with csv files with 40-50 columns?

manodwhb
Champion II
Champion II

@SAI_Chief ,if your using Licensed Talend ,you can try with Dynamic data type. If your using Open stiudio,you need to define schema.