Do not input private or sensitive data. View Qlik Privacy & Cookie Policy.
Skip to main content

Announcements
Join us in NYC Sept 4th for Qlik's AI Reality Tour! Register Now
cancel
Showing results for 
Search instead for 
Did you mean: 
vvazza10
Contributor III
Contributor III

Unloading from Snowflake to AWS S3 using Talend

Hello guys ! I am trying to unload data from a Snowflake table and load to AWS S3 external stage using COPY command. I tried using tDBInput component and entered the COPY command onto Full SQL query string field. But the value seem to be NULL while running the job - tDBInput_1 null

Please advise how we can unload data from Snowflake table and load to AWS S3 external stage.

Labels (6)
3 Replies
Anonymous
Not applicable

Hello,

Are you able to read data from a Snowflake table into the data flow of your Job based on your SQL query by using tSnowflakeInput component?

Best regards

Sabrina

 

 

vvazza10
Contributor III
Contributor III
Author

Yes, I am able to read the data. The execution fails when I use Full SQL Query String: "COPY INTO @my_ext_unload_stage/table.txt.gz 

FROM table

single=true

max_file_size=4900000000;"

 0693p000009RvDiAAK.jpg

 

JohnRMK
Creator II
Creator II

Hello,

 

tDBInput in talend is a data read component (input) and therefore the executed query must return data.

 

What you're trying to do is run a function to transfer data from Snowflake to Amazon and you don't read data (you're not actually getting data localy)

 

You have to use the tDBRow component to execute your query and it should work.

Dont forget the commit/Rollback op