Do not input private or sensitive data. View Qlik Privacy & Cookie Policy.
Skip to main content

Announcements
Join us in Toronto Sept 9th for Qlik's AI Reality Tour! Register Now
cancel
Showing results for 
Search instead for 
Did you mean: 
Anonymous
Not applicable

S3 to Snowflake directly

Hello ,

 

Is there a way to move files from S3 to snowflake tables directly without moving the files to local drive?

 

Regards,

Gopi

Labels (3)
1 Solution

Accepted Solutions
Anonymous
Not applicable
Author

Hi @Gopik ,

 

Sorry for the delayed response, i totally missed on this. Please see my job design and configuration of tsnowflakebulkexec compoenent. It is working fine without any inputs.

 

0683p000009M5xP.png

 

I have used tsnowflakerow(tdbrow_1 in image) component just to execute "commit" statement after bulkexec as tsnowflakeconnection does not provide auto commit option.

 

0683p000009M5yO.png

 

Please make sure that folder name you provide is the folder where you have placed file to be loaded in S3 bucket.

 

0683p000009M5oe.png

In advanced settings change the copy command option to manual if you want to specify other than default delimiter and modify other properties like gzip .

 

Please try configuring your component as above and let me know if you run into any issues.

 

Regards,

Pratheek Manjunath.

View solution in original post

10 Replies
manodwhb
Champion II
Champion II

Anonymous
Not applicable
Author

@manodwhb - I did go through that, howevever it was year back. I was looking if something Talend had come up during this course of time.

Anonymous
Not applicable
Author

Hi,

Talend 7.1.1 version does provide bulk component from which you can load data directly from s3 to Snowflake. Please refer the help document link for the same.

 

https://help.talend.com/reader/_LlTNckRRxzJvsZX2F88sA/Sjlijhp46B~z39MRgtUaVw

 

Regards,

Pratheek Manunath

Anonymous
Not applicable
Author

@groupproductmanagement 

 

The bulk component will use the S3 Stage to move the data from On-Prem data to Snowflake. However if the data is already present in the S3 and i need to move from S3 to Snowflake.

 

Even in this scenario will this work?

Anonymous
Not applicable
Author

Hi @Gopik ,

 

Yes, tSnowflakeBulkExec component is meant for that. If you already have a file in Amazon S3 and you want to move that data to Snowflake table, this component can be used. This component mimics the copy command of Snowflake. There will not be any on-premise files involved.

manodwhb
Champion II
Champion II

@Gopik ,tSnowflakeBulkExec  component is available in enterprise version and not in open studio.

Anonymous
Not applicable
Author

@groupproductmanagement 

 

I am going to try it out and will confirm you. Btw tSnowflakeBulkExec and i know it has the stage options.  In Job i do not need to use any Source for S3 bucket and give the bucket and S3 information directly in tSnowflakeBulkExec?

Anonymous
Not applicable
Author

@groupproductmanagement 

 

The tSnowflakeBulkExec  components is looking for a input. when i use dummy input  and configure the S3 details in tSnowflakeBulkExec  it throws an error and the files are not getting loaded into snowflake.

 

Regards,

Gopi

Anonymous
Not applicable
Author

Hi @Gopik ,

 

Sorry for the delayed response, i totally missed on this. Please see my job design and configuration of tsnowflakebulkexec compoenent. It is working fine without any inputs.

 

0683p000009M5xP.png

 

I have used tsnowflakerow(tdbrow_1 in image) component just to execute "commit" statement after bulkexec as tsnowflakeconnection does not provide auto commit option.

 

0683p000009M5yO.png

 

Please make sure that folder name you provide is the folder where you have placed file to be loaded in S3 bucket.

 

0683p000009M5oe.png

In advanced settings change the copy command option to manual if you want to specify other than default delimiter and modify other properties like gzip .

 

Please try configuring your component as above and let me know if you run into any issues.

 

Regards,

Pratheek Manjunath.