Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
Hello ,
Is there a way to move files from S3 to snowflake tables directly without moving the files to local drive?
Regards,
Gopi
Hi @Gopik ,
Sorry for the delayed response, i totally missed on this. Please see my job design and configuration of tsnowflakebulkexec compoenent. It is working fine without any inputs.
I have used tsnowflakerow(tdbrow_1 in image) component just to execute "commit" statement after bulkexec as tsnowflakeconnection does not provide auto commit option.
Please make sure that folder name you provide is the folder where you have placed file to be loaded in S3 bucket.
In advanced settings change the copy command option to manual if you want to specify other than default delimiter and modify other properties like gzip .
Please try configuring your component as above and let me know if you run into any issues.
Regards,
Pratheek Manjunath.
@Gopik, I believe the below link will answer to you.
https://support.snowflake.net/s/question/0D50Z00007ZB4eOSAT/loading-data-from-s3-using-talend
@manodwhb - I did go through that, howevever it was year back. I was looking if something Talend had come up during this course of time.
Hi,
Talend 7.1.1 version does provide bulk component from which you can load data directly from s3 to Snowflake. Please refer the help document link for the same.
https://help.talend.com/reader/_LlTNckRRxzJvsZX2F88sA/Sjlijhp46B~z39MRgtUaVw
Regards,
Pratheek Manunath
The bulk component will use the S3 Stage to move the data from On-Prem data to Snowflake. However if the data is already present in the S3 and i need to move from S3 to Snowflake.
Even in this scenario will this work?
Hi @Gopik ,
Yes, tSnowflakeBulkExec component is meant for that. If you already have a file in Amazon S3 and you want to move that data to Snowflake table, this component can be used. This component mimics the copy command of Snowflake. There will not be any on-premise files involved.
@Gopik ,tSnowflakeBulkExec component is available in enterprise version and not in open studio.
I am going to try it out and will confirm you. Btw tSnowflakeBulkExec and i know it has the stage options. In Job i do not need to use any Source for S3 bucket and give the bucket and S3 information directly in tSnowflakeBulkExec?
The tSnowflakeBulkExec components is looking for a input. when i use dummy input and configure the S3 details in tSnowflakeBulkExec it throws an error and the files are not getting loaded into snowflake.
Regards,
Gopi
Hi @Gopik ,
Sorry for the delayed response, i totally missed on this. Please see my job design and configuration of tsnowflakebulkexec compoenent. It is working fine without any inputs.
I have used tsnowflakerow(tdbrow_1 in image) component just to execute "commit" statement after bulkexec as tsnowflakeconnection does not provide auto commit option.
Please make sure that folder name you provide is the folder where you have placed file to be loaded in S3 bucket.
In advanced settings change the copy command option to manual if you want to specify other than default delimiter and modify other properties like gzip .
Please try configuring your component as above and let me know if you run into any issues.
Regards,
Pratheek Manjunath.