Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
Hello,
I am trying to perform Bulk load from Snowflake External stage to Snowflake table.
Snowflake external stage refers to files from S3 bucket.
Attaching the Talend Flow with component properties.
Note: "S3_BUCK_STAGE" is external stage created.
While running the code through talend Studio, i am getting "Column not found" error. Screenshot is attached.
More information about code:
1)S3 file structure and snowflake table structure and its column names are same.
2)CSV file is staged from S3 in snowflake
3)All the column data types in table are String type
Could you please help me with the issue.
Please let me know for any more details.
Thank you
Hi,
Could you please double check whether all the column names have been given with Capital Letters?
Warm Regards,
Nikhil Thampi
Please appreciate our Talend community members by giving Kudos for sharing their time for your query. If your query is answered, please mark the topic as resolved
Hello,
Yes the column names are in Capital letters both in File and Table.
The order of the columns are also same.
Hi,
Could you please try to load with default copy command options and see whether its working? It could be because the separator symbol from file is not getting recognized.
Could you please also try to load some dummy data using tDBOutput component just to make sure that target column details in the schema are correct?
Warm Regards,
Nikhil Thampi
Please appreciate our Talend community members by giving Kudos for sharing their time for your query. If your query is answered, please mark the topic as resolved
Hello,
Thank you for your inputs.
I tried doing a one to one load(50 rows) from S3 file to Snowflake table and it worked fine without any errors.
Regarding Bulk load using talend, i observed the COPY command executed in snowflake history as:
"copy into "WORLD_DATE_20190808_005100_716" from '@~/@s3_buck_stage/' ON_ERROR='continue' FILE_FORMAT=(type=csv field_delimiter=',' compression=gzip)"
My requirement is to load data from External stage(Which points to AWS S3 bucket) created in Snowflake to snowflake table.
But Storage option in tDBBULKEXEC component has "Internal" option only.
How can we load data from external stage(AWS S3 files staged) to snowflake table?
Hello,
Thank you for your inputs.
I tried doing a one to one load(50 rows) from S3 file to Snowflake table and it worked fine without any errors.
Regarding Bulk load using talend, i observed the COPY command executed in snowflake history as:
"copy into "WORLD_DATE_20190808_005100_716" from '@~/@s3_buck_stage/' ON_ERROR='continue' FILE_FORMAT=(type=csv field_delimiter=',' compression=gzip)"
My requirement is to load data from External stage(Which points to AWS S3 bucket) created in Snowflake to snowflake table.
But Storage option in tDBBULKEXEC component has "Internal" option only.
How can we bulk load data from external stage(AWS S3 files staged) to snowflake table?
Hi,
Could you please try below option?
Warm Regards,
Nikhil Thampi
Please appreciate our Talend community members by giving Kudos for sharing their time for your query. If your query is answered, please mark the topic as resolved
Hello,
Let me try and get back to you the queries Snowflake fires in background.
Thanks
Regards,
Rohit
Hi Rohit,
Once you get the details, please update the forum so that it will help others in our Talend community.
Warm Regards,
Nikhil Thampi
Please appreciate our Talend community members by giving Kudos for sharing their time for your query. If your query is answered, please mark the topic as resolved
Hi Nikhil,
Definitely.
Regards,
Rohit