Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
Hi All,
I am using talend bulk execution to load data from s3 to snowflake. Talend tFileArchive converts the file to gzip format , file.csv.gz and upload it to s3 bucket. Copy into which gets executed through talend bulk component looks like below. It does nt throw an error or something but does nt load data either. If I try to load csv file without zip , it works fine.
File: file.csv.gz
Copy into table
from 's3://bucket/'
credentials=(aws_key_id='' aws_secret_key='')
FILE_FORMAT=(type=csv compression=gzip field_delimeter=',' skip_header=1 field_optionally_enclosed_by='\"' empty_field_as_null=true)
force=true
Can someone point wheres the issue ? Even if I execute above command through snowflake UI , it says ran successfully but does not load. File has data.
if I keep csv file to same s3 bucket location, it picks up csv file with out an issue and loads it , it also picks csv.gz file but does bt load data , row_parsed =0 shows there,
Thank you
After researching we came to know that gzip files are empty , I am using tFileArchive component to gzip the file.
Options are below:
SourceFile:"file.csv"
DestinationFile:"file.csv.gz"
ArchiveMethod:gzip
compression level : normal ,
Can someone point me why file is gziping blank ?
Hello,
Is there any error message printed on console? We will appreciate it a lot if you could post your job design screenshots here.
Best regards
Sabrina
Sorry for being late, there was an issue with logic hence it was creating zip file with header only, hence it was not loading any information , Thank you for getting back to me ,