Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
I have created a replication task to replicate the data from source oracle DB to snowflake as part of my initial load.For some tables the load failed after loading some portion of the data from the table with below error.
Handling End of table 'schemaname'.'Tablename' loading failed by subtask 8 thread 1
Failed to put file '/opt/attunity/replicate/data/tasks/orcl_conn/cloud/94/LOAD00000005.csv', size 1559884194
Failed to send file /opt/attunity/replicate/data/tasks/orcl_conn/cloud/94/LOAD00000005.csv to Snowflake stage
RetCode: SQL_ERROR SqlState: HY000 NativeError: 40 Message: [Snowflake][Snowflake] (40)
Error encountered when executing file transfer: Failed to upload file /opt/attunity/replicate/data/tasks/orcl_conn/cloud/94/LOAD00000005.csv.
Failed (retcode -1) to execute statement: 'PUT 'file:///opt/attunity/replicate/data/tasks/orcl_conn/cloud/94/LOAD00000005.csv' @"snowflakedb"."PUBLIC"."ATTREP_IS_ABC_3c11ce5e_b751_4e72_833d_e64dae8e0389"/94/ AUTO_COMPRESS = TRUE SOURCE_COMPRESSION = NONE ;'
Trying to understand why this failed and how to resolve this
you need to be on latest snowflake ODBC 2.25.xx ,, and if you still have problem , best to open support ticket to isolate down if replicate issue or Snowflake ODBC issue.
Are you using the parallel load feature? If so, can you try without? I found a case where the issue was caused by the table partition segment.
Thanks,
Dana
@Dana_Baldwin yes, we've experienced this while using parallel load configured to use both partitions and data ranges. Since the table has 1.7B records, I don't think we can just do a straight load as that may take days.
@Steve_Nguyen all the documentation says to use 2.24.x. What was the change in 2.25.x that resolves the issue?
correct, doc show 2.24, but we do see issue with 2.24 at time, so best to be on 2.25.xx newer
For anyone that finds this, we were able to resolve this issue by changing a couple of parameters.
After making these changes, we were able to load 1.7b records (almost 2TB of data) into Snowflake in 2h30m.
Hello @AutomatedUser ,
Thank you so much for you sharing the experience! this is very valuable information for all users!
Best Regards,
John.
Good day can you open a Support case and we can have you enable Logging on the Endpoint and get the repsrv.log file with logging on the Server to help determine the issue? Also using the ODBC Manager and if you setup the Snowflake Target connection and can connect this is a good test to confirm the ODBC layer is connecting. Please let us know as per the below ensuring you are using the 64 Bit ODBC Snowflake Driver.
Note: If this still has an issue please report a new Support case.
Run ODBC 64 Bit Manager as Administrator
Test the Connection with the credentials and if needed make sure you are on your VPN is required.
The Snowflake Endpoint should match:
Thanks!
Bill
@Bill_Steinagle I've already gone through this with Qlik support. The case number was 00062390. There are no issues connecting to Snowflake from the Qlik server and we are using the 64-bit ODBC Snowflake driver, v2.24.0.