Do not input private or sensitive data. View Qlik Privacy & Cookie Policy.
Skip to main content

Announcements
Join us in Bucharest on Sept 18th for Qlik's AI Reality Tour! Register Now
cancel
Showing results for 
Search instead for 
Did you mean: 
AKushnapalli
Contributor
Contributor

How to load 500+ Million Records from SQL Server to Snowflake

Hi..

How to load 500+ Million Records from SQL Server to Snowflake using Talend.

Currently using tSnowflakeOutputBulkExec component to store the data locally, but it's struckked loading into files after fetching 14+ Million records from SQL Server.

Please see Attached Screenshots:

  1. Talend Job Design - Includes Basic Settings of

    tDBOutputBulkExec

  2. tDBOutputBulkExec Advanced setting

Any help is greatly appreciated. 

Thank you.

Anil

Labels (5)
1 Reply
David_Beaty
Specialist
Specialist

Hi,

 

I would find someway of splitting the incoming data into chunks and loading up in manageable chunks, using some kind of partitioning key (a date is a good example). This would also allow you to build in some recoverability into the process if, as you've already found, the job hangs part way through.

 

Thanks

 

David

 

If you find these answers helpful, don't forget to Like and/or set as the answer