Skip to main content
Announcements
Introducing a new Enhanced File Management feature in Qlik Cloud! GET THE DETAILS!
cancel
Showing results for 
Search instead for 
Did you mean: 
Anonymous
Not applicable

tSnowflakeinput

Hello all

 

Am extracting data from snowflake, there are 150K rows to be extracted and and load into oracle Db, but it is taking nearly 3 hours to get work done. Can anybody give an optimized solution to improve the speed 

 

Thanks In Advance 

Manish

Labels (3)
2 Replies
Anonymous
Not applicable
Author

Hi Manish,

 

    Are you facing issue while extracting data from Snowflake or while writing data to Oracle?

 

    Could you please write the data to a file instead of Oracle DB and check the speed?

 

Warm Regards,
Nikhil Thampi

Please appreciate our Talend community members by giving Kudos for sharing their time for your query. If your query is answered, please mark the topic as resolved

iamabhishek
Creator III
Creator III

The native snowflake components doesn't provide much of an option to tweak the reading and writing terms. You could try your hands out and measure the performance by using jdbc components. 
tJDBCInput - this could give you the option to use cursor size. tJDBCRow component could be used to process specific SQL statements against a JDBC Snowflake connection.
Or if you are in higher version (7.x) of Talend could use the feature of bulk load by using tSnowflakeOutputBulk- which writes a file with data to an internal Snowflake storage or other storage including S3 and Azure.