Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
Hello all
Am extracting data from snowflake, there are 150K rows to be extracted and and load into oracle Db, but it is taking nearly 3 hours to get work done. Can anybody give an optimized solution to improve the speed
Thanks In Advance
Manish
Hi Manish,
Are you facing issue while extracting data from Snowflake or while writing data to Oracle?
Could you please write the data to a file instead of Oracle DB and check the speed?
Warm Regards,
Nikhil Thampi
Please appreciate our Talend community members by giving Kudos for sharing their time for your query. If your query is answered, please mark the topic as resolved
The native snowflake components doesn't provide much of an option to tweak the reading and writing terms. You could try your hands out and measure the performance by using jdbc components.
tJDBCInput - this could give you the option to use cursor size. tJDBCRow component could be used to process specific SQL statements against a JDBC Snowflake connection.
Or if you are in higher version (7.x) of Talend could use the feature of bulk load by using tSnowflakeOutputBulk- which writes a file with data to an internal Snowflake storage or other storage including S3 and Azure.