Skip to main content
Announcements
Introducing a new Enhanced File Management feature in Qlik Cloud! GET THE DETAILS!
cancel
Showing results for 
Search instead for 
Did you mean: 
rommy
Contributor III
Contributor III

How to share S3 and Snwoflake Connection between parent and Child Jobs ?

Hi All,

I know we can share register and share DB connection between parent and child, but same way I want to share S3 and Snwoflake connection , I am not seeing options to register and share ? is there any other way around to do so ?

Thank you !

Labels (5)
6 Replies
JGatti1595849770
Contributor
Contributor

Check out the detailed guide here: Talend Documentation

Hope this helps! Cyberflix Tv

rommy
Contributor III
Contributor III
Author

Thank you for your response, I referred to this guide earlier and it worked for other RDBMS engines, I am looking for guidance on s3 and Snwoflake as I do not see "Register and share DB Connection" option there to share.

 

Has anyone done it in past ?

 

Thank you !

JohnRMK
Creator II
Creator II

Hello,

 

This is normal for snowflake !

 

it reproduces the same pattern as the worksheets in the UI

 

There is a session id for each request group and cannot be shared

 

However, you can still use the JDBC connection and not the native connector. in this case, it is possible to save the connection

 

rommy
Contributor III
Contributor III
Author

Thank you for your response @Belaid Momo​ , If I use native snowflake connection , does it make sense with open and close connection component part of pre and post jobs based on your logic ?

 

What about s3 ? S3 has close component as well .

JohnRMK
Creator II
Creator II

Hello,

 

Yes it is quite logical to have a pre-job (opening of connections and must not forget to turn on the datawarhouse) and a post job to close the connection or / and commit and above all turn off your datawarhouse so that it not consumes more resources.

 

In child jobs, you can have the same thing without touching the dwh

 

For S3, I admit that I did not really work on it. But know that there is a very documented java sdk that you can integrate as java routines if ever the components are insufficient.

 

And you can even use SnowSQL or scripts to insert or upload files and you can use python in a powershell and call it with tSystem

rommy
Contributor III
Contributor III
Author

Thank you again @Belaid Momo​  , I have many jobs calling each other and all have snowflake connections, I guess I will have to use open and close in each job , I wont be able to share like other RDBMS.