Do not input private or sensitive data. View Qlik Privacy & Cookie Policy.
Skip to main content

Announcements
Join us in Bucharest on Sept 18th for Qlik's AI Reality Tour! Register Now
cancel
Showing results for 
Search instead for 
Did you mean: 
Anonymous
Not applicable

How to load Data from Oracle DB Tables into AMazon S3 using Talend?

HI Experts,
Basically i am new to Talend ETL Tool.
I have a requirement that i need to load the data from Oracle DB Tables into Amazon S3 using Talend D.I Tool.
I have seen the things we can load the files into Amazon S3 but how about the data loading data from Oracle DB Tables into Amazon S3 in talend etl tool.
Can anyone suggest me for the above query.
Appreciate for your help!!!!

Thansk & Regards,
Sudheer Reddy
Labels (3)
20 Replies
Anonymous
Not applicable
Author

Hi,
For your requirement, the workflow should be: tOracleInput-->tFileOutPutDelimited-->onComponentOk--> tS3Put.
Give a variable file path ((String)globalMap.get("tFileOutputDelimited_1_FILE_NAME")) in tS3Put which save oracle DB table data and will be uploaded to the S3 server.
See my screenshot
Best regards
Sabrina
0683p000009MC5Z.png
Anonymous
Not applicable
Author

Hi,
Thanks For the reply.
Can you please let me know, what is ((String)globalMap.get("tFileOutputDelimited_1_FILE_NAME")) this mean?
And using this in File name option instead of the full path of the filename?

Thanks & Regards,
Sudheer Reddy
Anonymous
Not applicable
Author

Hi,
((String)globalMap.get("tFileOutputDelimited_1_FILE_NAME")) is a existed global variable which retains filename/path from tFileOutputDelimited component.(In this case, it should be like D:/Talend-Studio-r104014-V5.3.1/workspace/out.csv).
Please see related document about TalendHelpCenter:How to centralize contexts and variables.
Best regards
Sabrina
Anonymous
Not applicable
Author

Hi xdshi,
I'm new on Talend and I would like to query a Big Oracle Table by using partition name with variable
How I can do that.
thanks in advance.
Anonymous
Not applicable
Author

Hi dabdougadry,
Please open a new forum for your topic.
It would be way more efficient if you were opening a new thread because people having the same issue could then find it more easily.
Best regards
Sabrina
Anonymous
Not applicable
Author

hi,

I want to fetch data from mysql table and insert it into vertica database with following conditions
* the job should be scheduled at certain interval
* the duplication of data should not happen(loading same data twice)
* how to keep track of last loaded data
Anonymous
Not applicable
Author

Hi,
the job should be scheduled at certain interval

The scheduler tool is only available in Talend Enterprise subscription product. With Talend Open Studio for Data Integration, you can export the job script, and then schedule it with third scheduler tool, such as crontab command on Linux(window scheduler).
the duplication of data should not happen(loading same data twice)

Check the key box of the corresponding columns on the schema of tVerticaOutput, and select 'insert or update' option in the action on data list, the job will insert the new records if the key does not exist in the target table, otherwise, update the record. tVerticaOutput.
how to keep track of last loaded data

Please have a look at component TalendHelpCenter:tMysqlLastInsertId.
Best regards
Sabrina
Anonymous
Not applicable
Author

hi
Regarding the above topic ,is db to db faster or db to file to db works good.
Anonymous
Not applicable
Author

Hi,
Regarding the above topic ,is db to db faster or db to file to db works good.

Usually, it depends on your situation.
The followings aspects could affect the job performance:
1. The volume of data, read a large of data set, the performance will degrade
2. The structure of data, if there are so many columns on t<DB>Input, it will consume many memory and much time for transferring the data during the job execution.
3. The database connection, the job always runs better if the database is installed on local, if the database is on another machine, even you are on VPN, you may have the congestion and latency issues.

Best regards
Sabrina