Do not input private or sensitive data. View Qlik Privacy & Cookie Policy.
Skip to main content

Announcements
Write Table now available in Qlik Cloud Analytics: Read Blog
cancel
Showing results for 
Search instead for 
Did you mean: 
Anonymous
Not applicable

copy data from oracle to paraccel databases

hi all,
I am trying to see if talend can help me move data from oracle to paraccel (postgres).
Say I have 1000 tables.
So far as I can see it, I can create a "process" to move each table. This would be something like:
tOracleInput, tFileOutputDelimited and tParAccelBulkExec
Works great! HOWEVER...
to set this up for 1000 tables, even if need to modify table name - looks ugly.
Question to you Talend gurus: is there a way to have a list of tables and then for each table perform the above process(extract, save file, bulk load)?
Thanks for your help/
let me know if to answer this question, you need more info (system, etc)
thankS!
Labels (2)
14 Replies
Anonymous
Not applicable
Author

It is available since TOS 3.0.x with Java code generation.
You could also do a select on the system tables and iterate over the result:
tOracleInput --(row)--> tFlowToIterate --(iterate)--> ...
But from my point of view this will not help you out at all. There are two solutions (less or more without Talend Open Studio):
- export and load the data for each table (as Christophe described)
- use a select statement which will return an insert statement (string concatenation) and execute it against the destination database.
- using a remote connection and work with "INSERT INTO SELECT FROM"
Bye
Volker
Anonymous
Not applicable
Author

Volker,
What about made Two JOBS with ELT+SQLpattern and BulkExec usage.
- First Job : Iterate extraction on every Oracle tables.
To do that : tOracleTableList ---iterate---> tELT component (table = GlobalVariable CURRENT table return by OracleTableList); then write a SQLPattern to extract and load data like a fastExport provided by SQLLoader. It generates a CSV/Delimited file for each tables.

- Second Job : Iterate on Delimited Bulk file and call a tParAccelBulkExec component (PARRALEL mode to increase performance activated)
To do that : tFileList ---iterate---> tParAccelBulkExec (globalVariable from tFileList for Bulk filepath property).

Another way is :
- First Job : tOracleTableList ---iterate---> tSystem or tSSH or tStoredProcedure
- Second Job : tFileList ---iterate---> tSystem or tSSH or tStoredProcedure
I agree that it's less integrated with Talend.
But probably to involve Talend usage; you can add tWarn, tFlowMeter for your Monitoring, or define FailOver or RollBack condition for your business rule or workflow constraint.
Best regards.
Anonymous
Not applicable
Author

Hi Christophe,
yes, thats why I love Talend, you could do much more than just simple use data transformation. And yes your are able to add some "basic infrastructure" components. My point was that this is not "real" Talend like with dynamic metadata for examples (as ask in the initial post).
Last week I thought about writing a podcatcher job to feed my mp3 player. This would be a good example to show how flexible you are with Talend 😉 (If I get some time and it will work I should post about it ...)
Bye
Volker
Anonymous
Not applicable
Author

.....
then write a SQLPattern to extract and load data like a fastExport provided by SQLLoader. It generates a CSV/Delimited file for each tables.
.......

Could you please post an example of the SQL template/pattern?
thank you very much
Anonymous
Not applicable
Author

hi everybody
could you post the solution, i have the same problem and this could help me.
sorry my english i am learning now.