Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
Hi,
We have a requirement where we need to export and import oracle data by using talend. and talend should create dump file or any other file which can be imported by talend only into oracle database.
Can we achieve this in talend?
If yes, then please provide the steps to implement this.
Regards,
Vivek
Hi Joe,
Thanks.
Toracleinput option looks more suitable for us.Following are the query.
1.What is the appropriate value of cursor size in Toracleinput component? We use cursor size parameter value is 10000.
2.Our code design is Toracleoutput -> TfileoutputDelimited -> tmap -> Toracleoutput .We need to improve performance of these objects to read and load data into db.What can we do to improve performance.
Regards,
Vivek
You can do what you want and more with Talend DI!
For your case, you can use tOracleInput to read tables content then tFileOutputDelimited to export this content to a CSV file.
You can then use tFileInputDelimited to read CSV file and tOracleOutput to import the content into Oracle tables.
There is also some options to bulkify the operations.
Finally you can also use tSystem to call external tools such as SQL*Loader.
So, many choices, the best depends of your constraints.
Hi Vivek,
Yes, you can do in many ways.
There is an component called tOracleOutputBulkExec, if you connect tOracleInput --> tOracleOutputBulkExec
At first, it will fetch the data from OracleInput table and write it into CSV file.(Path will be specified in tOracleOutputBulkExec component)
Then, it will automatically load the data from the CSV file to Target table mentioned in tOracleOutputBulkExec component.
You can use cursor in source component to fetch the data faster from the source Table.
Also, you can use the below load options in advanced settings of tOracleOutputBulkExec compoents to load faster.
Thanks,
Joe
<Dont forget to give kudos /Solution resolved as I am using my time for your issue>.
Hi,
Is tOracleOutputBulkExec uses sqlldr to load data ? Because i got some sqldr error and it is asking to map control file as well.Is there any other component which can be used to load max 1 million records fast without using sqlldr.
Our reruirement to take backup of existing tables quickely and load data into table whenever required but without (sqlldr ,imp/exp) by using talend..
Hi Vivek,
if you getting sqlldr error, install oracle client to resolve the issue.
And for control files, if you have the control file, you can add it in advanced settings(after check in 'use existing control file') of tOracleoutputbulkExec component . If you dont have control file, it will create it automatically.
Or, if you want other way,
you have to connect the SourceComponent -->tfileinputDelimited -->tOracleOutput
Feel free to reply if you need more details.
Dont forget to give kudos and solution accepted if this is really helpful.
Thanks,
Joe
Hi Joe,
Thanks.
Toracleinput option looks more suitable for us.Following are the query.
1.What is the appropriate value of cursor size in Toracleinput component? We use cursor size parameter value is 10000.
2.Our code design is Toracleoutput -> TfileoutputDelimited -> tmap -> Toracleoutput .We need to improve performance of these objects to read and load data into db.What can we do to improve performance.
Regards,
Vivek
Hi Vivek,
In ToracleInput, cursor refers the fetch size (no of records read at once).
The default value is 64 or 100 .
We are modifying the dataset to fetch 10000 records at once by keeping the value of cursor as 10,000.
This size will be vary as per your database design , memory that the data to be hold , parallel jobs execution etc. Pls check with your DBA for more details.
My suggestion is to keep 15000 as cursor which I always use as cursor size in my jobs(nearly 50 to 60 jobs was executed in parallel in high end database design).
Don't forget to give kudos for all suggestions when a reply is helpful and
click Accept the solution when you think you're good with it
Thanks,
Joe
Hi ,
I was doing the same thing. i am having issue after the table is filled into CSV records from source table and it is getting imported into the csv and gives null exception after the source records are completed loading into CSV. the jobs fails as next record is null after all records from source table are pushed into csv and does not load any data into target (oracle).
how to fix this issue?
@joearun4u @vivek_u