Skip to main content
Announcements
Join us at Qlik Connect for 3 magical days of learning, networking,and inspiration! REGISTER TODAY and save!
cancel
Showing results for 
Search instead for 
Did you mean: 
AAdvikC
Contributor
Contributor

Bulk Load - From Oracle to AWS RDS

Hi Team,

Good Day!!

As part of the migration project, we are trying to load 120000000 rows of data from Oracle to AWS RDS.

With below two approaches:

Option 1: tOracleInput ----------------->

tAmazonMysqlOutput

Curser (1000) --records were loading per sect only 150 , hence its taking lots time to load the data

Option 2 :

tOracleInput ----------------->tMysqlOutputBulk_1 ----->tMySQLBulkExec ( this flow was also very slow ) which

Could you please suggest to me the best approach to the design?

We want to load the data in very little time ..

Thanks

AChoudhry

Labels (5)
1 Reply
Anonymous
Not applicable

Hello,

For bulk insert , please use bulk components of Talend if it your server, or check AWS recommendations for bulk insert.

Generally speaking, the followings aspects could affect the job performance:

1. The volume of data, read a large of data set, the performance will degrade. For your case, 120000000 rows are a big data set.

2. The structure of data, if there are so many columns on tOracleInput, it will consume many memory and much time for transferring the data during the job execution.

3. The database connection, the job always runs better if the database is installed on local, if the database is on another machine, even you are on VPN, you may have the congestion and latency issues.

What's your rate of records/second in your options?

Best regards

Sabrina