Do not input private or sensitive data. View Qlik Privacy & Cookie Policy.
Skip to main content

Announcements
Qlik Connect 2026! Turn data into bold moves, April 13 -15: Learn More!
cancel
Showing results for 
Search instead for 
Did you mean: 
k526
Contributor
Contributor

how to speedup the process for loading data from one database to another database?

Hi,

 

I have to tried load 1lakh records from one database to another database.

insert or update is taking more than one day to load.so i tried delete and insert .

i given commit-1000 and batchsize also 1000 only. It takes the same time.

can any one help me is there any other way to speed up the process

Labels (3)
1 Solution

Accepted Solutions
TRF
Champion II
Champion II

Try to use tRedshiftOutputBulkExec instead of tRedshiftOutput.

View solution in original post

7 Replies
TRF
Champion II
Champion II

Hi,
Which databases ?
Which component?
Did you try to bulkify the job?
Can you share the design?
k526
Contributor
Contributor
Author

redhshift database.
myjob is
tredshiftinput----->tmap----->tredshiftoutput

TRF
Champion II
Champion II

Try to use tRedshiftOutputBulkExec instead of tRedshiftOutput.
k526
Contributor
Contributor
Author

i dont have idea on tRedshiftOutputBulkExec.in tRedshiftOutputBulkExec settings s3 and datafilepath local .where to give insert or update operations in tRedshiftOutputBulkExec component. can you please explain how to change the settings and all. tredhsiftinput---.tRedshiftOutputBulkExec directly we can give without tmap? 

where we can give table name ?

cterenzi
Specialist
Specialist

The documentation has a scenario for bulk loading data to Redshift:

https://help.talend.com/#/reader/KxVIhxtXBBFymmkkWJ~O4Q/F6II4ZlKWu1xI1NZhaWEBg?section=tredshiftbulk...

k526
Contributor
Contributor
Author

tredshiftoutputbulkexec component is taking less time to insert bulk records.but it is only inserting the data .but i need it has to insert or update the data.any option is there for insert or update bulk data in redhshift.
Anonymous
Not applicable

Hi,

Let me tell you the exact answer. There are 2 Method to load bulk data into redshift 

 

First Method : 

Your source is redshift and your target is redshift . So use tredshiftrow component 2 times, 1 for insert and 1 for update.Don't use any redshift input, output component because loading into redshift is very slow.

 

Second Method : 

you can load bulk data in redshift using S3 component , for that Amazon S3 license is mandatory.