Do not input private or sensitive data. View Qlik Privacy & Cookie Policy.
Skip to main content

Announcements
See why IDC MarketScape names Qlik a 2025 Leader! Read more
cancel
Showing results for 
Search instead for 
Did you mean: 
Anonymous
Not applicable

Talend & Redshift AWS

Hi All,
i'm currently using talend to transfer my data from Oracle Database to my Redshift Cloud database.
what I noticed is that it is really slow, with an average of 240 row / second on only three fields.
Is there a trick to improve the transfer?
Labels (3)
9 Replies
Anonymous
Not applicable
Author

Hi n.kasdali,
Performance issue is usually caused by the DB connection or the job design, can you upload some screenshots of job design into forum?
Best regards
Sabrina
Anonymous
Not applicable
Author

hi, i've got 6gb/s UP bandwidth in my network, and 3.6gb/s using the aws network.
For my Job is quite simple:
tOracleInput---->tMap-------->tRedshiftOoutput.
Anonymous
Not applicable
Author

Hi,
Have you selected "Extend Insert" check box to carry out a bulk insert of a defined set of lines? TalendHelpCenter:tRedshiftOutput.
In addition, tMap is cache component consuming two much memory. For a large set of data, try to store the data on disk instead of memory on tMap. Also, allocate more memory to execute the job.
Please have a look at KB article TalendHelpCenterutOfMemory.
Best regards
Sabrina
0683p000009MEOc.png
Anonymous
Not applicable
Author

At first, I started by doing a CSV extraction.
However, I have a field "USER_AGENT" which contains a wealth of character, which makes it difficult to use a delimiter.
This is why I use a tMap.
Anonymous
Not applicable
Author

Hi,
Is "Extend Insert" option in Advanced Setting of tRedshiftOutput OK with your scenario?
Best regards
Sabrina
Anonymous
Not applicable
Author

i've got the same option : commit every 10000 and insert ligne 100.
Anonymous
Not applicable
Author

Hi,
Have you selected "Extend Insert" check box to carry out a bulk insert of a defined set of lines? TalendHelpCenter:tRedshiftOutput.
In addition, tMap is cache component consuming two much memory. For a large set of data, try to store the data on disk instead of memory on tMap. Also, allocate more memory to execute the job.
Please have a look at KB article TalendHelpCenter0683p000009MA5A.pngutOfMemory.

Does these solutions make your job performance improved?
Best regards
Sabrina
Anonymous
Not applicable
Author

i'll try 0683p000009MA9p.png
Anonymous
Not applicable
Author

Hi,
Feel free to let me know if is it OK with you.
Best regards
Sabrina