Skip to main content
Announcements
UPGRADE ADVISORY for Qlik Replicate 2024.5: Read More
cancel
Showing results for 
Search instead for 
Did you mean: 
Benkku
Partner - Contributor III
Partner - Contributor III

Copying huge tables with Qlik Replicate

Hi!

Within a customer demo I have to copy a huge table from a MySQL RDS instance in AWS into Timescale Cloud (by using PostgreSQL target end point). To do this I have created a Qlik Replicate task which does also some data transformations and some data filtering.

I have copied the first 1 million entries successfully in about 17 seconds.

The source table contains about 8400 million rows which take currently about 1800 GB. Due to the more efficient target repository and to the filtering, I get a pretty high compression rate so I should have enough space in the target but the all pipeline might take the all weekend to run.

Any hints in which settings in Qlik Replicate I should make in order to run the pipeline smoothly?

Thanks,

Bernardo Di Chiara

Labels (2)
1 Solution

Accepted Solutions
Prabodh
Creator II
Creator II

You can try enabling parallel load by creating segmentation. Check this documentation page for details. I have successfully improved the load from Oracle endpoint by 5x be enabling parallel load.

 

However, your source, MySQL on RDS is not explicitly mention in the supported sources.

View solution in original post

2 Replies
Prabodh
Creator II
Creator II

You can try enabling parallel load by creating segmentation. Check this documentation page for details. I have successfully improved the load from Oracle endpoint by 5x be enabling parallel load.

 

However, your source, MySQL on RDS is not explicitly mention in the supported sources.

Benkku
Partner - Contributor III
Partner - Contributor III
Author

Thanks Prabodh!