Skip to main content
Announcements
Qlik Introduces a New Era of Visualization! READ ALL ABOUT IT
cancel
Showing results for 
Search instead for 
Did you mean: 
Vinicius_Luiz
Contributor III
Contributor III

Improving the use of a PostgreSQL-based database as a target

Hi everyone!!


When using PostgreSQL as a target, we are facing some challenges related to CDC performance.

When the number of transactions reaches thousands, the CDC takes longer than expected compared to replications carried out in other DBMS (Oracle, for example).

In other words, transactions that take around 10 seconds to be applied using Oracle as a target, take around 2 minutes to be applied using PostgreSQL as a target.

Do you have knowledge about configurations in PostgreSQL to support a large volume of data update?

 

Qlik Replicate 

Labels (2)
6 Replies
boaz_newman
Employee
Employee

Hello

The first thing to do is to check if the db is optimized, perhaps it needs tables and indexes rebuild, and also regularly updating the db statistics.

 

Regards

Boaz

SushilKumar
Support
Support

Hello team,

To add more onto Boaz Suggestion. Any target database is no different from the Source database. Qlik Replicate Works in the Same Way as any other Application connects with the Database to apply Changes fetched from the Source db. to Apply changes via DML. 

It will Follow the same principle. It Submit the Querry as an Application does, and Server generate execution plan or profile is to execute the DML changes faster. 

And to make Changes faster one should follow the Suggestion or recommendation by Database help document.

hope this helps.

Regards,

Sushil Kumar 

Vinicius_Luiz
Contributor III
Contributor III
Author

Hi Boaz.

Thanks for the contribution!

We always create tables with indexes and primary keys corresponding to the source database. Furthermore, we carried out an analysis of the PostgreSQL database and found that table statistics are being properly computed by PostgreSQL's internal procedures.

I was researching some settings in the database to optimize transaction performance and found parameters such as: maintenance_work_mem, checkpoint_timeout, max_connections. Have you had experience adjusting these database parameters?

boaz_newman
Employee
Employee

Hi 

Sorry, I have no experience optimizing a Postgres db.

There are two significant points that you mentioned:

1. "When the number of transactions reaches thousands..." - why are there so many concurrent transactions on the target db?

2. "transactions that take around 10 seconds to be applied using Oracle as a target, take around 2 minutes to be applied using PostgreSQL as a target" - this indicates that there is a big difference between the Oracle target and the Postgres one. There could be many reasons for that:

Differences between the resources (server, cpu, memory, etc.)

Differences between the db configuration and optimization.

Differences between Oracle and Postgres

Anyway, as my colleague mentioned, Replicate sends SQL executable statements to the target (in bulk apply mode, by default) the rest is up to the target.

 

Regards

boaz

Vinicius_Luiz
Contributor III
Contributor III
Author

Hello

1. I referred to the number of operations performed, I'm sorry. Some of our tables have a reasonably high volume of updates.

2. Yes, we are considering improving the server, CPU, memory, etc. However, before that we are researching PostgreSQL database configurations that could improve performance.

Dana_Baldwin
Support
Support

Hi @Vinicius_Luiz 

In conjunction with checking the target performance you might consider engaging our Professional Services team to ensure your Replicate task is tuned properly for your use case. You can contact your account manager to initiate this process.

Thanks,

Dana