Skip to main content
Announcements
Join us at Qlik Connect for 3 magical days of learning, networking,and inspiration! REGISTER TODAY and save!
cancel
Showing results for 
Search instead for 
Did you mean: 
teal
Contributor
Contributor

Optimizing tReplicate with lookups in big tables

I have a job that updates and deletes records in multiple tables based on the same input data.

I'm comparing the rm table in the v24 MSSQL db and in the v26, if a record was deleted in the v24, it has to be deleted in the v26.

0695b00000fJ5vTAAS.png

This is what's happening with tMap_3, it finds the matching bl_id, fl_id and rm_id as keys.

But before deleting that record in the rm table, the three keys are used as foreign keys in other tables records and have to be deleted or updated.

So in "rmpct" the records that have these three foreign keys are deleted, while in the other tables such as activity_log for example, the values of these three keys have to be emptied. So I use a tMap with a lookup to find the corresponding primary key and then update the values of these records.

0695b00000fJ612AAC.png

To do that for the different tables, I use a tReplicate but now the problem is that it takes a really long time (30 min).

Any idea on how I could optimize this?

Thanks!

Labels (3)
1 Reply
Anonymous
Not applicable

Hi

All tMap output is executed in parallel, which takes up more memory resources.

You can optimize job design from the following aspects:

  1. Store the lookup data on disk instead of memory if the volume of lookup data is big. (see below image)
  2. Allocate more memory to the job execution.

 

0695b00000fJBSXAA4.pngRegards

Shong