Qlik Community

QlikView App Dev

Discussion Board for collaboration related to QlikView App Development.

Announcements
We value your feedback – take our 5-minute QlikView modernization survey
cancel
Showing results for 
Search instead for 
Did you mean: 
RaduM
Contributor II
Contributor II

QlikView Performance

I have a qvd file that i load with some data. I need to redistribute some of the values from this loaded table based on a combination of dimensions.

what i have done is the following

1. load qvd table (around 3 milion rows)

2 create  mapping tables with the combination of dimensions needed for the calculation (around 30 rows)

3 foreach loop  with the combinations

preceding load with mapping and calculations

load from resident table

drop resident table

 

the calculations are corect. the point is that is taking to much time to reload the data (40 minutes)

i have 30 steps in the foreach loop but all the data is coming from a resident table from qvd with is loaded in 2 seconds.

after that a read from resident with the where clause to limit the number of rows  

please let me know how can i debug this performance issue. It is a better way to do this? 

 

 

Labels (2)
1 Solution

Accepted Solutions
RaduM
Contributor II
Contributor II
Author

I just found a solution and i think will help others also

creating a temporary table with only the data that you need (from resident table  with the where clause to limit the number of rows  ) will dramaticaly cut the time of processing

my resident table was having around 3.5 milion rows and all this rows was processing inside a foreach loop.

the time was about 1.5 minutes per iteration

creating a diferent table with only the data that i needed to calculate generated a table with only 3000 rows and now the time inside each iteration is 2 seconds

so be careful with the amount of data that you process. is all the data neded?

 

 

View solution in original post

2 Replies
RaduM
Contributor II
Contributor II
Author

I just found a solution and i think will help others also

creating a temporary table with only the data that you need (from resident table  with the where clause to limit the number of rows  ) will dramaticaly cut the time of processing

my resident table was having around 3.5 milion rows and all this rows was processing inside a foreach loop.

the time was about 1.5 minutes per iteration

creating a diferent table with only the data that i needed to calculate generated a table with only 3000 rows and now the time inside each iteration is 2 seconds

so be careful with the amount of data that you process. is all the data neded?

 

 

MarcoWedel

There might be a solution without any loops as well that improves the script performance.