Skip to main content
Announcements
Qlik Connect 2024! Seize endless possibilities! LEARN MORE
cancel
Showing results for 
Search instead for 
Did you mean: 
Not applicable

Does using min/max function affects performance/memory ?

We have a large qvw document (approx. 4 GB) that does a TYLY comparison on a straight table (30 metric cols) and noticing that memory spikes up considerably. it was currently using this logic:

TY = sum

({$<ad_yr_num={$(=Max(ad_yr_num))}>} sale_amt)





TYLY change = sum({$<ad_yr_num={$(=Max(ad_yr_num))}>} sale_amt)



- sum ( {$<ad_yr_num = {$(#=min(ad_yr_num))}>} sale_amt)

Thinking that it was because of the min/max function, we have changed the logic in the report to only bring 2 years worth of data in the qvw (2009,2010) and did the calculation as such but memory did not really improve.

TY = sum(sale_amt)

TYLY change = sum(sale_amt) - sum ( {$<ad_yr_num = {$(#=(ad_yr_num) - 1)}>} sale_amt)

2 additional questions:

1) I am not able to understand how by just doing a sum(sale_amt) for TY, i get the correct value for year 2010 without needing to use a max function. No selection is needed to be made.

2) Does min/max not really impact performance (like some other applications) ? Is there some other way to improve memory ?

Thanks in advance for any feedbacks.

2 Replies
pover
Luminary Alumni
Luminary Alumni

Something is suspicious about sum(sales_amt) giving you 2010 data without selecting 2010. Something is not right. The expression (ad_yr_num)-1 shouldn't work either unless you have just one year selected. Maybe if you looked deeper into this strange result you might find something that's wrong with the data model.

Also, to make your application more efficient delete all unnecessary columns and consolidate your data model into one table as much as possible.

I've never had problems with a slow max o min especially with a column like years where the number of distinct years is so few.

Regards.

Not applicable
Author

Thank you for your response. Yes, there was a hidden list box with 2010 selected which make sense to why i was getting the right results for 2010.

It is one big fact table with some 80 million rows. On one sheet i have 2 charts: one barchart with only 2 measures and one straight table with 2 dimensions and 32 measures that's doing all the TY and TYLY sum by different columns. It takes about 12 seconds to get the results back on the table and memory jumps to almost 500 MB for each selection clicks made which seems too much of memory consumption for just one click. All unnecessary columns have been removed from the data model as well.

Appreciate any additional feedback. Thanks.