Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
Hello everyone,
I'm working on Qlik and I am facing a huge performance problem.
Indeed, I made hard calculations and the application now takes several seconds (or sometimes minutes or total stop) to be updated at every click!
I spent several days and I can't find a solution so I really need you !!!
I have 3 main tables to calculate the consumption of articles for each finished product in each workshop (by month):
- 1st table : Consumption (Q = quantity) by article and by workshop, site, month and year (for example in m3)
- 2nd table : Production (v= volume of processed tons) of product by workshop, month and year (for example in Tons)
- 3d table : Theoretical consumption (e_th) by product AND article (for example in m3 / Tons).
It's a kind of theoretical bill of materials updated only one time by year . But it's not the true consumption by product...
and an other table similar to the 1st one with the fix consumption (F= baseline) by article and by workshop, site, month and year
So the true consumption by article and by products (q) is not known (we want q = v x e) and I need to calculate it like that :
e = e_th x (Q - F) / (Qth -F)
with Qth = SUMPRODUCT(v x e_th)
I did this calculation. It works but the response time is very high...
SUM(
Aggr($(e_th),[SITE NAME],WORKSHOP, MONTH,ARTICLE,PRODUCT)
*
Aggr($(Q)-$(F),[SITE NAME],WORKSHOP, MONTH,ARTICLE,PRODUCT)
/
Aggr(SUM(TOTAL<[SITE NAME],WORKSHOP, MONTH,ARTICLE>
Aggr($(e_th) * SUM({$<YEAR = {$(Year_N)}>}$(v)),[SITE NAME],WORKSHOP, MONTH,ARTICLE,PRODUCT))
-
SUM({$<YEAR = {$(Year_N)}>}$(F)),[SITE NAME],WORKSHOP, MONTH,ARTICLE,PRODUCT)
)
Do you think it's possible to do this calculation directly in the script to have the table above??
If you have any idea please answer... It's will be very helpull !!!!!
Thanks a lot
Would you be able to share a sample to look at the issue you are facing?
Hello Sunny T,
Thank you for your quick answer.
Unfortunately I can't share the sample...
Do you have an idea how I can improve it?
This is my current mapping :
try using variables and check
It's not a surprise because of the heavy use of aggr-functions that your expression is slow. I think you need at least some adjustments within the datamodel and maybe even some kind of precalculations. How does your datamodel look like?
- Marcus
You used a linktable-datamodel which is from a performance point of view not the best choice - especially with so many aggr-functions you will build several huge virtual tables (which needs time) - see for this: Logical Inference and Aggregations. Therefore I suggest to change the datamodel into a starscheme or to use only a single fact-table. Here you will find more input: Concatenate vs Link Table (instead respectively additionally you could use joins or mapping to create the fact-table).
- Marcus