Do not input private or sensitive data. View Qlik Privacy & Cookie Policy.
Skip to main content

Announcements
Qlik Open Lakehouse is Now Generally Available! Discover the key highlights and partner resources here.
cancel
Showing results for 
Search instead for 
Did you mean: 
Not applicable

More efficient way of calculating this?

Hi all,

I have a qvw. which contains no synthetic keys or weird joins, but gives me an out of memory message when I select more that 8 weeks.

I know it is this expression :

=Sum (Achieved)/

(Sum (Target)* Avg ([Department Weighting])*Avg([Faculty Weighting]) * Avg([Course Weighting])*

Avg ([Domicile Weighting])*Avg ([Year of Study Weighting]))

that is causing the problem, but I have no idea on another way to perform this.

If anyone knows how I could go about changing this expression to prevent the memory error, I'd appreciate any tips or help.

Many thanks!

1 Solution

Accepted Solutions
Jason_Michaelides
Partner - Master II
Partner - Master II

Try bringing all of those small tables into your main Student table either with LEFT JOINS or ApplyMap().  Bring Engagement_Type into the Engagement table.

This will reduce the joins significantly and should improve things.  Remember that because of QlikView's Level 1 storage it is the DISTINCT values of a field that make the difference, not the number of records (at least not until you run into hundreds of millions), so a very denormalised structure is best.

Hope this helps,

Jason

View solution in original post

8 Replies
Not applicable
Author

Break this calculation down to steps and use variables where possible.

Its the number if iterations this calculation needs to make thru the entire data set thats causing the latency and error msg. to appear.

Jason_Michaelides
Partner - Master II
Partner - Master II

Hmmm - the expression itself doesn't look like it should be problematic. How many records is 8 weeks? Maybe there's an issue with your data model - are the different fields you're averaging separated by multiple joins?

Can you post a sample to look at?

jonathandienst
Partner - Champion III
Partner - Champion III

Hi

I suspect that your model structure may be the problem. How many degrees of separation do you have between the tables that contain the fields that you are aggregating? Are any of these fields key fields?

Have you tried leaving out some of the aggregates to determine if the problem is localised to one particlular sum/avg?

Jonathan

Logic will get you from a to b. Imagination will take you everywhere. - A Einstein
Not applicable
Author

This is what the table structure looks like.

I tried commenting out parts of the expression to eliminate the problematic avg(field) it seems to be:

*Avg ([Year of Study Weighting]) which causes the final memory error.

Jason_Michaelides
Partner - Master II
Partner - Master II

Try bringing all of those small tables into your main Student table either with LEFT JOINS or ApplyMap().  Bring Engagement_Type into the Engagement table.

This will reduce the joins significantly and should improve things.  Remember that because of QlikView's Level 1 storage it is the DISTINCT values of a field that make the difference, not the number of records (at least not until you run into hundreds of millions), so a very denormalised structure is best.

Hope this helps,

Jason

Not applicable
Author

Thanks, I'll give that a shot.

Not applicable
Author

I agree with Jason. Using mappings should significanlty improve your calculation speeds - I would also map your Engadgment_Type-and-Week_Weighting fields.

Not applicable
Author

Thanks for all of your help!

I created left joins with all of those smaller weighting tables.

The entire app. runs much more smoothly now.