Skip to main content
Woohoo! Qlik Community has won “Best in Class Community” in the 2024 Khoros Kudos awards!
Announcements
Nov. 20th, Qlik Insider - Lakehouses: Driving the Future of Data & AI - PICK A SESSION
cancel
Showing results for 
Search instead for 
Did you mean: 
matthewjbryant
Creator II
Creator II

Decreasing the RAM footprint of fields

I've been using the Document Analyzer that Marcus_Sommer‌ helpfully pointed me towards.

After dropping a lot of fields that I didn't need I've managed to make some epic savings on the RAM footprint of my QVWs, but I still have some number fields that are taking over 1 MB each. I've made sure they are correctly set to the numeric format through the Document Settings, but this doesn't seem to help.

Is there anything else I can do?

1 Solution

Accepted Solutions
petter
Partner - Champion III
Partner - Champion III

Reduction of unnecessary precision could lead to a more efficient RAM usage.

So what is the maximum transaction monetary value you need to cater for when getting the data into QlikView? - that is in your Load Script. Never mind the calculation part and analytic part afterwards - that can be any precision for aggregated sums and so forth.

View solution in original post

7 Replies
Peter_Cammaert
Partner - Champion III
Partner - Champion III

Major savings can be reached by reducing field value cardinality. Do you have many DateTime fields? Can you split them in parts, or eliminate the timestamp-part? The fewer the number of distinct values, the more you will reduce the memory footprint.

Another item to consider may be the amount of historical data in your document. If you load 10 years of data, but most users only go back a couple of years, reduce the table load to the last three years.

Best,

Peter

petter
Partner - Champion III
Partner - Champion III

Dual-fields takes a lot of space. Using AutoNumber() too can get you space-savings. If you create keys with concatenated strings you should really consider using this approach.

matthewjbryant
Creator II
Creator II
Author

Thanks Peter‌‌. I have no need for DateTime fields so all Date fields have no time element. The fields that are causing me most issues are monetary value of transaction fields. I am loading data from 2012 onwards, but were talking about a decent-sized manufacturing company, and this is detailed down to the employee transaction. We're talking about 2 million + rows, most of which have distinct values.

Is there anything else I can do, or is this something I need to accept?

petter
Partner - Champion III
Partner - Champion III

Hi Matthew,

What kind of precision do you need on the monetary value? Maximum, minimum and number of decimals?

matthewjbryant
Creator II
Creator II
Author

I guess this is problem because I can't have a limit on max or min. I could limit it to 2 d.p. Would that make much difference?

petter
Partner - Champion III
Partner - Champion III

Reduction of unnecessary precision could lead to a more efficient RAM usage.

So what is the maximum transaction monetary value you need to cater for when getting the data into QlikView? - that is in your Load Script. Never mind the calculation part and analytic part afterwards - that can be any precision for aggregated sums and so forth.

matthewjbryant
Creator II
Creator II
Author

I'll look at the data and have a think.

Thanks for the help.