Skip to main content
Announcements
Live today at 11 AM ET. Get your questions about Qlik Connect answered, or just listen in. SIGN UP NOW
cancel
Showing results for 
Search instead for 
Did you mean: 
Not applicable

Setting Memory Limits to avoid memory starvation

Hi All,

Is there a way to set memory limits on qlikview servers. We're currently running at 99-100% utilization. We want to set the limits to be approximately 90-95% to maintain performance. Doing otherwise is resulting in memory starvation for the OS which is negatively affecting some of our running applications and services (RDP) severely.

We only have Qlikview applications that are running on the server. Sometimes we have multiple documents and sometimes its just one qlikview application that ends up taking the memory. Our developers are connecting remotely to the development server. Working on Individual machines is not possible the reason being the huge size (4-5 GB) of some of the documents and amount of calculation within those qlikview documents.

Some measures to solve this issue internally:

1. Try to optimize the code,

2. Follow general code standards,

3. Open document without data or via access point (if we just want to see the code),

4. Run the application in debug mode and with sample data and not reloading constantly,

5. Reload heavy dashboard by informing the team and asking them not to load any other heavy dashboard at their end,

6. Use Incremental reload.

Any help would really be appreciated.

Regards,

Janaki Venkitasubramanian

6 Replies
Peter_Cammaert
Partner - Champion III
Partner - Champion III

The memory usage of QVS & Co can be controlled with QMC settings called "Working Set" There is a Low and High level setting. The High level setting detemines the maximum percentage of all server RAM that QlikView will ever use (and then never release).

I was going to say "lower the value of the High level setting to 85% so as to leave sufficient room for you OS", but that is the wrong approach. If your server is currently running at 99-100% RAM utilization, it means that your QV documents need this amount of RAM more than once. Lowering the High level setting will most probably adversely affect document performance in a big way, and that's not something your users will appreciate.

So the sensible advice to give is - as always: time to add more RAM, your QlikView documents need it. This is also cheaper than a big project to refactor all your documents. Although the latter should certainly be part of your development workflow.

You don't do development work on your production server, do you?

Best,

Peter

Not applicable
Author

Hi Peter,

Thanks once again for your reply.

Currently we have two servers:

1. Development/QA

2. Production

My next question to you would be if the working set limits will not work (as needed) and if we need to add more RAM, what would happen when we add more developers and also more dashboards. At the end of the day we can't just end up adding more and more RAM.

What model would you recommend in such a situation?

Regards,

Janaki

Peter_Cammaert
Partner - Champion III
Partner - Champion III

Indeed, there is a practical limit of about 256GB in most current servers.

However, there are other more pragmatic techniques to manage a development server under stress: For example:

  • In Dev/QA QVS, throw out all documents that aren't in use for some time (say 15 mins or lower). The QA portal isn't in use all the time, is it?
  • Use a special QDS data layer with a limited amount of data, just for development and testing. Let them load the full set only when doing final checks and performance test (in QA?). Do plan the latter.
  • Schedule development tracks so that the different memory-critical fases do not coincide. Not everybody is reloading every document at the same time, are they?
  • Institute a rule for your developers that every cartesian product on the development machine earns them an obligation to buy a round at the end of the week

Peter

Not applicable
Author

Hi Peter,

  • In Dev/QA QVS, throw out all documents that aren't in use for some time (say 15 mins or lower). The QA portal isn't in use all the time, is it? - No the Access Point is not in use all the time
  • Use a special QDS data layer with a limited amount of data, just for development and testing. Let them load the full set only when doing final checks and performance test (in QA?). Do plan the latter. - Currently we try to load sample data and use the debug mode to test things
  • Schedule development tracks so that the different memory-critical fases do not coincide. Not everybody is reloading every document at the same time, are they? - No, not everybody is reloading at the same time. But since our documents are really big just keeping few of them open by different developers creates issues and if the server reaches its high limit it disconnects everyone
  • Institute a rule for your developers that every cartesian product on the development machine earns them an obligation to buy a round at the end of the week - Wish that could be the case

Thank you for your comments.

Regards,

Janaki

Colin-Albert

Is the data in your model optimised? This blog shows how changing your data can affect memory usage.

The Importance Of Being Distinct

If you are storing timestamps when only dates are needed, this can cause massive ram increases. Splitting the timestamp to separate date and time fields will reduce memory usage.

Similar techniques can be applied to text data e.g. Split a telephone number into 3 fields for country, area and local number will store the data more efficiently.

Can you reduce the number of distinct values in your data?

Not applicable
Author

I think you should able to reduce the memory needs also by splitting those huge documents into smaller documents with different level of detail, like: Management - high level detail (aggregated data), Operator - low level detail (a lot more details). In this way the management for example will open smaller files and should use less RAM and CPU.

Also, sometimes I was providing more information than it was needed, as the business requirements changed. Getting the exact minimum level of details would be good and reduce the data to this level. This should lead to a better use of RAM and CPU also.