Skip to main content
Announcements
Have questions about Qlik Connect? Join us live on April 10th, at 11 AM ET: SIGN UP NOW
cancel
Showing results for 
Search instead for 
Did you mean: 
shane_spencer
Specialist
Specialist

High Page File Usage

I've been given the unenviable task of support a QlikView environment that I didn't build and with very little background in QlikView.

After only a few weeks the users are complaining of performance problems.

I've had a look at the Perfomance Metrics and when (Memory) Committed Bytes In Use (%) gets over 90% we start to see the Page File being increasing used, which is an obvious problem.

I found this rather good document: DS-Technical-Brief-QlikView-Server-Memory-Management-and-CPU-Utilization-EN.pdf that discusses the Working Set and Paging etc.

Our Working Set Min is 90% and Max 97%

Our QVS Servers are a clusterd pair with 192GB of RAM

These servers are pretty large already so there's a limit to how much more memory I can add, plus I don't want to simply throw hardware at the problem without understanding it.

My question is if I reduce the Working Set Min (and Max) will QVS start managing it's use of memory (clearing cache etc as described in the above pdf) and prevent Paging, or as I fear will the servers simply start Paging at a lower overall Memory Usage?

21 Replies
arish_delon
Creator
Creator

Hi Shane,

You are on the right path.

In my world i have the following challenge. "User behavior" and poor design of the apps.

We have flexible tables (poor design: no restriction to how much data you can load and/or export).

Analyst love their excel an they want dump data from QV Apps to Excel.

Initially it was 100,000 records and it grew 1 Million records and more. Everyone wasting their time creating their own reports then validating each others reports to find out if the data is accurate. What a wast of time and resources.


We are educating our consumers to use the online dashboards and reports to do their analysis on the fly instead of downloading to excel.

Memory usage and recycling used memory has been a challenge.

We have made the following changes.

  • Document Timeout is set to 30 mins

Doc_timeout.JPG

  • Performance tuning

Performance.JPG

  • Stop and Start QVS daily 1's day.

Scheduler.JPG

  • Dump cached memory every 3 hours (stating from Midnight)

Settings_ini.JPG

These steps would lead to better memory management. We have also performed hardware upgrade moving from 256 GB to 512 GB cluster model with publish on a different server. We still suffer business disruption because of user behavior. We have done as much as possible from Server Admin position.

We are looking at Audit tables to tell us, who are these individuals who still dumping large sets of data to txt or excel.

Then we will be having a conversation with these individuals. Why they are not following the best practices model. Do they have valid business requirements that does not meet their needs.

shane_spencer
Specialist
Specialist
Author

Since this topic was closed we've moved to a new environment with 2x 768GB QVS Servers. We still do regular restarted of QVS but we've stopped doing the Clear Cache as it seemed to lead to instability of QVS if users were on the system. Most importantly though after butting heads with the business for months we got someone in from Qlik to review the Documents who basically said the same thing we'd been saying about poor design and now they've redesigned the Documents at long last.