Not really sure where to start here so I'd like some advice please. Over the course of the day the qvs.exe service memory usage gradually gets higher and higher - I just restarted the service an hour ago as the memory usage had gone up to 12GB, it starts on about 400Mb.
Can anyone give me any pointers on where to look to perhaps diagnose how this is happening and perhaps fix it? Or is this just normal? Perhaps it's a particular report or a particular user that's causing the memory to be eaten up?
As Giuseppe Novello correctly said this is normal behaviour. QVS will need a certain amound of memory for your Document(s) and the rest up to the point of Working Set Low will be used for cached results. When Memory reaches this level QVS will flush the least relevant results from cache as memory is needed for new results (or documents).
This is only an issue if you're experiencing performance problems. Things you need to look at is:
How much memory does the server have?
How much memory is required to hold all your Documents in Memory?
(temporarily set all Documents to pre-loaded then restart QVS to see what memory utilisition climbs to)
How much memory is needed for other processes? - typically allow 5-10GB
Does having Working Set Low at 70% allow enough memory for other things? (or conversely does it leave too much unused)
When you've set a suitable Working Set Low how much does this allow for cached results?
What happens when you reach Working Set Low? Do you see Performance degradation? Instability?
Does the paging file start being used? (this is bad)
How longs does it take to fill up on a typical / busy day?
(nb If it is an issue you can schedule QVS restarts or cache flushes)
Do you have (frequent) reloads? If so what does this do to Memory? Does that cause a conflict if QVS Memory is already at Working Set Low?
I've written a QlikView Performance Monitoring document which details which Perfmon Metrics to collect and has some Applications that can be used to analyse the data but you need some QlikView developer skills to tweak the code to your individual purposes.
n.b. We've got a 192GB server where the base memory required is 60GB-70GB and a single document is 10-12GB when loaded in memory (4x size on disk) and we get to Working Set Min in less than half a day at peak periods. We often have problems when we reach Workin Set Low but on a smaller environment such as yours I would be less concerned about that happening.
Thank you both, that's really useful information. I was convinced there was some sort of memory leak. I have uploaded quite a few large documents recently and admittedly the data models haven't been perfect so that's probably why it seems worse now.
I'll have look at the server based on your advice.