Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 
Not applicable

tip: memory limits in large environments

My client has several large machines ( 128+ Gb memory, 16+ processor cores) for developer use. There can be tens of instances of qv.exe developer tool running fine at the same time.

That is, until somebody makes a loop in the script, and loads several millions rows, and consumes all available virtual memory. In that case programs (and Windows services) will randomly fail with variations of "out of memory" and "not enough resources". In rare cases memory shortage is so bad that Windows will corrupt disk and registry .

Here is a very good remedy, inspired from Unix "ulimit": kill all processes consuming more than X amount of memory. Of course it is hard to say a correct limit. With current amounts of data, well behaved documents use up 30 Gb memory, and bad scripts consume in excess of 100 Gb memory and crash. There must be a sanity limit somewhere in between.

powershell -command "& {get-process qv | Where-Object {$_.PeakVirtualMemorySize64 -gt 50000000000 } | Stop-Process } "

Powershell is not particularly fast, but other command line tools (ex from Sysinternals) do not display correctly memory usage for 64 bit processes.

Script above is scheduled every 10 minutes, and has improved a lot the availability of the servers. Of course, here and there some developer starts yelling, but so far there was always something wrong with those QVW files. My favourite so far: Export 50 million rows (with calculated columns) to Excel ..

-Alex



1 Solution

Accepted Solutions
Not applicable
Author

Hi

What is your opinion abaut using feature Windows System Resource Manager on 2008 enterprise R2 ?

View solution in original post

2 Replies
Not applicable
Author

Hi

What is your opinion abaut using feature Windows System Resource Manager on 2008 enterprise R2 ?

Not applicable
Author

Hi,

Yes is seems to do same things . Many thanks

-Alex