When I started working on Performance Management of a large number of Windows Servers I was a little surprised to find that there is not already an "off the shelf" solution out there to process and view the Windows Perfmon (Performance Monitor) logs. In our organisation we had a few in-house solutions mainly based on Excel that allowed you to view the logs for a Server over a short period of time (i.e. a day) but I needed a more robust and flexible solution. So as I stranger to QlikView I started to put together a solution, with a little help from the community here when I got stuck, and have decided to share in the hope that anyone else with similar requirements can benefit. Some knowledge or ability to learn a bit of QlikView is needed.
As a prerequisite I try to ensure that all the Perfmon Logs I collect are of a uniform nature:
Have a naming convention: servername_YYYYMMDD
Logs are saved as 'csv' files
Collect metrics at 5 minute intervals*
Run from midnight (00:00.01) to midnight (23:55.01)
*After some consideration I decided that 5 minute intervals were about the right frequency.
n.b. The flexibility of QlikView allows me to collect different Metrics (aka Counters) depending on the server, though I did have a basic building block (see appendix).
A) In the zip file: 'Perfmon Processing.rar' there are 4 types of QVW that I use for processing the Perfmon 'csv' logs:
The four qvw's in the zip file essentially do the same thing, process the raw data (the Perfmon csv's) and save them as a QVD. It runs just after midnight each night and appends the previous days logs to the current months "part processed" QVD. It then further processes the data to split out the Server Name, the Main Metric (i.e. Processor, Memory etc) and the Sub Metrics (i.e. Queue Length etc) from the column headings, filters out any junk and saves them toa QVD in a standard format that can be used for MI. The main difference in the 4 example's is where the perfmon log files are located in relation to the defined (root) directory..
This will search for all (csv) files in the defined (root) directory only
2) Perfmon_processing (with directories).qvw
This will search all the directories in the defined (root) directory for csv files
3) Perfmon_processing (with sub-directories).qvw
This uses a sub-routine to search all directories AND sub-directories in the defined (root) directory for csv files
4) Perfmon_processing (with apply-map).qvw
This is the same as 1) except that it has been modified because DNS names were used in collecting the metrics instead of Server names so I used a Map LOAD to replace them. Also the date stamp on the Perfmon log files was in a different format so I changed the script to accommodate that.
B) The 'GFA Utilisation Reporting Summary.xls' holds data such as Service, Class and Role on the Servers I am collecting Perfmon logs from and is used by the 'Perfmon_Utilisation_Reporting.qvw'.
C) The 'AVGHOUR_Perfmon_processing.qvw' I use to average old data over the hour. I get it to run on the 2nd of the month and process all the QVD's made by my 'Perfmon_processing' qvw's that are from 3 months previous.
D) The 'Perfmon_Utilisation_Reporting.qvw' is the front end application for displaying the processed QVD's.
It loads Processed QVD's for the current month and previous 2 full months plus it loads Average Hour QVD's for all older months. It also pulls info from the 'GFA Utilisation Reporting Summary.xls' that it links to the Performance data by Server Name.
n.b. The 'Main' sheet is the front-end, the others are for my use and normally hidden from other users. I have left them in here in case the work-in-progress is some use to others, there's even some charts I was playing with for data extracted from IIS Logs.
E) In the zip file: 'Sample Logs.rar' are just some sample Perfmon logs for different types of server (i.e. Web Server, SQL Server, etc)
I used the below Perfmon Counters as the base building block for any server: