Qlik Community

QlikView Documents

Documents for QlikView related information.

Announcements
QlikView Fans! We’d love to hear from you.
Share your QlikView feedback with the product team… Click here to participate in our 5-minute survey.
Rules, plus terms and conditions, can be found here.

Processing Windows Perfmon Logs using QlikView

shane_spencer
Valued Contributor

Processing Windows Perfmon Logs using QlikView

When I started working on Performance Management of a large number of Windows Servers I was a little surprised to find that there is not already an "off the shelf" solution out there to process and view the Windows Perfmon (Performance Monitor) logs. In our organisation we had a few in-house solutions mainly based on Excel that allowed you to view the logs for a Server over a short period of time (i.e. a day) but I needed a more robust and flexible solution. So as I stranger to QlikView I started to put together a solution, with a little help from the community here when I got stuck, and have decided to share in the hope that anyone else with similar requirements can benefit. Some knowledge or ability to learn a bit of QlikView is needed.

As a prerequisite I try to ensure that all the Perfmon Logs I collect are of a uniform nature:

  •      Have a naming convention: servername_YYYYMMDD
  •      Logs are saved as 'csv' files
  •      Collect metrics at 5 minute intervals*
  •      Run from midnight (00:00.01) to midnight (23:55.01)

*After some consideration I decided that 5 minute intervals were about the right frequency.

n.b. The flexibility of QlikView allows me to collect different Metrics (aka Counters) depending on the server, though I did have a basic building block (see appendix).

Attachments:

A) In the zip file: 'Perfmon Processing.rar' there are 4 types of QVW that I use for processing the Perfmon 'csv' logs:

The four qvw's in the zip file essentially do the same thing, process the raw data (the Perfmon csv's) and save them as a QVD. It runs just after midnight each night and appends the previous days logs to the current months "part processed" QVD. It then further processes the data to split out the Server Name, the Main Metric (i.e. Processor, Memory etc) and the Sub Metrics (i.e. Queue Length etc) from the column headings, filters out any junk and saves them toa QVD in a standard format that can be used for MI. The main difference in the 4 example's is where the perfmon log files are located in relation to the defined (root) directory..

1) Perfmon_processing.qvw

This will search for all (csv) files in the defined (root) directory only

    

2) Perfmon_processing (with directories).qvw

This will search all the directories in the defined (root) directory for csv files

    

3) Perfmon_processing (with sub-directories).qvw

This uses a sub-routine to search all directories AND sub-directories in the defined (root) directory for csv files

4) Perfmon_processing (with apply-map).qvw

This is the same as 1) except that it has been modified because DNS names were used in collecting the metrics instead of Server names so I used a Map LOAD to replace them. Also the date stamp on the Perfmon log files was in a different format so I changed the script to accommodate that.

B) The 'GFA Utilisation Reporting Summary.xls' holds data such as Service, Class and Role on the Servers I am collecting Perfmon logs from and is used by the 'Perfmon_Utilisation_Reporting.qvw'.

C) The 'AVGHOUR_Perfmon_processing.qvw' I use to average old data over the hour. I get it to run on the 2nd of the month and process all the QVD's made by my 'Perfmon_processing' qvw's that are from 3 months previous.

D) The 'Perfmon_Utilisation_Reporting.qvw' is the front end application for displaying the processed QVD's.

It loads Processed QVD's for the current month and previous 2 full months plus it loads Average Hour QVD's for all older months. It also pulls info from the 'GFA Utilisation Reporting Summary.xls' that it links to the Performance data by Server Name.

n.b. The 'Main' sheet is the front-end, the others are for my use and normally hidden from other users. I have left them in here in case the work-in-progress is some use to others, there's even some charts I was playing with for data extracted from IIS Logs.

show-sheet.JPG

E) In the zip file: 'Sample Logs.rar' are just some sample Perfmon logs for different types of server (i.e. Web Server, SQL Server, etc)

Appendix:

I used the below Perfmon Counters  as the base building block for any server:

\Memory\% Committed Bytes In Use

\Memory\Available MBytes

\Memory\Page Faults/sec

\Memory\Pages/sec

\Memory\Pool Nonpaged Bytes

\Network Interface(*)\Bytes Total/sec

\Network Interface(*)\Current Bandwidth

\PhysicalDisk(_Total)\% Disk Time

\PhysicalDisk(_Total)\Avg. Disk Bytes/Transfer

\PhysicalDisk(_Total)\Avg. Disk Queue Length

\Processor(_Total)\% Interrupt Time

\Processor(_Total)\% Processor Time

\Processor(_Total)\Interrupts/sec

\System\Context Switches/sec

\System\Processor Queue Length

Attachments
Comments
Luminary
Luminary

Good stuff. Thanks for sharing.

jeremy_fourman
Contributor

We made use of this as a starting point for performance analysis of a distributed installation of Publisher servers. Thanks for sharing.

shane_spencer
Valued Contributor

I'm glad you found it useful!

kalyandg
Contributor II

really a nice stuff

carbal1952
Contributor II

Very good material. Useful & required all the time by users. Thanks a lot. CB.

shane_spencer
Valued Contributor

I've created an in-depth document on How to set up Windows Performance Monitor For QlikView Servers

to compliment this post.

Not applicable

solid gold.

thank u dude

ysj
Contributor

Could you please hepl me

ysj
Contributor

Could you please hepl me

ysj
Contributor
Version history
Revision #:
1 of 1
Last update:
‎08-24-2012 06:09 AM
Updated by: