I have successfuly built an application that reads and parses Microsoft IIS Logfiles. My question is about how to optimize performance when loading new data multiple times during the day?
The current application reads new log file transactions that do not exist in the current log and then concatenates them to create a new version of the log file. However, reloading the current log file, concatenating new transactions, and saving a new log file runs a good amount of time due to the fact that we have many servers and log file transactions.
I have just started reading about dynamic updates, but I am not sure that this is a good direction to pursue.
Does anyone have an practical experience with refreshing a log file dashboard multiple times during the day? Our VP has even mentioned an hourly refresh.
A second question involves scripting to rebuild the entire log file from scratch, in the event of "corruption" of some sort. Is this better to do in a completely separate QVW or combine into a single script with parameters?