Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
I'm pretty new to using Qlikview and I don't know the proper terminology for everything, so please correct me if I'm talking nonsense. Also please excuse me if I'm just not making sense in general. It's finals week and I'm running on two hours of sleep and half of a red bull.
So I'm the intern here at work and I was told that we need to start keeping metrics on an automated process. I can't change existing code to add logging, but I can harvest data from the files in the output folders for these processes. I know I can generate a .csv easily with a perl script, but I'd like to keep everything from data generation to display in one place.
So I've got three main questions:
1. Can a load script read file metadata such as creation date and time? Basically, can it generate fields like "File name" "Date" etc.
2. Can a load script run through each file in a folder and pull information based on xml tags? I would also need fields based on "Author" "Department" etc.
3. Can I set a button to re-run the load script?
3a. Does Actions -> External -> Reload do what I think it does?
If all that is possible, how slow would it be? I'm needing this to be relatively quick. Though if I can't do the above, or if it's too slow, then I suppose I could have the sheet open a URL to a perl script, or launch an external application to generate a file, which I could then load. I'm just not sure what would be the more efficient.
Thanks for any help.
1. Have a look at the file functions in the help, like FileName(), FileTime()
2. You can run through all files in a folder using a FOR EACH vFile in FILELIST(...) .... NEXT loop.
There is more complete example in the help file, search for FOR EACH
3. Yes
Not sure if this will be fast enough. How fast ist fast enough?
1. Have a look at the file functions in the help, like FileName(), FileTime()
2. You can run through all files in a folder using a FOR EACH vFile in FILELIST(...) .... NEXT loop.
There is more complete example in the help file, search for FOR EACH
3. Yes
Not sure if this will be fast enough. How fast ist fast enough?
These processes fire off anywhere between 5 per minute up to 30 minutes per task. The person needing the metrics asked for "as close to real-time monitoring as possible." I can write the Perl script to only update a data file with changes from the last time it was run. Would that be possible with Qlikview, or would it be running through all 20000 files every time I refreshed the data?
You can run external scripts using EXECUTE statement, if this was your question.
You can also exlude files from the list to read in.
Or what is your question?
Perl is a scripting language separate from Qlikview, and it's very good at reading and parsing text files. Unless Qlikview has Perl capability built in, then I would have to execute it with a separate process.
I would need to exclude the files from the list based on their creation time, and the last time that the data was updated. I've got five folders with about 45000 .xml files that I would need to read. If Qlikview has to go through all of them every time I want to update my data, then I'll have to find a different way to process that.
I know Perl scripting language.
But I still unsure what you want to achieve.
If you can create a Perl script that creates a file with either the data or the file names to read in, this would be great. You can then read this file from QV and proceed with your data processing.
As said, you can execute external scripts from within the QV script using EXECUTE statement, if this is needed.
Perl is a bit of an old-fashioned way to track changes to xml files. Not because it is old-fashioned in itself (it isn't) but because it lacks the tight integration into OSes like Windows. Especially the recent ones
To do what you want to do and do it on large sets of files and directories, use Powershell and the .net filesystemwatcher to monitor NTFS for changes to specific files. Your Powershell script can store the auditing results in a file or a DB (better for concurrent access). Let QlikView then just read a list of filenames, last modifications dates, authors and other attributes.
Peter
Sorry if I wasn't clear. I'm not monitoring changes to files, I'm monitoring for additional files. These processes are passed an input file, then write changes to a different system based on the contents of that file. Then they deposit the input file into an output folder. I need to harvest data from that output folder. What I need from the metadata is creation date and filename. From within the input file I will need various parts of the data identified by that input file, such as badge number, part number, etc.
If Qlikview can handle that all neatly and quickly, then great. If it can't, then I'll need a workaround. Will Powershell work for what I've described? I've never used it before. I was just planning on using Perl because it would be easy to set up a regex/grep/whatever to get everything I need.
Sorry if I wasn't clear. These automated processes are passed an input file, then write changes to a different system based on the contents of that file. Then they deposit the input file into an output folder. I need to harvest data from that output folder. What I need from the metadata is creation date and filename. From within the input file I will need various parts of the data identified by that input file, such as badge number, part number, etc.
What you're monitoring in reality doesn't really matter.
The point I'm trying to make is that your monitoring process will become much more efficient if someone wakes up your tool whenever a new output file arrives (change to the directory). That's what you can use the filesystemwatcher for.
The alternate solution would be to run a (perl/qlikview) script at regular intervals that goes looking for changes to the contents of one or more directories. And may not find any. Or a lot.
Maybe you could do this in QlikView script ("near real-time" is a killer here) but it won't be too good-looking if you ask me...
Peter