I'm currently working in a very messy / immature but also large environment with a wide range of projects, many tools and no consistent approaches. So there's lots of opportunities to add value which is great, the other good thing is that at least some of the projects are trying to use the QDF which is nice to see, particularly as we're using Qlik Sense to do some heavy duty data transformation. In that context one of my focuses is to try to bring some visibility to what is happening in the ETL process as there's much confusion with ops types and testers / Business analysts as to what is going on. In particular I'm building a few things in to the process to provide visibility for testers so they can spend less time simply validating the data. As you are no doubt aware Qlik Sense lacks many of the traditional ETL tool type functionality, so one of the things I've done is extend the QDF to include some basic logging of ETL to track how many records have been loaded into QVDs at any point in time. Having this basic kind of logging was part of the discipline I learned as an ETL developer using different products many aeons ago. I then combine this logging with data profiling info and source-to-target mapping details into an app that gives testers and business analysts an end to end view of the ETL process.
So, my question is, are others doing anything like this? Any lessons learned or comments to share?
I'm attaching the custom subs I've written to enable the logging in case they are useful and you can improve upon them. There's two functions, one that simply creates a log entry and one that writes a log file. I include the functions in a Custom folder like so "99.Shared_Folders\3.Include\4.Sub\Custom"
The way I use it is that I call the ETLLogEntry whenever I store a QVD and then the WriteETlLog at the end of each script. This makes it pretty simple to use and is very little imposition on the developer.
Let me know your thoughts!