Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
I have inherited someone else's application. Large amounts of data and a less than optimal data model. Any tips for more quickly spotting issues to address to speed up a load, currently 25+min? Eg. While I know their definition, would looking at INFORMATION DENSITY and/or SUBSET RATIO reveal a good opportunity for improvement? What other things do you look at to troubleshoot? Rewriting the app/model is not an option at this point.
So what guiding principles does the community use in situations such as this? Thanks in advance for any and all constructive comments, suggestions, links, etc.
you can start with log analyzer (thanks to rwunderlich)
to identify the statement where the script is taking more time
Hello,
Did you check/validate the links between the tables? If you want to do some tests on the keys.
Maybe you'll find the QvM - Key Check useful. I created it to easily to validate my data model links.
Have fun Qlik'n, Koen
Check if all qvd's are qvd-optimized when loading. if they are not, make them optimized if possible by removing where clauses and push the where clause to resident loads instead. And always load the biggest tables first.
Thanks to all. Some great resources to dig into and educate myself more. An amazing community!
I was able to, painstakingly, find two LOADs of the same 19 million row table that was never used in the application. That alone cut the load time from 25+ minutes to 10 minutes.
Great to hear that you find it useful. Please mark it as correct answer or
helpful and close this discussion. It will help others.
-Siva
On 29 May 2017 10:43 pm, "Michael Armentrout" <qcwebmaster@qlikview.com>