I was wondering if someone could help me with an ongoing issue that we am experiencing with our QlikView application?
We currently have an application that has over 1 billion rows of data. The problem is that this data is growing exponentially every Quarter. For example, 2014 Quarter 2 of transactions was the same amount of transactions that were made for all of 2010.
We have been noticing performance issues as our application has become more and more unstable as quarters past by.
As the business has being used to a certain window of available data, I don't believe reducing the data is an option.
I believe the key is to solving this issue is to change our data model but I've have tried to optimize it as much as possible.
(Removed Unwanted Columns, Build Fact Tables, etc). Could anyone tell me if there are ways to optimize this data model more so that the application performance can remain stable for end user use?
Attached is the Application's data model overview. Any help on this would be greatly appreciated.
No. It's wouldn't be a virtual memory issue. In the event when it does happen all we get error messages
such as Server aborted trying to recover by restart"
The "Working Set.." messages sometime precede the restart. It's not really about virtual memory per se. It's that the available RAM can get filled with multiple documents, user sessions and cache. Those messages indicate that QVS is having to manage the RAM.
" The one issue I would have is with Document Chaining as there would be an extra cost for licenses for every user who uses this application and accesses the chained document?"
The chained document does require a license. Only an issue if you are using Doc or Usage CALs.
Expressions that reference fields from two disconnected tables generally cause a temporary table of the cartesian product to be built. In addition, data islands frequently have a negative impact on cache. See