Skip to main content
Announcements
Qlik Connect 2024! Seize endless possibilities! LEARN MORE
cancel
Showing results for 
Search instead for 
Did you mean: 
Not applicable

QVS 11 performance and reliability

We're creating QlikView reports for a client that handles a large volume of phone calls.  They need both aggregate and detailed information about their callers, and how the calls were handled.  We're having performance problems (timeouts, rendering delays, session disconnects, etc), and getting mixed messages on whether QlikView is intended to be used the way we're using it.

At the detailed level, a table box is used to display individual call records, with a calculation condition to prevent the table from being displayed if selection criteria exceeds a 1000 row limit.  A QlikView Extension Object is used on the same sheet as the table box, to select/play recordings of individual calls, using data from the fact table to fetch/play audio from an external server.

My question is whether QlikView should be able to do this type of detailed reporting, with reasonable performance/reliability, if properly sized and configured?  Or are we trying to do something it's not really intended for, perhaps functionally, or working with a data size that's too big for what the product can handle, on any hardware?

More details:

  • The data is in a star schema loaded from a datamart
  • The fact table contains about 50 million rows, 50 columns wide, no synthetic keys
  • Data is refreshed with hourly incremental updates, preload=On, application qvw file is ~6GB
  • Software: QV v11.00/SR2/64b (build 11440), w2k8/SR2 enterprise
  • Hardware: QVS=32 cores, 320G ram; QVP=24 cores, 128G ram

Some of the timeout problems occur consistently with only a single user on the system.  The system generally performs well with smaller data sizes thru several million rows, but not at the 50M row size.  When the problem occurs, server memory/cpu utilization is stable, no significant changes.  Users also intermittently experience problems with session disconnects after being idle 10-15 minutes (max inactive session time=30 minutes).

Our QV servers are hosted on dedicated VMs (no other VMs per host) due to operational constraints, but we don't have any evidence that the VM environment is causing the problems, and have seen similar symptoms on non-VM dev servers.

Any insight appreciated!

6 Replies
hallquist_nate
Partner - Creator III
Partner - Creator III

Not sure if you have had any insights yet, but I do have a question.  Are you actually loading and playing the audio file of the phone call recording through QlikView?  That might have something to do with it.  Your server looks beefy enough.

Let me know....

Not applicable
Author

Do you still get timeouts if you disable the extension object?

Not applicable
Author

Nate H: The extension object invokes a lightweight ASP application on the same QVS box, which proxies individual audio files, in order to provide authentication and access to our backend audio server.  The timeouts occur when initially rendering the detail page, there's no play audio activity at that time.  Also, compared to overall QV activity, there are relatively few users that will be playing audio at the same time, so the overhead for that should be light.

Michael F: We've had the timeout issues earlier on, without the extension object.  I'm re-confirming against the latest release and will send an update.

Thanks!

Not applicable
Author

Update on this - I retested with the extension object disabled, and the detail table responds well, using a 1000 row calculation condition limit.  So it seems to be an extension object issue.

Not applicable
Author

So it turns out, our performance issues were not due to the extension object.  The issues were caused by the basic report data model implementation, which was significantly impacting memory/cpu/disk utilization, and then exacerbated by the extension object.  The causes of our performance issues included:

  • A bunch of dimension tables that were needlessly mapped to the large fact table with applymap
  • Poorly defined variables that took large amounts of memory/time to (re-)calculate
  • Mapping functions that didn't work correctly, causing bad associations with the full fact table
  • A number of other performance impacting issues

Our fixes are still in progress, but preliminary results show that with these issues corrected, the report and extension object perform well, even on a VM.  Memory/CPU/disk usage are all way down, so that reports are quicker and we'll be able to scale better.  We can now run the extension object against a table box with many more rows, with reasonable performance.

It was relatively quick/easy to get the report functionally developed from scratch, but it took quite a bit of dedicated performance engineering work to identify/undo the bottlenecks.  That's generally not uncommon with any product when performance is an issue.  But specifically for Qlikview, while there's plenty of training and materials on functionality, there seems to be a lack of:

  • Tools/monitors/alerts to measure performance and resource utilization: It seems like you can measure the overall server performance, you can get info about a static document's resources (eg: document analyzer), but you can't get resource info about specific documents or users in use at a given time.  One poorly written construct in a document can impact the entire server, and it's difficult to determine which document is the bad one, or which user is causing the issue.
  • People adequately qualified to do performance engineering: We found many people with broad Qlikview experience (including performance work), but few with the right performance engineering expertise to solve our issue.  Don't know if this is just luck, or lack of Qlikview performance oriented documentation, training, best practices, or certifications.

So, on the plus side, going back to my original question, it does seem like Qlikview can support our detailed reporting requirements, with reasonable performance.  We're not trying to do anything it wasn't meant for.  It was just more trouble than we expected to get there, but hopefully there'll be better tools/experience available over time.

Not applicable
Author

Interesting results.  Thanks for the update.

I do find that this is a common side effect of having such an easy to use tool, or any "self service" BI product for that matter.  Most of the people developing solutions do not have a technical background.  I would suggest you would be unlikely to have this problem in the first place if a person who was experienced in BI or data architecture designed the data model.

Quick and easy solutions often work well, but they tend not to scale well over time if usage, data or requirements increase.

As you concluded, QlikView is very capable of delivering so long as you design a good model.