Skip to main content
Announcements
Global Transformation Awards! Applications are now open. Submit Entry
cancel
Showing results for 
Search instead for 
Did you mean: 
Not applicable

Qlikview web client data loading method and slow responsiveness

Hi all,

I'm trying to understand how does the client actually works when working with a QV model which is on a QV server.

I have the following scenario:

- We have a 512 GB ram QV server.

- A QV model with 300 million rows of data.

- 4 Tabs with average 3-4 charts and 5-10 filters

When loading the model to the server I'm trying to open the model in a Web browser (Explorer with IE plugin):

- when opening in the server itself as localhost I get quiet good responsiveness

- when opening in a web client on another computer on the same network I get long times of page loads, I see hundreds of MB of data transferred to the client, and the responsiveness is terrible.

Can somebody explain how the mechanism works? any optimizations needs to be done? other browsers to use?

Thanks,

Boris

1 Solution

Accepted Solutions
marcus_sommer

Yes, large tables could need to transfer very high amounts of data. The fact that qlikview could internal handle and process huge amounts of data very effective meant not that if millions of rows will be external displayed (tables within the browser and maybe even exported to excel) it wouldn't need a lot of RAM and the amount of data to transfer will be quite large.

Most often applications are parted in three dimensions - at first in dashboards which contain highly consolidated data (and often no tables only a few charts and textboxes with the most important KPI's), then a report area where the to display data are aggregated to levels like category and sub-categories and only within the third detail-part are data on transaction-level available and they will be mostly selected to the area of interest.

No tool will be able to display large amounts of data without the appropriate consume of RAM, cpu-ressources and traffic and if you need them (to process them within other systems - then a manually handling of them makes no sense) then it's better to create those data as tables within the script-step.

To "compress network traffic" I have no idea what it meant excactly but think that a (small) compression on http-protocol layer will be applied.


- Marcus

View solution in original post

3 Replies
marcus_sommer

In general you will have differences between the local machine and extern connected machines. Beside the max. technically possible transfer-rate could be proxy-server (cascades) and firewalls and other security measures slow down the connection. Whereby I think this isn't uncommon and you should notice such effects it caused rarely pain.

How do you measure the hundreds of MB of transferred data? Normally it won't be transferred bigger amounts of data then the server does the calculation and only the results of them will be transferred to the client and you get only bigger data-transfers if you displayed data into large tables - maybe with millions of rows - which if they will really needed should be better created within the script and not within the gui.

At first I would check the release from qv server (older versions could be have some bugs) and to enable the "compress network traffic" within the qmc server-settings within the tab security.

Beside them you would get better performance if you could optimize your application - removing unneeded fields with the Document Analyzer (Tools | Qlikview Cookbook) and removing/splitting from high cardinality fields (The Importance Of Being Distinct) might be the first and most important steps.

- Marcus

Not applicable
Author

You mean that if I have objects such as pivot tables that hold large number of records it could affect the transfer?

also, what does the "compress network traffic" do?



marcus_sommer

Yes, large tables could need to transfer very high amounts of data. The fact that qlikview could internal handle and process huge amounts of data very effective meant not that if millions of rows will be external displayed (tables within the browser and maybe even exported to excel) it wouldn't need a lot of RAM and the amount of data to transfer will be quite large.

Most often applications are parted in three dimensions - at first in dashboards which contain highly consolidated data (and often no tables only a few charts and textboxes with the most important KPI's), then a report area where the to display data are aggregated to levels like category and sub-categories and only within the third detail-part are data on transaction-level available and they will be mostly selected to the area of interest.

No tool will be able to display large amounts of data without the appropriate consume of RAM, cpu-ressources and traffic and if you need them (to process them within other systems - then a manually handling of them makes no sense) then it's better to create those data as tables within the script-step.

To "compress network traffic" I have no idea what it meant excactly but think that a (small) compression on http-protocol layer will be applied.


- Marcus