Skip to main content
Announcements
July 15, NEW Customer Portal: Initial launch will improve how you submit Support Cases. IMPORTANT DETAILS
cancel
Showing results for 
Search instead for 
Did you mean: 
magnusbergh
Contributor
Contributor

Selection/calcualtion runs slow (timeout) using web client, but fine in QV Desktop

We have a QV application which have problems with tiemout during calculation/selection when running the application using QVS/QWS and web client. We have an interface where the user can select which fields to display including calculated aggregated columns and the columns are displayed in a table object. This slection would result in one row being displayed in the table box (using probably 5000 rows in the underlying table for the aggregated column). When the user selects the calculated column it take a very long time and get calculation error after about 2 min (probably IE timeout). 

I took the qvw file and loaded in Qlikview desktop and same selection/calculation finish in 5 seconds. 

The qvw file is 1.8 GB, 202 mill rows and is about 6 GB loaded. 

I notice we get the same problem in our test environment. Not the same data and smaller (1.1 GB). Our test QVS/QWS server has 32 GB memory. When executing the calculation I notice how memory usage increase from about 5 GB for QVS up to take all availab le memory (and not ever finishing before timeout).

When using QV desktop memory usage is about 6GB all the time (more of course when displaying more data in UI).

What can cause this? I understand that running through QVS/QVWS and web client is slower but not like this. Also the application worked fine some time ago with same amount of data. 

1 Solution

Accepted Solutions
marcus_sommer

In general not active/visible sheets/objects/dimensions/expressions won't be calculated. But there are some exceptions especially if calculated dimensions within dimension-groups and/or variables are used - so your mentioned invalid dimensions may cause such impact. I would remove them - you may store them extern or separately within the object-comment or similar when what and why was adjusted to keep a track on it.

- Marcus

View solution in original post

9 Replies
magnusbergh
Contributor
Contributor
Author

Note: running fine in QV desktop is when I open a local qvw file. If I choose to open documnet on QV server then I get an error "Allocated memory exceeded" when running calculation.

marcus_sommer

It sounds that there are any cartesian calculations which takes all the RAM - possible causes may be:

  • different releases between the clients and also in regard to the server
  • section access may impact the dataset
  • variable-values and/or selections/locking of fields may be different in any way
  • shared-files (remove it after a backup)
  • really sure it's the same application and the dataset/datamodel is identically
  • probably some more

Beside this I suggest to check if the application could be optimized - datamodel is a star-scheme and heavy calculation are done within the script or at least pre-calculated so that there aren't aggr() constructs, (nested) if-loops, interrecord-functions within the UI.

- Marcus

magnusbergh
Contributor
Contributor
Author

But should I not notice problems when using QV desktop (opening local file)? If I copy the same qvw file from server to my lcoal machine and runs it there it runs fast and with no problems. So which problems or operations will be a problem when run on server but not desktop? 

The problem is with a chart object (straight table). It has about 40-50 dimensions including 6 calculated. All dimensions are conditional and the user select which deminsions to display in the table. Data is dispalyed just fine as long as you don't select any of the calculated dimensions. 

When running on Qv server I get problems even if I have selected something which would display as one row in the straight table which aggregate about 5k rows. The expression looks like this:

SUM({$<EnhetID={101}>}Värde * vMultiFromSelection)

vMultiFromSelection is just a constant to do unit conversion. I could probably remove the selection because in this case only values with that selection is loaded. 

 

marcus_sommer

Any small difference may lead to a change of the dataset and/or to the UI calculations. Those differences are possible through different releases, machines or users - therefore it's not regardless where and from whom an application is loaded/opened.

At first you could look within the table-viewer if all tables are there, all associations (key-values) look the same and also that the tables contain the same number of records? After that you may look to all relevant fields - are the values properly loaded (just using listboxes to scan the appearances and min/max values)? The next would be the variables - contain all the expected values?

Somewhere is anything different.

- Marcus

magnusbergh
Contributor
Contributor
Author

Which kind of differences do you mean? 

Tables looks fine (there are some columns which probably can be deleted though) as far as I can tell. I have verified which rows are actually selected by adding a table box to display rows and looks fine. I noticed (when working withing desktop) that selections seems to be a bit faster when removing the selection from summary and changing the data format for the field which is summarized by changing type from "mixed" to "numeric".  Have not tried it when running from server though if it makes any difference. Also have another calculated dimension without any selection and same problem.

Are some operations a lot more expensive when running on server compared to running locally ? If so which? 

I also tried to put same summary in a textbox and that worked fine even when running from server.  Is there something with chart - straight table which behaves very different when running on server?

Thank you for your help.

marcus_sommer

Data might be reduced in form of fields or records (section access) or not (properly) loaded through different conditions, access rights or data-interpretation. The same with variables. One missing link/associations (directly or indirectly) could cause that certain calculations in the UI happens not against a rather small subset of the data else against the entire dataset or even against a cartesian product of it.

Therefore my suggestion to check at first that the data are right.

You may also try to find the issue from the other side by deactivating/removing n percent (maybe 50% at first) of the dimensions and expressions and checking the performance. If it react fast again you could reverse it and check again and then deactivating/removing the next 50% and so on ... If this didn't work you could try to create a complete new chart within a new application - just to exclude a corruption - it quite seldom but sometimes it happens (before you should of course check the used/available releases - they should be always the same, no mix match between them.

- Marcus

magnusbergh
Contributor
Contributor
Author

I added a new chart object hard coded with 2 dimensions + calculated dimension and that displays just fine and fast even when showing all rows (~ 20-30k rows in straight table aggregating 200 mill rows in table). 

In the real chart object there are some invalid dimensions and also some groups (including some with invalid dimensions) which are not used for anything. Will see if that helps.

When I am using qv desktop I always login with an user with admin rights but when using from web server I login with an ordinary user. Maybe when you are using an admin user it will disable (or reevalute in some way) invalid dimensions but not when using an ordinary user? Not using section access for column/row reduction. 

marcus_sommer

In general not active/visible sheets/objects/dimensions/expressions won't be calculated. But there are some exceptions especially if calculated dimensions within dimension-groups and/or variables are used - so your mentioned invalid dimensions may cause such impact. I would remove them - you may store them extern or separately within the object-comment or similar when what and why was adjusted to keep a track on it.

- Marcus

magnusbergh
Contributor
Contributor
Author

It worked to remove those invalid dimensions (and groups, removed all because they were not even used).

What is strange is that those dimensions have been there all the time (probably copy-paste from another app by the previous developer) and it did work before. Qlik works in mysterious ways 🙂