Qlik Community

New to QlikView

Discussion board where members can get started with QlikView.

Announcements
QlikWorld, June 24-25, 2020. Free virtual event for DI and DA gurus. Register Now
cancel
Showing results for 
Search instead for 
Did you mean: 
Highlighted
Contributor II
Contributor II

table performance issues

Hello,

I've read multiple articles about this topic but none seems to really deliver what I need.

Im fairly new to QlikView so please excuse stupid questions / statements.

So, I want to build a freight reporting in form of an ad-hoc report. So far this was done in excel cause it was easy for everybody to maintain and work with. QlikView should mainly reduce the effort to format about 20 different reports coming in every month and provide a dynamic ad-hoc report.

Im using multiple applications to    

1. Load each of the 20 reports month to month to create a year-to-date table for each report and save it as a .qvd

2. Format each table according to one master format and create one master table with the concatenate function

3. in the user interface only one .qvd file is loaded with a number of flags I set for the analysis.

The master table has about 50 dimension and 20 expressions with year-to-date 180k lines.

Purpose of this report is to provide the user all the information available and the option to dig into certain dimensions or to highlight information by using filters.

In the beginning, when no filters are applied it is almost impossible to work with the table due to loading times.

Now I dont really know what to work on to improve the performance.

Is it the data structure to put all data in one table?

to many dimensions to put in one table?

I attached my table structure.

I'd appreciate any input on what to work on.

Thanks in advance,

Lucas

1 Solution

Accepted Solutions
Highlighted
MVP & Luminary
MVP & Luminary

Re: table performance issues

I hope you are not running on an old XP system ... personally I see none sensible reason not to upgrade to a 64-bit OS and better hardware could you (your company) already buy for a few hundred euros. Otherwise the 2 GB limit will hit you again and again ...

Good luck.

- Marcus

View solution in original post

9 Replies
Highlighted
MVP & Luminary
MVP & Luminary

Re: table performance issues

Even with 78 fact-fields is these application with about 180 k rows quite small and should response very quickly and the datamodel looked quite suitable to deliver it. Therefore I think the reasons will be the kind of expressions you used - probably nested if- and aggr-constructs or there is something wrong how the loosen table values are used for the (assumingly) dynamically dimensions/expressions.

Please provide more informations about how your gui looked like and which expressions are used.

- Marcus

Highlighted
Contributor II
Contributor II

Re: table performance issues

Hi Marcus,

thanks for your input. I assumed that the amount of data cant be the problem.

for the dimensions I use the following expression

=$(=only({$ <_dimensionNo={1}>} _dimension))

with conditional show

=SubStringCount(Concat(_dimension, '|'), '$(=only({$ <_dimensionNo={1}>}

and dimension names

='$(=only({$ <_dimensionNo={1}>} _dimensionLabel))'

For the expressions the same

=$(=only({$ <_expressionNo={1}>} _expression))

=SubStringCount(Concat(_expressionName, '|'), '$(=only({$ <_expressionNo={1}>} _expressionName))') <-- conditional show

='$(=only({$ <_expressionNo={1}>} _expressionName))' <-- dynamic expression names

Yes the dimensions and expressions are dynamic.

There are two

What do you mean with

"or there is something wrong how the loosen table values are used for the (assumingly) dynamically dimensions/expressions."

What could be wrong? Any mistaken reference would just lead to the dimension/expression not showing up I would assume.

Lucas

Highlighted
MVP & Luminary
MVP & Luminary

Re: table performance issues

Your dynamic dimensions/expressions looked fine and don't see there any issue. Loosen tables are tables which aren't associated with the datamodel and if there are calculations between fields from these tables and other tables qv don't know what should be calculated and calculates everything against everything which could take (some) time.

But using them for usability-features like dynamic dimensions/expressions are they very useful.

What are the formulas which needs the most time for calculating? By such a small application hardware-ressources and configurations by the working set shouldn't matter - but nevertheless what do you use for it (CPU, RAM, QV Version - Client/Server ...)?


- Marcus

Highlighted
Contributor II
Contributor II

Re: table performance issues

glad to hear that the expressions are fine.

It's not individual formulas which cause the lags. Its just that you cant navigate through the application any more when the report is fully opened.

CPU: i5-3340M @ 2.7 GHz

RAM: 4 GB ( 3,43 GB usable)

32-bit operating system.

QV desktop Version: 11.20.12018.0 SR 3

-Lucas

Highlighted
MVP & Luminary
MVP & Luminary

Re: table performance issues

It's a quite old QV release and you used a 32-bit OS which meant no application could address more then 2 GB RAM but I think it should be enough unless some of the fields are high cardinality fields with timestamps or larger text-content. Have a look on the task-manager how much RAM QV needs and if selections are made if the virtual RAM (on hard-disk) will be used (such swapping is slow) and further within the user-properties what are the working sets (tab general in bottom right).

Also are any actions/macros is use? What kind of expressions are in your expression table?

- Marcus

Highlighted
Contributor II
Contributor II

Re: table performance issues

Some individual values might contain 1-2 sentences, but only occasionally, no timestamps.

When the report is running im at 60% physical memory usage and QV is using as much memory as IE (about 130k) which suprises me.

But still, when adding / removing dimensions / expressions the report is loading / lagging significantly which means working with the report is almost impossible.

No Macros in use (still need to figure out what is meant by that in QV)

Expressions are all pretty basic:

_expression
num(count([Shipment ID]), '#.##0;-  #.##0')
num(sum([TEU]), '#.##0;- #.##0')
num(sum([Gross Weight in KG]),  '#.##0,00;- #.##0,00')
num(sum([Chargeable Weight in KG]),  '#.##0,00;- #.##0,00')
num(sum([Volume in CBM]), '#.##0,00;-  #.##0,00')
num(sum([Number of Pieces]), '#.##0;-  #.##0')
num(sum([Freight Charges]), '#.##0,00;-  #.##0,00')
num(sum([Fuel Surcharge]), '#.##0,00;-  #.##0,00')
num(sum([Security Surcharge]),  '#.##0,00;- #.##0,00')
num(sum([DG Fee]), '#.##0,00;- #.##0,00')
num(sum([Other Charges]), '#.##0,00;-  #.##0,00')
num(sum([Total Invoice Amount]),  '#.##0,00;- #.##0,00')
num(sum([Debit to Wiesbaden]),  '#.##0,00;- #.##0,00')
num(sum([Debit to 3rd Party]),  '#.##0,00;- #.##0,00')
num(avg([Total Cost per KG]), '#0,00')
num(avg([On-Time Flag]))
Highlighted
MVP & Luminary
MVP & Luminary

Re: table performance issues

The expressions are from a performance point of view harmless. But using about 60% of RAM by 4 GB ressources and a 32-bit qv installation seems to suggest that qv is near the limit on what a 32-bit application could address. Your description on recalculating everything by each action fits excactly to the situation that RAM will be swapped and/or no caches will be used (or removed and new created each time) or similar RAM problems (often seen in VM without fixed ressources for qv).

To check this you could load only a part from your data maybe with any where-condition or a first-statement before the load, like:

FIRST 10000

Load ...

or you used the debugger for loading - there is a limiting option for the amount of data available. If the application responses then "normal" you need to reduce the amount of data / fields from your document or to upgrade on a 64-bit OS (and QV11 SR12, too).

- Marcus

Highlighted
Contributor II
Contributor II

Re: table performance issues

Ok, my summary would be that the performance problem is not related to expressions / calculations in the application. Since upgrading the OS or increase RAM is no option for our IT department I will try to load the application with less data and update to SR12 and see how it goes. Worst case would be that I have to reduce the complexity (dimensions).

Anyways, thank you very much for your effort and input to find a solution to my problem!!

- Lucas

Highlighted
MVP & Luminary
MVP & Luminary

Re: table performance issues

I hope you are not running on an old XP system ... personally I see none sensible reason not to upgrade to a 64-bit OS and better hardware could you (your company) already buy for a few hundred euros. Otherwise the 2 GB limit will hit you again and again ...

Good luck.

- Marcus

View solution in original post