Skip to main content
Announcements
Qlik Connect 2024! Seize endless possibilities! LEARN MORE
cancel
Showing results for 
Search instead for 
Did you mean: 
Anonymous
Not applicable

data model optimisation

Hi,

I'm searching for the ways to optimize the data model of my application in terms of dashboard performance.

Actually, my source data is very simple. It has the logic of an OLAP cube.

There's a dimension 'Advertisers' which is split by the dimension 'Brands', then 'Brands' are split by 'Subbrands' and so on.

Advertisers > Brands > Subbrands > Year > Month > Media type > Media vehicle > Region > Product Category and so on ... and finally

> Budget.

It comes to me in no other format but a very big Excel table with thousands and thousands of rows in it and up to 20 columns where each column corresponds to each dimension.


In QV on the backfront I normally do the following:

1) load this huge Excel table

2) store it as qvd-file and drop it

3) load the fields from the qvd-file to the one final table

Then my data model is ready.


Is it possible to optimize the data and to improve dashboard performance?


Thank you in advance,


Larisa

1 Solution

Accepted Solutions
marcus_sommer

From a gui performance point of view performed such single table which included facts and dimensions mostly very well (except for the needed amount of RAM - if you haven't enough RAM it will be massive slow down the performance). So the performance will depend on the kind of calculations you are doing - I assume that you have not many records maybe less then 1 M records, so the performance is really (too) slow?

- Marcus

View solution in original post

6 Replies
marcus_sommer

From a gui performance point of view performed such single table which included facts and dimensions mostly very well (except for the needed amount of RAM - if you haven't enough RAM it will be massive slow down the performance). So the performance will depend on the kind of calculations you are doing - I assume that you have not many records maybe less then 1 M records, so the performance is really (too) slow?

- Marcus

Anonymous
Not applicable
Author

Hi Marcus,

The number of records is less than 1 mln now. The performance is not too slow now. But as soon as a new dimension is added, I notice a bit of a slow down.

I thought that maybe QlikView has a special data format (olap cube) which is more effective then a huge straight table (in qvd format) with multiple dimensions.

Larisa

gautik92
Specialist III
Specialist III

drop unused fields

marcus_sommer

The way how qlikview the data stored (store only byte-stuffed unique values per column and use bit-stuffed pointers to associated them to the other fields) is a lot more efficient as the method of a classical olap-cube, see: Symbol Tables and Bit-Stuffed Pointers.

What do you mean with huge straight-table? A table with a 1 M rows and/or with a lot of (dozens ?) columns? To display such an object will need a lot of RAM and therefore time especially to transfer them from server to the client.

By an object which showed a more consolidate view of about a few dozens / hundreds rows which are calculated over millions of rows is normally very fast.

- Marcus

qlikviewwizard
Master II
Master II

Hi Larisa,

Please go through these documents.

Hope this will help you.

Thank you.

https://community.qlik.com/docs/DOC-7343

https://community.qlik.com/docs/DOC-4556

https://community.qlik.com/docs/DOC-5830

Anonymous
Not applicable
Author

Thanks everyone for your comments!