Skip to main content
Announcements
See what Drew Clarke has to say about the Qlik Talend Cloud launch! READ THE BLOG
cancel
Showing results for 
Search instead for 
Did you mean: 
amirmohamed
Creator
Creator

Big size of qvd or app

Hi all, 

I am struggling a lot here, 

We have Order Details Qlik sense dashboard, this contains 32 columns and 8 filters. 

QVD: In the morning 5am we are reloading all data from the Order Details SQL script then saving to QVD.

The data in the dashboard app:

1'st we are loading Today's Orders from SQL script 

2'nd we are loading the QVD's created above, then we are concatenating the data to 1'st table based on 

Where not Exists(unique_link,unique_link_old);

The reason we are doing where not exists is; we might have an order that has been updated today which will also be in the QVD file so we are doing not exists in the today's updated data.

When we load this data, the Qlik app is around 1gb and the total lines are around 19million lines and 40 columns.

Then in the front end of the dashboard we have only 1 table but when we select one option from the filters the table takes over 10 minutes to load the data , even if the data is 10 lines. 

We have If(getselectedcount in the Data handling of the table

 

I need help with 2 things,

1: why 19million lines is over 1gb

2: is the where not exists the best option in this case or is there any better way to use

Labels (3)
6 Replies
JordyWegman
Partner - Master
Partner - Master

Hi,

There can be multiple reasons why an app is so big, but the main things are:

  1. Amount of unique rows
  2. Size of the fields (in combination with uniqueness)
  3. Amount of columns

When the size of the fields is large, you can  choose to create unique numbers for this and create a dimension (check this).

Also, try to create a new application and see if you still have the problem of the slow dashboard.

Jordy

Climber

Work smarter, not harder
dwforest
Specialist II
Specialist II

This might be helpful; another thing to consider is if you really need 19 Million rows. Measures in a table can add to the delay after filtering as well. 

https://community.qlik.com/t5/New-to-Qlik-Sense/Qlik-Sense-Document-Analyzer/td-p/1607245

amirmohamed
Creator
Creator
Author

I have tried the points you have mentioned, i have removed the columns/fields not needed or not used and that reduced the data by 300mb which is really great Thank you for this.

The table has no measures at all it's all a straight table bringing dimensions and I have only one If(getselectedcount condition "If(getselectedcount(field)= 0,0,1) 

But i still see the table bringing the data very slow

amirmohamed
Creator
Creator
Author

Unfortunately yes we need all 19 Million rows, and I have used the Document Analyzer which helped me remove some fields I didn't need in the report

JordyWegman
Partner - Master
Partner - Master

19 million rows is some data, but not too much. Can you see the server performance? 

Also, are you using calculated dimensions? Try to create this in the back-end / script, while they can slow your performance quite well. Is it also slow when you only use a table without any calculations?

Jordy

Climber

Work smarter, not harder
amirmohamed
Creator
Creator
Author

Hi All, 

Thank you for all your replies, but I have found the issue. 

I had 1 table (Orders) in the script, then I was left joining another table to Orders table based on one link between them but the second table had more than 1 line for each order

ex;

Table2

OrderDateProduct
ord12315/11/2019sku204
ord12315/11/2019 sku205
ord12315/11/2019 sku206

 

Then i wanted to left join to Orders table based on Order and only add Date to Orders table , on my SQL script it was like (select order, date from table2) 

this was the issue where it was not pulling Distinct data to Orders table as I didn't add distinct function for my table 2.

I have used the distinct function and it is pulling the same data but as distinct and no line duplication . 

and this has significantly improved the performance, 

initially it was taking 10 seconds to load 1 order with 10 lines, but now it loads over 3000 orders with over 30000 lines within 9 seconds

i hope it makes sense