Skip to main content
Announcements
Join us at Qlik Connect for 3 magical days of learning, networking,and inspiration! REGISTER TODAY and save!
cancel
Showing results for 
Search instead for 
Did you mean: 
Not applicable

Performance question

Hi

I developed a qvd, not very complex, roughly two major facts one like orderheader, one like orderdetail.

80k rows and anoher one 200k rows.

I also developed several charts in my qvd main page.

my desktop has 16g memory, big enough for the app.

but i noticed it takes about 20 seconds to refresh, several chats sequentially, each takes about 5 to 6 seconds.

it is too slow for me.

while i noticed it works fine on the server.

so my question is , what is the reason it is slow on my desktop, memory? one core cpu? or something else?

do we have any guideline for the performance?

Thanks.

6 Replies
effinty2112
Master
Master

Hi,

To clarify by qvd with two major facts do you really mean a qvw with two tables, orderheader (80k rows) and orderdetail (200k rows)?

Regards

Anonymous
Not applicable
Author

Hi Cooker,

It might have something to do with how you are loading your data. Check your script and make sure that there are no unnecesary duplicates/ data duplicating itself. There could be other factors involved but I doubt that it has anything to do with your machine

Not applicable
Author

You can go to Operations Monitor tab and find the sheet "performance". Which gives the entire story about the performance.

Analyse the report and make necessary changes.

Anonymous
Not applicable
Author

avinashelite

Hi,

1.Your server has a huge memory compared to your PC that's why it will load faster in the server

2.try to avoid the calculated dimensions in the chart it will help you in the fast loading

3.You machine CPU will working on various activities apart from the Qlikview app also so you will face this issue in local system

Not applicable
Author

80k and 200k is not that many rows for Qlikview.

Some operators may limit Qlikview to only one CPU core for the calculation ( count distinct might've been one of those ), so I assume the problem is in some of the expressions (if -operation can get costly) or the data model itself ( expression does cartesian join which joins everything with everything, producing 80k x 200k rows in your case ).

If you have multiple expressions, try disabling those until you see performance getting better.