Skip to main content
Announcements
Have questions about Qlik Connect? Join us live on April 10th, at 11 AM ET: SIGN UP NOW
cancel
Showing results for 
Search instead for 
Did you mean: 
hasanarifkhan
Contributor
Contributor

Qlikview performance issue in AccessPoint

Hi All,

We have developed sales dashboard for our company but now facing performance issue. The dashboard is taking around 15-20 seconds to perform calculations. Expert opinion and suggestions are required.

Hardware Configuration:

Processor: Intel Xeon CPU E5-2650 v3@ 2.3GHz (20 cores with 40 logical processors)

RAM : 224 GB

The server is not fully utilized as CPU and RAM does not reach peak value. We are using single FACT table with multiple dimension tables and autonumber is applied on all key fields.

The records in FACT are 98 Millions. The data model is attached for reference.

Regards,

Hasan Arif Khan.

6 Replies
marcus_sommer

Which kind of expressions are used? Simple sum/count(FIELD) should be quite fast but by using (nested) if-loops, aggr() or interrecord-functions in large tables it could slow-down an application quite heavily.

- Marcus

hasanarifkhan
Contributor
Contributor
Author

Dear Marcus,

Thanks for reply.

we re using aggregates and count distinct functions due to business requirements and cannot move at script level.

marcus_sommer

I didn't mention a moving of the calculation to the script - although some pre-calculation with flagging or categorising of the data may be helpful to reduce the complexitity of the expressions and to speed them up - I was just asking to the used expressions. Some examples and screenshots of the objects will be helpful.

- Marcus

hasanarifkhan
Contributor
Contributor
Author

Dear Marcus,

The FACT contains sales (primary and secondary), stock, productivity and targets data. Flag field is maintained to fetch data in set analysis. The sample KPI used is;

avg(aggr(count({<IS_SECONDARY_SALES={'1'},Cartons={'>0'}>} SKU), DOC_NO, DISTRIBUTORCODE))

diegomen
Contributor
Contributor

Dear Hasan,
I have a similar server configuration, 48 cores and 512 GB of RAM.

For my experience in this case the problem is not the "brute power" of the server but the data model, I would try to reduce the levels of the related tables.

I've seen more than 1 levels cause performance impacts. If possible try moving table fields: "distributionHyearchy, distributors, gandola, gadgethierachy in the table of the top level and check the response of the dashboard.
The use of the set analisys in any case slows down the output.


Sorry for my english and give me a feedback 😉


Ciao.


marcus_sommer

It's not really surprising that these calculation is slow because it counts SKU from the products over the probably quite granular key of the fact table. The virtual table which qlik needs to create to perform the calculation will be rather large and on top of it it's further aggregated (avg) to the fact and the distribution bridge table. Here you could find some background of what is happening to perform a calculation: The Calculation Engine.

There may here and there a few measures with which you could improve this and that a bit but I think you will need to change your datamodel to get a significantly more performaning dashboard.

- Marcus