Qlik Community

Qlik Sense App Development

Discussion board where members can learn more about Qlik Sense App Development and Usage.

Announcements
BI & Data Trends 2021. Discover the top 10 trends emerging in today. Join us on Dec. 8th REGISTER
cancel
Showing results for 
Search instead for 
Did you mean: 
Highlighted
Partner
Partner

Regarding handling large data set in QS FrontEnd calculations

Hello Friends,

I appreciate your time. I am looking for a best approach to handle my below scenario.

I have a transaction fact with customers and trans_id connected to product_table via prod_id. When I try to create a chart for customer count with resp to trans_id for the products, I always end up in out of memory due to high volume of transactions. 

As of now I am creating a straight table and using the product_id as dim and count(trans_id) as exp.

Instead can we do some thing in the qvd layer to get the count of transactions and display in a table box will resolve this issue?

 

could you let me know your suggestions or feedback?

 

Thank you,

Kiruthiga

2 Replies
Highlighted
Partner
Partner

Hi Kiruthiga,

First thing I believe you need to look into is your requirements... What exactly users are looking for and whether they really (everyone keep saying they need everything, but not using 80% of the functionality delivered) need transactional level of details.

It's always easy to pump up your RAM/CPU (just a matter of finding a person who is willing to pay for it), but whether current resources used efficiently and really exhausted its potential?

In a high level dashboards it's a good practice to use aggregated QVD files, rather than transactional data. Depending on your use cases and scenarios you can also consider "Document chaining", Direct Query, Analytical connections (pass initial data set to Python/R engines to calculate and display only retrieved result).

Also to reduce a RAM footprint (which increases amount of available RAM on server) you need to keep only fields which are used in your application (no "just in case" things allowed), get rid of timestamps (split columns to Date and Time - ideally do not store time as an infinite fraction (default behavior) ). If you have long textual transactionID - consider Autonumber() or any kind of simplification/converting it to number.

Basically there are a lot of thing you can do, but everything starts from the very beginning - from your requirements, objectives and limitations.

PS. All above is valid if you have sufficient Qlik installation according to your data volumes and number of users.

Hope this helps.

//Andrei

Highlighted
MVP & Luminary
MVP & Luminary

Something doesn't sound quite right.  Can you post the dimensions and expressions you are using, along with a screenshot of the data model?

-Rob