Skip to main content
Announcements
Join us at Qlik Connect for 3 magical days of learning, networking,and inspiration! REGISTER TODAY and save!
cancel
Showing results for 
Search instead for 
Did you mean: 
Pulkit_Thukral
Partner - Creator II
Partner - Creator II

Application Optimization

Hi All,

We are struggling with a heavy application having around 120 Million Rows(Includes Partial Duplicates) and 3 GB size on Disk (Uncompressed).

I have tried following things :-

1. Splitting heavy Dashboard into 2 Dashboard (60:40 Size)

2. Pre-Load Option Enabled to decrease load time for first user but no effect seen.

3. Used a different landing page for the application.A light page with just a single button for moving to heavy tab.

4. Selective Load. Making some default selections.

5. Followed various other measures.Attaching a document with all measures followed.

Please go through the document to see the complete list of actions performed.

Even after all these steps, we are still struggling with the Access point view for this application.

Initial Load time for the first user is close to 60 Sec while Load time post each selection is avg 40 Sec.

We currently have 32 GB RAM and 8 CORE CPU but  already requested to install 64 GB RAM and 16 CORE CPU.

Not sure how much help will we get with this H/W increase.

Please suggest some points to increase the performance of my application.

9 Replies
felipedl
Partner - Specialist III
Partner - Specialist III

Hi Pulkit,

Some suggestions based on your topics would be:

1. Since you split your dash into two visualizations, do two separate QVW files each one with its own data (example, from the 120 million rows, dash 1 would have 70 and 2 would have 50 if possible).

2. Address with business owners what the really need to see, 120 million rows would be really analytical and as to my understanding would be quite difficult for bussiness people to look at. You could try to synthesize the data to reduce number of rows (duplicated rows you have for instance).

3~4. Triggers with default views help, but it would be nice to understand what the bussiness most offently sees, as to put these information in one app (KPIs and primary dashboards), and secondary data in another one, if there's a need for more analytical investigation.

5. Check your data and check if all the currently loaded fields tables are needed on the final app.

That's what I can think off about now.

Felipe.

rwunderlich
Partner Ambassador/MVP
Partner Ambassador/MVP

Interesting document. I would say there are some good ideas and some bad ideas in there.  It would be interesting to see a screenshot of your table model.

Have you run the app through Document Analyzer to identify some pain points?

Qlikview Cookbook: QV Document Analyzer http://qlikviewcookbook.com/recipes/download-info/document-analyzer/

Performance tuning a large app can be tricky.  You or a member of your team might want to take a QV tuning course such as Qlik's "Performance and Scalability" Qlik Training Search Results or my https://q-on.bi/courses/qlikview-document-performance-tuning/ or "Masters Summit for Qlik" Masters Summit – The premier international summit for Qlik users

-Rob

http://masterssummit.com

http://qlikviewcookbook.com

Pulkit_Thukral
Partner - Creator II
Partner - Creator II
Author

Hi Felipe,

Thanks for your reply.

Those were some good suggestions but unfortunately , i have tried few of them and others were not shown any interest by Dashboard Users.

We have to showcase last Quarter Data marking each instance on the Dashboard which roughly comes to be 120 Million rows.

Section access/Sub Applications are not acceptable to users.

Thanks,

Pulkit

Pulkit_Thukral
Partner - Creator II
Partner - Creator II
Author

Hi Rob,

Thanks for letting me know about Performance Training.

Will definitely try for it.

As for the document, it would be great if you can mark the bad ideas listed in the document.This would help me to roll back those changes.

I will share the Table model screenshot once i login to the Production Server.It would take me some time to do that due to dependencies to get access.

Thanks,

Pulkit

marcus_sommer

At first I would disable the compression which only saved storage but increased the storing- and opening-times of the application. Further your used network/storage could be a bottleneck and changing to a local ssd-storage could it speed up.

As next I would use the document analyzer from Rob and then checking if the cardinality of some fields could be reduced, see for this: The Importance Of Being Distinct‌. Further are there string-fields which could be numeric fields and also to consider to remove formattings. More concrete this meant:

- splitting timestamps into dates and times

- replacing string-keys with autonumber or with numeric keys like:

  Date * 10000 + StoreID instead of Date & '|' & StoreID

- pure numeric dates like 42500 instead of 'YYYY/MM/DD' and using the field-formatting within

  the document-properties

Another point could be the datamodel itself. Normally it's recommended to use star-schemes or even a big flat-table by huge amounts of data because of their better UI performance but they consume more RAM than other datamodels and so if your UI performance is well enough you might make changes here.

- Marcus

MK9885
Master II
Master II

We're loading around 400M records but server ram is 500GB.

Of course RAM would make lot of difference.

Extract Layer- Try to only bring required records (using join/union/where etc if SQL). Do not bring full raw data, store in QVD.

Transform Layer- Use Extracted qvd's to load data and transform further if required. Create any new dimensions which you plan to use in front end using baisc IF conditions.

Source App- Use Set Analysis, Pick Match instead of IF, use buttons to hide/show sheets.

Minimize the object if it is not used, hiding object would still consume memory..

Finally use Document Analyzer V3.0 Update Available

to test in detail

rwunderlich
Partner Ambassador/MVP
Partner Ambassador/MVP

Three items that jump out as sometimes bad recommendations:

-If Application consists of logically separate modules (Example - ITSM and Monitoring contains different data are not related much), avoid common calendar which requires creation of a distinct key in each module. Keep all modules separate with a separate calendar on each tab. This will make the application faster and reduce the script load time drastically.

-Try to store fields having duplicates separately. Example - Date Field would have MM and YYYY common for most rows. So, store it separately and join on Final UI. This would Compress the size to huge extent.

-Highly Normalized tables are best for cpu and memory use.

MK9885
Master II
Master II

Great point, I'd like to know more on storing multiple date fields into one.

I have 3 tables and they all have their own date fields...

Can I left join those fields based on key and maybe use master calendar and map it to fact?

Or just joining to fact and later on dropping date fields from dimension tables would work?

MK9885
Master II
Master II

Plus if the join is happening in same table, instead of resident load, using preceding load is a good way?

I'm using apply map within preceding load but not with join/resident load.