Skip to main content
Announcements
Qlik Connect 2024! Seize endless possibilities! LEARN MORE
cancel
Showing results for 
Search instead for 
Did you mean: 
Anonymous
Not applicable

Heavy qvw document issues

Hi all,

I have experienced some problems with a heavy QlikView document. I would like to see what you guys think is possible.

I have found some posts about this (https://community.qlik.com/message/863553), but no one really solves my problem.

About my document
I have around 10 tables in my data model, one fact table and the other are dimension tables. My fact table consists of 35 columns and has >69.000.000 recs… The total weight of my document is 905 MB, which makes it very slow to open. I use the QV Publisher to reload and distribute my documents, which I have separated in two different jobs. No problem with the first job that is reloading the application, it takes about 10 minutes and works fine. Then I have a “distribution job” that distributes the updated document to QV AccessPoint (at another server) where one single AD group has access to this document. A major issue is that the distribution job takes around 4 minutes to complete, which makes the application on the AcccessPoint “freeze” for our end users when updating. I have a tight schedule of reloading this application every 30 minutes, so being offline 4 minutes * 2 times an hour is a problem for us.

Possible solutions?
I think this problem consists of two major problems; 1) the heavy weight of the QV document 2) the long time to complete the distribution job. Regarding the first issue I have tried to make the document less heavier by droping columns in my data model. Just for an example I tried to drop all tables but the fact table, but that document end up around 902 MB, no difference… Then I tried to really optimize my fact table, and it was possible to drop 8 more fields. But that just resulted in a fact table consistning of 27 columns, >69.000.000 recs and a document weight of 882 MB, still heavy.

Two direct questions
1) Is it normal for this amount of data to take so much as >900 MB document weight?
2) Is it normal for the QV Publisher to complete the distribution task for this heavy document in 4 minutes?


Please reply with your suggestions to make my QV document as light and smooth as possible.
I appreciate any suggestions. Thanks for your help.

Best regards,
Filip

5 Replies
datanibbler
Champion
Champion

Hi,

the solution depends a bit on what you do with this qvw and in what order you take the different steps.

It would make the qvw open a lot faster if you built in a routine that, upon saving it, automatically runs it and empties it of data - which makes up the bulk of the qvw's total size. That could possibly be done with a partial reload - I have two different locations - one productive and one backup and when I save in the backup_location, there is a small dummy txt file there which the code always looks for when reloading and when it's found, the qvw is emptied of data.

HTH

Best regards,

DataNibbler

marcus_sommer

I think that the file-size isn't unusual for about 69 M records and that dropping this or that field makes no big differences (whereby if you don't really need them then leave them out). Bigger differences within the file-size could you only get if you dropped or adjust fields with a high-cardinality - such as record-id's, timestamps, unrounded calculations and so on, see here what is meant: The Importance Of Being Distinct. Also the reducing of records by consolidating some data-areas might be measure. Further I think that a reload-time from about 10 minutes is quite long for this amount of data and schedule from 30 minutes - here might be approaches from incremental loading helpful. Here are some links included to this topic: Advanced topics for creating a qlik datamodel

But I believe your main-issue is more your (slow) network between publisher and server. Try to make manually similar copy-jobs between both servers to see how long it takes. Maybe there are further mechanism which could slow down those transfers - proxys, firewalls, group policies and so on.

Beside them there are probably not many other possibilities - maybe you could try it with a high compression from the qvw which made a qvw significantly smaller but it will need time and ressources to compress and de-compress the application - this meant the open-time within the access point will increase ...

- Marcus

Anonymous
Not applicable
Author

Can you please explain this technique in more details, so I could try it

Anonymous
Not applicable
Author

Thank you for a good reply.

I think that my data model and fields are as normalized as I could get I'm afraid, and unfortunately it's hard for me to adjust any more high-cardinality fields.

Good point about the slow network between publisher and server. I tried to manually copy the file (which is the only thing the distribution job does in the publisher, besides adding some accesses, right?!) between the servers and it took me less than 5 seconds. I'll have a look on the proxys and firewalls etc.

I already have use high compression on all my QV documents Otherwise it would be unsustainable working with these data amount applications in the QV developer.

marcus_sommer

This excludes the network. It could be that there are some settings which could be adjusted maybe some timeouts, waitings or similar (handling certificates or ...) then I doubt that the distribution service does something in the meanwhile. Another point could be the qv release from your environment, sometimes they are quite old ...

- Marcus