Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 
Not applicable

QV Publisher, datamodel

Hey,

I have built an application that will be updated overnight and reduced to different QVWs based on field value.

However, I'm now considering whether to create different QVD files (one per QVW) or one large QVD that will be split up in different QVW's using the task described above.

I'm afraid the QVD will become very large, and because the QVWs will be updated at different moments during the day it might be hard to schedule updating the QVD. At the moment I'm leaning towards creating multiple QVD source files.


Can anyone offer his thoughts?


If I opt for creating multiple QVDs, then how do I create one QVW that I can update for each file (as you can imagine I don't want to maintain 100 QVWs that are exactly the same except for their source data?


I have uploaded the model as I think fits my purposes best, however implementation on the right hand side (QVD->QVW) I could use some help on.

12 Replies
Anonymous
Not applicable
Author

Daniel

I am not convinced at all that your estimate of 1250TB of RAM, for your 1,000 Customers [i.e. a touch over a TB for a single customer] is remotely accurate.

How about for testing you take one customer and generate the QVD for that customer.

Use that to populate your qvw dashboard and see how much RAM it consumes.

Then use something like Rob Wunderlich's Document Analyser Downloads - Rob Wunderlich Qlikview Consulting to optimise RAM usage, especially removing unused data fields and eliminating GUI tables displaying loads & loads of rows.

I would wager that RAM usage will be a lot, lot less than you expect. And with more tuning could be reduced even more.

Best Regards,     Bill Markham

Not applicable
Author

Yeah obviously I meant 12.5 and just as an example, there could be more than 1000 customers. Some customers have small databases (less than 1 gig) and yet others are larger). The 1.25 multiplier was provided by QlikTech as a rough rule of thumb..

As for your suggestion to test for one customer, yeah we did. In fact, storing all customers in one combined file won't be a problem at all at first. However, it's best to be prepared for the future:)

I guess our situation is pretty uncommon. Thanks for the help.

Not applicable
Author

You can setup so you have one master qvw to load data into, then distribute master qvw by loop and reduce to X amount of qvws, they will then run another loop and reduce and provide one specific document per user.

And you dont have to publisher master doc as each user has it´s own file.  Or split it down to working sizes in the second step and then use Section access to control access, and publish DataAA,etc

All comes down to what works and not.

DataA >

DataAA

DataAB

DataAC>

UsersA

Andy

Amy

UsersB

Bob

Bewoulf

etc