Skip to main content
Announcements
Have questions about Qlik Connect? Join us live on April 10th, at 11 AM ET: SIGN UP NOW
cancel
Showing results for 
Search instead for 
Did you mean: 
Not applicable

What is the size of typical report in Qlik View?

I would like to move generated reports from one server to another, but without any data, only reports. Is that possible?


I have 2 servers in two different networks. One of them will contain all data, but it doesnt have access to the internet. So would like to move reports from one server to another. Do I need to  move data as well or I can move only some piece of data associated to reports or even only reports?


PS What is the typical amount of ram do I need per core? Is there any recommendation. I know the answer "it depends', but any ideas? (1 TB source files, about 200 users)

Labels (2)
10 Replies
Clever_Anjos
Employee
Employee

"I would like to move generated reports from one server to another, but without any data, only reports. Is that possible?" If you mean *.qvw files, you can move them without data, but they are useless. Remember that a qvw is a self-contained data+application


"Do I need to  move data as well or I can move only some piece of data associated" you need to move your qvw´s

"typical amount of ram do I need per core" the amount of RAM depends of qvw´s size

Peter_Cammaert
Partner - Champion III
Partner - Champion III

Not sure I understand your question(s).

A QlikView document contains all the data. So your "reports" always include the data in the same file, otherwise there won't be any "reporting".

As soon as your end-user documents have been reloaded, you won't need any of the source QVD's or the data sources. But the QlikView documents themselves may have a substantial size, so I'm not sure whether this solves your issue.

To calculate an estimate of the RAM requirements for a machine with particular documents and numbers of users, there is this quick formula:

=uncompressed document size + 10% for every simultaneous user

Calculate this amount for every document, add a set-up amount for OS and software. The problem is that this is just the amount required by QVS to serve documents to end-users. QDS (the reload infrastructure) also requires RAM and this amount depends on the actual reload situation which can only be estimated/monitored by executing them.

Best,

Peter

Not applicable
Author

Thank you for the answer. The size of qvw file will be around 0.5 Tbytes and up to 1 Tbytes, for sure, pretty big files. So I will be glad hear your consideration about typical ram per core in that case and possible other cases?

Any ideas how big can be qvw files, what is the top solution can be deployed? For example oracle supercluster m6-32 has 32 Tbytes of RAM. Can it actually handle 20 Tbytes qvw files? What about if I have 10 such servers?



PS I cant access to link you provided, do I need rights for that?

oknotsen
Master III
Master III

Just to add:

On top of what Peter mentions, you will want to have a load of free RAM that your QVS will use for caching. Caching will improve the average response time of your apps. There for I would blindly double that amount of RAM, unless of course all your calculations are finished in 3 seconds.

May you live in interesting times!
Not applicable
Author

RAMinitial calculated this way: =source*(1-  CompressionRation)*FileSizeMultipler  

As I know CompressionRation is about 0.8, but what about FileSizeMultipler? Am I right that  (1-  CompressionRation)*FileSizeMultipler will be equal to 1?

Peter_Cammaert
Partner - Champion III
Partner - Champion III

First you need to ask yourself whether it is useful at all to load all data from an Oracle cluster into a single QlikView document. I guess not. There are always physical (speed) limits to what QlikView can cough up in the magical couple of seconds target response time.

Also, a single QlikView document cannot be segmented or spread over multiple systems. If you load 20TB in a single QlikView document, you'll have to scale up every single system hosting this document. If Windows can handle such requirements, then QlikView can too.

I don't really understand what you mean by "RAM per core". AFAIK RAM isn't allocated per core. In addition to that, QlikView is reasonably multi-threaded, so at any time multiple cores may be messing with the same chunk of memory, and one core may be using multiple chunks of memory. The more cores and the faster they are, combined with more (faster) memory will make a high-performance system for a number of users and a number of documents (scaling up or vertical scaling). If that isn't enough, you can cluster systems into what will appear to be one large system (scaling out or horizontal scaling). That's how many enterprise users grow their systems together with their business.

If the link form Clever doesn't work for you, maybe you can try this one to an official document: Qlik Resource - QlikView Scalability Overview

Best,

Peter

Not applicable
Author

I know asking about ram per core sounds ridiculous. The goal I'm trying to achieve, is that I need substantiate my sizing and then eventually to approve. Amount of ram looks reasonable, but there is problem with cpu. Is there any cases with pretty big qvw files? That would be nice))

oknotsen
Master III
Master III

That calculation is a guestimate and since you put in the numbers, you have to have some experience with similar data. In other words; if you have made this calculation before with similar data.

I don't see why the result should be equal to 1. Could be 0.2, could be higher. I don't know; I don't have a clue about your data nor experience with it.

In your openings post your are talking about "source files". That does not mean anything to me as different course systems store their data differently and on top of that Qlik also stores the data again different.

I am getting the impression you have already loaded the data into QVWs. If that is so, there is an easy way to get a decent guestimate: Check the properties of the QVW and make sure it is stored uncompressed. If so, the file size on disk will get you a decent idea (not perfect, but close enough usually) to how much RAM the base footprint of the app will be in memory.

I mentioned base footprint. For every concurrent user past the first add 10%. You are talking about 200 users; will all of them be using the app at the same time or will it only be like 21 people during high usage? Lets assume it is 21 people. 21-1 = 20. 20 * 10% = 200%. So that means you should have 300% of the base memory footprint available RAM for just the app plus the sessions for those 21 users.

And then comes the last part:

Cache.

Qlik will try to store every calculation you make into the remaining memory of your server. It does so, so it does not have to waste valuable resources to keep re-calculating the same difficult expressions over and over again. If there hardly is any memory left, it will have to do so. If you have a healthy amount of memory left, it will eat it up for caching (which btw is a very healthy thing and nothing to worry about).

It is pretty normal to see a server with 256gb RAM using 70% of its memory constantly.

Hope that helps. No clear numbers, but that would require experience.

May you live in interesting times!