Skip to main content
Announcements
Have questions about Qlik Connect? Join us live on April 10th, at 11 AM ET: SIGN UP NOW
cancel
Showing results for 
Search instead for 
Did you mean: 
Not applicable

How much data can QlikView handle?

Guys,

We are conducting a vendor review and comparing Tableau, QlikView, Spotfire and Xcelsius.
How much data can Qlikview handle? From QlikView's manual, there is no limit on the amount of data except that it is limited by the database QlikView is connecting to. But, from your experience, how much in GB, can QlikView handle with a given Server size.

Also, Please point me to any resources that have some data on the comparison of these 4 tools.

Thanks,
Atlas.

22 Replies
Not applicable
Author

Hey Atlas,

You need to evaluate Qlikview in slightly different terms.

Since Qlikview is an in momry associative technology, you're limited only by the number of records you can fit in memory.

That said, your concept of 'quantity' of data also needs to change. Your database, which is (as an example) 4 GB in size on your database server won't be 4GB in memory. QV is highly optimized, and generally speaking sees a compression rate of approximately 80%. (yup, I said 80) - I won't go into detail about HOW it does it, but it's not a big secret.

From there, the complexity and number of things you do to your data also impacts the size of that data in memory. Also, the number of users that are hitting each QV document will also inflate the size slightly.

Basically, there is no direct answer to your question. There is no 'limit' to the 'number' of records. It's not like Excel that can only handle 65k rows on a sheet. The limitations of the application are restricted by the physical capacity of the server running the document, the complexity of that document, and the density of the data.

Given an adequately beefy server, QV could handle anything you could ever throw at it. (and damned well I might add).

We went through the same process, and attempting to compare QV with basically any other solution occasionally looks like nonsense, because QV is a completely different paradigm.

I would recomend you download the free 'personal edition' of the software, it's quite easy to use, and get a sense of. A bit more complex to master, or develop with great efficiency.

johnw
Champion III
Champion III

This won't mean much, but just as a random data point, I can run or reload MOST of our applications on a old 32-bit dual-core machine with 2 GB of RAM. Only the largest applications require me to swap over to a 64-bit single-core machine with 4 GB of RAM. Our servers are a little beefier, but honestly, not by much. I think the main server is a quad core 64-bit with 8 GB. We're serving about 100 users with about 100 document accesses per day between them. Our biggest tables in our biggest documents are probably in the lower tens of millions of rows. Performance is good most of the time.

There are links to several reports comparing QlikView to other tools in the resource library, but the two I've read don't really discuss required server size. They're at a much higher level than that.

http://www.qlik.com/us/explore/resources.aspx

Anonymous
Not applicable
Author

Jonh, a friend asked me about the data size once.

They already have a DW with 8TB.

Can Qlikview deal with it?

I don't know about numbers of users or any other information.

Regards,

Leandra Scordamaglia

johnw
Champion III
Champion III


Leandra Scordamaglia wrote: 8TB. Can Qlikview deal with it?
I'll give it a "maybe". My understanding is that QlikView is designed for gigabyte databases, not terabyte databases. But I don't think there's a fundamental limitation preventing its use on terabyte databases.

If we needed the ENTIRE data warehouse in memory at the same time, even with 10x compression, that's 800 GB of data. But it takes more for each user. Let's say we could run about 20 users at the same time for about 2 TB. That's how much RAM we'd need, because we can't afford to swap to disk. Looks like in theory, we can run up to 2 TB of physical memory on the beefy versions of Windows Server. So on paper, we CAN probably have our entire warehouse in memory with the right hardware and software.

In practice, it should be simpler than that, and we should be able to support a lot more concurrent users than that. We probably won't need the entire data warehouse in memory. We don't typically make one monolithic QlikView application that covers every single aspect of the business. We make QlikView applications that are targetted for specific areas of the business, particular groups of users. Each of these will only be taking on a small portion of the total data in the data warehouse. Now in the long run, between them all, if we have an active user base, we MIGHT still need as much memory as I mentioned, or even significantly more, but it could probably be spread across multiple servers rather than one giant box. So I suspect it's just a matter of economics rather than any fundamental limitations.

Unfortunately, our data volumes are MUCH lower than this, so I'm not sure how applicable my direct experience will be. I'm going to guess that we're currently managing about 50-100 GB of raw data in QlikView. I think we have several hundred users, but probably only a handful are active at any one time. I believe we're handling that on one physical server broken up into several virtual servers. I think the physical server has only 8 GB. I don't manage the servers, so I'm not entirely certain.

Maybe someone from a bigger shop can comment better on how well QlikView is handling larger data volumes like this. There must be someone with this much data or more. 8 TB isn't really all that much data in the grand scheme of things.

Not applicable
Author

Hi,

Does the operating system on which I'm working matter, in terms of how much data I can re-load?

I'm working on Windows Vista and using Qlikview 9 and have 2 GB RAM on my machine.

I was trying to load a CSV of around 5.2 GB containing around 18 Million records and it gave that "Out of Logical and/or Virtual Memory 2 Mb" error. (the data has around 80 columns and contain text as well as numbers in them)

Since then I tried increasing the amount of Virtual Memory to 4096 Mb and re-loaded the data and still it showed the same error.

Then I tried breaking the CSV into 3 smaller CSVs of around 1.5 Gb to 1.7 Gb each but still can't load the data.

I don't think there is a problem with the data, as it works when I do a limited load.

Could you suggest anything else which would help in loading the data?

Not applicable
Author

Hi anant_iyer1

yes, loading depends on your OS. If you are using a 32-bit Windwos (it doesn't depend on the version) the hard limit is 4GB RAM. Because 2**32 is 4GB. Even if you have 32GB in your computer, this is the hard limit using any 32bit-OS.

This limit even decreases at least to about 3.4 GB because the rest is used for internal purposes.

So if you have to load as much as you have to, try any 64-bit Windows and of course enough physical RAM. How much it will take depends an your data.

Anyway, I would Try to split the data(-files) and store them into several qvds and build them together afterwards.


Hope this helps

Roland


Not applicable
Author

Hi kurokarl,

I had tried the approach suggested by you, of storing the data in separate QVDs and then stacking them using concatenate. It loaded the initial QVDs but still was not able to load all the QVDs.

In that regard I wanted to ask that does loading a large file (be it CSV or QVD) take the same amount of memory (RAM) as is does loading smaller files one after the other?

I think it takes the same amount of memory as I was monitoring the task manager for the amount of RAM taken up during the load.

And if so, then is there any way to overcome that, apart from increasing more RAM?

Not applicable
Author

Hi again,

the truth is, you will need the memory (even for a short moment of loading the last few rows) into your model. The splitting of the data was suggested to be a workaround. Loading i.e. data of one year into one qvd, or one month per qvd, or one region, whatever.

But, as I said before, if you want to load all the data in one application, you will need the memory.

regards

Roland

johnw
Champion III
Champion III

I believe that with 32-bit, the limit is 2 GB per application even if you can address 4 GB total. The 2 GB limit may be adjustable to 3 GB with I think some bios setting. It doesn't matter how much physical or virtual memory you add if 2 GB is all operating system will address for your app. So your limit is probably 2 GB. While I'd expect compression to get your 5.2 GB CSV below 2 GB, perhaps the compression isn't occurring as the table is being loaded, but only after. I'd have thought the multiple-QVD approach would get around that, as we used a similar approach for our worst offender before we finally bit the bullet and bought 64-bit everything. But maybe it's just too much data, period.

If you're 32-bit, you might try to figure out the setting that gives you 3 GB of addressable space. When I tried it on my own PC, though, it caused it to become unbootable, and it took our techies a couple hours to get it back up and running, so use at your own risk.

If you're 64-bit (operating system and QlikView), and you feel that you're reading in only the data you need from the file, then RAM might be the only answer.

As always, it's hard to debug something like this remotely, so I wouldn't go and buy 64-bit everything and 32 GB of RAM just because some guy on a forum said you might need to.