Skip to main content
Announcements
Qlik Connect 2024! Seize endless possibilities! LEARN MORE
cancel
Showing results for 
Search instead for 
Did you mean: 
Not applicable

How much data can QlikView handle?

Guys,

We are conducting a vendor review and comparing Tableau, QlikView, Spotfire and Xcelsius.
How much data can Qlikview handle? From QlikView's manual, there is no limit on the amount of data except that it is limited by the database QlikView is connecting to. But, from your experience, how much in GB, can QlikView handle with a given Server size.

Also, Please point me to any resources that have some data on the comparison of these 4 tools.

Thanks,
Atlas.

22 Replies
suniljain
Master
Master

we are already using qlikview in a environment where we have more than 20 crores of record in a single table and we have more than 20 table which have that much data.

Not applicable
Author

Hi Anant_Iyer1. If you are having trouble with smaller (1.5Gb) csv files, then I suspect it someting to do with the way you are loading the data rather than the data itself. A straight load should have no problem loading 1.5Gb, due to the way Qlikview compresses the data. But if you are doing any joins, that can drastically increase the memory requirement. We would need to see the script to better understand the issue.

Steve

Not applicable
Author

For one of my applications I convert 36GB of text files into 11GB of QVD files and load that into a QVW file, add some more fields, do some joins and end up with a 2.6GB QVW file.

So you can get pretty hefty. But I do this on a 64bit machine with a lot of memory. I makes it to almost 15GB of memory at one point 😕

Not applicable
Author

"Out of Logical and/or Virtual Memory" can happen if there are too many distinct values. It happens sometimes even in 256 GB of memory. Try not to load everything, but data that is really needed. And install the latest available hotfixes.

To anwer the original question, 200 million rows is business as usual, with decent performance. Above that, performance degrades quickly if you do "load *; select *".

You can go above 500 million rows, if you optimize the design and can afford hundreds of GB of memory and tens of CPU cores.

-Alex



Not applicable
Author

Hi,

Thanks a lot to all of you, esp. to Alex for pointing out the distinct values thing. I think that might just be the problem, because most of the columns in the data that I'm trying to load have distinct values, i.e., each column has a large number of distinct values.

I had already tried aggregating the data, and had no problems in loading the aggregated data. Was trying to get the non-aggregated data into qlikview to see what type of analysis I can do with that.

Thanks again.

Anant

Not applicable
Author

Hi Jhon,

Our company is planning to buy qlik view but I am stuck with the problem of having huge amount of data could you please tell me if we have to buy servers are we base on QVW file or QVD file.Because every year we have have data of 2tb .Doesnt QVD file would sit on SAN .How we can decide how much RAM is needed and I need to know exactly based on which file size we should decide RAM as we have more than 800 users.

Thanks

Radhika

johnw
Champion III
Champion III

A very basic rule of thumb is that you get 90% compression and then need another 5-10% per concurrent user. So you could vastly oversimplify and say you need (10% * 2 TB) * (100% + 7.5% * 800) = 12 TB of RAM per year. Since Windows only supports 2 TB of RAM, what you want thus looks impossible.

It's not, though. Chances are excellent that your 2 TB per year will be split across 50 or more different applications, each with only some small fraction of the total data. Maybe your biggest application is 200 GB per year with 5 years of data, so 1 TB. You won't have 800 concurrent users of this application if you only have 800 users. Maybe at the worst time of the worst day you have 50 concurrent users. Now you're looking at (10% * 1 TB) * (100% + 7.5% * 50) = about 500 GB. So you might be OK with 500 GB - 1TB on the server (allowing for other applications in the higher number).

Taking another approach, as a very rough estimate, I think our shop has about 100 GB of uncompressed data in QlikView, we have about 300 users, and we're just starting to bump into memory limits at 12 GB of memory on the server. Bump us up to 800 users, and maybe we'd need 20 GB. So with the way our own data is split across applications, and based on the way our own users use the data, we need about 20% as much memory on the server as we have raw, uncompressed data. So if you keep 5 years of 2 TB per year, you have 10 TB of raw data, and you'd need 2 TB of memory on the server.

So it's hard to say exactly how much memory you'll need. But I think it's pretty clear that it will be a lot. Of course, if you're used to 2 TB of data per year, maybe 500 GB to 2 TB of RAM on a server isn't a lot to you.

Not applicable
Author

Hi Jhon,

2TB data is compressed with QVD files(of total 35 applications) and each application 30 users are there and we have total all applications of 300gb of QVW file.Please let me know how I should look into taking server size.Say each application will have like 30GB QVW file size for 12 months so could you please let me know how I can handle this data.Does QVD file also sits on RAM .

Thanks

Radhika

Not applicable
Author

Hi Atlas

Ii joind  this community couple of months before....and saw you query on the data compression data volumes handiing on the application level....one thing i wanted to throw my thoughts on the same..you are comparing Qlikview Vs tableau Vs Spotfile Vs Xcelsius..lets take xcelsius first which was my favoorite ....

Xcelsius:- it is a just dashbaord tool which uses excel as source for processing the data in it..whenver you run the dashbaord, the data has to pass thru excel..i spent so many years and i've developed 1000 + dashbards by using xcelsius and different with connectors...interms of data processing it depends what exacty you are showing .

xcelsius is meant for highly aggreagted number you cannot take beyond that...if you have highly aggregated numbers then go for xcelsus,

but with xcelsius  you cannot fill your requirement it needs BO or any Other OLAP connectors to connect it.......LOOK AND FEEL (user interface ) is amazing afcourse it uses flash lah.....

Qlikview:-Qlikview is a different tool it uses its RAM to process the data and compress 90 % of your original table having said that it all depends on data quality(Duplicates/Joins..etc)...the good thing about the tool is much faster if your comparing with xcelsius....one more good thing about is, you dont need data wareshouse at all it uses QVD as data source and stores all the data in it,

whenver you generate QVW the QVW triggers the QVD to process the same data.....

Cost wise and processing wsie go with qlikview.....if you have anything BIG in you mind like different reporting flavours like CANNED,ADHOC, DASHBAORDS, Drag and drop concepts for user then go for SAP BO, or Tableau

i dont know much about tabluau much but look and feel wise is damm good

good thing about tableau is, it will suggest  the graphs based on you dimension, metrics selection...because most of the BI users want look and Feel and and some really Inteligence(Suggestions) in the Tool....

The above comments are purly based on my experince .......I'am implemntating complete End to End implementation on qlikview..will update on the same if you have any queries.....and please avoid any typo/grammatical mistakes..

Regards

Imran Khan

mahindrasatyanm

BI Arch

Not applicable
Author

Hi Atlas

Ii joind  this community couple of months before....and saw you query on the data compression data volumes handiing on the application level....one thing i wanted to throw my thoughts on the same..you are comparing Qlikview Vs tableau Vs Spotfile Vs Xcelsius..lets take xcelsius first which was my favoorite ....

Xcelsius:- it is a just dashbaord tool which uses excel as source for processing the data in it..whenver you run the dashbaord, the data has to pass thru excel..i spent so many years and i've developed 1000 + dashbards by using xcelsius and different with connectors...interms of data processing it depends what exacty you are showing .

xcelsius is meant for highly aggreagted number you cannot take beyond that...if you have highly aggregated numbers then go for xcelsus,

but with xcelsius  you cannot fill your requirement it needs BO or any Other OLAP connectors to connect it.......LOOK AND FEEL (user interface ) is amazing afcourse it uses flash lah.....

Qlikview:-Qlikview is a different tool it uses its RAM to process the data and compress 90 % of your original table having said that it all depends on data quality(Duplicates/Joins..etc)...the good thing about the tool is much faster if your comparing with xcelsius....one more good thing about is, you dont need data wareshouse at all it uses QVD as data source and stores all the data in it,

whenver you generate QVW the QVW triggers the QVD to process the same data.....

Cost wise and processing wsie go with qlikview.....if you have anything BIG in you mind like different reporting flavours like CANNED,ADHOC, DASHBAORDS, Drag and drop concepts for user then go for SAP BO, or Tableau

i dont know much about tabluau much but look and feel wise is damm good

good thing about tableau is, it will suggest  the graphs based on you dimension, metrics selection...because most of the BI users want look and Feel and and some really Inteligence(Suggestions) in the Tool....

The above comments are purly based on my experince .......I'am implemntating complete End to End implementation on qlikview..will update on the same if you have any queries.....and please avoid any typo/grammatical mistakes..

Regards

Imran Khan

mahindrasatyanm

BI Arch