Skip to main content
Woohoo! Qlik Community has won “Best in Class Community” in the 2024 Khoros Kudos awards!
Announcements
Nov. 20th, Qlik Insider - Lakehouses: Driving the Future of Data & AI - PICK A SESSION
cancel
Showing results for 
Search instead for 
Did you mean: 
Not applicable

How much data can a Qlikview tool handle? what would be it's effect on speed of the tool?

Hey People,

New to Qlikview.i have a data of 10 gb for which i have to build a tool on Qlikview. Can Qlikview handle this much amount of data?

How much would the speed of the tool be affected?

11 Replies
rbecher
MVP
MVP

Hi,

it depends on many things:

- hardware (RAM, cores)

- data model (one table, many tables, associations)

- cardinality and sparsity of the data (esp. for compression ratio)

- record count

- amount of UI elements (listboxes, charts etc.)

- amount of users

The amount of raw data gives not a real indication. But, I think 10 GB of raw data is not a big size.

- Ralf

Astrato.io Head of R&D
mountaindude
Partner Ambassador
Partner Ambassador

We've been using tens of GB of reasonably complex data without too much trouble. Main issue is having enough RAM in the web server, as each additional user of the large QV application will need a fair amount of RAM for her/himself. I recall ca 10% of application size per user being mentioned somewhere?, I.e. for an application using 30 GB of RAM, each user using it will use another 3 GB of RAM (assuming that 10% number is correct).

But there is a lot of performance to gain by designing the data model and application properly, doing set analysis over lots of rows in a chart is a no-no from performance point of view. Much preferred (if possible given the problem at hand) to do the aggregation (or whatever needs to be done) in the load script.

Please mark the post as a solution if it provided you with a solution to the topic at hand. Thanks!
rbecher
MVP
MVP

BTW, we're processing 250 GB raw data every day in one project..

Astrato.io Head of R&D
mountaindude
Partner Ambassador
Partner Ambassador

But you're not juggling around 250 GB in RAM, I assume... ?

Agreed, QV is pretty effective at quickly ingesting large data volumes, at least if you avoid reading via ODBC (which in my experience is way too slow for larger data volumes).

CSV files and similar are on the other hand quick to read.

/Göran

Please mark the post as a solution if it provided you with a solution to the topic at hand. Thanks!
Not applicable
Author

Well, goran thanks for reply and i had started importing 1.8 gb data and i couldn't..

Data is being fetched but the reloading process continues for a large amount of time without giving any result.

And yes i have a 4 GB ram!!

Help!!

rbecher
MVP
MVP

Unfortunately we have only 144 GB RAM but I've heard from a project using 1 TB RAM !

Astrato.io Head of R&D
rbecher
MVP
MVP

Vanaika,

are you running QlikView on 32bit? If so, it could only use 2 GB per application. I presume it would work on 64 bit.

Are you loading just one table? I think we would need more information to help you.

- Ralf

Astrato.io Head of R&D
mountaindude
Partner Ambassador
Partner Ambassador

Vanaika: You could also try loading a subset of the data, to make sure the load script itself works as expected.

You could do this either by creating a small(er) input file to read from, or by using the built-in debugger's limit feature (allows you to load max n lines in each load statement).

Inserting trace statements in the load script also helps pinpoint where the script hangs or has issues.

Ralf: 144GB isn't too shabby though. Given the development of RAM prices you'll soon be crossing into the next order of magnitude... \o/

/Göran

Please mark the post as a solution if it provided you with a solution to the topic at hand. Thanks!
Not applicable
Author

Hey Goran,

Could you please tell me how and where to insert trace statements in the load script?