
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Qlikview Performance and RAM Usage
HI All,
I am loading 10 Tables with approximately 20 Columns each and the total row count of these tables are 224528320.
So the App size is ranging to be 1.5 GB, But when I check the RAM used its about 12 GB .
Could someone let me know the functionality behind this, also I would like to know if its feasible to add 224.5 Million rows to Qlikview


- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
HI Anushree,
You can choose How it´s your compression ratio.
Check this document please.
Regards
Miguel del Valle


- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
From my experiense you can load much more than 224.5 Million rows to Qlikview (i have worked on an app with over 1 Billion rows) as long as you have the proper hardware available.
When i was working with that huge app our ETL server was an Intel Xeon 16 Core Server with 1TB of RAM, it was a pretty powerful machine.
In your case i would check if all the keys in your data model are functioning as they should ant there is Loop or Synthetic Keys.
Check this link, this could be usefull to you! http://www.johndaniel.com/index.php/how-much-memory-is-needed-for-a-great-user-experience-in-qlikvie...
.png)

- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
The number of rows is nothing spectacular, you can easily add billions of lines and have a working application. The size of the app in disk is most of the times, and by default, compressed, so it gives you a wrong idea of the whole usage.
Some Qlik marketing documents say that compression goes from 1:4 or 1:10, truth is it mostly depends on how unique the values are, its length and if they are numeric, how many decimal places they have, etc. Also, the complexity of the app, alternate states, variables, etc. you are using in the front-end design.
It's completely normal that Qlik uses as much memory and CPU as the computer has available, and the more users you have opening the application and using it, the more memory (and peaks of CPU) the computer will experience, which is fine.
If performance is not good but the server is well sized, you will need to see whether you can remove fields you are not using, or split fields (first name, last name, dates), convert to numeric, and join tables.
But with the information you provided, one cannot say whether it is OK.

- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Thanks for the reply but in my case when i try to open this application which is of 1.5 GB on my system having 8 GB RAM (so no concurrent users) its on the desktop even after an hour the application does'nt seem to open up.
This application has no synthetic keys, no alternate states , no complex calculations.
I would also like to point out that here we are showcasing the data at a granular level like to 100 such rows at the least granularity
But, there is another application of size 3.97 GB which seem to open up withing few minutes even if its being opened for the first time and does have certain calculations.
Could any one explain the reason behind this?
.png)

- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Again, it's all about the data. Unless you say that both applications come from the same data source and the same tables, in which case I would think one is corrupt, it's completely normal that what in disk compressed uses some space, in memory, expanded, uses much more.
And yes, with an 8GB laptop you might not be able to open a QVW if it's big enough, or even if it opens, it renders the computer practically useless.
One exercise you can do is to save both documents you mention with compression set to none, and see the differences. The size of the QVW document uncompressed will be much closer to its use of memory, even if not 1:1.
Another test is, in the bigger one, to remove tables to see which one is really making the difference. It could well be a dimensional table, not facts, with a lot of customer details which one would guess have a higher level of uniqueness (email and postal addresses, coordinates, names and last names, phone numbers, IDs, etc.).
All in all, I would not expect to be able to work with a 1.5GB file in my laptop. We have to use bigger files and bigger QVDs (several GBs) and all development is carried on on a server. Developers' laptops don't even have QlikView Desktop installed.

- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Thanks I shall try the compression set to none


- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Some tips to make these apps potentially more sane size-wise:
- If you use timestamps, round them to the nearest reasonable level. If you just need dates, round them to dates (don't save the time portion). If you do need the time, consider splitting them into date and time fields separately rather than one timestamp. If you can't do that, at least try rounding them to the nearest hour, minute, or second as appropriate.
- Use AutoNumber / AutoHash on any key fields that you don't specifically need the value of
- Avoid value repetition if it exists, especially in any fields that have a large number of distinct values and/or long textual values. Duplicating fields is often a good practice for easier design later, but if the app is larger than you can handle, it's probably not worth it.
You can download Document Analyzer to see where all that memory is going and hopefully get some idea of how to reduce it. Of course, you'll have to be able to actually open the document to do that...
