
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Loading millions of rows
Hi,
Our company want to launch a new dataviz project on Qlik Sense. We want to visualise a table of about 10 millions rows (csv file of a bit less than1Gb ) with two dozens of columns (mostly float numbers and short texts) on a on premise server with à 32Go Ram and Intel Xeon CPU E3-1270 v6 @3.80GHz 3.80Ghz.
Is there any limitations regarding the amount of data that Qlik Sense can handle ?
Regards,
Ash
Accepted Solutions

.png)
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
There is a hard limit on 2 billion rows, so 10 million rows is absolutely OK.
However, there is also a limit related to the amount of RAM. And here you will just have to test to see how much memory you need: Load your data and monitor the Qlik Sense process in the task manager.
My guess is that 32 GB is plenty of memory, but it depends on many factors: Number of rows, number of columns, number of distinct values in your fields, complexity of calculations, etc.
Good luck!

.png)
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
There is a hard limit on 2 billion rows, so 10 million rows is absolutely OK.
However, there is also a limit related to the amount of RAM. And here you will just have to test to see how much memory you need: Load your data and monitor the Qlik Sense process in the task manager.
My guess is that 32 GB is plenty of memory, but it depends on many factors: Number of rows, number of columns, number of distinct values in your fields, complexity of calculations, etc.
Good luck!

- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
HI Henric,
Thanks for your quick answer ! We will try that out then.
Regards,

- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
@hic My customer needs 3B records to be loaded along with Section Access. The app load takes around 4 mins for every user and subsequent sheet performance is around 30 seconds. Apparently the customer needs all data in one app and splitting is not an option. We are in need of some serious optimization. Any insights?

.png)
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
That's a lot of data...
Your best chance is to put everything in one table (no star schema) and cross your fingers. 😉

- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Interesting.. currently the data model is de-normalized into multiple tables. We can try one table + SA table 🙂
Note that we are using 1TB machine with 64 cpus. and all cpus are at 100% when a single user loads the app 🙂
We have used autonumber for concatenated fields that we dont display. We have removed all timestamps from dates. Reduced distinct values as much as we could. Removed additional columns. Cannot think of any other optimizations.
