Skip to main content
Announcements
Have questions about Qlik Connect? Join us live on April 10th, at 11 AM ET: SIGN UP NOW
cancel
Showing results for 
Search instead for 
Did you mean: 
LVDC_Ash
Contributor II
Contributor II

Loading millions of rows

Hi,

Our company want to launch a new dataviz project on Qlik Sense. We want to visualise a table of about 10 millions rows (csv file of a bit less than1Gb ) with two dozens of columns (mostly float numbers and short texts) on a on premise server with à 32Go Ram and Intel Xeon CPU E3-1270 v6 @3.80GHz 3.80Ghz.

LVDC_Ash_0-1650877418584.png

Is there any limitations regarding the amount of data that Qlik Sense  can handle ?

Regards,

Ash

Labels (1)
1 Solution

Accepted Solutions
hic
Former Employee
Former Employee

There is a hard limit on 2 billion rows, so 10 million rows is absolutely OK.

However, there is also a limit related to the amount of RAM. And here you will just have to test to see how much memory you need: Load your data and monitor the Qlik Sense process in the task manager.

My guess is that 32 GB is plenty of memory, but it depends on many factors: Number of rows, number of columns, number of distinct values in your fields, complexity of calculations, etc.

Good luck!

View solution in original post

5 Replies
hic
Former Employee
Former Employee

There is a hard limit on 2 billion rows, so 10 million rows is absolutely OK.

However, there is also a limit related to the amount of RAM. And here you will just have to test to see how much memory you need: Load your data and monitor the Qlik Sense process in the task manager.

My guess is that 32 GB is plenty of memory, but it depends on many factors: Number of rows, number of columns, number of distinct values in your fields, complexity of calculations, etc.

Good luck!

LVDC_Ash
Contributor II
Contributor II
Author

HI Henric,

Thanks for your quick answer ! We will try that out then. 

Regards,

mborsadw
Partner - Creator
Partner - Creator

@hic My customer needs 3B records to be loaded along with Section Access. The app load takes around 4 mins for every user and subsequent sheet performance is around 30 seconds. Apparently the customer needs all data in one app and splitting is not an option. We are in need of some serious optimization. Any insights?

hic
Former Employee
Former Employee

That's a lot of data...

Your best chance is to put everything in one table (no star schema) and cross your fingers. 😉

mborsadw
Partner - Creator
Partner - Creator

Interesting.. currently the data model is de-normalized into multiple tables. We can try one table + SA table 🙂

Note that we are using 1TB machine with 64 cpus. and all cpus are at 100% when a single user loads the app 🙂

We have used autonumber for concatenated fields that we dont display. We have removed all timestamps from dates. Reduced distinct values as much as we could. Removed additional columns. Cannot think of any other optimizations.