Skip to main content
Announcements
Have questions about Qlik Connect? Join us live on April 10th, at 11 AM ET: SIGN UP NOW
cancel
Showing results for 
Search instead for 
Did you mean: 
Anonymous
Not applicable

Large database

Hi,

I would like to ask how does Qlik Sense Desktop perform on large databases, where there are hundred of millions records. I expect it would take a long time to load data into Qlik. How can I manage with refreshing data that takes a long time. Can I schedule it somehow? Is there a difference in that poin between Desktop and Server Version?

Thanks in advance,

M.R.

1 Solution

Accepted Solutions
jonathandienst
Partner - Champion III
Partner - Champion III

Large databases require a little more design than small tables, but can work well providing your PC (for desktop) or server has sufficient RAM and processing power. Use techniques such as incremental loads to handle load times and take care with the data model design (simple keys, remove unneeded fields, etc).

Many large companies such as banks routinely report out of databases with 100s of millions of rows.

Desktop will be constrained by the amount of installable RAM. Server installations are not so constrained and can go up to more than a terabyte of RAM, but at a a cost of course.

Logic will get you from a to b. Imagination will take you everywhere. - A Einstein

View solution in original post

4 Replies
marcus_sommer

Most data need to be loaded only once from the database and could be stored within qvd-files and new or changed data could then be loaded incremental. Within the last two link-blocks here: Advanced topics for creating a qlik datamodel you will find various examples for incremental load-approaches and loading of qvd-files within an optimized mode.

- Marcus

jonathandienst
Partner - Champion III
Partner - Champion III

Large databases require a little more design than small tables, but can work well providing your PC (for desktop) or server has sufficient RAM and processing power. Use techniques such as incremental loads to handle load times and take care with the data model design (simple keys, remove unneeded fields, etc).

Many large companies such as banks routinely report out of databases with 100s of millions of rows.

Desktop will be constrained by the amount of installable RAM. Server installations are not so constrained and can go up to more than a terabyte of RAM, but at a a cost of course.

Logic will get you from a to b. Imagination will take you everywhere. - A Einstein
Anonymous
Not applicable
Author

Thanks Jonathan, it gave me a good insight into the problem.

Anonymous
Not applicable
Author

Thanks Marcus, now I can see I can manage with large data using incremental load.