Qlik Community

New to Qlik Sense

Discussion board where members can get started with Qlik Sense.

Announcements
QlikWorld, June 24-25, 2020. Free virtual event for DI and DA gurus. Register Now
cancel
Showing results for 
Search instead for 
Did you mean: 
Highlighted
Creator
Creator

Large database

Hi,

I would like to ask how does Qlik Sense Desktop perform on large databases, where there are hundred of millions records. I expect it would take a long time to load data into Qlik. How can I manage with refreshing data that takes a long time. Can I schedule it somehow? Is there a difference in that poin between Desktop and Server Version?

Thanks in advance,

M.R.

Tags (1)
1 Solution

Accepted Solutions
Highlighted
MVP
MVP

Re: Large database

Large databases require a little more design than small tables, but can work well providing your PC (for desktop) or server has sufficient RAM and processing power. Use techniques such as incremental loads to handle load times and take care with the data model design (simple keys, remove unneeded fields, etc).

Many large companies such as banks routinely report out of databases with 100s of millions of rows.

Desktop will be constrained by the amount of installable RAM. Server installations are not so constrained and can go up to more than a terabyte of RAM, but at a a cost of course.

Logic will get you from a to b. Imagination will take you everywhere. - A Einstein

View solution in original post

4 Replies
Highlighted
MVP & Luminary
MVP & Luminary

Re: Large database

Most data need to be loaded only once from the database and could be stored within qvd-files and new or changed data could then be loaded incremental. Within the last two link-blocks here: Advanced topics for creating a qlik datamodel you will find various examples for incremental load-approaches and loading of qvd-files within an optimized mode.

- Marcus

Highlighted
MVP
MVP

Re: Large database

Large databases require a little more design than small tables, but can work well providing your PC (for desktop) or server has sufficient RAM and processing power. Use techniques such as incremental loads to handle load times and take care with the data model design (simple keys, remove unneeded fields, etc).

Many large companies such as banks routinely report out of databases with 100s of millions of rows.

Desktop will be constrained by the amount of installable RAM. Server installations are not so constrained and can go up to more than a terabyte of RAM, but at a a cost of course.

Logic will get you from a to b. Imagination will take you everywhere. - A Einstein

View solution in original post

Highlighted
Creator
Creator

Re: Large database

Thanks Jonathan, it gave me a good insight into the problem.

Highlighted
Creator
Creator

Re: Large database

Thanks Marcus, now I can see I can manage with large data using incremental load.