Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
Hi,
I would like to ask how does Qlik Sense Desktop perform on large databases, where there are hundred of millions records. I expect it would take a long time to load data into Qlik. How can I manage with refreshing data that takes a long time. Can I schedule it somehow? Is there a difference in that poin between Desktop and Server Version?
Thanks in advance,
M.R.
Large databases require a little more design than small tables, but can work well providing your PC (for desktop) or server has sufficient RAM and processing power. Use techniques such as incremental loads to handle load times and take care with the data model design (simple keys, remove unneeded fields, etc).
Many large companies such as banks routinely report out of databases with 100s of millions of rows.
Desktop will be constrained by the amount of installable RAM. Server installations are not so constrained and can go up to more than a terabyte of RAM, but at a a cost of course.
Most data need to be loaded only once from the database and could be stored within qvd-files and new or changed data could then be loaded incremental. Within the last two link-blocks here: Advanced topics for creating a qlik datamodel you will find various examples for incremental load-approaches and loading of qvd-files within an optimized mode.
- Marcus
Large databases require a little more design than small tables, but can work well providing your PC (for desktop) or server has sufficient RAM and processing power. Use techniques such as incremental loads to handle load times and take care with the data model design (simple keys, remove unneeded fields, etc).
Many large companies such as banks routinely report out of databases with 100s of millions of rows.
Desktop will be constrained by the amount of installable RAM. Server installations are not so constrained and can go up to more than a terabyte of RAM, but at a a cost of course.
Thanks Jonathan, it gave me a good insight into the problem.
Thanks Marcus, now I can see I can manage with large data using incremental load.