Do not input private or sensitive data. View Qlik Privacy & Cookie Policy.
Skip to main content

Announcements
Join us to spark ideas for how to put the latest capabilities into action. Register here!
cancel
Showing results for 
Search instead for 
Did you mean: 
vishalgoud
Creator III
Creator III

How to suggest refresh frequency(incremental) of an app to users..??

Hi Qlikers,

Working on a Special task where i need to suggest some reload frequency of the Qlikview app to users.

earlier we have tried to implement Direct discovery concept, as our client requirement is they want live data on the reports, but by considering its limitations , we are not considering it any more.

So now we are looking for some nearest solution to the like hourly refresh. don't have data as well as access to DB.

Want to know how much time Qlikview(126 GB RAM) will take to pull 1 crore records from a SQL DB.

i know it is specific to data , but need some approx timings so that i can suggest incremental reload frequency.

and is there is any difference between the time of normal extraction from the SQL and Incremental Load. if there ? how much difference ?



Please do let me know your questions if any.


Best Regards,

V

3 Replies
marcus_sommer

It will be more important how fast the db and your network are. But using of incremental approaches will be always faster than an access on the db.

By only 1 crore of records and a well implemented incremental solution and a fast db I could imagine that refreshs of the data are possible to each minute. Here you will find various sources about incremental loading and keeping qvd-loads optimized with exists():

Advanced topics for creating a qlik datamodel

- Marcus

vishalgoud
Creator III
Creator III
Author

Thanks alot somer for the reply,

Just a small doubt, present my VDI RAM is 8GB and i extracted 3 tables of total 12 Crore records in 106 minutes.

8GB - 12 Crore records - 106 mins

64GB - 12 crore records - 13 mins

128GB - 12 crore records - 6 mins.

am just thinking that if i go with 64GB RAM and 128GB RAM then above reload times are possible. Please guide me about my approach , so that i can tell to users that if we use 128 GB RAM then hourly refresh possible even if the updated records are around 12 Crore.

marcus_sommer

Of course you need enough RAM to handle your data. 8 GB seems a bit to small for 12 crore records whereby it might work depending on the widths of your records and how many distinct values each of your fields contained and you as soon as possible dropped your tables - without swapping data into the virtual RAM which will make it quite slowly.

But you couldn't calculate the run-times like above - enough RAM will be enough and more RAM didn't make it faster in any way.

You will get shorter run-times if the db delivered the data faster and/or the network/storage speed is better and if you have more CPU cores with more clock frequency. Another and probably more important point are your scripting knowledge especially by implementing an incremental approach.

- Marcus