Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
The most reports we build contain a couple of tables with some tables reaching 12 million rows.
But now we have a request for making reports for reports with 120 million rows with max 275 columns ...
Has anyone got experience with large amounts of data in QlikView? How does QlikView handle large amounts (loading time, user experience)?
rgrds Anita
Hi Gerhard,
Are you using an incremental load?
Once you know that an employee was employed yesterday (or any other day in the past), you should not need to recalculate the employment data for that employee on that day again. You just need to add the data that contains the employee's data for the new dates to the existing QVD data.
This should improve the reload performance.
Hi Colin,
good point indeed, until now i never used incremental load. I'll try to make myself familiar with incremental loads to evaluate if this can be applied to my datamodel / dataset. I mean i have to find a somewhat reliable criteria to identify records to be added/updated/deleted from my qv datamodel...
thans for your engagement !
Hi Selvakumar,
Were all the rows distinct? because it fails for me. Our machine specs is like 24 core and 256GB RAM
Regards,
Sagar Gupta
Since Qlik Engine (QIX) has the limitation of 2Billion Unique records. I would suggest using extension "On Demand App Generation" (ODAG) https://help.qlik.com/en-US/sense-developer/3.0/pdf/User%20Guide%20%E2%80%93%20OnDemandApplicationGe...
Check this blog post also:
Performance and Optimization Best Practices in QlikView & Qlik Sense