Skip to main content
Announcements
NEW: Seamless Public Data Sharing with Qlik's New Anonymous Access Capability: TELL ME MORE!
cancel
Showing results for 
Search instead for 
Did you mean: 
Anonymous
Not applicable

Large amount of data (120 million rows) ... anyone got experience?

The most reports we build contain a couple of tables with some tables reaching 12 million rows.

But now we have a request for making reports for reports with 120 million rows with max 275 columns ...

Has anyone got experience with large amounts of data in QlikView? How does QlikView handle large amounts (loading time, user experience)?

rgrds Anita

14 Replies
Colin-Albert

Hi Gerhard,

Are you using an incremental load?

Once you know that an employee was employed yesterday (or any other day in the past), you should not need to recalculate the employment data for that employee on that day again. You just need to add the data that contains the employee's data for the new dates to the existing QVD data.

This should improve the reload performance.

gerhard_jakubec
Contributor III
Contributor III

Hi Colin,

good point indeed, until now i never used incremental load. I'll try to make myself familiar with incremental loads to evaluate if this can be applied to my datamodel / dataset. I mean i have to find a somewhat reliable criteria to identify records to be added/updated/deleted from my qv datamodel...

thans for your engagement !

Not applicable
Author

Hi Selvakumar,

Were all the rows distinct? because it fails for me. Our machine specs is like 24 core and 256GB RAM

Regards,

Sagar Gupta

Not applicable
Author

Since Qlik Engine (QIX) has the limitation of 2Billion Unique records. I would suggest using extension "On Demand App Generation" (ODAG) https://help.qlik.com/en-US/sense-developer/3.0/pdf/User%20Guide%20%E2%80%93%20OnDemandApplicationGe...

omerfaruk
Creator
Creator