Qlik Community

QlikView App Development

Discussion Board for collaboration related to QlikView App Development.

Re: Large amount of data (120 million rows) ... anyone got experience?

Hi Gerhard,

Are you using an incremental load?

Once you know that an employee was employed yesterday (or any other day in the past), you should not need to recalculate the employment data for that employee on that day again. You just need to add the data that contains the employee's data for the new dates to the existing QVD data.

This should improve the reload performance.

gerhard_jakubec
New Contributor III

Re: Large amount of data (120 million rows) ... anyone got experience?

Hi Colin,

good point indeed, until now i never used incremental load. I'll try to make myself familiar with incremental loads to evaluate if this can be applied to my datamodel / dataset. I mean i have to find a somewhat reliable criteria to identify records to be added/updated/deleted from my qv datamodel...

thans for your engagement !

Not applicable

Re: Large amount of data (120 million rows) ... anyone got experience?

Hi Selvakumar,

Were all the rows distinct? because it fails for me. Our machine specs is like 24 core and 256GB RAM

Regards,

Sagar Gupta

Not applicable

Re: Large amount of data (120 million rows) ... anyone got experience?

Since Qlik Engine (QIX) has the limitation of 2Billion Unique records. I would suggest using extension "On Demand App Generation" (ODAG) https://help.qlik.com/en-US/sense-developer/3.0/pdf/User%20Guide%20%E2%80%93%20OnDemandApplicationGe...

Highlighted
omerfaruk
Contributor

Re: Large amount of data (120 million rows) ... anyone got experience?