Skip to main content
Announcements
Qlik Connect 2024! Seize endless possibilities! LEARN MORE
cancel
Showing results for 
Search instead for 
Did you mean: 
tekarius
Contributor
Contributor

Scalability of Large Data Sets

I was wondering how members of the Qlik Community that work with large transactional data sets (at least 100m+ rows and 80+ columns) handle ETL's and front end performance. 

Our organization is increasing in size and therefor will be doubling if not tripling our transactional data sets.  We have, and continue to implement incremental loading strategies accross all layers of our back end ETL (Extracts QVDs, Transform QVDs, etc) and continually optimize front end performance by ensuring apps follow best practices (avoid AGGR, calcualtion conditions on vizualisations, etc.).

Has anyone implemented other strategies to optimize performance?  We'd be happy to discuss strategy over a call.

1 Reply
dplr-rn
Partner - Master III
Partner - Master III

What you mentioned are good practices to follow.
I would say the app design from a functional point of view is crucial too. i.e. identify right audiences for right apps. not everyone will need all the data all the time e.g. senior management will need only rolled up data.

Also explore ODAG