Skip to main content
Announcements
Accelerate Your Success: Fuel your data and AI journey with the right services, delivered by our experts. Learn More
cancel
Showing results for 
Search instead for 
Did you mean: 
Maminuli
Contributor II
Contributor II

Building an Dashboard for a Large Client Database - Need Advice

We are working on creating a dashboard that involves displaying extensive client data in tabular form and enabling users to search the data by any column. Our client details table comprises a whopping 110 columns and contains more than 4.9 Crore (49 million) rows, with approximately 25,000 new rows added daily. We're currently facing several challenges:

The application takes a considerable 15 to 20 minutes to load initially.
Once the application is open, filters also take 5 to 10 minutes to load.
After selecting a filter, other filters load again and take an additional 5 minutes to load.
Our goal is to create an application that allows users to efficiently search through this massive single table database, with no measures, calculated columns, or variable data involved. We must display the data exactly as it exists in the database table.

Key points to consider:

There are no predefined filters; clients should be able to search the data based on any column in the table.
The search should encompass all data from the beginning of the dataset.
We cannot divide the data into smaller subsets based on specific criteria.
We'd greatly appreciate your advice on the best approach to tackle these issues and create an effective, responsive dashboard for our client database. Any insights or recommendations would be highly valuable!"

Labels (3)
4 Replies
Or
MVP
MVP

My advice would be to use something other than Qlik Sense if the dataset is large enough that Qlik Sense can't handle it in a reasonable amount of time and memory. Qlik apps aren't meant to be search engines on a large scale. What you're describing doesn't sound like a dashboard.

adithyarpai
Employee
Employee

ODAG would be a solution you should explore 

 

https://www.youtube.com/watch?v=Ft3I00-g4H4

 

Maminuli
Contributor II
Contributor II
Author

We had explored that option but when it comes to adding more then 10 filters the application doesn't perform smoothly.

marcus_sommer

I would try to apply multiple horizontally + vertically slicing and grouping the data in various extra fields. This means not to remove/exclude any information else to add sensible filter-information to be able to apply multiple (any maybe more simple) filter-selection and visualization-conditions.

Reasons: no one could sensible use a table-view with more as 10 - 15 columns and maybe a few thousands rows at a time. So looking at first to those columns/rows which may contain the wanted data and if they are detected all data of this sub-set might be displayed. I would expect to reduce a lot of rendering-efforts against a complete table because no web-server and browser will be really performant to render millions of cell-values within html-tables.

Further if possible I would also try to reduce the cardinality of the field-values from those fields which are mainly used to filter the data - the most common example is a timestamp which is better split into dates and times and connected to appropriate calendar and timetable instead of keeping the timestamp and then filtering against these values. Similar approaches are also by many other fields possible. In the end the entire dataset may significantly increase but be better performing in the UI.