Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
Dear All,
We are working on HR dashboard where we are trying to get the Data of each employees to be on daily basis and we are making the link with the master calendar table.
The main issue is that we are duplicating the data , and now the dashboard is including more than 100 millions of records (0.5 GB)
When trying to access the dashboard it's taking too much time to open the KPIs and the charts.
Any idea what could be the main issue or what we should check to enhance the performance of our dashboard.
Thanks for your support.
Hi Elie,
are you using Sense OR Sense Desktop?
Best regards
Andy
Sense
Could you expand on when you say we are duplicating the data ?
It would be easier to help if you attach more information
you data model or script would help
Hi Elie,
can you provide your data model and possibly the specs of the machine you're running Qlik Sense on?
Regards
Chris
to simplify the case :
we have data for every employee on one record from start date to end date.
what we are trying to do is to create transactions by dates from the start date to the end date for these employees so we will have multiple records for the same employee so we can link it to our master calendar table and get the employees data on daily basis
hope i am clarifying well the case.
To quantify data volumes can you advise :
From that you should be able to derive average number of rows per employee per date.
The reason I ask is I that was surprised when you advised you have 100+ million rows.
qvf script attached.
Thanks for your support.
specs of the Server : 128 RAM 4 Core CPUs (around 300 users)