Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
I created an application based on a large amount of data (over 18 million records). I'm experiencing a very bad performance (e.g. taking forever to generate) on one of the chart with "full accumalation" option selected.
I tried to improve its perforamnce by using sampling records (e.g. sum(if), set analysis, mulitiplication). It takes about 15 seconds to reload when I trim down the total number of records to around 3 million records by filtering records. However, when I refresh this application using using full set of data (18 million records), it took about 8 minutes to reload this chart while the rest of the charts \ tables are reloaded within a few seconds.
I need to load the full data set in the applicaton as they are needed to generate other charts \ tables. Is there any other way I can further improve the performance of this chart? Thanks.
Hi Lewis,
Try to do the sumation in the script level (back end) as a seperate table & field. Bring the new field into chart.
Regards
Rajesh
Hi Rajesh,
Actually, I've a complex but high flexibility data model which allow to analysis data in many different dimensions in this application. If I build a summaried table based only base on 1 or a few dimensions, it will limit the flexibility of the application. Is there any other possible way to improve the performance in this case? Thanks a lot for your suggestion 🙂
Regards,
Lewis
Hi Lewis,
Based on you complex conditions, can you able to raise some flag in the back end script and make use of them in the expression.
Regards
Rajesh
Hi Rajesh,
I added a flag in my script and tried to make use of it in the dimension and expression as well. But it doesn't help much in my case. Is there anything else I should try?
Regards,
Lewis
Hi Lewis,
Can you share the expression?
How many dimensions are using in your chart? Is there any calculated dimensions?
Regards
Rajesh