Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
Hi All,
I have one large Fact table having 150 fields and around 20 millions of data. In my data model I am going to use only this large fact table. I have read that only single table data model is the best for front end performance. I would like to know what could be the performance impact in this case. I have 150 fields out of which more than 100 are numeric (approx 110). And all these 110 fields are used in front end calculations (not at the same time, but used in different calculations)
It would depend what are your expectations and your hardware performance.
Some expressions could be improved using flags or other techniques.
If you are more specific we can help further
it also depends on how many unique field values the expressions are aggregating over
Its aggregating over 10 to 12 dimensions
my front end expressions are simple. most of the fields are having flag only
If usage of numeric fields in front end gets increases, the number of aggregations over those fields also increases which basically takes more time to perform.
You can use Direct Discovery method to reduce the Physical memory (RAM) consumption.
-Ashok.
Like Clever mentioned you need to elaborate it in more details (screenshots from tableviewer and main-objetclags for each s and main-parts from script and expressions) to get helpful answers. Maybe there are a few flag-fields too much - of course flags are very useful to simplify expressions and to speed up them but it's an overkill to create flags for each expression and it could be easily result in the opposite.
- Marcus