I have a promising interview alligned and I wanted to seek some expertise on this.
What I thought it was: For BI tools that use this mechanism, data is loaded onto memory rather than consuming the CPU of the machine. And this makes the performance significantly better than traditional BI tools that usually consume the hard disk for processing data resulting in slower performance.
but after reading "HIC"s article, ""if you instead have all the data in memory and calculate all expressions on-the-fly, you get much greater flexibility. You can allow a much greater range of formulae and you can let a power user with no knowledge of the original database create charts with complex KPIs"
I think it is: In memory tools such as QV, you can perform calculations on the fly specifically, set analysis, aggr (these do not need scripting in the back end, can be done on the front end) as opposed to traditional BI tools where you are limited to calculations done on back end. hence In-memory is more flexible.
so is it the speed or flexibilty that attributes to 'In-memory-ness'. Any comments, questions, or thoughts would be highly regarded and desired.