hi all, we need a suggestion in order to optimize our System.
The goal to increase the temporal depth of our reports without adding RAM or loose performance
In this moment the data in our .QVD are limited at 380 days, meaning that we're able to generate report since a year ago.
We'd like to aggregate the data (e.g. generate some new .QVD) in wich store the "transformed" data insted of the "gross" data.
in order to do this we thought to create a new ETL proccess that instead of read from the DB, read from the oldest data from QVD
Transform these data and store the data in a new "historical QVD". and very last delete these old data from the source .QVD.
We know that in this way we'll be unable to generate new report but this should allow us to increment the deep of our report,
but we'd like to know if in your opinion generate these "partial/aggregate .QVD" will permit to reach our goal.