Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
Hello dear community!
I ask for a question about my data model optimization.
The main goal of optimization was to reduce the time of loading data, as well as add new periods to the fact table.
In the previous model, a lot of "Join" commands were used to combine the fact tables and plans. Thus, when upgrading, the server memory was loaded to 90-95% with 160 gigabytes of memory.
In the new model, I used "Concatenate", which would have facts, plans and other movements in one table.
As a result, I reduced the update time from 2 hours to 30 minutes and instead of 1 year I received 3.
But my facts table has grown to 150,000,000 lines.
The question is - did I do right or wrong?
The result of working with the model is shown below.
it seems pretty good
another way is ApplyMap check it
If you increase the number of year of your data model (from 1 to 3) the link table could be bigger, it depends on which are the keys inside your link table.
The fact is that I refused in this case from the link table. It was used in the previous model. Now I just generate the keys in the fact table.
The previous model came to me as a legacy from another developer.