Synthetic Keys in 7 tables with 200M data can cost 40G+ disk usage to reload
Today, I made a terrible mistake to test and process script developed by a college, in our internal server. The script crash the QV and the server!!!!
I noticed that the .qvw file with a lot of synthetic key. Because 7 metrics tables with for example the same field name of Year, Month, Date, and other columns. Then I started to optimize the script. I never allowed synthetic key in my script, but
As this script seems worked before, I just want to test how far it will go with these synthetic key this time. Then I started the "reload" processes. Normally I know, the synthetic key can use more memory. The file normally is about less than 200M. But this time it surprise me!!!! It ate not only all the 25G memory, and also 40G disk, in less than 10mins!!!! And the server did not kill it and just hanged there, without any more space to do any thing on the server.
I did not expected this, as I did not allowed synthetic key in my script before. All I know is that synthetic can eat memory, and make the reload more slower. May I know what I encountered this time is the real evil side of synthetic key? It can cost not only memory, but also disk?
Re: Synthetic Keys in 7 tables with 200M data can cost 40G+ disk usage to reload
Hi, this is and old post, but the 40GB+ of disk used was on a QVD file? or are you just happen to see that the Disk had 40GB less of free space? If is the latter that might be caused because the page file; after the server runs out of RAM (25GB?) it starts swaping to the page file and use up disk space.