Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
Currently I have a table with 100 million records with around 40 fields something like below. My question is , would it be helpful regarding performance if normalize those tables in the database? especially dashboard sheet/table performance. I am currently good with loading the data.
Here is an example I am talking about
I can put the department level info into separate table.
For Qlik's purposes, it would likely make a marginal difference. From a database perspective, a single large table often provides better performances than a normalized database, but specifics depend on your exact structure.
I was reading everywhere in the web below,
No, normalizing tables does not inherently improve Qlik performance; in fact, de normalizing is often preferred for Qlik to improve app performance, especially with large datasets
For most datasets, the difference is marginal. If your app is very large, de-normalizing will typically be better for performance, but there are occasional situations where the opposite is true, particularly if your goal is to keep your app size down rather than to optimize on-the-fly calculation speed.
If you're interested in understanding what happens behind the scenes when Qlik stores data, have a look at https://community.qlik.com/t5/Design/Symbol-Tables-and-Bit-Stuffed-Pointers/ba-p/1475369
Well considering such huge data set, we can conclude like
1: if you want to reduce in data load time then go with Normalize table load
2: If you want to improve dashboard or sheet or object performance then go with denormalize table. like one table or star schema.
Regards,
Prashant Sangle