Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
Hi,
How is the performance of qlikview with huge data set & more computation columns?
If you have huge data set then data should be optimised and the proper data model should be created (Star scheme / or may be combination of Snow flake schema). It seems it is an interview question though.
Thanks, No it isn't interview question. Actually we are moving from Hyperion to Qlikview or some other tool so I wanted to know what type performance can be expected if there are many computation columns..
Qlikview does not have computation (calculated?) columns as in a database. Qlikview does have pretty comprehensive ETL capabilities, during which you can derive and transform fields as needed, and a front end with arbitrarily complex expressions.
For a large data set, it is important to spend some time on analysis to determine what sort of analysis is being performed and what fields to derive that will aid and speed up calculation in the front end. It is important that the front end should be rapidly responsive to user inputs and this is possible with large data sets (30-40 of million rows in the fact table is the limit of my experience).
You will need a server(s) with sufficient RAM to manage the data as QV/QS are in-memory tools (ie the entire data set is loaded into RAM) as well as sufficient processing power.