Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
Hi,
Let's imagine we'll have to prepare a SiB on a huge database (5To). What will be the approach to prove Qlikview Value not
on it ablality to response to a business case but also to manage a huge amount of tables and records. How to size the sib and
the hardware in this case to well demonstrate the scalabilty and make the prospect confident enough?
Maybe some of your experiences can help.
Thx,
Michael.
I think you are best served to change the direction of the conversation from database size to another BI reference unit of measure. Say, the number of BI objects delivered in a single application.
The TBs in the DW include indexes, stored procedures, and other adminsitrative overhead typical in a RDBMS. QlikView will never need to address the entire n terabytes. Focus on the size (row count) of the largest fact tables you will be addressing in the SiB. Consider the width (number of columns) as well. Togther these inputs should be enough for you to size the server for the SiB. By the end of the SiB you should be able to prove out the delivery of the additional BI units that QlikView can delivery over a traditional OLAP cube based tool and database size will become a moot point.
/emb
Hi,
you should define analytics use cases which includes only the level of detail (granularity) and those dimensions and facts you need for the use case (or business question). Also, you could slice time-wise. The goal should be to keep only the data needed for the actual discovery task in memory.
If you want to drill down/up/along/through or slice/dice (I know these are OLAP terms) you can chain with prepared and segmented (e.g. by dimension) QVW documents.
Overall, a good idea for starting a SiB is to start with a selection of data, not a whole TB warehouse.
- Ralf
This is why I suggested http://community.qlik.com/ideas/1839 the "Virtual Memory for Qlikview: an ability to expand the database beyond RAM" idea...