Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
I have a few questions about how people are doing statistical analysis/predictive modeling/data mining with QlikView. According to the Gartner 2010 BI Quadrant, QlikView "lacks the statistical and predictive modeling capabilities of some of its most similar competitors, ..."
My questions:
1) How often are you asked to do predictive modeling with QlikView? (whether you're an in-house developer or work for a consulting firm)
2) What is the most advanced statistical work you've done inside of QlikView?
3) Does anyone use third party statistical packages to pre-calculate before loading into QlikView? (R anyone?)
Thanks,
Gary
1) Very rarely. Consultant.
2) I've just done complex what-if's based on user-defined drivers and historical data to predict future results, but the what-if always has a user behind it making the changes according to what they see in their analysis. Also, sometimes, basic linear regression.
3) No
I agreed with John's and Lar's comments. This is an interesting topic and deserves more analysis and demos to add an extra punch to QlikView. A couple years ago I read a Gartner User Survey that contained a page that surprised me. This might be a copyright violation, but it's from the 2009 Gartner Report.
Predictive Modeling and Data Mining
The least used and lowest rated of any capability surveyed, with just 27% of the sample using predictive modeling and data mining, and a mean rating of 64.4 among those using them.
A bit late (but)
1) Not too often however if an Excel model needs to be updated regularly this is be moved into Qlikview. (In house)
2) Adding a seasonally adjusted forecast line to historical price data at various probability levels (P90-P50-P10).
3) Multiple regression using Excel Linest to calculate the regression parameters fed back into Qlikview.
QV really needs some multiple regression capability as it would be very powerful to dynamically update regression parameters based on the current selection.
Hi Gary,
As statistician, i'm very frustrated about the lack of statistics tools inside QV.
2) So I just use basic descriptive statistics (bar chart distribution, box-and-whisker diagram), sometime I add a trend curve in a time plot chart .
3) I use SAS very often for statistical purpose (mainly regression, factorial analysis or cluster analysis). The best thing I found out was to use the ODBC driver for SAS and to connect QV with SAS. It's very simple to set up and very useful.
Example : after a principal component analysis with SAS, I'm directly able to read the resulting file in QV then to display the principal component in a scatter plot.
Regards
JJ
What type of models are you looking for?
Qlikview aloud us to compute models in real time.
Most of statistic models made in spss and other statistic and econometrics softwares are executable in Qlikview.
The Problem is that we need to include all syntaxs in Qlikview and that is time consumming.
My experience is quite good because after um compute the model in Qlikview we can filter information and model recalculate the p values.
Best regards
We have created some simple SPC Charts. Pretty simple to do. One needs to manually put the formuals in the expressions for the median, UCL and LCL limits
Theres a few things that you could developed using statistics. Off course its not a data mining tool, but there's a few analysis that your are to create such as Anova's, Outlier's, SPC Charts, Samples Means Comparisons, among others. Lets hope in a short-run future Qliktech developed data mining tools for qlikview...that will be real innovation
This is obviously late, but I would look for integrative ways to apply more robust statistical techniques to any analysis with some of the existing packages out there, if indeed this is a target market of yours. As previously mentioned, there is a way to take SAS output to QV. R/SPSS/Stata/Statistica/etc I can't really comment on (my experience is in R and SAS). I mention in another post of mine about the potential merits of providing visualizations for the purpose of further analyses, not merely reporting (Tableau might be an example here). However, there are companies who have taken R and merged with other open source products, such as JasperSofts products, to produce analytic throughput to an environment which can be delivered (even interactively) as a reporting product or an application (i.e. R script called in Java to produce, say a flexible set of visualizations).
I've been lurking the forum looking at the treatment of time series analysis, but it appears many users may be more interested in sticking to basic trending. It doesn't appear that QV has the appropriate tools to really go beyond this. Interestingly, opposed to financial risk analytic groups, most financial (accounting type) groups appear to gamble on very linear and simplified methods of making future predictions (of course the problem always comes when one cannot explain the degree of uncertainty with any results). This is quite similar to those who would use IBM Cognos, whose 'forecasting functionality' really doesn't belong in the space of robust analytics.
Alot of the direction in business analytics toward robust models is sadly slow to adopt - particularly depending on your target business. This is true for a number of reasons, which include - time to initial model development, statistical expertise of the analyst, the research questions that various businesses feel are sufficient to get information on, and really - the general business gestalt of the role the analyst should play in helping develop and refine the right questions before attemping to analyze them in the first place. These are truly fundamental issues that you kinda need to know up front before venturing into the data mining and statistical modeling space.
Just my $.02.
Phillip