we are currently reviewing qlik sense solution. our current solution is Tableau, using aggregated data extracts and direct connection over redshift. We are not happy with the tableau response time.
At the moment we do not have a very fast analysis solution like presto, spark sql etc, but we are planning on creating these sort of environments in the future, also our data is suppose to grow from 100's of millions (raw) to a few billions (raw). At that time, I am not sure that Qlik Sense would be able hold the data in memory, unless aggregated. The offered solution is direct discovery, which allow for a tableau like solution, but with the associative engine still in place.
For that, we would still need to retrieve all dimensions from the DWH. Can anyone, using Direct Discovery, comment on the process? time to load dimensions, latency of queries, how robust is the solution, can it be a real alternative to tableau live connection?
I have no experience of Tableau, but some with DD:
Our experience, with a quite difficult data model (using QlikView), was to bail out and prepare a view in the db with the dynamic data needed (and one dimension) from which we read as-is using DD. The update time of your document will depend on the number of objects you have as they seem to access the view one by one.
Loading the dimensions is fast, or as fast as any ordinary load.
We proposed using other tools when we realized we had to prepare a separate view, but the client did not want to add more tools in the toolbox so we had to use QlikView.