We have created a Qlik Mashup on our social intelligence platform which will track the feeds from twitter/ facebook and do analysis on the same. We have a separate tab where we are bringing live twitter data (assuming twitter for now). The data gets loaded in hadoop with a lag of every 10 seconds. Till this everything is good.
I face the main challenge in displaying the same in the mashup/qvf. I am using Qlik Sense 3.0 for our project. The data takes atleast a minute to refresh in the QVF followed by the refresh in the mashup. I tried refreshing the app less than a minute but didn't get much help as the schedule supports minimum refresh time as 1 minute.
As I was not happy with the above frequency, the next approach I took, is the direct Discovery feature for Qlik. With direct discovery, I tried testing a sample table with the following fields extracted from twitter :
Below is the query to load the table with the above info:
Connecting using Cloudera ODBC Connection.
1 as DISP_ID
ID AS TWEET_ID,
I have put all the fields in measure as dimension specified fields don't get updated with incoming changes to hadoop.
Even after doing this, the data for the table view used in our mashup, sourced from the above direct script is not getting updated even when hadoop is updated with the latest records pulled from twitter. It gets updated only when tha app is refreshed.
I am not sure where am i going wrong. Please let me know if I can update the table as soon as the underlying hadoop table gets updated.