I have created a similar solution.
I did a query for each date and and then looped over the periode (365 queries per year).
I put a small sleep of 1-2 sec between each query cause Google dont allow that many queries per hour. That gave me the oppotunity to do a full load in 6 hours.
In the following loads i substitute the last 3 days with new data, cause Google is not always up to date. I use a incremental technic for that part.
Hope this is a help and have fun. I learned a lot from this case and QCSouce worked ok.
Are you looking for a solution to store the data into, a QVD file?
So perhaps in the first load you have a feed url something like (this is probably a lot simpler than your feed url):
Then the results are stored in a QVD, then each day afterwards you just load the last days data and load it into a QVD? So the next day the feed might look like:
And so on.
The code is protected, so that is not possible. Sorry.
I use "Sleep 3000" between each query in the full load - If that is not enough you can adjust it.
In the incremental load i don't use sleep.
About incremental load i use the date as incremental seperator.
When i send queries against google, I only ask for data for one date. So you need to set filter on Start-date and End-date.
Could you please explain to me in which way I can load data for a whole year into QV by using the sleep function? Do I need to make 365 scripts, one for each date (I don't think so)? I am not familiair with the loop function in QV, so I'm not sure on how to use it. I hope you can help me some further.
Hi, Henco - just providing an alternative to Henrik's (good) solution.
We use Qlikview as a lightweight data import tool (like Microsoft's SSIS), as well as a reporting tool. What I mean by that is - while we're pulling in data (from Google Analytics, or a similar API) and we'd like to keep it, we insert it in to a database/datawarehouse as part of the QlikView script. The benefits of that are the speed of querying a database for lots of data (instead of slowly and regularly querying the API for lots of static historical data), and in addition that data is available for other applications and Qlikview reports, not just the single report.
Something you might want to think about.
Hey, Chris - sure, I'll give you an example.
We're using QVSource to pull Facebook Insights data for 500+ domains (and growing) of ours, and report out daily Likes, Shares, etc., for the last 12 months. Rather than refresh ALL that data out of Insights daily (which would take hours each day), after we load in new data that we want to keep, we store it in tables in a local database. We only actually query Insights data for the last 2-3 days worth of data, update the database table with those results, then use the data from the table for the actual Qlikview report.
Just like to you can do a SQL SELECT statement in the QV scripts, you can do a SQL INSERT (or UPDATE, DELETE, etc.).
I read a lot about incremental reload and storing the data in a QVD file, but unfortunately I can't translate it to my own document. I hope someone of you can help me out.
I use QVSource to load the data into QlikView. So, in the document I got a FROM statement which contains the Google feed url with a certain daterange. What I would like to do is load the data for 2009, 2010, 2011 and 2012 untill today's date into a QVD file. After that I would like to add every new day into the QVD file. In another QlikView document I use that table to get my data from.
I added a simple shot of the current situation and hope anyone of you can tell me how to get this done.
Thanks in advance!
I can see a few queries in google analytic in QVSource.
BUT, how do i link them up?
For example, i can have isolated table for visit by country, by page, by keyword searched.
But, i can't have the info for which keyword is searched the most from which country and they land on which page?
Any tips ?
Unfortunately there is no key like visitor_id or anything. My solution was to create a concatenated key to get as close to an individual visitor as possible. I combined date, hour, latitude and longitude to create a key. The problem is that you can only query 7 dimensions at once, so that only left me with 3 more dimensions. So I just created separate queries each with the 4-field key and 3 separate dimensions. They all became associated in QlikView based on the 4-field key.