Anything that is done 10,000 times is going to take longer than something that is done just once. Looping in the Qlik load script is straight forward (for iRow = 0 to NoOfRows('TableOfRows') -1), you can then peek out your values (let vOrigins = peek('Origin', iRow, 'TableOfRows');) and use those in the post.
As well as the fact it could take a while you will also need to be careful of any API call limits there are on the database.
The key will be to having an incremental load, so that you only look up new destination pairs and revert to QVD for destination pairs you have looked up previously.
Hope that helps.
I have attached the QVF file .. using POST connection and with statement.
Somehow , my data is being load times of total rows. It Should be generally 12 rows but it coming 90 rows.
Can you please check the error?
This is to do with how the JSON is returned in multiple tables and then QlikView associates between the tables. Each time you call the API the keys between the tables will be reset (probably to 1) so all items will share the same key.
You have two options, one is to replace the keys with the ID from the loop, repeating this for each table in place of the existing key:
$(a) AS [__KEY_root]
The other is to first remove all fields you don't need (you don't need the origin and destination out of the table, as you have this on the way in) and follow this pseudo code:
load origin destination pairs
loop for each pair
get JSON from API
resident load first table into temporary table
left join resident load distance and time tables onto temporary table
resident load from temporary table into main table
drop temporary table
drop all other REST tables
You can see why paring the queries down to just what you need before using the second approach is essential.
The code for building the temp table would be something like:
'$(vOrigin)' as Origin,
'$(vDestination)' as Destination
LEFT JOIN (tmpDist)
[text] AS Distance,
[value] AS Metres
WHERE NOT IsNull([__FK_distance]);
LEFT JOIN (tmpDist)
[text_u0] AS Time,
[value_u0] AS Seconds
WHERE NOT IsNull([__FK_duration]);
DROP TABLE tmpDist;
DROP TABLE RestConnectorMasterTable;
All of the other RESIDENT loads which are inserted from the REST wizard can be ignored. It is critical that the DROP statements appear within the loop.
Hope that helps.
I don't like putting up solutions which I haven't tested, so I came back to this one and plumbed it into Sense (I had typed straight into Qlik Community previously).
I had missed a couple of places where AUTO CONCATENATE messed up my script. I've addressed those and have attached an app that solves this. This is attached.
Whilst I was at it I have created the dimensions and measures to test things out, and have put on graphs of distances, times and average speeds.
You can obviously replace my inline load of origins and destinations with your original spreadsheet.
BTW, your original file had a hi-res image in it which was making the file size much larger than it needed to be.
Hope that helps,
When applicable please mark the appropriate replies as CORRECT. This will help community members and Qlik Employees know which discussions have already been addressed and have a possible known solution. Please mark threads as HELPFUL if the provided solution is helpful to the problem, but does not necessarily solve the indicated problem. You can mark multiple threads as HELPFUL if you feel additional info is useful to others
Every different REST API has different fiddly bits to get right. Given the nested JSON keys I suspect the only way to get that one to work is to construct the BODY and then pass that in using the WITH CONNECTION BODY tags, as outlined above.
When things fail when using the Qlik REST connector there is not usually much steer about what has gone wrong. I would recommend getting things working with the PostMan app first. This allows you to send requests and check output much quicker and with much more diagnostics than in Qlik.
Every API I have dealt with has required a bit of perseverance to get right, and some amount of changing parameters to see what works.
Hope that helps. Good luck!
I don't know if you managed to solve this or not, but next time it might prove easier to use LLodi instead. It is a freemium service and allows you to send requests both from dashboards and script.
Here there is an example to use it from the script with Slack.