as i know there are in general 3 ways to change data in your QV data model:
- dynamic update
You may set inputfields values via macro. Dynamic update is possible via macro or trigger.
Input fields can't change row numbers, only set values to defined (as input) fields.
Dynamic update may insert/update/delete rows. But it is not fast ....
I think, it should be possible to start macro or trigger actions from extension level.
What I have seen is the following scenario with dynamic update:
- Use an extension to write to a SQL database
- Use a trigger on the SQL database to write the data into QlikView with the OCX
- Dynamic update causes the app to display the new data
I've seen this work, BUT it was a PoC, and you're probably aware of all of the issues and limitations of dynamic update.
Dynamic update does not work in a clustered server scenario. On a single server it can, however, the "to be" approach is to use Publisher to reload to deliver new data into the memory app. This could be a frequently triggered partial reload and this is also cluster-supported.
Another idea, if writing to SQL is already done, that table could also be integrated with Direct Discovery into the QlikView app ...
Writing data back to QlikView is not possible unless you reload the document. Reason is, that during the loading process in QV the associated model is created which is then loaded into memory. The only ideal ways would be to use a dynamic update but this is verrry limited in terms of how many rows you can update at once and in frequency OR use Direct Discovery (I have little experience with this).
What you can also do is manually associate the data from SQL into QlikView through an extension object until the reload cycle has finished (By building dynamic queries depending on the selection). This can be very fast but it's not very efficient and definitely not what QV was designed for. This, however, can be a viable solution if you want to append simple metadata to a selection or row like a user comment etc.