Qlik Community

New to Qlik Sense

Discussion board where members can get started with Qlik Sense.

sat_tok52
Contributor

Writeback Funtionality on QlikSense

Hi All,

I have to implement writeback functionality in QlikSense ,Tell me the best way how to implement it and is there any extension available for it.

Thanks,

Sateesh.T

 

 

 

Labels (1)
1 Reply
peterkunhardt
New Contributor III

Re: Writeback Funtionality on QlikSense

Hi @sat_tok52

Assuming you mean you are in need of a method of writing back to a source database over a data connection, this is not currently available in Qlik Sense natively, although you may find extensions on Branch. I was able to get it working in an extremely limited fashion by creating a function on my target database which inserted a row of data into a specified table and then iterating over a table in Qlik Sense with a FOR loop and calling said function.

Function definition example in Postgres:

create or replace function public._amu_longstanding_support_issues(
col1 character varying,
col2 character varying,
col3 character varying) returns void as
$BODY$
BEGIN
insert into lobster_bisch._am_longstanding_support_issues values (
col1,col2,col3);
end;
$BODY$
language 'plpgsql' VOLATILE;

 Once you have the data you want to load in a table in Qlik, you can use a loop similar to the following to call your function:

for a=0 to FieldValueCount('row')-1
Let vCol1 = Peek('col1',$(a), 'example_table');
Let vCol2 = Peek('col2',$(a), 'example_table');
Let vCol3 = Peek('col3',$(a), 'example_table');
SQL SELECT * from public.example_function('$(vCol1)','$(vCol2)','$(vCol3)');
next a;

 Note that I've assumed the column names in your Qlik Sense table are the same as the columns in your database.

I found this to be, erm, not very efficient. You're introducing the mother of all n+1 problems, opening a new connection for each row you want to insert. This means that very large insert statements will take far more time to execute than they would if you did it in a single statement on the db directly. (I found it to be about 100x faster to load a table of only about 1k rows directly than to use the method I've described above) 

We have since migrated to a different system (Buildkite executing Python code which runs the necessary transformations and then upserts the data to target db) and I would recommend attempting to find a non-Qlik based workaround if at all possible. 

In the alternative, if your real goal is just the caching of the result sets of "expensive" queries to a place where Qlik can see them, look into saving to QVD files. This will obviously not work if having the data in your source database is a hard requirement for your use case. 

Best of luck!