Skip to main content
Announcements
Global Transformation Awards! Applications are now open. Submit Entry
cancel
Showing results for 
Search instead for 
Did you mean: 
Not applicable

Insert all records from resident table to SQL db

Hi all,

Is it possible to insert all records from a resident table into an db table from qlikview script and in 1 shot?

I checked other posts, a loop on the records was suggested (for...next) and an insert statement would be executed every time.

But this method is too heavy and is taking too much time.

Any alternative to achieve this?

Thanks

5 Replies
Colin-Albert

Save the table to csv using

     STORE tablename into filename.csv (txt) ;

Then import the data into your database.

Qlik is designed to load data from a database not to insert data into a database.

Gysbert_Wassenaar

Is it possible to insert all records from a resident table into an db table from qlikview script and in 1 shot?

No. You'll need more shots. One for exporting the table to a csv file and one to import the csv file in your database. Perhaps your dbms supports csv files as virtual tables. That could make it a bit easier.


talk is cheap, supply exceeds demand
Not applicable
Author

Thanks for your quick replies guys gwassenaarcolin_albert

I c, based on your answers, I'm also checking how we could automate the import of the csv doc...

Any suggestion ?

I searched a bit and found about the bulk insert command.

The client will be testing it through qlikview script (to automate the process store csv + write back to db).

Fingers crossed

But what do you think? Theoretically it should work, no ?

Not applicable
Author

Hi,

I had the same problem, but with by making a string of the variables and inserting 1000 rows at a time, decreasing the load time exponentaily. Due to some of the values in the fields I hade to swith between ' for delimiter and Chr(39).

populateObj:

LOAD [client_id],

  [obj_id],

  [objval_id],

  [descr],

  [valid_from_isodate],

  [valid_to_isodate],

  [active],

  [created],

  [row_id],

    '('&if(isnull([client_id]),'NULL',[client_id]) &', '&

  if(isnull([obj_id]), 'NULL',chr(39)&[obj_id]&chr(39)) &', '&

  if(isnull([objval_id]),'NULL',chr(39)&[objval_id]&chr(39)) &', '&

  if(isnull([descr]), 'NULL',chr(39)&[descr]&chr(39))&', '&

  if(isnull([valid_from_isodate]), 'NULL',chr(39)&[valid_from_isodate]&chr(39)) &', '&

  if(isnull([valid_to_isodate]), 'NULL',chr(39)&[valid_to_isodate]&chr(39))&', '&

  if(isnull([active]),'NULL',[active])&', '&

  if(isnull([created]),'NULL', chr(39)&[created]&chr(39))&', '&

  if(isnull([row_id]),'NULL', [row_id])&')' as SQL_queryRowString,

    RowNo() as RowNo,

    Round(RowNo()-500,1000) as RowNo1000

FROM [lib://xxxxxx (xxxx)/xxxxxx.qvd]

(qvd);

NoConcatenate

tempTable:

Load

RowNo1000,

Concat(SQL_queryRowString, ',') as SQLQueryValues1000

Resident populateObj

Group by RowNo1000;

Drop table populateObj;

LIB CONNECT TO 'BizView (ghh_qliktech)';

SQL DELETE FROM "bxxxxxxx".xxxxxxx;

For vRowNo = 1 to NoOfRows('tempTable')

    

Let vSQLInsertValues = Peek('SQLQueryValues1000',vRowNo-1,'tempTable');

LET vInsertedRows = vRowNo * 1000;//Used to see the loop progress

 

  SQL INSERT INTO "xxxxxx".xxxxxxxx (

    client_id,

    obj_id,

    objval_id,

    descr,

    valid_from_isodate,

    valid_to_isodate,

    active,

    created,

    row_id

  ) VALUES

  $(vSQLInsertValues);

  TRACE Inserting $(vInsertedRows) rows; //Used to see the loop progress

   

Next vRowNo

Drop Table tempTable;

Hope it helps!

cheenu_janakira
Creator III
Creator III

Salut Jean Pierre,

As Gysbert put it, the answer is "no", there is no one shot. However, check this thread: Using QlikView for (bulk) INSERT into SQL.

I created a script that does bulk inserts in series of 1,000 rows per dump. Hence, say for a table/QVD of 12,345 rows, the script will do 12 dumps of 1,000 each and then a dump of the last 345 rows. I think a number of individuals have also thought about this, such as David G.

Since this script is based on using a QVD at source, I am sure you can re-use the latter part to write from a RESIDENT table, the process is the same. You would have to be cautious in your 'data types' and what it is in the source table.

My script was build in order for the script to first build the SQL table and then load the data into it, which would ensure the 2 are consistent and won't produce errors.

Video for this on YouTube: Qlik Bulk INSERT into SQL table - YouTube

Hope this helps.

Kr,

Cheenu