Is it possible to insert all records from a resident table into an db table from qlikview script and in 1 shot?
No. You'll need more shots. One for exporting the table to a csv file and one to import the csv file in your database. Perhaps your dbms supports csv files as virtual tables. That could make it a bit easier.
I c, based on your answers, I'm also checking how we could automate the import of the csv doc...
Any suggestion ?
I searched a bit and found about the bulk insert command.
The client will be testing it through qlikview script (to automate the process store csv + write back to db).
But what do you think? Theoretically it should work, no ?
I had the same problem, but with by making a string of the variables and inserting 1000 rows at a time, decreasing the load time exponentaily. Due to some of the values in the fields I hade to swith between ' for delimiter and Chr(39).
'('&if(isnull([client_id]),'NULL',[client_id]) &', '&
if(isnull([obj_id]), 'NULL',chr(39)&[obj_id]&chr(39)) &', '&
if(isnull([objval_id]),'NULL',chr(39)&[objval_id]&chr(39)) &', '&
if(isnull([descr]), 'NULL',chr(39)&[descr]&chr(39))&', '&
if(isnull([valid_from_isodate]), 'NULL',chr(39)&[valid_from_isodate]&chr(39)) &', '&
if(isnull([valid_to_isodate]), 'NULL',chr(39)&[valid_to_isodate]&chr(39))&', '&
if(isnull([created]),'NULL', chr(39)&[created]&chr(39))&', '&
if(isnull([row_id]),'NULL', [row_id])&')' as SQL_queryRowString,
RowNo() as RowNo,
Round(RowNo()-500,1000) as RowNo1000
FROM [lib://xxxxxx (xxxx)/xxxxxx.qvd]
Concat(SQL_queryRowString, ',') as SQLQueryValues1000
Group by RowNo1000;
Drop table populateObj;
LIB CONNECT TO 'BizView (ghh_qliktech)';
SQL DELETE FROM "bxxxxxxx".xxxxxxx;
For vRowNo = 1 to NoOfRows('tempTable')
Let vSQLInsertValues = Peek('SQLQueryValues1000',vRowNo-1,'tempTable');
LET vInsertedRows = vRowNo * 1000;//Used to see the loop progress
SQL INSERT INTO "xxxxxx".xxxxxxxx (
TRACE Inserting $(vInsertedRows) rows; //Used to see the loop progress
Drop Table tempTable;
Hope it helps!
Salut Jean Pierre,
As Gysbert put it, the answer is "no", there is no one shot. However, check this thread: Using QlikView for (bulk) INSERT into SQL.
I created a script that does bulk inserts in series of 1,000 rows per dump. Hence, say for a table/QVD of 12,345 rows, the script will do 12 dumps of 1,000 each and then a dump of the last 345 rows. I think a number of individuals have also thought about this, such as David G.
Since this script is based on using a QVD at source, I am sure you can re-use the latter part to write from a RESIDENT table, the process is the same. You would have to be cautious in your 'data types' and what it is in the source table.
My script was build in order for the script to first build the SQL table and then load the data into it, which would ensure the 2 are consistent and won't produce errors.
Video for this on YouTube: Qlik Bulk INSERT into SQL table - YouTube
Hope this helps.