Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
I (a QlikSense beginner) have had created an incremental load as described in Qlik help (Inkrementelles Laden mit QVD-Dateien ‒ Qlik Sense) for eingefügte, geänderte und gelöschte Datensätze. I have to use an internet connection to get several million records out of a table on a remote server. This lasts something about 2h..
I noticed a data loss.
I assume that this is caused by the fact that after starting the select for the initial load and before the first timestamp for LastExecTime is set lots of new records are stored into the table.
Is there a best practice known, how to solve this problem?
I can't modify the connection to the database server (Oracle).
It works fine up to few million records.
Hello all!
Using the third variable fixed my problem. Now after initial and incremental load the numerber of records in the App matches the number of records in the table of the databse.
Thanks for your assistance!
What is your Incremental load script?
What where conditions are you using in the load?
The load script is (field and table names are replaced by descriptions in square brackets):
Let ThisExecTime = Now();
Let ScriptErrorCount = 0;
TABLE_QS:
SELECT [FIELDLIST]
FROM [TABLE]
WHERE
[DATE] >=to_date('$(LastExecTime)','DD.MM.YYYY HH24:MI:SS')
AND
[DATE] < to_date('$(ThisExecTime)','DD.MM.YYYY HH24:MI:SS')
;
Concatenate LOAD
*
FROM [QVD]
(qvd)
WHERE NOT EXISTS(TABLE_PRIMARY_KEY);
INNER JOIN SQL SELECT [TABLE_PRIMARY_KEY] FROM [TABLE];
IF ScriptErrorCount = 0 Then
Let LastExecTime = NOW();
Store TABLE_QS INTO '[QVD]';
DROP TABLE TABLE_QS;
EndIf;
In this script I don't use any conditions.
I think I've got a solution for this problem. I added a third variable for a timestamp called SecondLastExecTime = LastExecTime (when script starts for incremental load).
If ScriptErrorCount = 0 then I set LastExecTime = ThisExecTime. Otherwise LastExecTime = SecondLastExecTime.
Now it looks like as if all records are taken from the database into the QVD file.
Dear Ulrich,
Try to run the same query more close the database in order to define either the problem is connection or query results.
I already had same problems regarding data latency between Qlikview and remote databases trhough internet.
Hi Ulrich, could you post your code to better undestanding?
But, if i understand you want to do a incremental, if is the case you could try:
//New records
DB_AUX:
Load
*;
select
*
from yourdb
store DB_AUX into [lib://something_aux.qvd]
drop table DB_AUX;
Records:
Load
*
from
your_full_qvd_record.qvd
Concatenate(Records)
Load
*
from
something_aux.qvd;
Store Records into [lib:\\your_full_qvd_record.qvd]
Drop Table Records
Hello Eduardo,
this is my modifyed incremental load script:
Let ThisExecTime = Now();
Let SecondLastExecTime = LastExecTim;
Let ScriptErrorCount = 0;
TABLE_QS:
SELECT [FIELDLIST]
FROM [TABLE]
WHERE
[DATE] >=to_date('$(LastExecTime)','DD.MM.YYYY HH24:MI:SS')
AND
[DATE] < to_date('$(ThisExecTime)','DD.MM.YYYY HH24:MI:SS');
Concatenate LOAD
*
FROM [QVD]
(qvd)
WHERE NOT EXISTS(TABLE_PRIMARY_KEY);
INNER JOIN SQL SELECT [TABLE_PRIMARY_KEY] FROM [TABLE];
IF ScriptErrorCount = 0 Then
Let LastExecTime = ThisExecTime;
Store TABLE_QS INTO '[QVD]';
DROP TABLE TABLE_QS;
Else
Let LastExecTime = SecondLastExecTime;
EndIf;
Hello Ricardo,
unfortunately I can't get closer to the databse server.
Hello all!
Using the third variable fixed my problem. Now after initial and incremental load the numerber of records in the App matches the number of records in the table of the databse.
Thanks for your assistance!