Skip to main content
Announcements
Have questions about Qlik Connect? Join us live on April 10th, at 11 AM ET: SIGN UP NOW
cancel
Showing results for 
Search instead for 
Did you mean: 
ugoedert
Contributor II
Contributor II

Data loss while initial load on big data

I (a QlikSense beginner) have had created an incremental load as described in Qlik help (Inkrementelles Laden mit QVD-Dateien ‒ Qlik Sense) for eingefügte, geänderte und gelöschte Datensätze. I have to use an internet connection to get several million records out of a table on a remote server. This lasts something about 2h..

I noticed a data loss.

I assume that this is caused by the fact that after starting the select for the initial load and before the first timestamp for LastExecTime is set lots of new records are stored into the table.

Is there a best practice known, how to solve this problem?

I can't modify the connection to the database server (Oracle).

It works fine up to few million records.

1 Solution

Accepted Solutions
ugoedert
Contributor II
Contributor II
Author

Hello all!

Using the third variable fixed my problem. Now after initial and incremental load the numerber of records in the App matches the number of records in the table of the databse.

Thanks for your assistance!

View solution in original post

8 Replies
MK9885
Master II
Master II

What is your Incremental load script?

What where conditions are you using in the load?

ugoedert
Contributor II
Contributor II
Author

The load script is (field and table names are replaced by descriptions in square brackets):

Let ThisExecTime = Now();

Let ScriptErrorCount = 0;

TABLE_QS:

SELECT [FIELDLIST]

FROM [TABLE]

WHERE

[DATE] >=to_date('$(LastExecTime)','DD.MM.YYYY HH24:MI:SS')

AND

[DATE] < to_date('$(ThisExecTime)','DD.MM.YYYY HH24:MI:SS')

;

Concatenate LOAD

*

FROM [QVD]

(qvd)

WHERE NOT EXISTS(TABLE_PRIMARY_KEY);

INNER JOIN SQL SELECT [TABLE_PRIMARY_KEY] FROM [TABLE];

IF ScriptErrorCount = 0 Then

Let LastExecTime = NOW();

Store TABLE_QS  INTO '[QVD]';

DROP TABLE TABLE_QS;

EndIf;

In this script I don't use any conditions.

ugoedert
Contributor II
Contributor II
Author

I think I've got a solution for this problem. I added a third variable for a timestamp called SecondLastExecTime = LastExecTime (when script starts for incremental load).

If ScriptErrorCount = 0 then I set LastExecTime = ThisExecTime. Otherwise LastExecTime = SecondLastExecTime.

Now it looks like as if all records are taken from the database into the QVD file.

Ricardo_Gerhard
Employee
Employee

Dear Ulrich,

Try to run the same query more close the database in order to define either the problem is connection or query results.

I already had same problems regarding data latency between Qlikview and remote databases trhough internet.

Ricardo Gerhard
OEM Solution Architect
LATAM
eduardo_dimperio
Specialist II
Specialist II

Hi Ulrich, could you post your code to better undestanding?

But, if i understand you want to do a incremental, if is the case you could try:

//New records

DB_AUX:

Load

*;

select

*

from yourdb

store DB_AUX into [lib://something_aux.qvd]

drop table DB_AUX;

Records:

Load

*

from

your_full_qvd_record.qvd

Concatenate(Records)

Load

*

from

something_aux.qvd;


Store Records into [lib:\\your_full_qvd_record.qvd]


Drop Table Records

ugoedert
Contributor II
Contributor II
Author

Hello Eduardo,

this is my modifyed incremental load script:

Let ThisExecTime = Now();

Let SecondLastExecTime = LastExecTim;

Let ScriptErrorCount = 0;

TABLE_QS:

SELECT [FIELDLIST]

FROM [TABLE]

WHERE

[DATE] >=to_date('$(LastExecTime)','DD.MM.YYYY HH24:MI:SS')

AND

[DATE] < to_date('$(ThisExecTime)','DD.MM.YYYY HH24:MI:SS');

Concatenate LOAD

*

FROM [QVD]

(qvd)

WHERE NOT EXISTS(TABLE_PRIMARY_KEY);

INNER JOIN SQL SELECT [TABLE_PRIMARY_KEY] FROM [TABLE];

IF ScriptErrorCount = 0 Then

Let LastExecTime = ThisExecTime;

Store TABLE_QS  INTO '[QVD]';

DROP TABLE TABLE_QS;

Else

Let LastExecTime = SecondLastExecTime;

EndIf;

ugoedert
Contributor II
Contributor II
Author

Hello Ricardo,

unfortunately I can't get closer to the databse server.

ugoedert
Contributor II
Contributor II
Author

Hello all!

Using the third variable fixed my problem. Now after initial and incremental load the numerber of records in the App matches the number of records in the table of the databse.

Thanks for your assistance!