Qlik Community

New to Qlik Sense

Discussion board where members can get started with Qlik Sense.

ugoedert
New Contributor II

Data loss while initial load on big data

I (a QlikSense beginner) have had created an incremental load as described in Qlik help (Inkrementelles Laden mit QVD-Dateien ‒ Qlik Sense) for eingefügte, geänderte und gelöschte Datensätze. I have to use an internet connection to get several million records out of a table on a remote server. This lasts something about 2h..

I noticed a data loss.

I assume that this is caused by the fact that after starting the select for the initial load and before the first timestamp for LastExecTime is set lots of new records are stored into the table.

Is there a best practice known, how to solve this problem?

I can't modify the connection to the database server (Oracle).

It works fine up to few million records.

1 Solution

Accepted Solutions
ugoedert
New Contributor II

Re: Data loss while initial load on big data

Hello all!

Using the third variable fixed my problem. Now after initial and incremental load the numerber of records in the App matches the number of records in the table of the databse.

Thanks for your assistance!

8 Replies
arvind654
Honored Contributor II

Re: Data loss while initial load on big data

What is your Incremental load script?

What where conditions are you using in the load?

ugoedert
New Contributor II

Re: Data loss while initial load on big data

The load script is (field and table names are replaced by descriptions in square brackets):

Let ThisExecTime = Now();

Let ScriptErrorCount = 0;

TABLE_QS:

SELECT [FIELDLIST]

FROM [TABLE]

WHERE

[DATE] >=to_date('$(LastExecTime)','DD.MM.YYYY HH24:MISmiley FrustratedS')

AND

[DATE] < to_date('$(ThisExecTime)','DD.MM.YYYY HH24:MISmiley FrustratedS')

;

Concatenate LOAD

*

FROM [QVD]

(qvd)

WHERE NOT EXISTS(TABLE_PRIMARY_KEY);

INNER JOIN SQL SELECT [TABLE_PRIMARY_KEY] FROM [TABLE];

IF ScriptErrorCount = 0 Then

Let LastExecTime = NOW();

Store TABLE_QS  INTO '[QVD]';

DROP TABLE TABLE_QS;

EndIf;

In this script I don't use any conditions.

ugoedert
New Contributor II

Re: Data loss while initial load on big data

I think I've got a solution for this problem. I added a third variable for a timestamp called SecondLastExecTime = LastExecTime (when script starts for incremental load).

If ScriptErrorCount = 0 then I set LastExecTime = ThisExecTime. Otherwise LastExecTime = SecondLastExecTime.

Now it looks like as if all records are taken from the database into the QVD file.

dcj
Valued Contributor II

Re: Data loss while initial load on big data

Dear Ulrich,

Try to run the same query more close the database in order to define either the problem is connection or query results.

I already had same problems regarding data latency between Qlikview and remote databases trhough internet.

eduardo_dimperi
Valued Contributor II

Re: Data loss while initial load on big data

Hi Ulrich, could you post your code to better undestanding?

But, if i understand you want to do a incremental, if is the case you could try:

//New records

DB_AUX:

Load

*;

select

*

from yourdb

store DB_AUX into [lib://something_aux.qvd]

drop table DB_AUX;

Records:

Load

*

from

your_full_qvd_record.qvd

Concatenate(Records)

Load

*

from

something_aux.qvd;


Store Records into [lib:\\your_full_qvd_record.qvd]


Drop Table Records

ugoedert
New Contributor II

Re: Data loss while initial load on big data

Hello Eduardo,

this is my modifyed incremental load script:

Let ThisExecTime = Now();

Let SecondLastExecTime = LastExecTim;

Let ScriptErrorCount = 0;

TABLE_QS:

SELECT [FIELDLIST]

FROM [TABLE]

WHERE

[DATE] >=to_date('$(LastExecTime)','DD.MM.YYYY HH24:MISmiley FrustratedS')

AND

[DATE] < to_date('$(ThisExecTime)','DD.MM.YYYY HH24:MISmiley FrustratedS');

Concatenate LOAD

*

FROM [QVD]

(qvd)

WHERE NOT EXISTS(TABLE_PRIMARY_KEY);

INNER JOIN SQL SELECT [TABLE_PRIMARY_KEY] FROM [TABLE];

IF ScriptErrorCount = 0 Then

Let LastExecTime = ThisExecTime;

Store TABLE_QS  INTO '[QVD]';

DROP TABLE TABLE_QS;

Else

Let LastExecTime = SecondLastExecTime;

EndIf;

ugoedert
New Contributor II

Re: Data loss while initial load on big data

Hello Ricardo,

unfortunately I can't get closer to the databse server.

ugoedert
New Contributor II

Re: Data loss while initial load on big data

Hello all!

Using the third variable fixed my problem. Now after initial and incremental load the numerber of records in the App matches the number of records in the table of the databse.

Thanks for your assistance!