Skip to main content
Announcements
Qlik Connect 2024! Seize endless possibilities! LEARN MORE
cancel
Showing results for 
Search instead for 
Did you mean: 
tabletuner
Creator III
Creator III

Incremental load challenge

Hi all,

I have a question regarding a large fact table which contains positions of financial documents. My situation is as follows:

I'd like to shorten the load time of this large fact table. However, the fact table doenst have a time stamp (except for valuta date). The header table contains the header data (like the important 'created on' field).

What method would you prefer to load this large fact incrementally?

Thanks for sharing your thoughts!

Regards,

Tjeerd

7 Replies
Not applicable

What is the relation between the header data and the detail? Basically for the incremental load , a record level audit (time stamp) for the updaton/revision is a must. If you don't have, the other option is to seperate fact into history and transaction blocks (logical division based on date periods, for eg. closed periods as history and open periods as transaction) and during incremental load refresh only the transaction records and always concatenate the hostory form the traget QVD itself.

--Arun

rwunderlich
Partner Ambassador/MVP
Partner Ambassador/MVP

What type of data source is this fact table? Is it a database table or a text file?

-Rob

tabletuner
Creator III
Creator III
Author

Arun, I think that I could make the closed periods based on valuta date.However, i am still open for new ideas.

Rob,It is an SAP database table. I think you are thinking about the buffer reload? Is this appropriate and which statements are appropriate in this situation (without creation timestamps)?

Thanks all for sharing your thoughts!

Not applicable

This is very interesting because this problem really happens on a lot of SAP tables where there is no date. And even with dates running an incremental update is complicated because you need to query on these dates which are not keys.

In some case it takes more time to do incremental updates in SAP than complete updates !

What table do you want to work on ?

Sébastien

tabletuner
Creator III
Creator III
Author

Hi Sebastien,

It is true that SAP has many header/position table combinations. In my case i am working on the BSEG (financial position data) and BKPF (Financial header data) tables.

I think it might be a good idea to do a complete load of both tables once in a week (during the quiet weekends). After completion join the BKPF date field on the BSEG table and store the BSEG table in pieces over months.

During the weekdays load the BSEG month parts (qvd source) and concatenate them. Next, load the document numbers (from SAP) which dont exist in the concatenated table and concatenate them as well. Will this be a (sound and fast performing) solution?

Regards,

Tjeerd

Not applicable

Spastor : I had a similar case of FIN table with more than 25G data. Incremental load works well caz we have DATE and TIME as audit for each financial entry. To increase the query performance, we created a performance index on DATE in the database level.

Tj: If document number is the key, your logic will take care of only new insertions, right? What about the updations happened on already staged documents and possible deletions in SAP database? Thatt will be got refreshed only in the week end?

--Arun

Not applicable

Arun: Using a date field is ok, but you'll have also the same deletion issue no ?

Anyway, In SAP, when you want to manage deletion / updates, you need to read more than one time the table and at the end it is as fast as doing a complete reload...

Am I wrong somewhere ?