Qlik Community

QlikView Scripting

Discussion Board for collaboration on QlikView Scripting.

Announcements
QlikView Fans! We’d love to hear from you.
Share your QlikView feedback with the product team… Click here to participate in our 5-minute survey.
Rules, plus terms and conditions, can be found here.
Not applicable

Incremental load performance low on big table

Hi,

I have a Qlikview application with incremental load (using primary key and field datetime on tables).

I have a data volume which become important : neat 350 000 000 lines on some table using incremental load.

My application must keep one year history, and load daily near 4 million of lines using incremental load.

Now I see that the incremental load seems as long as the full load which is not really interesting.

Could it be possible to optimize this?

I have though about to use a BIT field instead of datetime field for the incremental load. And set an index on the field on database table I load.

The process will be something like this, for a BIT field ToLoad on a table with incremental load used:

1-all updated/inserted lines in table -> the field ToLoad set to 1 on the lines

2-QlikView application refresh with incremental condition on field "ToLoad=True"

3-Field ToLoad set to 0 on all the table

Do you think it will works better? Or any better idea?

The datetime method has truly poor performance on huge data volume as even if you set an index on a datetime field it's not really increase performance (all the value are different, the index has as values as number of table lines).

Could you help me?

Thanks.

Community Browser