Skip to main content
Announcements
Join us at Qlik Connect for 3 magical days of learning, networking,and inspiration! REGISTER TODAY and save!
cancel
Showing results for 
Search instead for 
Did you mean: 
anjali0108
Partner - Creator III
Partner - Creator III

Large data insert and update

Hi All,

I am working on one account where data is large.The task scheduled to pull data for the module is also for every 10 mins.

Also, Whenever I have to pull older records I have to divide the intervals so that I will not get the error "Field length:100494 exceeds maximum limit of 65535.[10232]" and it takes a lot of time to fetch data even for a month or some days.

I am using the attached qvw for pulling the same (for reference).

  • Also,I use the following script for incremental load:

===========================================================================================

MaxModifiedDate:

LOAD

    Timestamp(max(sys_updated_on),'YYYY-MM-DD hh:mm:ss') as Max_ModifiedDateTime   

FROM

[..\Raw Data\Incident.qvd]

(qvd);

LET vMaxModifiedDate= peek('Max_ModifiedDateTime');

INCREMENTAL:

SQL SELECT *

FROM "SCHEMA".OAUSER."incident" where sys_updated_on >='$(vMaxModifiedDate)';

Concatenate

LOAD  *

FROM

[..\Raw Data\Incident.qvd]

(qvd) where not exists(number);

Store INCREMENTAL into [..\Raw Data\Incident.qvd](qvd);

==================================================================================================


I have 2 requirements:


First:

Many a times ,I see the task failed error even after pulling for 10 mins because of the field length exceed error.

Can anybody tell me if there is any way I can merge the logic of data pull(as in attached qvw) in the script written above to avoid field length exceed error?


Second:

Is there any script that I can use for insertion and updation of records while merging new QVD with older QVD as mine shows running for a long time?


Please suggest.


Hope you understand what I am looking for?


Thanks in advance.

2 Replies
prieper
Master II
Master II

Believe, that the error is produced by your ODBC-connector, have you been able to identify, where this error incurred (fieldname, content, length, special characters .....).

Incremental load might be done with a script like:

ExistingData:

     LOAD key, ... FROM MyData.QVD (QVD);

NewData:

     CONCATENATE (ExistingData) LOAD key, ..... FROM ..... WHERE NOT EXISTS(key);

STORE NewData INTO MyData.QVD (QVD);

Peter

sonysree88
Creator II
Creator II

Please find the below URL to do the incremental load

https://www.analyticsvidhya.com/blog/2014/09/qlikview-incremental-load/