2 Replies Latest reply: May 26, 2017 5:57 AM by sonygot g RSS

    Large data insert and update

    Anjali Ahlawat

      Hi All,

       

      I am working on one account where data is large.The task scheduled to pull data for the module is also for every 10 mins.

       

      Also, Whenever I have to pull older records I have to divide the intervals so that I will not get the error "Field length:100494 exceeds maximum limit of 65535.[10232]" and it takes a lot of time to fetch data even for a month or some days.

      I am using the attached qvw for pulling the same (for reference).

       

      • Also,I use the following script for incremental load:

      ===========================================================================================

      MaxModifiedDate:

      LOAD

          Timestamp(max(sys_updated_on),'YYYY-MM-DD hh:mm:ss') as Max_ModifiedDateTime   

      FROM

      [..\Raw Data\Incident.qvd]

      (qvd);

       

      LET vMaxModifiedDate= peek('Max_ModifiedDateTime');

       

      INCREMENTAL:

      SQL SELECT *

      FROM "SCHEMA".OAUSER."incident" where sys_updated_on >='$(vMaxModifiedDate)';

       

      Concatenate

      LOAD  *

      FROM

      [..\Raw Data\Incident.qvd]

      (qvd) where not exists(number);

       

      Store INCREMENTAL into [..\Raw Data\Incident.qvd](qvd);

      ==================================================================================================


      I have 2 requirements:


      First:

      Many a times ,I see the task failed error even after pulling for 10 mins because of the field length exceed error.

      Can anybody tell me if there is any way I can merge the logic of data pull(as in attached qvw) in the script written above to avoid field length exceed error?


      Second:

      Is there any script that I can use for insertion and updation of records while merging new QVD with older QVD as mine shows running for a long time?


      Please suggest.


      Hope you understand what I am looking for?


      Thanks in advance.