Qlik Community


Search or browse our knowledge base to find answers to your questions ranging from account questions to troubleshooting error messages. The content is curated and updated by our global Support team

How to deal with Out of Memory errors in an automation

Showing results for 
Search instead for 
Did you mean: 

How to deal with Out of Memory errors in an automation


In this article, we will go through possible ways to recognize out-of-memory errors that can happen within automation, the root cause, how to debug and best practices to avoid/get rid of memory issues.

How to recognize

Out of memory errors can be identified if any of the below occurs

  • The automation fails with the 'Execution has failed' error message
  • The maximum run duration of automation takes much longer than the normal execution time and exceeds the limit which results in automation failure

Root Cause

The automation runs out of memory if it has the following blocks with huge data or dealing with larger lists

How to debug

  • Maximum memory usage during an automation run is 256 MB
  • Download job history by clicking the 'Export this run' button and check if memory usage accumulative sum exceeds the above limit.


Best Practices

Let's walk through different ways of resolving memory issues by recreating the automation / changing the behaviour of automation

  • In contact sync or any data sync use-cases where automation is used to sync data between 2 CRM systems and if the account contains a huge amount of data(>1million), the best approach is to use backfill patterns. Refer: https://community.qlik.com/t5/Knowledge/Backfill-patterns/ta-p/1806691 
  • Split automation
    • If Multiple syncs happen in one single automation which means for eg syncing contacts, accounts, leads, etc within the same automation, its always recommended to split the automation into 3 separate automation which will be of the same pattern as described below
      • Sync contacts from source to destination CRM in 1st automation
      • Sync accounts from source to destination CRM in 2nd automation
      • Sync leads from source to destination CRM in 3rd automation etc...
    • If merge lists block is used to merge two given lists (arrays) into one new list where any/both of the lists have more data, replace the merge list block and perform data sync for individual lists in separate automation as described in the above step
  • If compare lists block is used to delete records from destination that are missing from the source,

    automation may fail with an out of memory error for very large lists. So in order to make the deletion process work we can change the behaviour of the automation. Instead of deleting records based on the comparison, we will implement automation in any other possible ways described below

    • Check if there is already an On delete webhook event from the source platform, to delete records in the destination. Please get in touch with our support team if a connector is missing webhooks (that should be available according to the API documentation). Refer: https://help.qlik.com/en-US/blendr/Content/design-patterns/design-patterns-to-delete-records.htm
    • Use the CDP method. Follow the below steps to build the automation
      • Initialize a variable called 'lastCheckDatetime' make it empty and then set current DateTime as value.
      • List records from the source and loop all records through the upsert contact cdp block with last_check_datetime as a custom field which is mapped to the lastCheckDatetime variable.
      • Use list contacts cdp endpoint along with filter list block to filter out the contacts from cdp based on the source.
      • Condition block to check the records from the cdp which is not equal to DateTime stored in lastCheckDatetime variable.
      • If the last_check_datetime field from cdp is not equal to DateTime stored in the lastCheckDatetime variable delete contact from CDP and destination platform

Please check out the attached JSON file containing an example automation workspace demonstrating the CDP method to handle delete flow from source to destination. It should be noted that this method will only work up to 400k records in total because the CDP capacity is limited. That means if the same customer uses the CDP multiple times, the records that can be stored in CDP will be even more limited.

  • Keep in mind the below points while using variables in an automation
    • Do not create a copy of the variable which is having larger dataset if it is not required.
    • Do not use variables to store the last output of a block. The last output of a block is always kept in memory and can be used in the input field of other blocks
    • Do not store output from list endpoints to a variable. In order to get all data loop through the data using the same list endpoint or use the Loop block and then use it as an input in subsequent blocks

Follow the steps provided in this article Upload Automation Workspace  to import the automation from shared JSON file

Related Resources:

Labels (2)
Version history
Last update:
Updated by: