Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
Hi Qlik Community,
I'm encountering a critical issue while running a Qlik Replicate task from Oracle to MySQL. The task fails during the data load phase with the following error:
Stream component 'st_1_agm-mysql-dest' terminated Stream component failed at subtask 1, component st_1_agm-mysql-dest Error executing data handler Failed to get statement, func: insert row handler Failed to prepare statement 'INSERT INTO `agmdb`.`EMPLOYEES`(`EMP_ID`,`EMP_NAME`,`DOCUMENTS`) values (?,?,?)' Failed to allocate array for parameter 'Param#003' in statement 'INSERT INTO `agmdb`.`EMPLOYEES`(`EMP_ID`,`EMP_NAME`,`DOCUMENTS`) values (?,?,?)' (size: 41943041000 bytes) Not enough storage is available to process this command. (apr status = 720008)
Context:
The error says "(size: 41943041000 bytes)" that is approx 40GB... but why?
Your explanation led me to tune the Commit Rate in Full Load settings. Adjusting the default to around 300 allowed the replication to complete successfully—thank you!
Now, I’d like to better understand how memory allocation works during the Full Load phase. Assuming the Batch Commit memory allocation logic applies here: Does Qlik Replicate allocate memory using the formula [max column size × default commit rate (i.e. 10K)]?
Also,
20 MB × 200 Rows = 4000 MB
Since 1024 MB = 1 GB, that equals approximately 3.91 GB right?
Hello @AbdullahMastan ,
In certain scenarios, tasks configured to replicate tables using Batch Optimized Apply mode may consume a significant amount of memory. This is because Qlik Replicate allocates memory based on the number of rows in the batch, multiplied by the estimated size of each row. For example, if the LOB size is limited to 20 MB and the batch contains 200 records, the task could consume up to 40 GB of memory. Additionally, other factors such as the source/target database type and version may also influence memory usage.
If you require a deeper investigation, please open a support ticket and provide the following information:
1. CREATE TABLE DDLs for both the source and target tables.
2. Set SOURCE_CAPTURE and TARGET_APPLY logging levels to Verbose. Reproduce the behavior and collect the log files, please do not forget to decrypt the task log files before submission.
Our support team may request additional details based on the initial analysis.
Regards,
John.
Your explanation led me to tune the Commit Rate in Full Load settings. Adjusting the default to around 300 allowed the replication to complete successfully—thank you!
Now, I’d like to better understand how memory allocation works during the Full Load phase. Assuming the Batch Commit memory allocation logic applies here: Does Qlik Replicate allocate memory using the formula [max column size × default commit rate (i.e. 10K)]?
Also,
20 MB × 200 Rows = 4000 MB
Since 1024 MB = 1 GB, that equals approximately 3.91 GB right?
Hello @AbdullahMastan ,
Definitely you are correct! Thank you so much for your outstanding support!
BTW, if you have multiple LOBs then the expression looks like:
20 MB × 200 Rows x 2 LOBs = 8000 MB = 8G
Regards,
John.