Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
Our DW target db (MS SQL Server) transaction log is getting full pretty quickly causing some tasks to fails. Recovery mode is set to Simple and transaction log is currently at 25 GB. Is there any option in Qlik Compose to for the transactions to have batches with commits? Trying not to keep increasing the transaction log file size.
@Al_gar We don't have an option in Compose for the transactions to have batches with commits. We have a few best practices to control the transaction logs.
1. Make sure you have a good policy of either cleaning up the logs
2. In compose, under 'Data Warehouse' settings you can reduce the 'Maximum number of database connections. Not sure if this value was increased in the past to get the task to run fast. Our default value is 10
3. You can split your tasks into smaller groups of tables and run them separately in workflow and make sure all tables load
Hope this helps!
Thanks,
Nanda
@Al_gar We don't have an option in Compose for the transactions to have batches with commits. We have a few best practices to control the transaction logs.
1. Make sure you have a good policy of either cleaning up the logs
2. In compose, under 'Data Warehouse' settings you can reduce the 'Maximum number of database connections. Not sure if this value was increased in the past to get the task to run fast. Our default value is 10
3. You can split your tasks into smaller groups of tables and run them separately in workflow and make sure all tables load
Hope this helps!
Thanks,
Nanda
Suggested the follwoing:
1. Make sure you have a good policy of either cleaning up the logs
2. In compose, under 'Data Warehouse' settings you can reduce the 'Maximum number of database connections. Not sure if this value was increased in the past to get the task to run fast. Our default value is 10
3. You can split your tasks into smaller groups of tables and run them separately in workflow and make sure all tables load