Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
hi, we have a data load process from a source (SAP) to Azure SQL table.
Post doing full data load, we would like to run a update statement on the target table.
For this we created a sql trigger (insert trigger) for this, but it isn't getting fired.
Does replicate allow the triggers to work on target SQL table?
Good catch @replicate
There is an option on the advanced tab of the target endpoint "Use BCP for loading tables" which if enabled, will prevent triggers from firing. Please test with this disabled. As you might expect, load performance may be impacted.
For reference:
Limitations | Qlik Replicate Help
Thanks,
Dana
>> Does qlik use bulk insert under the hood or normal insert ?
Check with "manage endpoint connections" -> "endpoint" -> "Advanced" -> "Use BCP for loading tables"
The default is to use BCP (bulk) in which case triggers are NOT executed. Try again disabling BCP? Or set the FIRE_TRIGGERS option but be aware of the costs. Bulk copy operations that would usually be minimally logged will be fully logged.
https://www.sqlservercentral.com/forums/topic/bcp-and-triggers --> specify FIRE_TRIGGERS as "Additional ODBC connection properties:" ?
Instead of a trigger for every row, might it not be better to run whatever need to be done after the full-load?
Hein.
Hi @replicate
Since the trigger is defined on the target database, it should fire on every insert record. I am not aware of a way that Replicate could interfere with this. You might check with Azure Support on why it is not working.
Thanks,
Dana
Good catch @replicate
There is an option on the advanced tab of the target endpoint "Use BCP for loading tables" which if enabled, will prevent triggers from firing. Please test with this disabled. As you might expect, load performance may be impacted.
For reference:
Limitations | Qlik Replicate Help
Thanks,
Dana
>> Does qlik use bulk insert under the hood or normal insert ?
Check with "manage endpoint connections" -> "endpoint" -> "Advanced" -> "Use BCP for loading tables"
The default is to use BCP (bulk) in which case triggers are NOT executed. Try again disabling BCP? Or set the FIRE_TRIGGERS option but be aware of the costs. Bulk copy operations that would usually be minimally logged will be fully logged.
https://www.sqlservercentral.com/forums/topic/bcp-and-triggers --> specify FIRE_TRIGGERS as "Additional ODBC connection properties:" ?
Instead of a trigger for every row, might it not be better to run whatever need to be done after the full-load?
Hein.
It's a small table but you are right, not the best way to have trigger firing for each insert. On the other hand, do we have the ability to execute a stored proc post fdl to do the update process?
Hi @replicate
Some target endpoints allow running a script after uploading a file, such as S3, but Azure SQL target does not.
Thanks,
Dana
Hello @replicate ,
Besides @Dana_Baldwin comment, You can set the task to stop automatically after Full Load completes. This is useful if you need to perform DBA maintenance operations on the target side before the task’s Apply Changes phase begins.
Regards,
John.
That's unfortunate. Is it possible to request this basic functionality of calling a spoc/function on target sql as a feature or is it a design limitation itself?
Hi @replicate
You can submit feature requests directly to our Product Management team here:
https://community.qlik.com/t5/Ideas/idb-p/qlik-ideas
Thanks,
Dana