When using Replicate to replicate data to Big Query Target the attrep_change Table gets dropped and recreated for the batches to send to Big Query which is causing the DML generated for the Table to exceed the 1500 default Quota Limit for Big Query Table.
Environment
Replicate 7.0
Replicate 2021.5
Resolution
For the Big Query Target in Replicate you can add an internal parameter below to use the Truncate method for the attrep_change Table so the data is truncated to reduce the Quota cost for the given Tables being Replicated in the Task.
Advanced Tab on Big Query Target Internal Parameters
$info.query_syntax.truncate_table (won't show up in the list, just hit enter)
With a value of: TRUNCATE TABLE ${QO}${TABLE_OWNER}${QC}.${QO}${TABLE_NAME}${QC}
Cause
The GCP (Google Cloud Provider) has Quota Limit setup on the given tables in Big Query with the default of 1500. During CDC processing with Replicate the attrep_change (temp table) to keep track of the batches to send to Target is dropping and recreating as the bulk process to send to Target. These are DML calls to Big Query which are the used via the Quota's per Table.