Google BigQuery limits cached results to approximately 10 GB compressed. Queries that return larger results overtake this limit and frequently result in the following error:
Response too large to return. Consider setting allowLargeResults to true in your job configuration
Environment:
Qlik Google BigQuery Connector (Qlik Cloud)
Qlik Google BigQuery Connector (Qlik Sense Enterprise on Windows August 2021 and later)
Resolution
To mitigate the "Response too large" error, you will have to set the queries to run with the Destination dataset specified
In Standard SQL, specify a destination table for the query results.
In Legacy SQL, specify a destination table and set the allowLargeResults option.
Step by Step guide:
Create a destination dataset under your GBQ project
Make sure to set "Table expiration settings" for this dataset, as the temporary tables created by the driver will inherit the settings from the parent dataset.
Configure Qlik Google big query connector with the below settings
For Standard SQL, just provide the dataset name as shown below
For Legacy SQL, Select Legacy SQL from language Dilect, then enable Allow Large Result Sets checkbox and provide the dataset name as shown below
Performance considerations:
As rightly quotedhere"Be aware that writing very large result sets to destination tables impacts query performance (I/O)", enabling large datasets settings will impact query performance and might slow down reloads.
So, to improve performance while working with destination tables strategy, its advisable to enable high throughput API
Performance can be further be increased by using the alternate processing method for optimization reload by enabling Bulk Reader checkbox (as shown in the screenshot above)
The information in this article is provided as-is and to be used at own discretion. Depending on tool(s) used, customization(s), and/or other factors ongoing support on the solution below may not be provided by Qlik Support.