Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
May 23, 2025 6:51:59 AM
Apr 29, 2025 7:02:10 AM
Qlik Cloud Catalog is a native connector that lets you automate tasks related to the Data Products Catalog and Data Quality workflows. With this connector, you can enable the scheduling of quality computations for datasets, streamlining data validation processes across your organization. Aligning Data Quality execution with ELT or ETL processes helps you assess the trustworthiness of your data, especially as it may be consumed downstream by analytics, AI models, or other consumers.
This connector does not require additional configuration to authenticate, it will automatically connect to the automation owner's Qlik account. Whenever blocks of this connector are executed, they will use that account. Additional blocks (like retrieving data products or getting quality indicators) will be released over time. You can request new ones through ideation.
For the initial release, the connector introduces two main capabilities:
Schedule quality computation for selected datasets
Send notifications based on the computation result
Once your datasets are registered in Qlik Cloud Catalog, you can use Qlik Automate to schedule their quality computation with custom parameters. This ensures data quality stays aligned with your freshness and operational needs.
You can configure the computation mode:
Pushdown (for Snowflake and Databricks datasets): computation runs on the cloud data warehouse side (note: it consumes data warehouse credits).
Pullup (Qlik Cloud): computation runs in Qlik Cloud.
Both modes allow you to define a sample size. Pullup uses a head sample; pushdown uses a random sample.
To set this up:
Use the trigger data quality computation block.
Specify the dataset id (found in the dataset's details panel in Qlik Cloud Catalog).
Configure mode (pushdown or pull-up) and sampling options.
Add the trigger mode to the start block of your automation, this is where you can schedule it.
In order to know whenever your automation ran successfully or when you might need to perform actions in case of failure, you can add blocks to your automation in order to push an alert to the system of your choice. In the template we propose you to send a notification to a Slack channel.
To monitor your automation results:
You can then trigger alerts based on outcomes:
Future updates will allow threshold-based alerts, letting you trigger actions based on data quality indicator results.