Do not input private or sensitive data. View Qlik Privacy & Cookie Policy.
Skip to main content

Announcements
Qlik Open Lakehouse is Now Generally Available! Discover the key highlights and partner resources here.

How to deploy Qlik Talend Cloud Pipelines across spaces using Qlik Automate

No ratings
cancel
Showing results for 
Search instead for 
Did you mean: 
VincentM
Employee
Employee

How to deploy Qlik Talend Cloud Pipelines across spaces using Qlik Automate

Last Update:

Sep 29, 2025 10:50:41 AM

Updated By:

VincentM

Created date:

Sep 29, 2025 10:48:14 AM

Qlik Automate is a no-code automation and integration platform that lets you visually create automated workflows. It allows you to connect Qlik capabilities with other systems without writing code. Powered by Qlik Talend Cloud APIs, Qlik Automate enables users to create powerful automation workflows for their data pipelines.

Learn more about Qlik Automate.

 

In this article, you will learn how to set up Qlik Automate to deploy a Qlik Talend Cloud pipeline project across spaces or tenants.

To ease your implementation, there is a template on Qlik Automate that you can customize to fit your needs.
You will find it in the template picker: navigate to Add newNew automationSearch templates and search for ‘Deploying a Data Integration pipeline project from development to production' in the search bar, and click Use template.

ℹ️ This template wil be generally available on October 1, 2025.

Use case and pre-requisites

In this deployment use case, the development team made changes to an existing Qlik Talend Cloud (QTC) pipeline.

As the deployment owner, you will redeploy the updated pipeline project from a development space to a production space where an existing pipeline is already running.

To reproduce this workflow, you'll first need to create:

  • Two data spaces:
    • A DEV space
    • A PROD space
  • Two source databases
    • A DEV source database
    • A PROD source database
  • Two target databases
    • A DEV target database
    • A PROD target database

Using separate spaces and databases ensures a clear separation of concerns and responsibilities in an organization, reduces the risk to production pipelines while the development team is working on feature changes.

 

Workflow steps:

  1. Export the updated pipeline project from the DEV space
  2. Get the project variables from DEV
  3. Update the project variables for PROD before import
  4. Stop the pipeline currently running in production
  5. Import your project to the PROD space
  6. Prepare your project and check the status
  7. Restart the project in PROD

ℹ️ Note: This is a re-deployment workflow. For initial deployments, create a new project prior to proceeding with the import.

 

Step-by-step workflow

1. Export the updated pipeline project from the DEV space

VincentM_0-1758556419784.png

Use the 'Export Project' block to call the corresponding API, using the ProjectID.

This will download your DEV project as a ZIP file. In Qlik Automate, you can use various cloud storage options, e.g. OneDrive. Configure the OneDrive 'Copy File on Microsoft OneDrive' block to store it at the desired location.

To avoid duplicate file names (which may casue the automation to fail) and to easily differentiate your project exports, use the 'Variable' block to define a unique prefix (such as dateTime). 

 

2. Get the project variables from DEV

From the 'Qlik Talend Data Integration' connector, use the 'Get Project Binding' block to call the API endpoint.

The 'bindings' are project variables that are tied to the project and can be customized for reuse in another project. Once you test-run it, store the text response for later use from the 'History' tab in the block configuration pane on the right side of the automation canvas:

VincentM_2-1758616801348.png

 

3. Update the project variables for PROD before import

We will now use the 'bindings' from the previous step as a template to adjust the values for your PROD pipeline project, before proceeding with the import.

From the automation, use the 'Update Project Bindings' block. Copy the response from the 'Get Project Binding' block into the text editor and update the DEV values with the appropriate PROD variables (such as the source and target databases). Then, paste the updated text into the Variables input parameter of the 'Update Project Binding' block.

ℹ️ Note: these project variables are not applied dynamically when you run the 'Update Bindings' using the Qlik Automate block. They are appended and only take effect when you import the project.

 

4. Stop the pipeline currently running in production

For a Change Data Capture (CDC) project, you must stop the project before proceeding with the import.

Use the 'Stop Data Task' block from the 'Qlik Talend Data Integration' connector. You will find the connectors in the Block Library pane on the left side of the automation canvas.

Fill in the ProjectID and TaskID:

VincentM_0-1758616439453.png

ℹ️ We recommend using a logic with variables to handle task stopping in the automation. Please refer to the template configuration and customize it to your needs.

 

5. Import your project to the PROD space

You’re now ready to import the DEV project contents into the existing PROD project.

⚠️ Warning: Importing the new project will overwrite any existing content in the PROD project.

Using the OneDrive block and the 'Import Project' blocks, we will import the previously saved ZIP file.

ℹ️ In this template, the project ID is handled dynamically using the variable block. Review and customize this built-in logic to match your environment and requirements.
 

VincentM_3-1758556419795.png

 

After this step is completed, your project is now deployed to production.

 

6. Prepare your project and check the status

It is necessary to prepare your project before restarting it in production. Preparing ensures it’s ready to be run by creating or recreating the required artifacts (such as tables, etc).

The 'Prepare Project' block uses the ProjectID to prepare the project tasks by using the built-in project logic. You can also specify one or more specific tasks to prepare using the 'Data Task ID' field. In our example, we are reusing the previously set variable to prepare the same PROD project we just imported.

If your pipeline is damaged, and you need to recreate artifacts from scratch, enable the 'Allow recreate' option. Caution: this may result in data loss.

 

VincentM_4-1758556419798.png

Triggering a 'Prepare' results in a new 'actionID'. This ID is used to query the action status via the 'Get Action Status' API block in Qlik Automate. We use an API polling strategy to check the status at a preset frequency. 

Depending on the number of tables, the preparation can take up to several minutes.

Once we get the confirmation that the preparation action was 'COMPLETED', we can move on with restarting the project tasks.

If the preparation fails, you can define an adequate course of action, such as creating a ServiceNow ticket or sending a message on a Teams channel.

ℹ️ Tip: Review the template's conditional blocks configuration to handle different preparation statuses and customize the logic to fit your needs.

 

7. Restart the project in PROD

Now that your project is successfully prepared, you can restart it in production.

In this workflow, we use the 'List Data Tasks' to filter on 'landing' and 'storage' for the production project, and restart these tasks automatically.

 

All done: your production pipeline has been updated, prepared, and restarted automatically!

Now it’s your turn: fetch the Qlik Automate template from the template library and start automating your pipeline deployments.

 

Related Content

About Qlik Automate

Start a Qlik Talend Cloud® trial

How to get started with the Qlik Talend Data Integration blocks in Qlik Automate

How to use the Qlik Automate templates

Contributors
Version history
Last update:
8 hours ago
Updated by: