Skip to main content

Suggest an Idea

Vote for your favorite Qlik product ideas and add your own suggestions.

Announcements
CUSTOMERS ONLY: Now accepting customer applications for the 2023 Luminary Program: SUBMIT NOW

Promotion process for task between environments

KellyHobson
Support
Support

Promotion process for task between environments

Hey there,

I worked with customer who was trying come up with a best practice for promoting task to different environments (DEV, TEST, PROD).  Each environment has its own source and target definitions.  When you migrate and swap the source endpoint, you lose the table definition list which is problematic to customer.  

The workarounds we came up with for migrating PROD task to DEV were:  

1: Imported Prod task to DEV, changed the endpoints within UI to DEV, exported task as json, updated json with table list from the original prod json file, and reimported

2: Copied a version of PROD task json, edited json to point to DEV objects definitions

Both options are fairly manual and open it up to human error.  

Proposing a feature request to retain table list information in a task if you need to swap source endpoint definition.

Thank you,

Kelly  

Tags (1)
5 Comments
pettitsd
Partner - Contributor III
Partner - Contributor III

That would be a great starting point.  But why not build on that by enabling a full CI/CD process integrated with a GIT repository.

KellyHobson
Support
Support

Great suggestion @pettitsd !! 

slewis
Contributor II
Contributor II

We all know that manually making changes in production is bad bad bad...and error prone. My approach to the exact same problem was to:

1) Name source and target endpoints the same in all environments
2) Test the task in Dev; assuming Dev, Test, Prod environment as stated above
3) Once satisfied use the Export without Endpoint option to write out task JSON 
4) Copy the task JSON to the Test or Prod Replicate server and Import Task

Exporting without endpoints leaves a relative reference to the endpoints in the JSON but does not provide any of the connection string metadata. As long as the next environment already has a matching endpoint by name, the task imports and does not overwrite the existing endpoint metadata. If you export JSON with endpoints and import into another environment you will either create a new endpoint (good only the first time) or overwrite an existing endpoints connection string metadata...which is no bueno.

There are still manual tasks here, but at least you are guaranteed to have the "same" task in all environments and you don't have to worry about your tables disappearing when endpoints are changed.

I have found no benefit to naming tasks with environment indicators as Replicate and AEM both indicate the environment for you already and it just gets in the way of no-touch deployments/migrations.

At some point I'll integrate this same approach into a CI/CD pipeline that uses the Replicate API to export and import task JSON into the appropriate environment. Once this is set up a truly no-touch deployment process will be available. I suppose creating/verifying source and target endpoints could also be added to the CI/CD pipleline to eliminate this manual step as well.

With all this said, I'd like to upvote a feature request for the system to provide deployment capabilities. Perhaps it can be tied into AEM, which can be aware of multiple Replicate servers, and could orchestrate the migration between "known" servers.

pettitsd
Partner - Contributor III
Partner - Contributor III

@slewis , sounds like you have a solid set up.  I was not as wise and didn't manage to convince my team to use such a straight forward naming convention.  As a result we have names that include environmental information.  So, we have more editing to do.  Still it would be possible to automate those edits with the proper tool(s).  

It would be so much more efficient for QLIK to make it part of the product than for each of us to create it on our own.

slewis
Contributor II
Contributor II

@pettitsd It took some pain to get there as well. Initially I was exporting JSON and manually renaming the endpoints for the next environment(s). That became too error prone and time consuming, so the next step became obvious.