Qlik Community

Qlik Sense Enterprise Documents & Videos

Documents & videos about Qlik Sense.

BARC’s The BI Survey 19 makes it official. BI users love Qlik. GET REPORT

Multi-tier deployment in a single environment - let apps automatically know which datasources to use!


Multi-tier deployment in a single environment - let apps automatically know which datasources to use!

Major development is often split into several environment or stages, in order to separate development (where changes are made) and production (where end users are). It is used as a phased approach of software testing and deployment, where each tier closer to the end-user represents more critical infrastructure.

However, some software requirements use a single environment for two different stages of a development. This can be due to license matters, hardware costs etc.

Say you have both DEV and TEST datasources accessible in the same Qlik Sense environment, how can you easily handle two streams for each purpose?

  • DEV-Finance
  • TEST-Finance

In the script you can easily define e.g.:

SET vG_Environment = 'dev';

which then results in:

lib://data-$(vG_Environment) Finance/

This would allow the developer to manually test and switch different data-sources depending on where in the development stage he/she is testing. But how do you do this automatically during the different deployment stages? And if there is a third, separate environment (which perhaps the developer isn't allowed to access), how can you let the administrator save time by not manually editing every single load script?

The solution is to let the script connect to the internal REST connector and read what type of Stream the app is published to!

The stream-type is defined through custom properties.

This will enable apps automatically sense which environment-stage they currently reside in. This eases app transfers considerably.

It will also save tons of time.

Put a copy of the developer code (e.g. Developer code to copy.txt) in a shared network folder which could contain both the script and the QVD files to read. The folder is a read-only folder and accessible for all users within the Qlik Sense Data Load editor.

It works by reading which streams exists and if they have a DEV or TEST StreamType custom property assigned. It then maps the streams to where the app is published and assumes that there is from where it should reload. We use a REST connector to read from the repository itself towards qrs_apphublist and qrs_stream. Other APIs can be used as well. The repository is a PostgreSQL database.

Here is how to accomplish this...

For the administrator:

In the below scenario we have two machines:

Machine 1 - acting as a DEV and TEST machine. This machine is assumed below unless otherwise stated, as on #6.

Machine 2 - PROD

1. Adding REST connectors

Add the following internal connectors and make sure they are accessible and added, for administrators:

You can imitate the other built-in REST connectors properties or just add the following properties:

  • Timeout: 900
  • Method: GET
  • Auto-detect response type: Yes
  • Key Generation strategy: Sequence ID
  • Use Windows Authentication: Yes
  • Skip Server Certificate Validation: Yes
  • Use Certificate: No
  • Pagination type: None
  • Query parameters: xrfkey: 0000000000000000
  • Query headers: X-Qlik-XrfKey: 0000000000000000; User-Agent: Windows

Have a look at the help site on how the APIs work, in case of questions. Yes, qrs_app can be used, but as we will extensively use these we want to be able to configure, troubleshoot and use our very own connector.

Remember to deny access for anyone else than administrators.

2. Add a new Custom Property

Add a new custom property called "StreamType".

Add two values: dev, test

3. Add the custom property values to the relevant streams

Apply these two values to the relevant streams, e.g.

  • DEV-Finance
    • StreamType - "dev"
  • TEST-Finance
    • StreamType - "test"

4. Create a new app, with the sole purpose of extracting Stream ID and corresponding custom properties.

Create a new app which contains (code at "Streams to QVD.txt"), which extracts which Stream IDs that has the corresponding custom properties and saves this to a file. This will use our REST connector "qrs_stream".

As streams are not as often added it should be enough to run once a day, so remember to schedule it.

The reload task should output a QVD file called "stream_id_to_env.qvd".

5. Make the developers code easily accessible

Now, put the code attached (whichenvironment.txt) in a text-based file which is accessible (read-only) for the developers, e.g. the "Services" library.

This folder needs to be a shared network drive, so that the developer can easily copy the file content.

Remember to add the folder as a Library as well. The library should be accessible to everybody in order for the developer to read the "stream_id_to_env.qvd" from his script.

In the DEV/TEST environment our apps will read this text file and set the corresponding variable depending on which stream the app is published to, by connecting to the internal QRS database and read which Stream it is published to. If the app contains "_DEVELOPMENT" in the title, the rest is ignored and "DEV" is assumed, which allows for the developer to use the include-file despite not being able to access the qrs_stream and qrs_apphublist data-sources. It thus just skips the rest of the script.

The app then knows if it is to access "dev" or "test" data sources.

Developers can also manually define which connection to use, during testing.

6. Repeat the above steps in any other environment

In DEV/TEST the file whichenvironment.txt contains a complex connection string to https://localhost/qrs/app/hublist/full which only root admins and service accounts can access.

In another environment (e.g. PROD) where there is only a single tier, the whichenvironment.txt can be very simple and include a simple variable "SET" for the app to read, in order to know which datasource to use.

Remember to share the network folder in PROD as well.

For the developer:

Include the following in the script:

//Feature to sense which environment we are working in


//to manually use, e.g. during development use dev

//SET vG_Environment = 'dev';

//you can also just name your app "_DEVELOPMENT" in title if it doesn't work

//use with e.g. lib://data-$(vG_Environment) Finance/

//or LIB CONNECT TO 'data-$(vG_Environment)';

//if it doesn't work, try to include brackets []

trace Your current environment is '$(vG_Environment)';


No matter what I try I get an authentication error when seeting up the first dataconnection.

Using the same usr/pwd as the ones that are running the services.



The connection string changed with version 3.2 and higher. Ensure that the built in internal connectors work first, and see if you can use them, e.g. qrs_app etc. You may compare your settings to those. Here is a guide: Qlik Support

No need to fill out the username and password.


You guided me in the right direction.

In server-version June2017 update 2 version there is no built in qrs_app but the are named as below.


By looking into the monitor_apps_REST_app I found out that I was missing the trusted location. After adding that it worked when adding User-Agent Windows.

Also in June2017 release you have to add a user/pwd.



Ah I see. I have only used this in 3.2 SR3, but will evaluate in September release shortly. Did you get it to work?



Yes it is working nicely now which mean nothing have to be changed in the apps when publishing to prod from test 🙂

Thanks for sharing this!

It really is something that lacks in the installation. Hopefully there will come a day when you can pass a parameter from the task to the load-script as can be done in Qlikview.

There is a typo in the text above https://localhost/qrs/apphublist/full shold be https://localhost/qrs/app/hublist/full.

Also when scheduling the scripts you need to set the rights on the new data-connections so the user running the services is allowed to use them. My installation was new so there was no such rules.



I've updated the text for the typo. Regarding the rights on the data connections, yes, I will bring that up later

Version history
Revision #:
1 of 1
Last update:
‎2017-05-31 02:38 PM
Updated by: