Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
The purpose of this post is to help you install the database drivers necessary to allow your Qlik Data Gateway to communicate with your company's servers once you have completed the Qlik Data Gateway installation itself.
If you are anything like me, perhaps you panicked a bit at the thought of installing the Qlik Data Gateway in a Linux environment. I have a lot of experience with .EXE installations in Windows environments. You know "Next – Next – Next – Finish." But an .RPM file? I had never even see that extension type before. If you were a Linux connoisseur beforehand, you probably guessed that my image for this post is an homage to the Fedora flavor of Linux. Otherwise you just thought it was an advertisement for the new "Raiders of the Lost Data" movie.
In any event, by now you have created your first Data Gateway, applied the registration key, completed the setup instructions and thankfully the command to check your Data Gateway service shows that it is running.
When you go back to the Data Gateway section of the Management Console and do a refresh your eyes fill you with happiness because your brand spanking new Data Movement Gateway shows "Connected.”
A lesser person would go celebrate right now. But you've decided to try and connect to a source before doing your happy dance. So, you create a new Data Integration project to the destination of your choice. While you will ultimately have many different data sources, let's imagine that you decide to start with a "SQL Server (Log Based)" connection, as your first source test.
You input the server connection details, but your SQL Server doesn't use a standard port for security. Finally, you find information online that you should input your server IP followed by a "comma and the port #". As an example, if your servers IP is 39.30.3.1 and your security port is 12345 you would input "39.30.3.1,12345'. Next you input the user and password credentials. Your last step is to choose the database. Easy peezy, lemon squeezey. Right?
You press the "Load databases" button but suddenly a dialog comes up telling you that the Data Gateway can't connect because it can't find a SQL Server driver.
Your heart starts beating quickly but naturally as a pro, you remain calm on the outside. Eventually you realize that whether on Windows or Linux, applications have always required drivers to communicate with servers. This is nothing new, we just got excited when we saw that connected message and thought we were done. Upon going back to the setup guide
you realize that there is in fact a link labeled "Setting up Data Gateway – Data Movement source connections."
So, you go ahead and click the link and it takes you to:
Wow, so many sources, and so many additional links to click to ensure the required drivers are in place for the sources your company will need. All the documentation is there, but I know firsthand that it can get a bit overwhelming, especially if Linux isn't your native language, which is the reason for this post.
Obviously every one of you reading this works in an environment that may require different data source connections than the others. Thus, there is no way for me to predict and help with your exact configuration. However, odds are strong that most of you likely require at least: SQL Server, Databricks, Snowflake, Postgres or MySQL, various combinations of them, or perhaps all of them.
As tedious, or imposing as it may be, I highly recommend you walk through the documentation for each data source you will need. But thanks to my buddy John Neal, I have attached a Linux shell script that can be executed to configure all 5 of those data sources for you. Given the many flairs and versions and configurations of Linux I can't ensure that it will work for everyone, but at least it is a start for those that may want to press an easy button, and those that like me may be somewhat or brand new to Linux.
If you choose to take advantage of it, understand that it is only being offered a shelp, and is not meant to replace the documentation. To utilize it you will need to do the following (Please note in my examples I have changed to the root user. If you are logged in as a normal user account, you may need to use SUDO "super user do"):
If all went well with the installation your output should look like similar to the following image that was part of my file:
It's almost time to do our happy dance, but let's hold off until we test. In my starting example I asked you to assume we wanted to test against a "SQL Server (Log Based) connection." When we left off it was because we got an error message we had no driver while trying to load the list of databases. I will try that again.
Oh no, the heart rate is going up again.
We have successfully installed the Qlik Data Gateway. We have successfully installed the required drivers. Yet, we are getting this new error message. Let's focus on our breathing and try and digest the situation. What could cause our attempt to connect to our data source to timeout? I got it.
It's likely network security. We know what we want to talk to. We know the location. We know the credentials. But our networks aren't always wide open to do the talking. Resolving your connectivity/firewall issues may or not be with your abilities and if you are like me, you may need to seek the help of your IT/Networking team.
When I reached out to my friendly IT guru, here within Qlik, he was able to help me get everything in place so that my Linux server could speak with my database servers, including all of the needed ports.
Once they were completed I was able to test and sure enough my data connection succeeded.
Whether or not you do a happy dance, as I did, I hope that this post has helped you get to that sweet smell of success. After all, someone has to be known as the amazing person who got your Qlik Data Gateway going so that others in the Data Engineering team could create all of those lights out Qlik Cloud Data Integration projects that would be feeding data in near real time to all of those wonderul analytics use cases. Hopefully with the help of the documentation and this post, that person is you my friends.
Challenge
One of the things I've long admired about the Qlik Community is their willingness to help each other through this Community site. If you are a Linux guru and are so inclined I would love to see you share other versions of the shell script that I have started. Maybe your organization is using another flair/version of Linux and you needed to make a few tweaks to my file. Maybe your organization needed Oracle added and you can tweak my file. Whatever the reason, I sure hope you will give back to the community by sharing all of those tweaks here. Who knows, your help might help them be able to do their happy dance. And we all know the world is a better place when more people do their happy dance.
Related Content
Qlik Data Gateway - Data Movement prerequisites and Limitations - https://help.qlik.com/en-US/cloud-services/Subsystems/Hub/Content/Sense_Hub/Gateways/dm-gateway-prerequisites.htm
Setting up the Data Movement gateway - https://help.qlik.com/en-US/cloud-services/Subsystems/Hub/Content/Sense_Hub/Gateways/dm-gateway-setting-up.htm
PS - I created both of the images here using a generative AI solution called MidJourney. I hope they've added to the fun of this post.
Is there any option to find out, which Browser, OS, Device and IP address of our users using to log in to Qlik Sense production system? Can this data be found in the Qlik Sense monitoring apps or Qlik Sense logs?
Note that this article covers the historic analysis of data and does not include how to identify access live and have Sense react accordingly.
Environments:
This method requires for the Proxy log files logging level to be increased, and for Extended Security Environment to be enabled. Extended Security Environment has consequences in the environment, such as disabling the potential of sharing sessions across multiple devices. See the Qlik Online help for details.
Settings to Enable:
The proxy will now log additional information in:
C:\ProgramData\Qlik\Sense\Log\Proxy\Trace\[Server_Name]_Audit_Proxy
Example Output:
Audit.Proxy.Proxy.Core.Connection.ConnectionData [X-Qlik-Security, OS=Windows; Device=Default; Browser=Chrome 67.0.3396.99; IP=::ffff:172.16.16.100; ClientOsVersion=10.0; SecureRequest=true; LicenseContext=UserAccess; Context=AppAccess; ] || [X-Qlik-User, UserDirectory=DOMAIN; UserId=administrator]
For more information on where to find the logs see
How To Collect Qlik Sense Log Files
A much more lighter-weight approach than method 1 would be to parse the HubService logs in C:\ProgramData\Qlik\Sense\Log\HubService. No additional settings are required.
Example Output:
::ffff:192.168.56.1 - - [31/May/2019:12:36:40 +0000] "GET /about HTTP/1.1" 304 - "https://SERVERNAME/hub/?qlikTicket=9mDlmVfE-E1Nc3RT" "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/74.0.3729.169 Safari/537.36"
The information in this article is provided as-is and to be used at own discretion. Depending on tool(s) used, customization(s), and/or other factors ongoing support on the solution below may not be provided by Qlik Support.
In a Windows multi-node deployment, the App Distribution Service (ADS) distributes apps from Qlik Sense Enterprise on Windows to Qlik Sense Enterprise SaaS. The service is installed on every node. However, Qlik Sense does not have load balancing for ADS, meaning if not all nodes have access to the apps, distribution may fail. See App Distribution from Qlik Sense Enterprise to Qlik Cloud fails when distributed from RIM NODE.
If you wish to disable app distribution from certain nodes:
[appdistributionservice]
Disabled=true
Identity=Qlik.app-distribution-service
DisplayName=App Distribution Service
ExePath=dotnet\dotnet.exe
UseScript=false
[hybriddeploymentservice]
Disabled=true
Identity=Qlik.hybrid-deployment-service
DisplayName=Hybrid Deployment Service
ExePath=dotnet\dotnet.exe
UseScript=false
This article explains how to extract changes from a Change Store and store them in a QVD by using a load script in Qlik Analytics.
The article also includes
This example will create an analytics app for Vendor Reviews. The idea is that you, as a company, are working with multiple vendors. Once a quarter, you want to review these vendors.
The example is simplified, but it can be extended with additional data for real-world examples or for other “review” use cases like employee reviews, budget reviews, and so on.
The app’s data model is a single table “Vendors” that contains a Vendor ID, Vendor Name, and City:
Vendors:
Load * inline [
"Vendor ID","Vendor Name","City"
1,Dunder Mifflin,Ghent
2,Nuka Cola,Leuven
3,Octan, Brussels
4,Kitchen Table International,Antwerp
];
The Write Table contains two data model fields: Vendor ID and Vendor Name. They are both configured as primary keys to demonstrate how this can work for composite keys.
The Write Table is then extended with three editable columns:
This article provides answers to the most frequent questions asked about Qlik Answers.
For the Qlik MCP FAQ, see Qlik Model Context Protocol (MCP) FAQ.
In February 2026, we launched our new agentic experience, which will enhance decision-making and improve productivity through a combination of assistants and agents running on a cutting-edge architecture. This initial release includes out-of-the-box agents for structured data analytics, unstructured knowledge, discovery of anomalies, and help and assistance. These agents take advantage of our foundational capabilities, including our data products and unique analytics engine, to execute complex, multi-step tasks in a trusted, scalable, and secure manner.
Qlik Answers is the primary AI assistant for people to interface with agentic AI. It will understand the intent of natural language questions and engage the underlying agentic framework to execute tasks, build responses, and take actions.
Qlik Answers now combines structured data analytics with unstructured content and general knowledge and reasoning from LLMs to deliver the most complete and relevant answers and insights, helping our customers improve decisions, productivity, and business outcomes in ways not possible before.
Looking ahead, as we build additional agents, such as prediction agents and pipeline agents, they will all be invoked through Qlik Answers. A broader set of agents is planned, all aimed at helping users get more value from their data and become more productive as Qlik continues to evolve.
With Qlik Answers now able to handle both structured and unstructured data, you can drive hundreds more informed decisions and actions each day. You can drive productivity through automation of a broad range of data and analytics tasks and workflows. And with plug-and-play simplicity, you can quickly deploy assistants in a matter of hours, reducing risk, speeding time-to-value, and future-proofing their investments in AI.
For now, Qlik Answers will continue to be priced based on current models for the number of questions asked. You get capacity at corresponding levels in Standard, Premium, and Enterprise editions, as well as Qlik Sense Enterprise SaaS, with additional capacity available for purchase as needed.
There is currently no additional cost for structured data questions or task automation requests; a question is a question.
For additional details, refer to Pricing.
Since launch, Qlik Answers has been rolled out across regions, and the process is still ongoing. If you have Standard, Premium, and Enterprise editions, check if your region already supports it (see Supported regions).
If it is not yet available to you, then:
Yes, you must be a Qlik Cloud customer to use Qlik Answers. Qlik Answers is built on cloud-native technologies, specifically large language models (LLMs) that require significant compute resources and specialized infrastructure, and there is no mechanism to deploy these technologies in an on-premises environment.
However, you don’t have to fully migrate their analytics environment or documents to the cloud to take advantage of Qlik Answers. Analytics apps can be pushed to the cloud as needed to support Qlik Answers. See Qlik Answers and applications distributed from Qlik Sense Enterprise on Windows for details.
No. You will use either Qlik Answers or Insight Advisor, not both at the same time.
Qlik Answers represents the AI-first experience going forward. When a tenant chooses Qlik Answers, that becomes the primary way users interact with analytics. Insight Advisor is not available in parallel within the same tenant.
This is a deliberate choice to avoid duplicated experiences, inconsistent results, and user confusion.
No. Qlik Answers is cloud only.
There are no plans to bring Qlik Answers to on-premises environments. The product relies on cloud native AI services, managed infrastructure, and continuous model evolution.
Insight Advisor is not being discontinued.
If you remain on Insight Advisor, you can continue using it. However, within a tenant, you must choose between Insight Advisor and Qlik Answers. You cannot run both experiences side by side.
The most important and relevant business logic is preserved when moving to Qlik Answers.
That said, Qlik Answers is built for a newer generation of AI-driven analytics. In many cases, customers will find they no longer need to manually build or maintain the same level of logic, because the system handles more of that automatically.
The value is not in recreating everything exactly as it was, but in moving to a simpler, more capable experience.
This is essentially a buy vs build decision:
Qlik Answers is built on AWS Bedrock and currently utilizes Anthropic Claude models. The specific model versions vary by agent function and are continuously evaluated and updated based on performance, accuracy, latency, and cost optimization.
Our Model Selection Philosophy:
Qlik maintains flexibility in model selection to continuously improve the user experience as AI technology evolves. Different agents within the Qlik Answers architecture may use different models optimized for their specific tasks (e.g., semantic understanding, code generation, reasoning).
No. Not at this stage.
Qlik Answers is a managed experience with curated models and configurations. Customers who want to use their own models or bring custom AI stacks should use MCP instead.
Yes. Qlik Answers works on top of existing Qlik Sense applications and uses the same data, logic, and security model.
But to get the best experience, apps should be prepared beforehand:
Yes. Master measures and dimensions are always prioritized. If business logic exists, Qlik Answers uses it rather than creating new calculations.
Yes. Qlik Answers generates appropriate visualizations such as KPIs, bar charts, or time-based charts depending on the question.
Qlik Answers inherits and enforces Qlik's established security model without exception. All existing security rules, section access configurations, and row-level security policies apply automatically.
Key security principles:
Field-level security (if implemented) is respected in all analyses.
No additional security configuration is required. Organizations with complex security requirements can continue using their existing Qlik security implementations with confidence.
Yes, if their access rights differ. Answers are always scoped to the user’s permissions.
While no special data preparation is required beyond standard Qlik Sense data modelling best practices, the apps themselves should be prepared beforehand to give you the best experience possible:
Yes. Qlik Answers understands conversational context, allowing users to refine or continue their analysis.
At its initial GA release, Qlik Answers is optimized and fully supported for English language queries and responses.
While the underlying large language models have multilingual capabilities and may be able to process queries in other languages with varying degrees of accuracy, non-English language support is not officially validated, documented, or supported by Qlik at this time.
Additional language support is planned for future releases based on demand and regional priorities.
No. It accelerates analysis and reduces repetitive work but does not replace human expertise or decision-making.
Yes. Only enabled and indexed applications are available.
Not in the current GA release. Qlik Answers operates within the context of a single Qlik Sense application per query. Multi-application query capabilities are planned for a future release.
If you want to ask questions in an app, you just need the ‘Data analysis’ scope. If you plan on asking questions to an assistant, you need the ‘Data analysis’ and ‘Search knowledge base’ scopes.
Cross-region inference has minimal risks as the data still stays within the AWS Virtual Private Network. The only difference here is that the LLM call gets processed in a different region due to GPU availability.
We have made a deliberate design decision to prioritize the quality of answers and insights over the speed of responses. In general, Qlik provides a far richer reasoning process and answer than competing products, and this results in a longer response time. We are planning to improve and optimize this, as well as introduce a faster mode for simpler questions in the future.
Qlik Answers always references its sources in detail. To begin troubleshooting, check the citations, which will show:
In a case where you do not get the response you expect based on the sources, or you receive an error:
Has your app been prepared for Qlik Answers?
Your Qlik Cloud subscription determines the quota of questions asked by users. If you are licensed for Qlik Answers, both MCP and Qlik Answers will use your monthly question capacity. See Administering Qlik MCP server.
Question capacity quotes are per month and reset every month. When you hit your limit, users can no longer ask questions until the next month. Overage is only allowed, depending on your subscription. For more information, see Qlik MCP server product description.
For more information on overage, see Overage.
Features can be turned off for individual users through user scopes.
See Control access to AI features.
If you have previously enabled the feature, the entirety of Qlik’s Agentic Analytics can easily be turned off again by configuring AI features in Qlik:
See Enable cross-region inference.
Error codes
These error codes should only be used to reference what is an expected error. Retry if you receive any of these errors.
Retry and Processing Errors
App and Document Errors
Chart and Sheet Errors
Expression and Hypercube Errors
Semantic Search Errors
Access Verification Errors
This article overviews the available blocks in the Snowflake connector in Qlik Application Automation. It will also cover some basic examples of retrieving data from a Snowflake database and creating a record in a database.
The Snowflake connector has the following blocks:
To create a new connection to Snowflake, the following parameters are required:
Warning
Account names that include underscores can cause issues for certain features. For this reason, Snowflake also supports a version of the account name that substitutes the hyphen character (-) in place of the underscore character. For example, both of the following URLs are supported:
URL with underscores:
https://acme-marketing_test_account.snowflakecomputing.comURL with dashes:
https://acme-marketing-test-account.snowflakecomputing.com
More details about the account name can be found in the below Snowflake documentation
The password field is a required field when configuring a Snowflake connection in Qlik Automate. While Snowflake does permit the use of unencrypted private keys, which do not require a passphrase, Qlik's connector mandates the password field for both username and password authentication and key pair authentication with a password-protected keyfile.
If you have a keyfile without a passphrase, tools such as OpenSSL can be used to add one.
Example OpenSSL command to add a passphrase:openssl pkcs8 -topk8 -in existing_key.p8 -out encrypted_key.p8-in existing_key.p8: your current private key file
-out encrypted_key.p8: the output file for the encrypted key
You will be prompted to enter and confirm a new passphrase after running the command.
The Do Query block can be used to perform actions in Snowflake that aren't supported by the other blocks.
The information in this article is provided as-is and to be used at own discretion. Depending on tool(s) used, customization(s), and/or other factors ongoing support on the solution below may not be provided by Qlik Support.
NPrinting Initial Installation Fails with Error 0x080070643 (or similar)
If any other version of NPrinting (ie NP 16 and earlier client server track) has been installed here previously or other unrelated software is installed, it would be recommended to reinstall Windows Server OS to ensure a clean start before installing NPrinting Server. (...or if this target Windows Server has been repurposed for use with NPrinting from some other function. It could already be damaged ie: damaged registry files).
*It is best to have a clean slate when installing Qlik NPrinting Server for the first time or if you suspect that the underlying Windows Server has become corrupted or damaged, or even when upgrading and Error 0x080070643 appears *
This article explains how to extract changes from a Change Store by using the Qlik Cloud Services connector in Qlik Automate and how to sync them to a database.
The example will use a MySQL database, but can easily be modified to use other database connectors supported in Qlik Automate, such as MSSQL, Postgres, AWS DynamoDB, AWS Redshift, Google BigQuery, Snowflake.
The article also includes:
Content
Here is an example of an empty database table for a change store with:
Run the automation manually by clicking the Run button in the automation editor and review that you have records showing in the MySQL table:
Currently, there is no incremental version yet for the Get Change Store History block. While this is on our roadmap, the automation from this article can be extended to do incremental loads, by first retrieving the highest updatedAt value from the MySQL table. The below steps explain how the automation can be extended:
SELECT MAX(updatedAT) FROM <your database table>
The solution documented in the previous section will execute the Upsert Record block once for each cell with changes in the change store. This may create too much traffic for some use cases. To address this, the automation can be extended to support bulk operations and insert multiple records in a single database operation.
The approach is to transform the output of the List Change Store History block from a nested list of changes into a list of records that contains the changes grouped by primary key, userId, and updatedAt timestamp.
See the attached automation example: Automation Example to Bulk Extract Change Store History to MySQL Incremental.json.
The provided automations will require additional configuration after being imported, such as changing the store, database, and primary key setup.
Automation Example to Extract Change Store History to MySQL Incremental.json
Automation Example to Bulk Extract Change Store History to MySQL Incremental.json
If field names in the change store don't match the database (or another destination), the Replace Field Names In List block can be used to translate the field names from one system to another.
To add a more readable parameter to track the user who made changes, the Get User block from the Qlik Cloud Services connector can be used to map User IDs into email addresses or names.
A user's name might not be sufficient as a unique identifier. Instead, combine it with a user ID or user email.
Add a button chart object to the sheet that contains the Write Table, allowing users to start the automation from within the Qlik app. See How to run an automation with custom parameters through the Qlik Sense button for more information.
Environment
When authenticating the Authentication Windows keeps prompting for login.
Check if you also have the issue when setting the QMC --> Virtual Proxy --> Windows authentication pattern to "Form" (instead of "Windows").
If you do not have the issue with the Form mode, the issue is most likely caused by a Browser policy or a Windows local policy.
When you try to authenticate with Qlik Sense, you can check and see the authentication in the audit logs:
C:\ProgramData\Qlik\Sense\Log\Proxy\Trace\SERVER_Audit_Proxy.txt
2319 20210908T153023.213+0200 INFO QlikServer1 Audit.Proxy.Proxy.SessionEstablishment.RedirectionHandler 38 dbc7b6a4-9314-41fb-b683-c70803402fe7 DOMAIN\qvservice Authentication required, redirecting client@http://[::1]:56417/ to https://localhost/internal_windows_authentication/ 0 be39c60e-e41a-4c82-98ee-470d5d685ec2 ::1 1861e8f3aacd2a0e7b02696ffe70d447fe88e659
In the above log we do see the authentication is required: Authentication required
After having put the login and password you should have below line:
2325 20210908T153023.468+0200 INFO QlikServer1 Audit.Proxy.Proxy.SessionEstablishment.Authentication.TicketValidator 56 ec875129-8767-4313-9c40-6b38d19f535d DOMAIN\qvservice Issued ticket 'SN30j55B6QToyf8F' for user, valid for 1 minute(s) 0 DOMAIN administrator SN30j55B6QToyf8F b99ddb05183cd238af0af82c843141fd80855f90
If you have nothing after the authentication required line, it means something is blocked outside Qlik, as nothing comes to the logs.
You can eventually check the Windows security event logs, if you do not see anything logged, then you migh want to check the local policy.
In this example we suspect security policies, you might want to compare the Secpol of this not working server with another server where the authentication is working.
In the case of this article, we found out issue was caused because of one specific security (start secpol.msc --> local policies --> security options):
Network security: Restrict NTLM: Incoming NTLM traffic
If you select "Deny all accounts."
It will not be possible to authenticate with a Windows prompt.
The solution is to select "Allow All".
Note: We had some cases where the above wrong setting (deny all accounts) was provoking an HTTP ERROR 500 and in the Chrome .har file, error was: net::ERR_HTTP_RESPONSE_CODE_FAILURE
Caused by a local security policy which prohibit the NTLM authentication.
This article explains how the Qlik Sense app button component can be used to send custom parameters directly to the automation without requiring a temporary bookmark. This can be useful when creating a writeback solution on a big app as creating and applying bookmarks could take a bit longer for big apps which adds delays to the solution. More information on the native writeback solution can be found here: How to build a native write back solution.
Contents
If you want to limit this to a specific group of users, you can leave the automation in Manual run mode and place it in a shared space that this group of users can access. More information about this is available here: Introducing Automation Sharing and Collaboration. Make sure to disable the Run mode: triggered option in the button configuration.
Environment
The information in this article is provided as-is and will be used at your discretion. Depending on the tool(s) used, customization(s), and/or other factors, ongoing support on the solution below may not be provided by Qlik Support.
After downloading a Qlik product installer or patch, you may want to verify the checksum of the file. This article explains how to compute the checksum with the MD5 hash.
This applies to products such as Qlik Talend Studio, Qlik Sense Enterprise on Windows, and similar.
Command: get-Filehash -algorithm
Example:
get-Filehash .\Patch_20201218_R2020-12_v1-7.3.1.zip -algorithm md5
Output:
Algorithm Hash Path
--------- ---- ----
MD5 A18D537FE8F466643FF2B36DC0713D9F C:\tmp\Patch_20201218_R2020-12_v1-7.3.1.zip
Command: CertUtil
Example:
CertUtil -hashfile Patch_20201218_R2020-12_v1-7.3.1.zip MD5
Output:
MD5 hash of Patch_20201218_R2020-12_v1-7.3.1.zip:
a18d537fe8f466643ff2b36dc0713d9f
CertUtil: -hashfile command completed successfully.
Use either cksum or md5sum.
Command: cksum
Command: md5sum
Example:
# cksum Patch_20201218_R2020-12_v1-7.3.1.zip
Output:
2689783428 702238968 Patch_20201218_R2020-12_v1-7.3.1.zip
Example:
# md5sum Patch_20201218_R2020-12_v1-7.3.1.zip
Output:
a18d537fe8f466643ff2b36dc0713d9f Patch_20201218_R2020-12_v1-7.3.1.zip
This article gives an overview of the available blocks in the Github connector in Qlik Application Automation. It will also go over some basic examples of retrieving file/blob contents from your repos as well as other functionalities within a GitHub account.
As with most connectors provided for automations the authentication for this connector is based on the oAuth2 Protocol, so when connecting to it you provide the user name and password of the account directly to the Github platform to request access so it is done in the most secure manner there is.
Let's now go over a few basic examples of how to use the Github connector:
How to list owned repositories and check their contents from your Github account:
Now the "list my repositories" block offers a couple of filtering options depending what result you want (all repos or just the private or public ones and if you want the result to come in sorted by some rule) but they are mostly optional. Not filling them in will return by default all repositories.
As for the "List repository contents" block you will need to fill in the username you use for your github account as well as the repository name which can be filled in with the results gotten from the first block. You can leave the path parameter empty to get the contents from the root folder or you can specify a path and the contents of that path will be returned.
As stated, if you expect to retrieve only one record, the use of "get repository content" block is more better suited. Also, you might want to switch this "List repository contents" block On Error status to either warning or ignore since Github API platform returns a 404 error if one of the queried repositories is empty.
Now if you are planning to use the "Get repository content" block another warning should be mentioned and that this block only works for files or blobs up to a maximum of 1 MB in size, as per Githubs platform limitations. The response of this block should look like:
As you can see we have a couple of information stubs of that file, but most importantly from here is the SHA property, which is needed if you are planning to later on use the "Create or update file contents" block, required input parameter for the update of a file/blob.
Now if you're planning on updating files that are bigger than 1MB and you require the SHA of that file, we suggest using the list repository contents block and search for the required file and SHA in that result.
As for other functionalities of the Github connector we support also getting and listing commits or issues present in a repository, listing of users and many other requests but, if you are in need of a request that isn't present, we also offer the functionality to create your own requests to the Github API by making use of the RAW API blocks. These API blocks and their uses are explained in a separate article.
You can find attached to this article a simple JSON example which you can upload to your workspace, if you want to see a quick example of how to use version control to back up your QCS apps I suggest visiting the related article.
The information in this article is provided as-is and to be used at own discretion. Depending on tool(s) used, customization(s), and/or other factors ongoing support on the solution below may not be provided by Qlik Support.
How to: Qlik Application Automation for backing up and versioning Qlik Cloud apps on Github
Understanding and managing your Qlik Cloud subscription consumption is essential for maintaining predictable costs, ensuring uninterrupted service, and optimizing resource allocation across your organization. This guide provides you with the tools, strategies, and best practices to gain complete visibility into how your subscription is being consumed and implement proactive controls to stay within your capacity limits.
While Qlik Cloud measures consumption at the tenant level, you can achieve effective governance through strategic monitoring, automated alerting, and space-based management practices. This guide will walk you through the monitoring tools available, how to automate their deployment and refresh, and practical approaches to tracking consumption patterns and implementing controls that align with your organizational needs.
Content:
The Administration activity center Home page provides your first line of visibility into capacity consumption. Understanding what this view offers and how it complements the detailed monitoring apps will help you build an effective monitoring strategy.
Navigate to the Administration activity center → Home to see a real-time dashboard summarizing capacity consumption. This view displays visual bar charts for consumption metrics relevant to your subscription:
Common metrics displayed:
Additional metrics may include Data Moved, Large App consumption, Qlik Predict deployed models, and others, depending on your subscription.
Metrics appear dynamically as features are adopted. If no one has asked an assistant question yet, that metric won't display until first use, keeping the dashboard focused on what you're consuming.
The Data for Analysis chart shows a current snapshot with the last update timestamp. Most metrics update multiple times per hour, providing near real-time visibility into your consumption position.
The Administration activity center provides high-level consumption visibility designed for rapid assessment. For detailed analysis, investigation, and proactive monitoring, you'll complement this view with a set of monitoring apps.
Daily quick check (2 minutes):
When you need more detail, the Home page tells you what is being consumed. The monitoring apps tell you who, where, when, and why. Capacity subscriptions should use the Data Capacity Reporting App as the source of truth, while the Qlik Cloud Monitoring apps can be treated as estimated consumption reports, for example:
Use the Home page for daily checks and status awareness. When consumption requires attention or you need to understand trends, drill into the appropriate monitoring app for detailed analysis.
For more information, see Monitoring resource consumption.
The Data Capacity Reporting App is your official, billable record of consumption for capacity-based subscriptions. This Qlik-supported application is generated once per day (morning Central European Time) and provides the definitive view of your consumption against your entitlement.
The app tracks eight key value meters across the current and previous two months:
This app represents your billable consumption record. The data in this app is what Qlik uses for official capacity reporting and billing purposes. When there's any discrepancy between this app and other monitoring sources, the Data Capacity Reporting App is the authoritative source. This app refreshes only once daily, meaning you see yesterday's official position, not real-time consumption. For more frequent monitoring and estimated usage, you'll complement this with the Qlik Cloud Monitoring Apps.
For detailed information, see Monitoring detailed consumption for capacity-based subscriptions.
Rather than manually distributing the consumption app from the Administration activity center each day, automate this process using the Capacity consumption app deployer template in Qlik Automate.
Setup steps:
This automation creates or uses designated spaces, imports the latest version, publishes it to a managed space, and maintains version history according to your configuration. You now have a single source of truth that updates automatically each day. Create automations or alerts on the published app for automated insights.
For complete details, see the Qlik Community article: Automate deployment of the Capacity consumption app with Qlik Automate.
While the official consumption report updates once daily, the Qlik Cloud Monitoring Apps (community-supported) can be reloaded multiple times per day up to your contractual reload limits, giving you more timely estimated usage insights.
The Qlik Cloud Monitoring Apps provide estimated consumption data that may differ slightly from the official Data Capacity Reporting App. Use these apps for trend monitoring, troubleshooting, and proactive management, but always refer to the Data Capacity Reporting App for official billable consumption figures.
Particularly valuable monitoring apps include:
App Analyzer: Provides comprehensive application usage and operational analytics, including:
Automation Analyzer: Provides detailed analysis of automation runs, including:
Reload Analyzer: Tracks data refresh activity, including:
Access Evaluator: Analyzes user roles, access, and permissions across your tenant
Report Analyzer: Tracks report generation, including:
Entitlement Analyzer: For user-based subscriptions, provides insights into:
For a complete list of available monitoring apps, see the Qlik Community article: The Qlik Sense Monitoring Applications for Cloud and On-Premise.
The Qlik Cloud Monitoring Apps deployer template simplifies installation and maintenance of these community apps.
What it handles:
Reload frequency considerations: You can reload these monitoring apps multiple times per day to get more current estimated usage data. However, each reload counts against your tenant's reload capacity limits. Consider your contractual limits when scheduling. For most organizations, reloading 2-4 times per day provides a good balance between timely insights and consumption.
For complete implementation details, see the Qlik Community guide: Qlik Cloud Monitoring Apps Workflow Guide.
The monitoring apps are also available on GitHub: qlik-oss/qlik-cloud-monitoring-apps.
Effective governance comes from monitoring consumption at multiple levels and implementing proactive interventions. Here's how to approach monitoring for key consumption metrics.
Automation runs are counted across all automations in your tenant, regardless of owner or run mode (manual, scheduled, triggered, webhook, API). Test runs within the automation editor also count toward your limit.
What to monitor:
Tenant level:
Space level:
Automation level:
User level:
Example alert scenario: Using the Automation Analyzer, create alerts when:
Data for Analysis is measured by monthly peak usage. A single day's spike can impact your entire month's consumption.
This data is only available via the Data Consumption report; it is a lagging metric and currently lacks customer data such as app names, user names, and space names. As such, use of an automation template to provide notifications may be preferable to standard alerts, and some app size metrics may be better analyzed in the reload analyzer.
What to monitor:
Tenant level:
App level:
Space level:
Example alert scenario: Using the Data Capacity Reporting App and Reload Analyzer:
Each subscription tier has limits on maximum concurrent reloads, and capacity subscriptions have daily reload counts. Exceeding concurrent limits causes queuing; exceeding daily limits can block operations.
What to monitor:
Tenant level:
Space level:
App level:
Example alert scenario: Using the Reload Analyzer:
Report generation counts vary by subscription tier, with add-on packs available for purchase. Across all reporting capabilities, tenants have a maximum of 30,000 reporting-related requests per day.
What to monitor:
Tenant level:
Report task level:
Example alert scenario: Using consumption reporting and monitoring apps:
For detailed information on report limits, see Qlik Reporting Service specifications and limitations.
While Qlik Cloud measures consumption at the tenant level, you can implement effective governance practices that provide meaningful control over resource usage.
Make users aware of the impacts of their consumption and empower them to monitor their own usage.
Implementation:
Create early warning systems that trigger well before official capacity notifications.
Implementation:
Alert tier 1 (60-70% of capacity):
Alert tier 2 (75-85% of capacity):
Alert tier 3 (90%+ of capacity):
Use strict space controls to prevent development consumption from impacting production limits, or procure a development subscription from Qlik to fully isolate capacity.
Implementation:
For information on subscription types and capacity planning, see Qlik Cloud capacity-based subscriptions.
Now that you have the monitoring apps deployed and refreshed regularly, you can leverage Qlik Cloud's built-in alerting and distribution capabilities to create a proactive monitoring system. These tools transform static consumption data into actionable intelligence that reaches the right people at the right time.
Data Alerts: Create threshold-based alerts that evaluate conditions on a schedule and notify recipients when conditions are met. Alerts can be created on any chart or measure in your monitoring apps and can be shared with users or groups. Inclusive in all plans.
Subscriptions: Schedule automatic distribution of charts, sheets, or entire apps to users via email or Microsoft Teams. Subscriptions ensure stakeholders receive regular consumption reports without needing to log into Qlik Cloud. Inclusive in all plans.
In-app monitoring: Create bookmarks and sheets in the monitoring apps that focus on specific consumption areas. Share these bookmarks with space owners or functional teams so they can self-service their consumption monitoring. Inclusive in all plans.
Automations: Build custom workflows that trigger actions based on consumption thresholds, such as sending notifications through Slack, creating tickets in ServiceNow, or disabling specific automations when limits are approached. Value-add feature, if third-party connectors are used.
Creating Data Alerts:
Sum(AutomationRuns) > 4000)Creating Subscriptions:
Creating In-App Bookmarks:
Creating Automations:
All of these tools support distribution to groups, making it easy to ensure the right teams have visibility into the consumption metrics relevant to them. Space administrators can receive alerts about their space consumption, development teams can get daily subscription reports, and executive stakeholders can receive monthly summary reports.
The following examples demonstrate how to set up comprehensive monitoring for different consumption metrics. These examples assume you have deployed the Capacity consumption app deployer (running daily around midday UTC) and the Qlik Cloud Monitoring Apps deployer (running overnight) with default settings.
Explore the apps to discover a wide range of operational metrics you can monitor, alert, automate, and subscribe to.
Scenario: Your organization uses third-party automation blocks (such as Slack, ServiceNow, or Salesforce connectors), which incur additional costs based on consumption. You need to monitor third-party automation runs to prevent unexpected charges and identify which automations are driving costs.
Navigate to the Automation Analyzer and create the following alerts:
Alert 1: Third-party runs approaching limit
Alert 2: Individual user excessive third-party runs
Automation - Automation usage notifier: Automation or user email notifications
This approach allows you to send email notifications or take action directly on executing users or owners, while sending a fully customised template to notify them that they are approaching limits.
See Automation Usage Notifier | GitHub for details.
Scenario: Your Data for Analysis consumption is measured by monthly peak usage. You need early warning when daily peaks are trending upward and visibility into which apps are driving consumption.
Step 1: Create peak usage alerts in the Data Capacity Reporting App
Alert 1: Warning capacity threshold
Alert 2: Critical capacity threshold
Step 2: Create a weekly trend subscription
In the Data Capacity Reporting App:
Scenario: You want to create a comprehensive monthly review package that combines official billable data with estimated usage trends to facilitate informed capacity planning discussions.
Create a Qlik Automate automation that runs on the first business day of each month:
The key to managing Qlik Cloud consumption effectively is shifting from reactive (waiting for 80%/90%/100% notifications) to proactive (continuous monitoring with early intervention).
This week:
This month:
Ongoing:
By combining automated monitoring through the official Data Capacity Reporting App and community monitoring apps, tiered alerts, clear governance policies, and proactive intervention workflows, you can effectively manage your subscription costs and maintain predictable, controlled consumption across your organization.
Qlik Help documentation:
Qlik Community Official Support Articles:
Developer resources:
The Qlik Cloud Monitoring Apps are community-supported and provided as-is. They are not officially supported by Qlik, though they are maintained through Qlik's Open-Source Software GitHub. The Capacity consumption app deployer and Qlik Cloud Monitoring Apps deployer are supported automation templates found in the template picker catalog.
This article documents how to schedule automations between specific hours and days of the week. This is intended as a workaround until a native solution is delivered.
The information in this article is provided as-is and will be used at your discretion. Depending on the tool(s) used, customization(s), and/or other factors, ongoing support on the solution below may not be provided by Qlik Support.
Qlik Sense uses HTTP, HTTPS, and WebSockets to transfer information to and from Qlik Sense.
The attached Webscoket Connectivity tester can be used to verify protocol compliance, indicating if a network policy, firewall, or other perimeter device is blocking any of the required connections.
If the tests return as unsuccessful, please engage your network team.
The QlikSenseWEbsocketConnectivtyTester is not an officially supported application and is provided as is. It is intended to assist in troubleshooting, but further investigation of an unsuccessful test will require your network team's involvement. To run this tool, the Qlik Sense server must have a working internet connection.
Qlik Sense Enterprise on Windows
Since the introduction of extended WebSocket CSRF protection, using the WebSocket Connectivity tester on any version later than November 2024 requires a temporary configuration change.
<add key="WebSocketCSWSHCheckEnabled" value="true"/>
<add key="WebSocketCSWSHCheckEnabled" value="false"/>
Verify that WebSocket is enabled in the network infrastructure, such as firewalls, browsers, reverse proxies, etc.
See the article below under Related Content for additional steps.