Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
Updated 4th of February, 2026: the role and keys toggle has been removed as announced.
The following two items were deprecated in June 2025 and removed in February 2026:
This can lead to Error: 401 Authorization Required when executing third party API calls.
To replace the deprecated built-in role, migrate your users away from the Developer role to a Custom Role with the required permissions (Manage API Keys).
To create and assign a replacement custom role:
For additional reading on the Managed API Keys (set to Not allowed by default), see Permissions in User Default and custom roles | Permission settings — Features and actions.
The Developer role and Enable API keys toggle were removed in February 2026.
Once the Developer role has been removed, users who have not been updated to use the “Manage API keys” = Allow permission will:
API keys are not deleted from Qlik Cloud and will automatically be re-enabled once a user has been assigned the required Manage API Keys permissions.
To resolve this, a Tenant Administrator needs to act as outlined in What action do I need to take?
The deprecation notice was communicated in an Administration announcement and documented on our What's New in Qlik Cloud feed. See Developer role and API key toggle deprecated | 6/16/2025 for details.
The following products were affacted:
Qlik offers a wide range of channels to assist you in troubleshooting, answering frequently asked questions, and getting in touch with our technical experts. In this article, we guide you through all available avenues to secure your best possible experience.
For details on our terms and conditions, review the Qlik Support Policy.
Index:
We're happy to help! Here's a breakdown of resources for each type of need.
| Support | Professional Services (*) | |
| Reactively fixes technical issues as well as answers narrowly defined specific questions. Handles administrative issues to keep the product up-to-date and functioning. | Proactively accelerates projects, reduces risk, and achieves optimal configurations. Delivers expert help for training, planning, implementation, and performance improvement. | |
|
|
(*) reach out to your Account Manager or Customer Success Manager
Your first line of support: https://community.qlik.com/
Looking for content? Type your question into our global search bar:
Leverage the enhanced and continuously updated Knowledge Base to find solutions to your questions and best practice guides. Bookmark this page for quick access!
Subscribe to maximize your Qlik experience!
The Support Updates Blog
The Support Updates blog delivers important and useful Qlik Support information about end-of-product support, new service releases, and general support topics. (click)
The Qlik Design Blog
The Design blog is all about product and Qlik solutions, such as scripting, data modelling, visual design, extensions, best practices, and more! (click)
The Product Innovation Blog
By reading the Product Innovation blog, you will learn about what's new across all of the products in our growing Qlik product portfolio. (click)
Q&A with Qlik
Live sessions with Qlik Experts in which we focus on your questions.
Techspert Talks
Techspert Talks is a free webinar to facilitate knowledge sharing held on a monthly basis.
Technical Adoption Workshops
Our in depth, hands-on workshops allow new Qlik Cloud Admins to build alongside Qlik Experts.
Qlik Fix
Qlik Fix is a series of short video with helpful solutions for Qlik customers and partners.
Suggest an idea, and influence the next generation of Qlik features!
Search & Submit Ideas
Ideation Guidelines
Get the full value of the community.
Register a Qlik ID:
Incidents are supported through our Chat, by clicking Chat Now on any Support Page across Qlik Community.
To raise a new issue, all you need to do is chat with us. With this, we can:
Log in to manage and track your active cases in the Case Portal. (click)
Before you can access the Support Portal, please complete your Community account setup. See First time access to the Qlik Customer Support Portal fails with: Unauthorized Access Please try signing out and sign in again.
Please note: to create a new case, it is easiest to do so via our chat (see above). Our chat will log your case through a series of guided intake questions.
When creating a case, you will be prompted to enter problem type and issue level. Definitions shared below:
Select Account Related for issues with your account, licenses, downloads, or payment.
Select Product Related for technical issues with Qlik products and platforms.
If your issue is account related, you will be asked to select a Priority level:
Select Medium/Low if the system is accessible, but there are some functional limitations that are not critical in the daily operation.
Select High if there are significant impacts on normal work or performance.
Select Urgent if there are major impacts on business-critical work or performance.
If your issue is product related, you will be asked to select a Severity level:
Severity 1: Qlik production software is down or not available, but not because of scheduled maintenance and/or upgrades.
Severity 2: Major functionality is not working in accordance with the technical specifications in documentation or significant performance degradation is experienced so that critical business operations cannot be performed.
Severity 3: Any error that is not Severity 1 Error or Severity 2 Issue. For more information, visit our Qlik Support Policy.
If you require a support case escalation, you have two options:
When other Support Channels are down for maintenance, please contact us via phone for high severity production-down concerns.
A collection of useful links.
Qlik Cloud Status Page
Keep up to date with Qlik Cloud's status.
Support Policy
Review our Service Level Agreements and License Agreements.
Live Chat and Case Portal
Your one stop to contact us.
By default, Qlik Replicate reads primary keys from source tables and creates target tables using those same keys. If you want to use an existing view that doesn’t share the same key columns, you can modify the replication process to define matching key columns and adjust the task settings to prevent it from reloading the target table.
In table transformations, use Set Key Columns > Use transformation definition to ensure the key columns match the target view.
But using Views as the target (instead of a table) will result in this error, as indexes cannot be applied to views.
[TARGET_LOAD ]E: RetCode: SQL_ERROR SqlState: 42000 NativeError: 1939 Message: [Microsoft][ODBC Driver 18 for SQL Server][SQL Server]Cannot create index on view 'PPHTRAN' because the view is not schema bound. Line: 1 Column: -1 [1022502] (ar_odbc_stmt.c:5083)
Target views behave differently from tables, but an internal parameter can be used to trigger a manual query. To achieve this, add the $info.query_syntax.create_index internal parameter and value to the SQL Server target endpoint.
SUPPORT-9276
Qlik Replicate 2025.11.0.285 could not read transaction logs properly for the SQL Server source endpoint, causing the following error:
[SOURCE_CAPTURE ]E: Bad Envelope : Lsn=00695591:01394baa:0009,operation=5,TxnId=0006:91821bb4,Tablename=COMMIT,PageId=0000:00000000,slotId=3,timeStamp=2026-02-25T06:20:03.890,dataLen=0, LCX=99, >Invalid data context / LCX Code encountered for TXN operation. [1020203] (sqlserver_log_processor.c:350) 00001580: 2026-02-25T07:35:09 [SOURCE_CAPTURE ]E: Internal error (specific information not available) [20014]
Upgrade to Qlik Replicate 2025.11.0.437 to resolve the read issue for the transaction logs.
SUPPORT-8946
This article provides answers to the most frequent questions asked about Qlik MCP.
For the more general Qlik Answers FAQ, see Qlik Answers Agentic Analytics FAQ.
Qlik Model Context Protocol (MCP) server integrates Qlik Cloud into your LLM workflow, allowing you to work with Qlik Cloud using your LLM without having to leave your LLM. Connection issues will often be tied to misconfiguration.
Qlik MCP does not support clients with Client Secrets.
In a case where you do not get the response you expect based on the sources, or you receive an error:
Has your app been prepared for Qlik Answers?
For now, Qlik MCP will continue to be priced based on current models for the number of questions asked. You get capacity at corresponding levels in Standard, Premium, and Enterprise editions, as well as Qlik Sense Enterprise SaaS. There is currently no additional cost for structured data questions or task automation requests; a question is a question.
Use of the MCP server consumes questions when Qlik is accessed using Tool Calls. A Tool Call is a request made by the LLM to interact with Qlik's capabilities, such as, but not limited to, querying databases, calling APIs, or performing computations. These are typically visible in the LLM's log.
For Qlik's MCP server, 5 Tool calls consume 1 question. More questions may be purchased for expanded use cases.
See Pricing and the Qlik MCP server product description for details.
Qlik’s pricing does not include your chosen LLM subscription or usage, which will need to be paid separately.
Yes. Qlik MCP works on top of existing Qlik Sense applications and uses the same data, logic, and security model.
But to get the best experience, apps should be prepared beforehand:
Your Qlik Cloud subscription determines the quota of questions asked by users. If you are licensed for Qlik Answers, both MCP and Qlik Answers will use your monthly question capacity. See Administering Qlik MCP server.
Question capacity quotes are per month and reset every month. When you hit your limit, users can no longer ask questions until the next month. Overage is only allowed, depending on your subscription. For more information, see Qlik MCP server product description.
For more information on overage, see Overage.
Features can be turned off for individual users through user scopes.
A Loop and Reduce task created in the Qlik Sense Application Management Console (AMC) will only display its settings and parameters to the original task creator. Other users are unable to view any details.
For more information about AMC, see: AMC - Application Management Console, an alternative to the QMC for large Enterprise environments.
Example:
User A, who created the task, sees:
User B only sees the following:
Create a new Security Rule in the Qlik Sense Management Console to allow the desired user(s) to see all content. For more information about Security Rules, see Security rules.
Example Security Rule:
This article provides answers to the most frequent questions asked about Qlik Answers.
For the Qlik MCP FAQ, see Qlik Model Context Protocol (MCP) FAQ.
In February 2026, we launched our new agentic experience, which will enhance decision-making and improve productivity through a combination of assistants and agents running on a cutting-edge architecture. This initial release includes out-of-the-box agents for structured data analytics, unstructured knowledge, discovery of anomalies, and help and assistance. These agents take advantage of our foundational capabilities, including our data products and unique analytics engine, to execute complex, multi-step tasks in a trusted, scalable, and secure manner.
Qlik Answers is the primary AI assistant for people to interface with agentic AI. It will understand the intent of natural language questions and engage the underlying agentic framework to execute tasks, build responses, and take actions.
Qlik Answers now combines structured data analytics with unstructured content and general knowledge and reasoning from LLMs to deliver the most complete and relevant answers and insights, helping our customers improve decisions, productivity, and business outcomes in ways not possible before.
Looking ahead, as we build additional agents, such as prediction agents and pipeline agents, they will all be invoked through Qlik Answers. A broader set of agents is planned, all aimed at helping users get more value from their data and become more productive as Qlik continues to evolve.
With Qlik Answers now able to handle both structured and unstructured data, you can drive hundreds more informed decisions and actions each day. You can drive productivity through automation of a broad range of data and analytics tasks and workflows. And with plug-and-play simplicity, you can quickly deploy assistants in a matter of hours, reducing risk, speeding time-to-value, and future-proofing their investments in AI.
For now, Qlik Answers will continue to be priced based on current models for the number of questions asked. You get capacity at corresponding levels in Standard, Premium, and Enterprise editions, as well as Qlik Sense Enterprise SaaS, with additional capacity available for purchase as needed.
There is currently no additional cost for structured data questions or task automation requests; a question is a question.
For additional details, refer to Pricing.
Since launch, Qlik Answers has been rolled out across regions, and the process is still ongoing. If you have Standard, Premium, and Enterprise editions, check if your region already supports it (see Supported regions).
If it is not yet available to you, then:
Yes, you must be a Qlik Cloud customer to use Qlik Answers. Qlik Answers is built on cloud-native technologies, specifically large language models (LLMs) that require significant compute resources and specialized infrastructure, and there is no mechanism to deploy these technologies in an on-premises environment.
However, you don’t have to fully migrate their analytics environment or documents to the cloud in order to take advantage of Qlik Answers. Qlik's roadmap aims to enable support for analytics apps that were pushed to the cloud, at which point unstructured documents can be indexed where they reside. The current target release date for this feature sits within Q2 of 2026.
No. You will use either Qlik Answers or Insight Advisor, not both at the same time.
Qlik Answers represents the AI-first experience going forward. When a tenant chooses Qlik Answers, that becomes the primary way users interact with analytics. Insight Advisor is not available in parallel within the same tenant.
This is a deliberate choice to avoid duplicated experiences, inconsistent results, and user confusion.
No. Qlik Answers is cloud only.
There are no plans to bring Qlik Answers to on-premises environments. The product relies on cloud native AI services, managed infrastructure, and continuous model evolution.
Insight Advisor is not being discontinued.
If you remain on Insight Advisor, you can continue using it. However, within a tenant, you must choose between Insight Advisor and Qlik Answers. You cannot run both experiences side by side.
The most important and relevant business logic is preserved when moving to Qlik Answers.
That said, Qlik Answers is built for a newer generation of AI-driven analytics. In many cases, customers will find they no longer need to manually build or maintain the same level of logic, because the system handles more of that automatically.
The value is not in recreating everything exactly as it was, but in moving to a simpler, more capable experience.
This is essentially a buy vs build decision:
Qlik Answers is built on AWS Bedrock and currently utilizes Anthropic Claude models. The specific model versions vary by agent function and are continuously evaluated and updated based on performance, accuracy, latency, and cost optimization.
Our Model Selection Philosophy:
Qlik maintains flexibility in model selection to continuously improve the user experience as AI technology evolves. Different agents within the Qlik Answers architecture may use different models optimized for their specific tasks (e.g., semantic understanding, code generation, reasoning).
No. Not at this stage.
Qlik Answers is a managed experience with curated models and configurations. Customers who want to use their own models or bring custom AI stacks should use MCP instead.
Yes. Qlik Answers works on top of existing Qlik Sense applications and uses the same data, logic, and security model.
But to get the best experience, apps should be prepared beforehand:
Yes. Master measures and dimensions are always prioritized. If business logic exists, Qlik Answers uses it rather than creating new calculations.
Yes. Qlik Answers generates appropriate visualizations such as KPIs, bar charts, or time-based charts depending on the question.
Qlik Answers inherits and enforces Qlik's established security model without exception. All existing security rules, section access configurations, and row-level security policies apply automatically.
Key security principles:
Field-level security (if implemented) is respected in all analyses.
No additional security configuration is required. Organizations with complex security requirements can continue using their existing Qlik security implementations with confidence.
Yes, if their access rights differ. Answers are always scoped to the user’s permissions.
While no special data preparation is required beyond standard Qlik Sense data modelling best practices, the apps themselves should be prepared beforehand to give you the best experience possible:
Yes. Qlik Answers understands conversational context, allowing users to refine or continue their analysis.
At its initial GA release, Qlik Answers is optimized and fully supported for English language queries and responses.
While the underlying large language models have multilingual capabilities and may be able to process queries in other languages with varying degrees of accuracy, non-English language support is not officially validated, documented, or supported by Qlik at this time.
Additional language support is planned for future releases based on demand and regional priorities.
No. It accelerates analysis and reduces repetitive work but does not replace human expertise or decision-making.
Yes. Only enabled and indexed applications are available.
Not in the current GA release. Qlik Answers operates within the context of a single Qlik Sense application per query. Multi-application query capabilities are planned for a future release.
If you want to ask questions in an app, you just need the ‘Data analysis’ scope. If you plan on asking questions to an assistant, you need the ‘Data analysis’ and ‘Search knowledge base’ scopes.
Cross-region inference has minimal risks as the data still stays within the AWS Virtual Private Network. The only difference here is that the LLM call gets processed in a different region due to GPU availability.
We have made a deliberate design decision to prioritize the quality of answers and insights over the speed of responses. In general, Qlik provides a far richer reasoning process and answer than competing products, and this results in a longer response time. We are planning to improve and optimize this, as well as introduce a faster mode for simpler questions in the future.
Qlik Answers always references its sources in detail. To begin troubleshooting, check the citations, which will show:
In a case where you do not get the response you expect based on the sources, or you receive an error:
Has your app been prepared for Qlik Answers?
Your Qlik Cloud subscription determines the quota of questions asked by users. If you are licensed for Qlik Answers, both MCP and Qlik Answers will use your monthly question capacity. See Administering Qlik MCP server.
Question capacity quotes are per month and reset every month. When you hit your limit, users can no longer ask questions until the next month. Overage is only allowed, depending on your subscription. For more information, see Qlik MCP server product description.
For more information on overage, see Overage.
Features can be turned off for individual users through user scopes.
See Control access to AI features.
If you have previously enabled the feature, the entirety of Qlik’s Agentic Analytics can easily be turned off again by configuring AI features in Qlik:
See Enable cross-region inference.
Error codes
These error codes should only be used to reference what is an expected error. Retry if you receive any of these errors.
Retry and Processing Errors
App and Document Errors
Chart and Sheet Errors
Expression and Hypercube Errors
Semantic Search Errors
Access Verification Errors
You can select a sheet to the landing page of the app by setting a default bookmark.
When you open the app, the expected landing page may not be displayed.
As well, when you open a sheet with a sheet action, the action might not get triggered.
The session needs to be terminated.
The default bookmark with selection and landing sheet as well as sheet actions are applied once per session.
Qlik Sense SaaS keeps the session up to 30 minutes after closing the tab or tabs.
Setting a default bookmark to create an app landing page
Content
The information in this article and video is provided as is. If you need assistance with Zabbix, please engage with Zabbix directly.
The environment being demonstrated in this article consists of one Central Node and Two Worker Nodes. Worker 1 is a Consumption node where both Development and Production apps are allowed. Worker 2 is a dedicated Scheduler Worker node where all reloads will be directed. Central Node is acting as a Scheduler Manager.
The Zabbix Monitoring appliance can be downloaded and configured in a number of ways, including direct install on a Linux server, OVF templates and self-hosting via Docker or Kubernetes. In this example we will be using Docker. We assume you have a working docker engine running on a server or your local machine. Docker Desktop is a great way to experiment with these images and evaluate whether Zabbix fits in your organisation.
This will include all necessary files to get started, including docker compose stack definitions supporting different base images, features and databases, such as MySQL or PostgreSQL. In our example, we will invoke one of the existing Docker compose files which will use PostgreSQL as our database engine.
Source: https://www.zabbix.com/documentation/current/en/manual/installation/containers#docker-compose
git clone https://github.com/zabbix/zabbix-docker.git
Here you can modify environment variables as needed, to change things like the Stack / Composition name, default ports and many other settings supported by Zabbix.
cd ./zabbix-docker/env_vars
ls -la #to list all hidden files (.dotfiles)
nano .env_web
In this file, we will change the value for ZBX_SERVER_NAME to something else, like "Qlik STT - Monitoring". Save the changes and we are ready to start up Zabbix Server.
./zabbix-docker folder contains many different docker compose templates, either using public images or locally built (latest and local tags).
You can run your chosen base image and database version with:
docker compose -f compose-file.yaml up -d && docker compose logs -f --since 1m
Or unlink and re-create the symbolic link to compose.yaml, which enables managing the stack without specifying a compose file. Run the following commands inside the zabbix-docker folder to use the latest Ubuntu-based image with PostgreSQL database:
unlink compose.yamlln -s ./docker-compose_v3_ubuntu_pgsql_latest.yaml compose.yamldocker compose up -dIf you skip the -d flag, the Docker stack will start and your command line will be connected to the log output for all containers. The stack will stop if you exit this mode with CTRL+C or by closing the terminal session. Detached mode will run the stack in background. You can still connect to the live log output, pull logs from history, manage the stack state or tear it down using docker compose down.
Pro tip: you will be using docker compose commands often when working with Docker. You can create an alias in most shells to a short-hand, such as "dc = docker compose". This will still accept all following verbs, such as start|stop|restart|up|down|logs and all following flags. docker compose up -d && docker compose logs -f --since 1m would become dc up -d && dc logs -f --since 1m.
Use the IP address of your Docker host: http://IPADDRESS or https://IPADDRESS.
The Zabbix server stack can be hosted behind a Reverse Proxy.
The default username is Admin and the default password is zabbix. They are case sensitive.
Download link: https://www.zabbix.com/download_agents, in this case download the Windows installer MSI.
After Agent is installed, in Zabbix go to Data Collection > Hosts and click on Create host in the top right-hand corner. Provide details like hostname and port to connect to the Agent, a display name and adjust any other parameters. You can join clusters with Host groups. This makes navigating Zabbix easier.
Note: Remember to change how Zabbix Server will connect to the Agent on this node, either with IP address or DNS. Note that the default IP address points to the Zabbix Server.
In the Zabbix Web GUI, navigate to Data Collection > Templates and click on the Import button in the top right-hand corner. You can find the templates file at the following download link:
LINK to zabbix templates
Once you have added all your hosts to the Data Collection section, we can link all Qlik Sense servers in a cluster using the same templates. Zabbix will automatically populate metrics where these performance counters are found. From Data Collection > Hosts, select all your Qlik Sense servers and click on "Mass update". In the dialog that comes up, select the "Link templates" checkbox. Here you can link/replace/unlink templates across many servers in bulk.
Select "Link" and click on the "Select" button. This new panel will let us search for Template groups and make linking a bit easier. The Template Group we provided contains 4 individual templates.
Fig 2: Mass update panel
Fig 3: Search for Template Group
Once you Select and Update on the main panel, all selected Hosts will receive all items contained in the templates, and populate all graphs and Dashboards automatically.
To review your data, navigate to Monitoring > Hosts and click on the "Dashboards" or "Graphs" link for any node, here is the default view when all Qlik Sense templates are linked to a node:
Fig 5: Repository Service metrics - Example
We will query the Engine Healthcheck end-point on QlikServer3 (our consumer node) and extract usage metrics from by parsing the JSON output.
We will be using a new Anonymous Access Virtual Proxy set up on each node. This Virtual Proxy will only Balance on the node it represents, to ensure we extract meaningful metrics from the Engine and we won't be load-balanced by the Proxy service across multiple nodes. There won't be a way to determine which node is responding, without looking at DevTools in your browser. You can also use Header or Certificate authentication in the HTTP Agent configuration.
Once the Virtual Proxy is configured with Anonymous Only access, we can use this new prefix to configure our HTTP Agent in Zabbix.
In the Zabbix web GUI, go to Data collection > Hosts. Click on any of your hosts. On tabs at the top of the pop-up, click on Macros and click on the "Inherited and host macros" button. Once the list has loaded, search for the following Macro: {$VP_PREFIX}. This is set by default to "anon". Click on "Change" and set Macro value to your custom Virtual Proxy Prefix for Engine diagnostics, and click Update. The Virtual Proxy prefix will have to be changed on each node for the "Engine Performance via HTTP Agent" item to work. Alterantively, you can modify the MACRO value for the Template, this will replicate the changes across all nodes associated to this Template.
Fig 6: Changing Host Macros from Inherited values
To make this change at the Template level, go to Data collection > Templates. Search for the "Engine Performance via HTTP Agent" and click on the Template. Navigate to the Macros tab in the pop-up and add your Virtual Proxy Prefix here to make this the new default for your environment. No further changes to Node configuration are required at this point.
Fig 7: Changing Macros at the Template level
The Zabbix templates provided in this article contain the following Engine metric JSONParsers:
These are the same performance counters that you can see in the Engine Health section in QMC.
Stay tuned to new releases of the Monitoring Templates. Feel free to customise these to your needs and share with the Community.
Environment
By default, Qlik Talend Data Catalog will not be able to trace the lineage of Qlik Talend Studio jobs that use dynamic components such as tDBJava. To extract lineage correctly, there are multiple steps that need to be followed. Below is an example where the JavaRow lineage fails to show because Talend Data Catalog is not able to parse the lineage of tJavaRow and instead creates duplicate columns, unintentionally splitting the lineage for each column:
The following versions are required to trace lineage for complex Qlik Talend Studio components:
-vm
C:\Talend\Studio-QTC\zulu17.48....
-vmargs
-Xms512m
-Xmx1536m
-Dfile.encoding=UTF-8
-Dtalend.lineage.enabled=true
-XX:+UseG1GC
-XX:+UseStringDeduplication
-XX:MaxMetaspaceSize=512m
--add-modules=ALL-SYSTEM
ErrorCode.11041 occurs when opening up an App
ErrorCode.11043 occurs when creating a Database connection in the data load editor.
The two symptoms will correlate with the Qlik Sense system having a restricted or no internet connection.
Qlik connectors are cryptographically signed for authenticity verification. The .NET framework verification procedure used for this signing includes checking OCSP and Certificate Revocation List information, which are fetched from an online resource if the system doesn't have a cached local copy. These requests will timeout due to a lack of access to online resources in environments with restricted, slow or no internet connection. Due to the authenticity check failure, the connector will not run, and the app reload fails.
Edit the .Net Framework's machine.config file
<runtime> <generatePublisherEvidence enabled="false"/> </runtime>If the <runtime> section looks different, modify it to:<runtime>
<OTHER CONFIGURATION="YOU VALUES">
<...>
<generatePublisherEvidence enabled="false"/>
</runtime>
NOTE1: Changes to machine.config affects all software using the .NET framework feature.
NOTE2: 3rd party connectors might be compiled for 32-bit platforms.
In such case repeat steps above for the 32-bit version of the machine.config file;
C:\Windows\Microsoft.NET\Framework\v4.0.30319\config\machine.config
A Qlik Sense app has been deleted from the Qlik Sense Management Console and needs to be restored.
! Deleting Qlik Sense application from Qlik Management Console (QMC) is generally an irreversible process. Restoring the applications is only possible if a previous backup exists. The delete process removes all files from the configured file share. See Creating a file share (Help.com).
If a backup of the files exists, proceed with the documented steps.
Note that these steps can also be applied when restoring and importing from one Qlik Sense environment to the other.
Information on server migration has also be posted to Qlik Community: Qlik Sense Migration Part1: Migrating your Entire Qlik Sense Environment. If assistance is needed, Qlik Consulting would need to be engaged. Qlik Support cannot provide walk-through assistance with server migrations outside of a post-installation and migration completion break/fix scenario.
To successfully restore Qlik Sense Application to the Qlik Sense environment, you must ensure backup strategy using your backup software tool for your shared folder, where Qlik Sense Application files are stored.
! If no backups of files are available, no restoration will be possible.
You can find the filename (APP_ID) in the AuditActivity_Engine log.
This log is by default stored in: \\<rootShare>\Log\Engine\Audit\ by default.
An example showing App id followed by App name follows:
492 20.4.2.0 20180521T180500.118+0200 QlikServer1 0ccd5e9f-e020-4b76-a84d-144bdf903765 20180521T180500.112+0200 12.145.3.0 Command=Reload app;Result=0;ResultText=Success 0 0 2290 INTERNAL sa_scheduler d296b870-da06-4311-bacc-038992b1c954 c047d8a7-148c-4ea6-97f2-10290e706cd7 License Monitor Engine Not available Doc::DoReloadEx Reload app 0 Success 0ccd5e9f-e020-4b76-a84d-144bdf903765
This will make the file readable and importable.
You can now locate the app in the shared folder with a new App ID.
Click Publish
The Publishing dialogue:
Previous versions of Microsoft SQL Server do not support a dedicated JSON data type.
For later versions, Microsoft announced the introduction of a native JSON data type (along with JSON aggregate functions). This new data type is already available in Azure SQL Database and Azure SQL Managed Instance, and is included in SQL Server 2025 (17.x).
SQL Server 2025 (17.x) became Generally Available (GA) on November 18, 2025.
At this time, the current Qlik Replicate major releases 2025.05/2025.11 do not support SQL Server 2025 or its native JSON data type yet.
During the endpoint connection ping test, you may encounter:
SYS-E-HTTPFAIL, Unsupported server/database version: 0.
SYS,GENERAL_EXCEPTION,Unsupported server/database version: 0
Since the Azure SQL Database version is always 14.x, the version check succeeds. However, Azure SQL DB already uses the SQL Server 2025 kernel, the task later fails during runtime with:
[SOURCE_CAPTURE ]T: Failed to set ct table column ids for ct table with id '1021246693' (sqlserver_mscdc.c:2968)
[SOURCE_CAPTURE ]T: Failed to get change tables IDs for capture list [1000100] (sqlserver_mscdc.c:3672)
[SOURCE_CAPTURE ]E: Failed to get change tables IDs for capture list [1000100] (sqlserver_mscdc.c:3672)
No workaround can be provided until support has been introduced.
According to the current roadmap, support for SQL Server 2025 and the native JSON data type is planned for the upcoming major release: Qlik Replicate 2026.5.
No date or guaranteed timeframe can yet be given. The support planned for 2026.5 is an estimate.
00419519
This article explains how to implement SAML for NPrinting with Azure as the IdP.
To implement Azure SAML in Nprinting, the following needs to be done:
Generate a Metadata XML file
Federation Metadata XML: Download
<?xml version="1.0"?>
<EntityDescriptor xmlns="urn:oasis:names:tc:SAML:2.0:metadata" entityID="https://sts.windows.net/b26e23cf-787a-40e8-9d17-f0c9f9ad0821/">
<IDPSSODescriptor xmlns:ds="http://www.w3.org/2000/09/xmldsig#" protocolSupportEnumeration="urn:oasis:names:tc:SAML:2.0:protocol">
<KeyDescriptor use="signing">
<ds:KeyInfo xmlns:ds="http://www.w3.org/2000/09/xmldsig#">
<ds:X509Data>
<ds:X509Certificate>MIIC8DCCAdigAwIBAgIQFUUu6ZQHg5FJ...Ud8tf9A/4A6+2SZm34gf8gcVPTXT/a</ds:X509Certificate>
</ds:X509Data>
</ds:KeyInfo>
</KeyDescriptor>
<SingleSignOnService Binding="urn:oasis:names:tc:SAML:2.0:bindings:HTTP-Redirect" Location="https://login.microsoftonline.com/b26e23cf-787a-40e8-9d17-f0c9f9ad0821/saml2"/>
<SingleSignOnService Binding="urn:oasis:names:tc:SAML:2.0:bindings:HTTP-POST" Location="https://login.microsoftonline.com/b26e23cf-787a-40e8-9d17-f0c9f9ad0821/saml2"/>
</IDPSSODescriptor>
</EntityDescriptor>Potential Troubleshooting Steps
To start replication from a specific point in time on a MongoDB source, you will need to identify the oplog stream position (BSON Timestamp) corresponding to your target time and configure it in your Qlik Replicate task.
This article outlines the options available to you.
Connect to the primary node via mongosh and run:
db.getSiblingDB('local').oplog.rs.find().sort({ $natural: -1 }).limit(1).pretty()
This returns the most recent oplog entry. Look for the ts field in the output:
{
"ts": Timestamp(1741600200, 1),
"op": "i",
...
}
The ts value is your stream position. The first number is Unix epoch seconds; the second is the ordinal increment.
If you know the specific time you want to start from, you can filter the oplog directly to find the closest entry:
var t = new Timestamp(Math.floor(new Date("2026-03-10T11:30:00Z").getTime() / 1000), 1);
db.getSiblingDB('local').oplog.rs.find({ ts: { $gte: t } }).limit(1).pretty()
Replace the date string with your target time. This returns the first oplog entry at or after that timestamp, giving you the exact ts value to use.
The rs.status() command returns the current replication position (optimeDate and optime.ts) for each replica set member:
rs.status()
This is useful for cross-referencing a wall-clock time to an approximate oplog position. Once you have an approximate position, use option two to pinpoint the exact ts value.
Start the task normally (without specifying a position) and allow Qlik Replicate to connect to the MongoDB oplog. Qlik Replicate will log the current stream position it reads from in the task log output, in the exact format it expects. You can then use that as a reference and template for entering positions manually in future tasks.
This is the safest way to confirm the correct position format for your version of Qlik Replicate before attempting a manual entry.
Once you have your stream position value:
Note: We recommend using option four first to confirm the exact position format your version of Qlik Replicate expects for MongoDB, as this can vary. Entering the value in an incorrect format will cause the task to start from an unintended position.
Connecting to an SFTP server using username and password authentication fails with the error:
Too many bad authentication attempts!
com.jcraft.jsch.JSchException: 11 Too many bad authentication attempts!
The issue is caused by the password containing a backslash (\). A backslash is treated as a control or escape character and must be escaped accordingly. If not handled correctly, a control or escape character in a password can cause connection or authentication errors.
Either remove the backslash (\) or replace it with a double backslash (\\) to escape it properly.
Example:
Previous, failing password: Test\123
Updated password: Test\\123
When attempting to export data to .csv or as a Straight Table from a Vizlib Pivot Table object in Qlik Cloud Analytics, the following error message is displayed:
An error occurred during export. Please try again later.
Even small datasets will not export.
This issue is not seen with this object in Qlik Sense Enterprise on-premise, but may occur with an app migrated to or published to Qlik Cloud from the on-premise install.
Upgrade the Vizlib Pivot Table to version 3.17.3 or higher in all environments, then reload the affected app.
This is not a Qlik issue, and is specific to complex apps with large data sizes and Vizlib Pivot Table version 3.14.0 or earlier. No Qlik limitation is involved.
It appears that only export to Excel (.xlsx) was fully supported for such apps in Qlik Cloud with earlier Vizlib object versions.
Note that downloading as an image or PDF from a Vizlib object remains unsupported.
Upgrading Qlik Compose across multiple versions requires a specific upgrade path. See Qlik Compose December 2024 Initial Release Notes for details.
After an upgrade, following the steps from 2022.5 to 2023.11 and finally 2024.12, Qlik Compose returns a generic UI error during connection tests.
UI ERROR
Unable to connect to the remote server
Reviewing the Qlik Compose server log reveals the Java process is failing to start entirely:
[INFO ] Java Server: .
[ERROR] Java Server: Error: Could not create the Java Virtual Machine.
[ERROR] Java Server: Error: A fatal exception has occurred. Program will exit.
[INFO ] Java Server: <JAVA_HOME>/lib/ext exists, extensions mechanism no longer supported; Use -classpath instead.
[WARN ] The Compose java server was restarted.
[ERROR] Java Server: Error: Could not create the Java Virtual Machine.
[ERROR] Java Server: Error: A fatal exception has occurred. Program will exit.
While the Qlik Compose server is running, the Java agent is down.
The details upgrade path followed was:
When the second hop was done, the installation recreated the /ext folder within Compose/java/lib/jre/lib with no files within. As Compose 2024.12 uses Java 17, it doesn’t support ext folder.
This article documents the basic steps to configure the SAML integration between Qlik Sense Enterprise on Windows (Client-Managed) and Microsoft Entra ID. By connecting these two platforms, administrators can control which users are allowed to access Qlik Sense directly from Entra ID, provide users with seamless single sign-on using their Microsoft accounts, and manage identities from a centralized location.
If you are looking for instructions for Qlik Cloud Analytics, see How To: Configure Qlik Sense Enterprise SaaS to use Azure AD as an IdP.
Content
To get started, you need the following items:
All the following steps are taken in Qlik Sense Enterprise on Windows.
You can test the single sign-on setup either from the Microsoft Entra ID portal by selecting Test, or by navigating directly to the Qlik Sense sign-on URL and starting the login process from there.
To start replication from a specific point in time on a DB2 LUW source, you will need to identify the LSN (Log Sequence Number) corresponding to your target timestamp and configure it in your Qlik Replicate task.
There are several ways to obtain the LSN depending on your environment and access level.
Ensure the DB2 archive logs covering your target LSN range are still retained and accessible on the server. If those logs have been pruned or moved off the system, Qlik Replicate will not be able to read from that position, and the task will error out.
Run the following on the DB2 server to list active log files with their LSN ranges and timestamps:
db2pd -db <DBNAME> -logs
Sample output:
Log File First LSN Last LSN Timestamp
S0001234.LOG 0x000123456789 0x000123ABCDEF 2026-03-10-11.30.00
Locate the log file whose timestamp range covers your desired start time and note the First LSN for that file. Convert the hex value to decimal before entering it into Replicate (e.g., 0x000123456789 = 1251004137353).
If you know the specific log file number and offset, you can translate it to an LSN using the db2flsn command-line utility:
db2flsn -db <DBNAME> -lsn <log_file_number>/<offset>
This is useful when you already know which log file corresponds to your target time. Convert the resulting hex LSN to decimal before entering it into Replicate.
To retrieve the current active LSN directly from the database:
SELECT CURRENT_LSN FROM SYSIBMADM.SNAPDB;
This returns the LSN at the moment the query is executed. Use this if you want to start replication from approximately "now" with a precise LSN anchor rather than relying on the task default. Convert the hex value to decimal before use.
By design, Qlik Replicate does not support starting CDC from a specific timestamp for a DB2 LUW source endpoint. This is a documented limitation in the Qlik Replicate User Guide.
However, when a DB2 LUW CDC task is first created and started, Replicate internally generates a file named DB2LUW_TIMESTAMP_MAP (a SQLite database) in the task's data folder. This file continuously maps processed LSN values to their corresponding timestamps each time the task runs. As a result, it provides a workaround to approximate a timestamp-based start position — by identifying the LSN that corresponds to the desired point in time and using that LSN to resume the task.
The only prerequisite for this approach is that the DB2 transaction logs covering the target time period must still be available and accessible on the source server.
Once you have your LSN value:
DB2 tools typically display LSN values in hexadecimal. Please ensure you convert to decimal before entering the value in Qlik Replicate; the task will start from an incorrect log position.
When asking the Qlik Answers Documentation Assistant a question and checking the source, it throws the following error:
Cannot access the source
This issue occurs when the Assistant user accessing the Documentation assistant does not have permission to the source. For reference, when setting up a documentation assistant, there should be two spaces:
As the Knowledge base lives in the Assistant Data space, confirm that Can consume data and Can view permissions are set:
For more information, see Qlik Answers use case: Documentation assistant.
In addition, if the Documentation assistant is consuming data from a Direct Access Gateway, confirm that the Assistant users have the permission Can Consume Data for the space where the Direct Access Gateway is installed.