Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
Qlik offers a wide range of channels to assist you in troubleshooting, answering frequently asked questions, and getting in touch with our technical experts. In this article, we guide you through all available avenues to secure your best possible experience.
For details on our terms and conditions, review the Qlik Support Policy.
Index:
We're happy to help! Here's a breakdown of resources for each type of need.
Support | Professional Services (*) | |
Reactively fixes technical issues as well as answers narrowly defined specific questions. Handles administrative issues to keep the product up-to-date and functioning. | Proactively accelerates projects, reduces risk, and achieves optimal configurations. Delivers expert help for training, planning, implementation, and performance improvement. | |
|
|
(*) reach out to your Account Manager or Customer Success Manager
Your first line of support: https://community.qlik.com/
Looking for content? Type your question into our global search bar:
Leverage the enhanced and continuously updated Knowledge Base to find solutions to your questions and best practice guides. Bookmark this page for quick access!
Subscribe to maximize your Qlik experience!
The Support Updates Blog
The Support Updates blog delivers important and useful Qlik Support information about end-of-product support, new service releases, and general support topics. (click)
The Qlik Design Blog
The Design blog is all about product and Qlik solutions, such as scripting, data modelling, visual design, extensions, best practices, and more! (click)
The Product Innovation Blog
By reading the Product Innovation blog, you will learn about what's new across all of the products in our growing Qlik product portfolio. (click)
Q&A with Qlik
Live sessions with Qlik Experts in which we focus on your questions.
Techspert Talks
Techspert Talks is a free webinar to facilitate knowledge sharing held on a monthly basis.
Technical Adoption Workshops
Our in depth, hands-on workshops allow new Qlik Cloud Admins to build alongside Qlik Experts.
Qlik Fix
Qlik Fix is a series of short video with helpful solutions for Qlik customers and partners.
Suggest an idea, and influence the next generation of Qlik features!
Search & Submit Ideas
Ideation Guidelines
Get the full value of the community.
Register a Qlik ID:
Incidents are supported through our Chat, by clicking Chat Now on any Support Page across Qlik Community.
To raise a new issue, all you need to do is chat with us. With this, we can:
Log in to manage and track your active cases in the Case Portal. (click)
Please note: to create a new case, it is easiest to do so via our chat (see above). Our chat will log your case through a series of guided intake questions.
When creating a case, you will be prompted to enter problem type and issue level. Definitions shared below:
Select Account Related for issues with your account, licenses, downloads, or payment.
Select Product Related for technical issues with Qlik products and platforms.
If your issue is account related, you will be asked to select a Priority level:
Select Medium/Low if the system is accessible, but there are some functional limitations that are not critical in the daily operation.
Select High if there are significant impacts on normal work or performance.
Select Urgent if there are major impacts on business-critical work or performance.
If your issue is product related, you will be asked to select a Severity level:
Severity 1: Qlik production software is down or not available, but not because of scheduled maintenance and/or upgrades.
Severity 2: Major functionality is not working in accordance with the technical specifications in documentation or significant performance degradation is experienced so that critical business operations cannot be performed.
Severity 3: Any error that is not Severity 1 Error or Severity 2 Issue. For more information, visit our Qlik Support Policy.
If you require a support case escalation, you have two options:
When other Support Channels are down for maintenance, please contact us via phone for high severity production-down concerns.
A collection of useful links.
Qlik Cloud Status Page
Keep up to date with Qlik Cloud's status.
Support Policy
Review our Service Level Agreements and License Agreements.
Live Chat and Case Portal
Your one stop to contact us.
Facebook Ads integration extraction fails with the following error:
SingerSyncError POST: 400 Message: (#3018) The start date of the time range cannot be beyond 37 months from the current date.
If you are suddenly seeing this error, it is likely due to resetting the integration or table while having an older Start Date.
To resolve, change the Start Date in your Facebook Ads integration settings to a value within the last 37 months. This aligns with Facebook's current policy and allows the integration to function properly.
If you have any questions about this limitation or to discuss potential options for accessing older data, please contact Facebook. They may have additional insights or alternative solutions for businesses needing to access older advertising data.
Going forward, it's important to be aware of this 37-month limitation when working with Facebook Ads data, especially when setting up or resetting integrations. Regular data backups or exports might be advisable to retain historical data beyond this window for long-term analysis and reporting needs.
Can we find out who changed the date?
Qlik Stitch does not track this type of user activity. You will need to check with other users in your organisation.
This is a Facebook Ads API limitation documented by Stitch.
To investigate Task failure, It is necessary to collect the Diagnostics Package from Qlik Cloud Data Integration.
Option Two: Monitor view within the task
Often, Support will request that specific logging components be increased to Verbose or Trace in order to effectively troubleshoot. To modify, click on the "Logging options" located in the right-hand corner of the logs view. The options presented in the UI do not use the same terminology as what you see in the logs themselves. For better understanding, please refer to this mapping:
UI | Logs |
Source - full load | SOURCE_UNLOAD |
Source - CDC | SOURCE_CAPTURE |
Source - data | SOURCE_UNLOAD SOURCE_CAPTURE SOURCE_LOG_DUMP DATA_RECORD |
Target - full load | TARGET_LOAD |
Target - CDC | TARGET_APPLY |
Target - Upload | FILE_FACTORY |
Extended CDC | SORTER SORTER_STORAGE |
Performance | PERFORMANCE |
Metadata | SERVER TABLES_MANAGER METADATA_MANAGER METADATA_CHANGES |
Infrastructure | IO INFRASTRUCTURE STREAM STREAM_COMPONENT TASK_MANAGER |
Transformation | TRANSFORMATION |
Please note that if the View task logs option is not present in the dropdown menu, it indicates that the type of task you are working with does not have available task logs. In the current design, only Replication and Landing tasks have task logs.
This article gives an overview of the available blocks in the dbt Cloud connector in Qlik Application Automation.
The purpose of the dbt cloud connector is to be able to schedule or trigger your jobs in dbt from action in Qlik Sense SaaS.
Authentication to dbt Cloud happens through an API key. The API key can be found in the user profile when logged into dbt Cloud under API Settings. Instructions on the dbt documentation are at: https://docs.getdbt.com/dbt-cloud/api-v2#section/Authentication
The blocks we have available are built around the objects Jobs and Runs. Furthermore for easy use of the connector there are helper blocks for accounts and projects. For any gaps we have raw API request blocks to allow more freedom to end users where our blocks do not suffice.
Blocks for Jobs:
Blocks for Runs:
The following automation is added as an attachment and shown as an image and will run a job in dbt Cloud and if successful reloads an app in Qlik Sense SaaS. It will always send out an email, this of course can be changed to a different channel. Also it would be possible to extend this to multiple dbt jobs:
The information in this article is provided as-is and to be used at own discretion. Depending on tool(s) used, customization(s), and/or other factors ongoing support on the solution below may not be provided by Qlik Support.
This article aims to answer the following questions:
Stitch is a cloud-based ETL platform, which means it is not real-time and may experience latency due to the nature of cloud infrastructure and its step-based processing model.
Stitch’s replication process consists of three independent steps:
Extraction → Preparation → Loading
Each step takes time to complete and is influenced by various factors.
For more information, see: Stitch’s Replication Process | stitchdata.com
The speed and efficiency of Stitch’s replication process can be affected by:
These factors can vary over time and across integrations, which is why replication durations are not always predictable.
The replication frequency determines how often Stitch initiates a new extraction job (when one isn’t already in progress). Stitch tracks your tables and updates them based on the replication method you’ve selected.
However, this frequency does not guarantee that data will be prepared and loaded within the same time window. For example, a 30-minute frequency does not mean the full replication cycle completes in 30 minutes.
Stitch extracts one table at a time per integration (sequentially). It must finish extracting one table before moving to the next.
Once data is extracted, Stitch begins the preparation phase, which involves cutting records into rectangular staging files. This step is batch-based and starts as soon as data is returned from the source. The duration of this phase depends on the structure and volume of the data.
Stitch can load up to 5 tables concurrently per destination. If 5 tables are already loading, others must wait until a slot becomes available. For example, with 10 integrations and 20 tables each, Stitch will load 5 tables at a time per destination.
Stitch’s loading systems check every 15–20 minutes for batches of records that are fully prepared and ready to be loaded into your destination.
What may appear as missing data is often just incomplete processing. Most data discrepancies resolve themselves once Stitch finishes processing.
The Qlik Cloud and Qlik Sense Enterprise on Windows Straight Table come with a menu option to Adjust Column Size.
Clicking this option does not have an immediate effect.
What does it do?
Adjust Column Size sets the column into a state that allows you to change its width using your arrow keys. Once in this state, you can use the left and right arrow keys to make the column larger or smaller.
IBM DB2 iSeries connector in Qlik Cloud Data Integration requires setting up a Data Gateway - Data Movement (see Setting up Data Movement gateway) and installing the supported DB2 iSeries ODBC driver on the same server (see Preparing the installation | IBM DB2 for iSeries).
This article aims to guide you through the process.
Currently, Qlik Cloud Data Integration supports DB2i ODBC driver version 1.1.0.26, as can be confirmed by viewing the /opt/qlik/gateway/movement/drivers/manifests/db2iseries.yaml after the data gateway is installed.
cd /opt/qlik/gateway/movement/drivers/bin
sudo mkdir -p /opt/qlik/gateway/movement/drivers/db2iseries
sudo wget -O /opt/qlik/gateway/movement/drivers/db2iseries/ibm-iaccess-1.1.0.26-1.0.x86_64.rpm “https://public.dhe.ibm.com/software/ibmi/products/odbc/rpms/x86_64/ibm-iaccess-1.1.0.26-1.0.x86_64.rpm”
./install db2iseries
sudo systemctl restart repagent
To ensure CDC works with this connector, set the internal property useStorageForStringSize to true. There is a known issue with BOOLEAN datatype and driver version 1.1.0.26, and this parameter will ensure smooth replication. Otherwise, you will see an error like:
[TASK_MANAGER ]I: Starting replication now (replicationtask.c:3500
[SOURCE_CAPTURE ]E: Error parsing [1020109] (db2i_endpoint_capture.c:679
[TASK_MANAGER ]I: Task error notification received from subtask 0, thread 0, status 1020109 (replicationtask.c:3641
[TASK_MANAGER ]W: Task 'TASK_gG1--Wsyl3drvCJf636TqQ' encountered a recoverable error (repository.c:6372)
[SORTER ]I: Final saved task state. Stream position QCDI_TEST:QSQJRN0001:6504, Source id 3, next Target id 1, confirmed Target id 0, last source timestamp 1759171643379185 (sorter.c:772)
[SOURCE_CAPTURE ]E: Error executing source loop [1020109] (streamcomponent.c:1946)
[TASK_MANAGER ]E: Stream component failed at subtask 0, component st_0_EP_SYcKapbEJQZiVETw_g5z4w [1020109] (subtask.c:1504)
[SOURCE_CAPTURE ]E: Stream component 'st_0_EP_SYcKapbEJQZiVETw_g5z4w' terminated [1020109] (subtask.c:1675)
To configure useStorageForStringSize:
The Qlik Talend tESBConsumer component fails to call a web service with the error:
Unable to create message factory for SOAP: Error while searching for service
This may occur after recently upgrading Qlik Talend and moving from JDK 11 to JDK 17. Post upgrade, the following error is encountered when calling a web service using tESBConsumer:
##Log##
tESBConsumer:Failed webservice call- Problem writing SAAJ model to stream
[WARN ] 13:14:53 org.apache.cxf.phase.PhaseInterceptorChain- Interceptor for {http://xmlns.oracle.com/Enterprise/Tools/services}PROCESSREQUEST#{http://xmlns.oracle.com/Enterprise... has thrown exception, unwinding now
org.apache.cxf.binding.soap.SoapFault: Problem writing SAAJ model to stream: Unable to create message factory for SOAP: Error while searching for service [jakarta.xml.soap.MessageFactory]
Caused by: jakarta.xml.soap.SOAPException: Unable to create message factory for SOAP: Error while searching for service [jakarta.xml.soap.MessageFactory]
at jakarta.xml.soap.MessageFactory.newInstance(MessageFactory.java:96) ~[jakarta.xml.soap-api-3.0.2.jar:3.0.2]
at org.apache.cxf.binding.soap.saaj.SAAJFactoryResolver.createMessageFactory(SAAJFactoryResolver.java:57) ~[cxf-rt-bindings-soap-4.1.0.jar:4.1.0]
at org.apache.cxf.binding.soap.saaj.SAAJOutInterceptor.getFactory(SAAJOutInterceptor.java:86) ~[cxf-rt-bindings-soap-4.1.0.jar:4.1.0]
at org.apache.cxf.binding.soap.saaj.SAAJOutInterceptor.handleMessage(SAAJOutInterceptor.java:122) ~[cxf-rt-bindings-soap-4.1.0.jar:4.1.0]
... 16 more
To resolve this, the necessary SAAJ implementation and its dependencies must be explicitly provided as separate libraries in the application's classpath. This typically involves including the following:
The jakarta.xml.soap.SOAPException 'Unable to create message factory for SOAP error', specifically mentions an error while searching for the service [jakarta.xml.soap.MessageFactory]. This indicates a problem with the availability or configuration of the SAAJ (SOAP with Attachments API for Java) implementation.
This issue commonly arises in environments using Java 11 or later, as JAX-WS and its related technologies, such as SAAJ, were removed from the standard Java Development Kit (JDK) in these versions.
Long access times to Qlik Cloud or when performing reloads using the Direct Access Gateway may be caused by high network latency between your location and the AWS datacenter hosting the affected tenant.
A possible browser test to check the network connection from a computer to AWS can be found at AWS Latency Test.
This test does not concern Qlik Cloud's platform, and can therefore be used to determine if there are purely network-related issues.
This is just a first-step tool, not a conclusive one. A latency of 300ms or more might still not affect user experience in Qlik Cloud, but it's still worth using the tool quickly for a first assessment.
How to verify and measure transfer speed from Direct Access gateway to Qlik Cloud
In a setup with a local and remote Qlik Replicate server, the remote server's IP address has changed. Does this IP address change require any additional configuration steps?
To avoid any issues after an IP address change:
Is it possible to limit the available output formats for an OnDemand report, such as only allowing PDF rather than allowing multiple formats?
Qlik NPrinting On-Demand Reports cannot be limited to a certain format (such as PDF). The Qlik Sense On-Demand reporting object will continue to present all available formats to export On-Demand reports to. This is by design.
In some cases, the otherwise correctly enabled Qlik NPrinting audit feature fails to produce output. For information on how to correctly enable auditing, see Audit trail.
On the Qlik NPrinting Server:
Always export and backup your client_audit certificate before proceeding.
This will result in the creation of a new certificate and enable the audit service to work normally once again.
The client_audit certificate that secures the Qlik NPrinting auditing service has become corrupted or contains an invalid signature.
The Qlik Analytics Migration Tool provides a structured and repeatable process for migrating analytics assets to Qlik Cloud. It enables teams to define migration plans, validate each step, and coordinate across roles to ensure secure and accurate execution.
Content
Qlik Cloud provides continuous innovation, centralized governance, and integrated AI that is not available in on-premises environments. Customers migrating to Qlik Cloud immediately benefit from:
Learn More, Ask Questions, Share Experiences
The Qlik Community is the central place to get answers, exchange ideas, and collaborate with peers using the Qlik Analytics Migration Tool. Whether you're preparing for a full environment shift or running phased pilots, we encourage you to share your approach and learn from others moving to Qlik Cloud in the Move to Cloud forum.
Over a period of time, attrep_changes tables will significantly contribute to the overall storage cost as the image of a dropped table will persist on Snowflake Storage for the purpose of Time Travel and Fail-Safe (see Working with Temporary and Transient Tables | docs.snowflake.com for details).
Create the attrep_changes table as a transient with a limited-to-absent support for Time Travel and Fail Safe. This needs to be done in Snowflake.
Example:
CREATE OR REPLACE TRANSIENT SCHEMA IF NOT EXISTS TRANSIENT_SCHEMA
DATA_RETENTION_TIME_IN_DAYS = 0
MAX_DATA_EXTENSION_TIME_IN_DAYS = 0
COMMENT = 'Transient Schema for Qlik Replicate Control Tables';
No out-of-the-box solution exists in Qlik Replicate for a configurable option to create the attrep_changes table as a transient one. Log an Idea with Qlik if this is a requirement.
If users are behind a proxy, the Qlik Web Connector may return the following errors:
Under these circumstances, the user has to configure the proxy from the deploy.config file, which is normally located in the root of your Qlik Web Connectors folder.
Disabling the proxy:
This is the best approach for Error 503 Service Unavailable errors.
Open the deploy.config file, search for Proxy in order to locate the settings and configure them as below.
<
Proxy
>
<
UseProxy
>false</
UseProxy
>
<
ProxyAddress
></
ProxyAddress
>
<
ProxyUsername
></
ProxyUsername
>
<
ProxyDomain
></
ProxyDomain
>
<
ProxyPassword
></
ProxyPassword
>
</
Proxy
>
Configuring the proxy - Setting it up to true:
This is the best approach for 407 proxy authentication errors.
If you are behind a proxy you will just need to set UseProxy to true and you might need to enter the proxy credentials as below.
<
Proxy
>
<
UseProxy
>true</
UseProxy
>
<
ProxyAddress
>proxy.sub-domain.mymaindomain.com:port</
ProxyAddress
>
<
ProxyUsername
>username</
ProxyUsername
>
<
ProxyDomain
></
ProxyDomain
>
<
ProxyPassword
>password</
ProxyPassword
>
</
Proxy
>
The Data Load Editor in Qlik Sense Enterprise on Windows 2025 experiences noticeable performance issues.
The issue is caused by defect SUPPORT-6006. Qlik is actively working on a fix.
A fix is planned for the next possible patches. Review the Release Notes for SUPPORT-6006.
A workaround is available. It is viable as long as the Qlik SAP Connector is not in use.
No service restart is required.
SUPPORT-6006
Qlik Sense and Vulnerability “CVE-2025-7783” in NPM Library form-data
In mid-July 2025, a vulnerability was disclosed in the NPM library form-data (GitHub Security Advisory). Qlik became aware of this issue through its standard Secure Development Lifecycle (SDL) processes.
Following an internal review, Qlik R&D and Security teams identified that potentially vulnerable versions of the form-data library were included in some installations of Qlik Sense Enterprise for Windows. However, due to the specific way Qlik utilizes this library, the conditions required for exploitation are not met.
Although the vulnerability was determined to be non-exploitable within Qlik Sense, customers who prefer to upgrade to a version that includes the patched form-data library can do so by installing one of the following releases:
Note: An earlier version of this information was mistakenly published indicating that this CVE was directly related to Qlik Sense for Windows.
This article provides an overview of how to send straight table data to Microsoft Teams as a table using Qlik Automate.
The template is available on the template picker. You can find it by navigating to Add new -> New automation -> Search templates, searching for 'Send straight table data to Microsoft Teams as a table' in the search bar, and clicking the Use template option.
You will find a version of this automation attached to this article: "Send-straight-table-data-to-Microsoft-Teams-as-a-table.json".
Content:
The following steps describe how to build the demo automation:
An example output of the table sent to the Teams channel:
The information in this article is provided as-is and will be used at your discretion. Depending on the tool(s) used, customization(s), and/or other factors, ongoing support on the solution below may not be provided by Qlik Support.
Installing, upgrading, and managing the Qlik Cloud Monitoring Apps has just gotten a whole lot easier! With two new Qlik Application Automation templates coupled with Qlik Data Alerts, you can now:
The above allows you to deploy the monitoring apps to your tenant with a hands-off approach. Dive into the individual components below.
Some monitoring apps are designed for specific Qlik Cloud subscription types. Refer to the compatibility matrix within the Qlik Cloud Monitoring Apps repository.
Content:
This automation template is a fully guided installer/updater for the Qlik Cloud Monitoring Applications, including but not limited to the App Analyzer, Entitlement Analyzer, Reload Analyzer, and Access Evaluator applications. Leverage this automation template to quickly and easily install and update these or a subset of these applications with all their dependencies. The applications themselves are community-supported; and, they are provided through Qlik's Open-Source Software GitHub and thus are subject to Qlik's open-source guidelines and policies.
For more information, refer to the GitHub repository.
Note that if the monitoring applications have been installed manually (i.e., not through this automation) then they will not be detected as existing. The automation will install new copies side-by-side. Any subsequent executions of the automation will detect the newly installed monitoring applications and check their versions, etc. This is due to the fact that the applications are tagged with "QCMA - {appName}" and "QCMA - {version}" during the installation process through the automation. Manually installed applications will not have these tags and therefore will not be detected.
This template is intended to be used alongside the Qlik Cloud Monitoring Apps for user-based subscriptions template. This automation provides the ability to keep the API key and associated data connection used for the Qlik Cloud Monitoring Apps up to date on a scheduled basis. Simply input the space Id where the monitoring_apps_REST data connection should reside, and the automation will recreate both the API key and data connection regularly. Ensure that the cadence of the automation’s schedule is less than the expiry of the API key.
Enter in the Id of the space where the monitoring_apps_REST data connection should reside.
Ensure that this automation is run off-hours from your scheduled monitoring application reloads so it does not disrupt the reload process.
Each Qlik Cloud Monitoring App has the following two variables:
With these variables, we can create a new Qlik Data Alert on a per-app basis. For each monitoring app that you want to be notified on if it falls out of date:
Here is an example of an alert received for the App Analyzer, showing that at this point in time, the latest version of the application is 5.1.3 and that the app is out of date:
Q: Can I re-run the installer to check if any of the monitoring applications are able to be upgraded to a later version?
A: Yes. Run the installer, select which applications should be checked and select the space that they reside in. If any of the selected applications are not installed or are upgradeable, a prompt will appear to continue to install/upgrade for the relevant applications.
Q: What if multiple people install monitoring applications in different spaces?
A: The template scopes the applications install process to a “target” space, i.e., a shared space (if not published) or a managed space. It will scope the API key name to `QCMA – {spaceId}` of that target space. This allows the template to install/update the monitoring applications across spaces and across users. If one user installs an application to “Space A” and then another user installs a different monitoring application to “Space A”, the template will see that a data connection and associated API key (in this case from another user) exists for that space already and it will install the application leveraging those pre-existing assets.
Q: What if a new monitoring application is released? Will the template provide the ability to install that application as well?
A: Yes. The template receives the list of applications dynamically from GitHub. If a new monitoring application is released, it will become available immediately through the template.
Q: I would like to be notified whenever a new version of a monitoring applications is released. Can this template do that?
A: As per the article above, the automation templates are not responsible for notifications of whether the applications are out of date. This is achieved using Qlik Alerting on a per-application basis as described in Part 3.
Q:I have updated my application, but I noticed that it did not preserve the history. Why is that?
A: The history is preserved in the prior versions of the application’s QVDs so the data is never deleted and can be loaded into the older version. Each upgrade will generate a new set of QVDs as the data models for the applications sometimes change due to bug fixes, updates, new features, etc. If you want to preserve the history when updating, the application can be upgraded with the “Publish side-by-side” method so that the older version of the application will remain as an archival application. However note that the Qlik Alert (from Part 3) will need to be recreated and any community content that was created on the older application will not be transferred to the new application.