Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
Qlik offers a wide range of channels to assist you in troubleshooting, answering frequently asked questions, and getting in touch with our technical experts. In this article, we guide you through all available avenues to secure your best possible experience.
For details on our terms and conditions, review the Qlik Support Policy.
Index:
We're happy to help! Here's a breakdown of resources for each type of need.
Support | Professional Services (*) | |
Reactively fixes technical issues as well as answers narrowly defined specific questions. Handles administrative issues to keep the product up-to-date and functioning. | Proactively accelerates projects, reduces risk, and achieves optimal configurations. Delivers expert help for training, planning, implementation, and performance improvement. | |
|
|
(*) reach out to your Account Manager or Customer Success Manager
Your first line of support: https://community.qlik.com/
Looking for content? Type your question into our global search bar:
Leverage the enhanced and continuously updated Knowledge Base to find solutions to your questions and best practice guides. Bookmark this page for quick access!
Subscribe to maximize your Qlik experience!
The Support Updates Blog
The Support Updates blog delivers important and useful Qlik Support information about end-of-product support, new service releases, and general support topics. (click)
The Qlik Design Blog
The Design blog is all about product and Qlik solutions, such as scripting, data modelling, visual design, extensions, best practices, and more! (click)
The Product Innovation Blog
By reading the Product Innovation blog, you will learn about what's new across all of the products in our growing Qlik product portfolio. (click)
Q&A with Qlik
Live sessions with Qlik Experts in which we focus on your questions.
Techspert Talks
Techspert Talks is a free webinar to facilitate knowledge sharing held on a monthly basis.
Technical Adoption Workshops
Our in depth, hands-on workshops allow new Qlik Cloud Admins to build alongside Qlik Experts.
Qlik Fix
Qlik Fix is a series of short video with helpful solutions for Qlik customers and partners.
Suggest an idea, and influence the next generation of Qlik features!
Search & Submit Ideas
Ideation Guidelines
Get the full value of the community.
Register a Qlik ID:
Incidents are supported through our Chat, by clicking Chat Now on any Support Page across Qlik Community.
To raise a new issue, all you need to do is chat with us. With this, we can:
Log in to manage and track your active cases in the Case Portal. (click)
Please note: to create a new case, it is easiest to do so via our chat (see above). Our chat will log your case through a series of guided intake questions.
When creating a case, you will be prompted to enter problem type and issue level. Definitions shared below:
Select Account Related for issues with your account, licenses, downloads, or payment.
Select Product Related for technical issues with Qlik products and platforms.
If your issue is account related, you will be asked to select a Priority level:
Select Medium/Low if the system is accessible, but there are some functional limitations that are not critical in the daily operation.
Select High if there are significant impacts on normal work or performance.
Select Urgent if there are major impacts on business-critical work or performance.
If your issue is product related, you will be asked to select a Severity level:
Severity 1: Qlik production software is down or not available, but not because of scheduled maintenance and/or upgrades.
Severity 2: Major functionality is not working in accordance with the technical specifications in documentation or significant performance degradation is experienced so that critical business operations cannot be performed.
Severity 3: Any error that is not Severity 1 Error or Severity 2 Issue. For more information, visit our Qlik Support Policy.
If you require a support case escalation, you have two options:
When other Support Channels are down for maintenance, please contact us via phone for high severity production-down concerns.
A collection of useful links.
Qlik Cloud Status Page
Keep up to date with Qlik Cloud's status.
Support Policy
Review our Service Level Agreements and License Agreements.
Live Chat and Case Portal
Your one stop to contact us.
After copying data from MSSQL to Azure SQL tables, the copy fails with the error:
The metadata for source table 'tabel_name' is different than the corresponding MS-CDC Change Table. The table will be suspended.
Verify if the tables you are replicating are temporal or system tables. Temporal or system tables are not supported by Qlik Replicate. See Limitations and considerations for details.
If you want to capture changes to these tables with MS-CDC and Qlik Replicate, then you have to unhide the system-generated columns:
ALTER TABLE <the table name> ALTER COLUMN [SysStartTime] drop HIDDEN;
ALTER TABLE <the table name> ALTER COLUMN [SysEndTime] drop HIDDEN;
Depending on how the table was created, the hidden column names may be different, such as ValidFrom, ValidTo.
If you don't want to make the above change, you can use the ODBC with CDC endpoint and capture both the base table and the history table using SysStartTime as the context column.
See Qlik Replicate: W: The metadata for source table 'dbo.table' is different than the corresponding MS-CDC Change Table for details.
The following error may be encountered in Qlik Replicate when reading from an Oracle Standby database node:
[SOURCE_CAPTURE ]E: Cannot create Oracle directory name 'ATTUREP_9C9D285sample_directory' with path '/RDSsamplefilepath/db/node_C/archive' [-1] (oradcdc_bfilectx.c:165)
Qlik Replicate accesses Oracle archive logs through Oracle directories from the file path assigned to the node, as retrieved from the v$Archived_log view. The mentioned error occurs when the Qlik Replicate task is unable to use the Oracle directory and file path set in the DB. In this instance, Qlik Replicate attempts to create its own custom directory.
If the user does not have Create Any Directory permissions, then this error occurs.
Read permissions on the file path of the Oracle directory are required; otherwise, the task will remain unable to access the archive logs, even when permissions to the Oracle directory are provided.
See Access privileges when using Replicate Log Reader to access the redo logs for details.
Example:
When working with the standby(Secondary) node C, the Oracle user will not have default permissions to the Oracle Directory and File Path. Giving permissions to just the Oracle Directory is not enough for the task to access the File Path. Read permissions must be given to both ARCHIVELOG_DIR_C and abc_C/arch in this example:
Provide Read permissions to both the Oracle Directory and the file path in use.
The task was missing File Path Read permissions of the Oracle Directory.
Qlik Cloud Analytics customers can now easily include highly formatted tabular or PixelPerfect reports in their automations with two new blocks:
Report developers can create highly formatted report templates to achieve stakeholder analytics presentation requirements. Using these new blocks, automation developers can then easily configure report production from a target Qlik Sense app, using selection states to produce a report output that can be used in the business process definition.
This article outlined the steps needed to prevent data loss and how to resume tasks in Qlik Replicate after moving an Oracle Database.
This step is crucial to prevent any changes from being missed.
Facebook Ads integration extraction fails with the following error:
SingerSyncError POST: 400 Message: (#3018) The start date of the time range cannot be beyond 37 months from the current date.
If you are suddenly seeing this error, it is likely due to resetting the integration or table while having an older Start Date.
To resolve, change the Start Date in your Facebook Ads integration settings to a value within the last 37 months. This aligns with Facebook's current policy and allows the integration to function properly.
If you have any questions about this limitation or to discuss potential options for accessing older data, please contact Facebook. They may have additional insights or alternative solutions for businesses needing to access older advertising data.
Going forward, it's important to be aware of this 37-month limitation when working with Facebook Ads data, especially when setting up or resetting integrations. Regular data backups or exports might be advisable to retain historical data beyond this window for long-term analysis and reporting needs.
Can we find out who changed the date?
Qlik Stitch does not track this type of user activity. You will need to check with other users in your organisation.
This is a Facebook Ads API limitation documented by Stitch.
To investigate Task failure, It is necessary to collect the Diagnostics Package from Qlik Cloud Data Integration.
Option Two: Monitor view within the task
Often, Support will request that specific logging components be increased to Verbose or Trace in order to effectively troubleshoot. To modify, click on the "Logging options" located in the right-hand corner of the logs view. The options presented in the UI do not use the same terminology as what you see in the logs themselves. For better understanding, please refer to this mapping:
UI | Logs |
Source - full load | SOURCE_UNLOAD |
Source - CDC | SOURCE_CAPTURE |
Source - data | SOURCE_UNLOAD SOURCE_CAPTURE SOURCE_LOG_DUMP DATA_RECORD |
Target - full load | TARGET_LOAD |
Target - CDC | TARGET_APPLY |
Target - Upload | FILE_FACTORY |
Extended CDC | SORTER SORTER_STORAGE |
Performance | PERFORMANCE |
Metadata | SERVER TABLES_MANAGER METADATA_MANAGER METADATA_CHANGES |
Infrastructure | IO INFRASTRUCTURE STREAM STREAM_COMPONENT TASK_MANAGER |
Transformation | TRANSFORMATION |
Please note that if the View task logs option is not present in the dropdown menu, it indicates that the type of task you are working with does not have available task logs. In the current design, only Replication and Landing tasks have task logs.
This article gives an overview of the available blocks in the dbt Cloud connector in Qlik Application Automation.
The purpose of the dbt cloud connector is to be able to schedule or trigger your jobs in dbt from action in Qlik Sense SaaS.
Authentication to dbt Cloud happens through an API key. The API key can be found in the user profile when logged into dbt Cloud under API Settings. Instructions on the dbt documentation are at: https://docs.getdbt.com/dbt-cloud/api-v2#section/Authentication
The blocks we have available are built around the objects Jobs and Runs. Furthermore for easy use of the connector there are helper blocks for accounts and projects. For any gaps we have raw API request blocks to allow more freedom to end users where our blocks do not suffice.
Blocks for Jobs:
Blocks for Runs:
The following automation is added as an attachment and shown as an image and will run a job in dbt Cloud and if successful reloads an app in Qlik Sense SaaS. It will always send out an email, this of course can be changed to a different channel. Also it would be possible to extend this to multiple dbt jobs:
The information in this article is provided as-is and to be used at own discretion. Depending on tool(s) used, customization(s), and/or other factors ongoing support on the solution below may not be provided by Qlik Support.
This article aims to answer the following questions:
Stitch is a cloud-based ETL platform, which means it is not real-time and may experience latency due to the nature of cloud infrastructure and its step-based processing model.
Stitch’s replication process consists of three independent steps:
Extraction → Preparation → Loading
Each step takes time to complete and is influenced by various factors.
For more information, see: Stitch’s Replication Process | stitchdata.com
The speed and efficiency of Stitch’s replication process can be affected by:
These factors can vary over time and across integrations, which is why replication durations are not always predictable.
The replication frequency determines how often Stitch initiates a new extraction job (when one isn’t already in progress). Stitch tracks your tables and updates them based on the replication method you’ve selected.
However, this frequency does not guarantee that data will be prepared and loaded within the same time window. For example, a 30-minute frequency does not mean the full replication cycle completes in 30 minutes.
Stitch extracts one table at a time per integration (sequentially). It must finish extracting one table before moving to the next.
Once data is extracted, Stitch begins the preparation phase, which involves cutting records into rectangular staging files. This step is batch-based and starts as soon as data is returned from the source. The duration of this phase depends on the structure and volume of the data.
Stitch can load up to 5 tables concurrently per destination. If 5 tables are already loading, others must wait until a slot becomes available. For example, with 10 integrations and 20 tables each, Stitch will load 5 tables at a time per destination.
Stitch’s loading systems check every 15–20 minutes for batches of records that are fully prepared and ready to be loaded into your destination.
What may appear as missing data is often just incomplete processing. Most data discrepancies resolve themselves once Stitch finishes processing.
The Qlik Cloud and Qlik Sense Enterprise on Windows Straight Table come with a menu option to Adjust Column Size.
Clicking this option does not have an immediate effect.
What does it do?
Adjust Column Size sets the column into a state that allows you to change its width using your arrow keys. Once in this state, you can use the left and right arrow keys to make the column larger or smaller.
IBM DB2 iSeries connector in Qlik Cloud Data Integration requires setting up a Data Gateway - Data Movement (see Setting up Data Movement gateway) and installing the supported DB2 iSeries ODBC driver on the same server (see Preparing the installation | IBM DB2 for iSeries).
This article aims to guide you through the process.
Currently, Qlik Cloud Data Integration supports DB2i ODBC driver version 1.1.0.26, as can be confirmed by viewing the /opt/qlik/gateway/movement/drivers/manifests/db2iseries.yaml after the data gateway is installed.
cd /opt/qlik/gateway/movement/drivers/bin
sudo mkdir -p /opt/qlik/gateway/movement/drivers/db2iseries
sudo wget -O /opt/qlik/gateway/movement/drivers/db2iseries/ibm-iaccess-1.1.0.26-1.0.x86_64.rpm “https://public.dhe.ibm.com/software/ibmi/products/odbc/rpms/x86_64/ibm-iaccess-1.1.0.26-1.0.x86_64.rpm”
./install db2iseries
sudo systemctl restart repagent
To ensure CDC works with this connector, set the internal property useStorageForStringSize to true. There is a known issue with BOOLEAN datatype and driver version 1.1.0.26, and this parameter will ensure smooth replication. Otherwise, you will see an error like:
[TASK_MANAGER ]I: Starting replication now (replicationtask.c:3500
[SOURCE_CAPTURE ]E: Error parsing [1020109] (db2i_endpoint_capture.c:679
[TASK_MANAGER ]I: Task error notification received from subtask 0, thread 0, status 1020109 (replicationtask.c:3641
[TASK_MANAGER ]W: Task 'TASK_gG1--Wsyl3drvCJf636TqQ' encountered a recoverable error (repository.c:6372)
[SORTER ]I: Final saved task state. Stream position QCDI_TEST:QSQJRN0001:6504, Source id 3, next Target id 1, confirmed Target id 0, last source timestamp 1759171643379185 (sorter.c:772)
[SOURCE_CAPTURE ]E: Error executing source loop [1020109] (streamcomponent.c:1946)
[TASK_MANAGER ]E: Stream component failed at subtask 0, component st_0_EP_SYcKapbEJQZiVETw_g5z4w [1020109] (subtask.c:1504)
[SOURCE_CAPTURE ]E: Stream component 'st_0_EP_SYcKapbEJQZiVETw_g5z4w' terminated [1020109] (subtask.c:1675)
To configure useStorageForStringSize:
The Qlik Talend tESBConsumer component fails to call a web service with the error:
Unable to create message factory for SOAP: Error while searching for service
This may occur after recently upgrading Qlik Talend and moving from JDK 11 to JDK 17. Post upgrade, the following error is encountered when calling a web service using tESBConsumer:
##Log##
tESBConsumer:Failed webservice call- Problem writing SAAJ model to stream
[WARN ] 13:14:53 org.apache.cxf.phase.PhaseInterceptorChain- Interceptor for {http://xmlns.oracle.com/Enterprise/Tools/services}PROCESSREQUEST#{http://xmlns.oracle.com/Enterprise... has thrown exception, unwinding now
org.apache.cxf.binding.soap.SoapFault: Problem writing SAAJ model to stream: Unable to create message factory for SOAP: Error while searching for service [jakarta.xml.soap.MessageFactory]
Caused by: jakarta.xml.soap.SOAPException: Unable to create message factory for SOAP: Error while searching for service [jakarta.xml.soap.MessageFactory]
at jakarta.xml.soap.MessageFactory.newInstance(MessageFactory.java:96) ~[jakarta.xml.soap-api-3.0.2.jar:3.0.2]
at org.apache.cxf.binding.soap.saaj.SAAJFactoryResolver.createMessageFactory(SAAJFactoryResolver.java:57) ~[cxf-rt-bindings-soap-4.1.0.jar:4.1.0]
at org.apache.cxf.binding.soap.saaj.SAAJOutInterceptor.getFactory(SAAJOutInterceptor.java:86) ~[cxf-rt-bindings-soap-4.1.0.jar:4.1.0]
at org.apache.cxf.binding.soap.saaj.SAAJOutInterceptor.handleMessage(SAAJOutInterceptor.java:122) ~[cxf-rt-bindings-soap-4.1.0.jar:4.1.0]
... 16 more
To resolve this, the necessary SAAJ implementation and its dependencies must be explicitly provided as separate libraries in the application's classpath. This typically involves including the following:
The jakarta.xml.soap.SOAPException 'Unable to create message factory for SOAP error', specifically mentions an error while searching for the service [jakarta.xml.soap.MessageFactory]. This indicates a problem with the availability or configuration of the SAAJ (SOAP with Attachments API for Java) implementation.
This issue commonly arises in environments using Java 11 or later, as JAX-WS and its related technologies, such as SAAJ, were removed from the standard Java Development Kit (JDK) in these versions.
Long access times to Qlik Cloud or when performing reloads using the Direct Access Gateway may be caused by high network latency between your location and the AWS datacenter hosting the affected tenant.
A possible browser test to check the network connection from a computer to AWS can be found at AWS Latency Test.
This test does not concern Qlik Cloud's platform, and can therefore be used to determine if there are purely network-related issues.
This is just a first-step tool, not a conclusive one. A latency of 300ms or more might still not affect user experience in Qlik Cloud, but it's still worth using the tool quickly for a first assessment.
How to verify and measure transfer speed from Direct Access gateway to Qlik Cloud
In a setup with a local and remote Qlik Replicate server, the remote server's IP address has changed. Does this IP address change require any additional configuration steps?
To avoid any issues after an IP address change:
Is it possible to limit the available output formats for an OnDemand report, such as only allowing PDF rather than allowing multiple formats?
Qlik NPrinting On-Demand Reports cannot be limited to a certain format (such as PDF). The Qlik Sense On-Demand reporting object will continue to present all available formats to export On-Demand reports to. This is by design.
In some cases, the otherwise correctly enabled Qlik NPrinting audit feature fails to produce output. For information on how to correctly enable auditing, see Audit trail.
On the Qlik NPrinting Server:
Always export and backup your client_audit certificate before proceeding.
This will result in the creation of a new certificate and enable the audit service to work normally once again.
The client_audit certificate that secures the Qlik NPrinting auditing service has become corrupted or contains an invalid signature.
The Qlik Analytics Migration Tool provides a structured and repeatable process for migrating analytics assets to Qlik Cloud. It enables teams to define migration plans, validate each step, and coordinate across roles to ensure secure and accurate execution.
Content
Qlik Cloud provides continuous innovation, centralized governance, and integrated AI that is not available in on-premises environments. Customers migrating to Qlik Cloud immediately benefit from:
Learn More, Ask Questions, Share Experiences
The Qlik Community is the central place to get answers, exchange ideas, and collaborate with peers using the Qlik Analytics Migration Tool. Whether you're preparing for a full environment shift or running phased pilots, we encourage you to share your approach and learn from others moving to Qlik Cloud in the Move to Cloud forum.
Over a period of time, attrep_changes tables will significantly contribute to the overall storage cost as the image of a dropped table will persist on Snowflake Storage for the purpose of Time Travel and Fail-Safe (see Working with Temporary and Transient Tables | docs.snowflake.com for details).
Create the attrep_changes table as a transient with a limited-to-absent support for Time Travel and Fail Safe. This needs to be done in Snowflake.
Example:
CREATE OR REPLACE TRANSIENT SCHEMA IF NOT EXISTS TRANSIENT_SCHEMA
DATA_RETENTION_TIME_IN_DAYS = 0
MAX_DATA_EXTENSION_TIME_IN_DAYS = 0
COMMENT = 'Transient Schema for Qlik Replicate Control Tables';
No out-of-the-box solution exists in Qlik Replicate for a configurable option to create the attrep_changes table as a transient one. Log an Idea with Qlik if this is a requirement.
If users are behind a proxy, the Qlik Web Connector may return the following errors:
Under these circumstances, the user has to configure the proxy from the deploy.config file, which is normally located in the root of your Qlik Web Connectors folder.
Disabling the proxy:
This is the best approach for Error 503 Service Unavailable errors.
Open the deploy.config file, search for Proxy in order to locate the settings and configure them as below.
<
Proxy
>
<
UseProxy
>false</
UseProxy
>
<
ProxyAddress
></
ProxyAddress
>
<
ProxyUsername
></
ProxyUsername
>
<
ProxyDomain
></
ProxyDomain
>
<
ProxyPassword
></
ProxyPassword
>
</
Proxy
>
Configuring the proxy - Setting it up to true:
This is the best approach for 407 proxy authentication errors.
If you are behind a proxy you will just need to set UseProxy to true and you might need to enter the proxy credentials as below.
<
Proxy
>
<
UseProxy
>true</
UseProxy
>
<
ProxyAddress
>proxy.sub-domain.mymaindomain.com:port</
ProxyAddress
>
<
ProxyUsername
>username</
ProxyUsername
>
<
ProxyDomain
></
ProxyDomain
>
<
ProxyPassword
>password</
ProxyPassword
>
</
Proxy
>
The Data Load Editor in Qlik Sense Enterprise on Windows 2025 experiences noticeable performance issues.
The issue is caused by defect SUPPORT-6006. Qlik is actively working on a fix.
A fix is planned for the next possible patches. Review the Release Notes for SUPPORT-6006.
A workaround is available. It is viable as long as the Qlik SAP Connector is not in use.
No service restart is required.
SUPPORT-6006