Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
Qlik offers a wide range of channels to assist you in troubleshooting, answering frequently asked questions, and getting in touch with our technical experts. In this article, we guide you through all available avenues to secure your best possible experience.
For details on our terms and conditions, review the Qlik Support Policy.
Index:
We're happy to help! Here's a breakdown of resources for each type of need.
| Support | Professional Services (*) | |
| Reactively fixes technical issues as well as answers narrowly defined specific questions. Handles administrative issues to keep the product up-to-date and functioning. | Proactively accelerates projects, reduces risk, and achieves optimal configurations. Delivers expert help for training, planning, implementation, and performance improvement. | |
|
|
(*) reach out to your Account Manager or Customer Success Manager
Your first line of support: https://community.qlik.com/
Looking for content? Type your question into our global search bar:
Leverage the enhanced and continuously updated Knowledge Base to find solutions to your questions and best practice guides. Bookmark this page for quick access!
Subscribe to maximize your Qlik experience!
The Support Updates Blog
The Support Updates blog delivers important and useful Qlik Support information about end-of-product support, new service releases, and general support topics. (click)
The Qlik Design Blog
The Design blog is all about product and Qlik solutions, such as scripting, data modelling, visual design, extensions, best practices, and more! (click)
The Product Innovation Blog
By reading the Product Innovation blog, you will learn about what's new across all of the products in our growing Qlik product portfolio. (click)
Q&A with Qlik
Live sessions with Qlik Experts in which we focus on your questions.
Techspert Talks
Techspert Talks is a free webinar to facilitate knowledge sharing held on a monthly basis.
Technical Adoption Workshops
Our in depth, hands-on workshops allow new Qlik Cloud Admins to build alongside Qlik Experts.
Qlik Fix
Qlik Fix is a series of short video with helpful solutions for Qlik customers and partners.
Suggest an idea, and influence the next generation of Qlik features!
Search & Submit Ideas
Ideation Guidelines
Get the full value of the community.
Register a Qlik ID:
Incidents are supported through our Chat, by clicking Chat Now on any Support Page across Qlik Community.
To raise a new issue, all you need to do is chat with us. With this, we can:
Log in to manage and track your active cases in the Case Portal. (click)
Please note: to create a new case, it is easiest to do so via our chat (see above). Our chat will log your case through a series of guided intake questions.
When creating a case, you will be prompted to enter problem type and issue level. Definitions shared below:
Select Account Related for issues with your account, licenses, downloads, or payment.
Select Product Related for technical issues with Qlik products and platforms.
If your issue is account related, you will be asked to select a Priority level:
Select Medium/Low if the system is accessible, but there are some functional limitations that are not critical in the daily operation.
Select High if there are significant impacts on normal work or performance.
Select Urgent if there are major impacts on business-critical work or performance.
If your issue is product related, you will be asked to select a Severity level:
Severity 1: Qlik production software is down or not available, but not because of scheduled maintenance and/or upgrades.
Severity 2: Major functionality is not working in accordance with the technical specifications in documentation or significant performance degradation is experienced so that critical business operations cannot be performed.
Severity 3: Any error that is not Severity 1 Error or Severity 2 Issue. For more information, visit our Qlik Support Policy.
If you require a support case escalation, you have two options:
When other Support Channels are down for maintenance, please contact us via phone for high severity production-down concerns.
A collection of useful links.
Qlik Cloud Status Page
Keep up to date with Qlik Cloud's status.
Support Policy
Review our Service Level Agreements and License Agreements.
Live Chat and Case Portal
Your one stop to contact us.
How to intercept and customize logging for API call details (like response time) within Talend Data Integration (DI) jobs, specifically for components like tRestRequest and tRestResponse.
The goal is to route these specific, detailed logs directly to talend esb.log file or an ELK stack.
Existing Functionality: The required interception logic (logging details before/after a request) is already handled by the Service Activity Monitoring (SAM) module inside the ESB Karaf container. Documentation on SAM: introduction-to-service-activity-monitoring | Qlik Talend Help
Recommendation: Because of the existing SAM functionality, there is no need to re-implement this low-level interception logic within the design of the DI job itself.
It will proceed with creating own automatic log routing system that will ingest event data directly from the existing SAM event database and send it to their ELK stack.
For more detailed or advanced log customization regarding service request interception it should submit a formal feature request via the official platform: Feature Request Link | Qlik Ideation
The Qlik Talend Professional Services team can provide custom solutions.
When using Google Cloud Storage as a target in Qlik Replicate, and the target File Format is set to Parquet, an error may occur if the incoming data contains invalid values.
This happens because the Parquet writer validates data during the CSV-to-Parquet conversion. A typical error looks like:
[TARGET_LOAD ]E: Failed to convert file from csv to parquet
Error:: failed to read csv temp file
Error:: std::exception [1024902] (file_utils.c:899)
There are two possible solutions:
In this case, the source is SAP Oracle, and a few rare rows contained invalid date values. Example: 2023-11-31.
By enabling the internal parameters keepCSVFiles and keepErrorFiles in the target endpoint (both set to TRUE), you can inspect the generated CSV files to identify which rows contain invalid data.
00417320
This article guides you through configuring the tRest component to connect to a RESTful service that requires an SSL client certificate issued by an NPE (Non-Person Entity).
tRest does not have its own GUI for certificate management; instead, it primarily routes HTTP calls to the underlying Java HttpClient or CXF client. Therefore, the certificate setup must be completed at the Java keystore level before the component can run.
Here's how to set it up:
1. Convert your certificate to a Java keystore
If you have your certificate in .pfx or .p12 format:
keytool -importkeystore \
-srckeystore mycert.p12 \
-srcstoretype PKCS12 \
-destkeystore mykeystore.jks \
-deststoretype JKS
You will be asked to enter a password; make sure to remember it as you will need it in Step 2.
2. Tell Talend Job (Java) to use your cert
In Talend Studio, go to Run → Advanced settings for your job.
In the JVM Setting, select the 'Use specific JVM arguments' option, and add:
-Djavax.net.ssl.keyStore="C:/path/to/mykeystore.jks"
-Djavax.net.ssl.keyStorePassword=yourpassword
-Djavax.net.ssl.trustStore="C:/path/to/mytruststore.jks"
-Djavax.net.ssl.trustStorePassword=trustpassword
The truststore contains the Certificate Authority (CA) that issued the server’s certificate. If you don’t have one, you can generate it by using keytool -import from their public certificate.
3. Use tRest normally
Now, when tRest makes the HTTPS request, Java’s SSL layer will automatically present your client certificate and validate the server cert.
In Talend Administration Center environments with a high number of tasks (700–800) or frequent task cycles (e.g., executions every minute), the task execution history table in the Talend Administration Center (TAC) database may grow rapidly.
A large history table can negatively impact overall Talend Administration Center performance, leading administrators to manually truncate the table to maintain stability.
Talend Administration Center provides configuration parameters to automatically purge old execution history and maintain system performance.
TaskExecutionHistoryCleanUp
Talend Administration Center includes two configuration parameters in the Talend Administration Center database configuration table that control the cleanup process.
These parameters allow administrators to adjust:
Both parameters must be tuned to effectively control table size.
| Description | Interval (in seconds) between cleanup operations |
| Default value | 3600 (1 hour) |
| Behavior | Talend Administration Center performs a cleanup every 3600 seconds. Setting value to 0 disables automatic cleanup |
The default value is 3600 seconds, which corresponds to 1 hour. After the time defined in this parameter has elapsed, the system cleans up old task execution history and old misfired task execution records. This means the system performs the cleanup action in one hour, rather than immediately.
Lower the value to check more frequently the records that need to be deleted. Set 0 to disable the delete actions.
| Description | Maximum retention time (in seconds) before records are purged |
| Default value | 1296000 seconds (15 days), 15 days × 24 × 60 × 60 = 1,296,000 |
| Behavior | During each cleanup cycle, Talend Administation Center deletes records older than the retention period. |
Both parameters are documented below:
improving-task-execution-history-performances | Qlik Talend Help
When attempting to execute a Talend Management Console (TMC) task using a Service Account via the Talend Management Console API, users may encounter an HTTP 403 Forbidden response—even if the Service Account is correctly configured.
When attempting to execute a task using the Processing API endpoint:
POST https://api.<region>.cloud.talend.com/processing/executions
the API returns:
This issue typically arises when the necessary permissions for task execution are not granted prior to generating the service account token, or when the service account lacks specific functional permissions pertaining to task execution.
The token generated via:
POST /security/oauth/token
is valid.
The Service Account permissions appear to include:
TMC_ENGINE_USE
TMC_ROLE_MANAGEMENT
TMC_SERVICE_ACCOUNT_MANAGEMENT
AUDIT_LOGS_VIEW
TMC_USER_MANAGEMENT
TMC_CLUSTER_MANAGEMENT
According to the documentation Using a service account to run tasks | Qlik Help Center, the Service Account must possess either TMCENGINEUSE or TMC_OPERATOR permissions; however, even with these permissions, the execution still fails.
Navigate to Talend Management Console→ Users & Security → Service Accounts, and ensure the Service Account has the permission: Tasks and Plans – Edit
After updating permissions, regenerate service account Token.
This ensures that the token contains the updated permission set. Subsequently, rerunning the task via the API will work.
Recent versions of Qlik connectors have an out-of-the-box value of 255 for their DefaultStringColumnLength setting.
This means that, by default, any strings containing more than 255 characters is cut when imported from the database.
To import longer strings, specify a higher value for DefaultStringColumnLength.
This can be done in the connection definition and the Advanced Properties, as shown in the example below.
The maximum value that can be set is 2,147,483,647.
How do I understand which file the data ID in the capacity consumption report refers to?
In the Consumption Report app, we can only view the Data File ID of a data set that generated Data for Analysis. The file name is not shown.
There are two possible ways to achieve this. One is to directly leverage the API, the other is to use qlik-cli.
https://TENANT.REGION.qlikcloud.com/api/v1/data-files/DATA-FILEID/api/v1/data-files/59c41e71-e6b1-4d9e-8334-da48fd2f91ba
For information on how to get started with Qlik-cli, see: Qlik-cli overview.
qlik data-file get DATA-FILEIDqlik data-file get 59c41e71-e6b1-4d9e-8334-da48fd2f91ba
Tip!
To extract all file IDs and related file names, type the following into the Qlik-CLI command prompt:
qlik data-file ls
This article explains how to extract changes from a Change Store and store them in a QVD by using a load script in Qlik Analytics.
The article also includes
This example will create an analytics app for Vendor Reviews. The idea is that you, as a company, are working with multiple vendors. Once a quarter, you want to review these vendors.
The example is simplified, but it can be extended with additional data for real-world examples or for other “review” use cases like employee reviews, budget reviews, and so on.
The app’s data model is a single table “Vendors” that contains a Vendor ID, Vendor Name, and City:
Vendors:
Load * inline [
"Vendor ID","Vendor Name","City"
1,Dunder Mifflin,Ghent
2,Nuka Cola,Leuven
3,Octan, Brussels
4,Kitchen Table International,Antwerp
];
The Write Table contains two data model fields: Vendor ID and Vendor Name. They are both configured as primary keys to demonstrate how this can work for composite keys.
The Write Table is then extended with three editable columns:
It is finally here: The first public iteration of the Log Analysis app. Built with love by Customer First and Support.
"With great power comes great responsibility."
Before you get started, a few notes from the author(s):
Chapters:
01:23 - Log Collector
02:28 - Qlik Sense Services
04:17 - How to load data into the app
05:42 - Troubleshooting poor response times
08:03 - Repository Service Log Level
08:35 - Transactions sheet
12:44 - Troubleshooting Engine crashes
14:00 - Engine Log Level
14:47 - QIX Performance sheets
17:50 - General Log Investigation
20:28 - Where to download the app
20:58 - Q&A: Can you see a log message timeline?
21:38 - Q&A: Is this app supported?
21:51 - Q&A: What apps are there for Cloud?
22:25 - Q&A: Are logs collected from all nodes?
22:45 - Q&A: Where is the latest version?
23:12 - Q&A: Are there NPrinting templates?
23:40 - Q&A: Where to download Qlik Sense Desktop?
24:20 - Q&A: Are log from Archived folder collected?
25:53 - Q&A: User app activity logging?
26:07 - Q&A: How to lower log file size?
26:42 - Q&A: How does the QRS communicate?
28:14 - Q&A: Can this identify a problem chart?
28:52 - Q&A: Will this app be in-product?
29:28 - Q&A: Do you have to use Desktop?
Qlik Sense Enterprise on Windows (all modern versions post-Nov 2019)
*It is best used in an isolated environment or via Qlik Sense Desktop. It can be very RAM and CPU intensive.
The information in this article is provided as-is and to be used at own discretion. Depending on tool(s) used, customization(s), and/or other factors ongoing support on the solution below may not be provided by Qlik Support.
Optimizing Performance for Qlik Sense Enterprise - Qlik Community - 1858594
How does Qlik Replicate convert DB2 commit timestamps to Kafka message payload, and why are we seeing a lag of several hours?
Qlik Talend Management Console logs do not show anything for the job, even though the job is finished.
Potential Checklist
active = false
The Remote Engine will be the one to handle sending the logs to Talend Management Console to show in the Talend Management Console logs.
There may be problems with the remote engine that affected it to not handle the logs appropriately.
For example, there may be a high usage of memory for the remote engine or in the environment where the remote engine is being used.
For more information about how to prevent sending logs to Talend Management Console, please refer to
Preventing the engines from sending logs to Talend Cloud | Qlik Talend Help
After upgrading from Java 8 to Java 17, encountered the following error messages when attempting to open a Job in Talend Studio 8.0.1; however, the same Job could be opened normally prior to the upgrade.
JsonIoException setting field 'flags' on target: Property: null with value: {}
To resolve the issue, install Java 11 or upgrade Talend Studio to version 8.0.1-R2023-10 or later.
The version of Talend Studio you are currently using is prior to 8.0.1-R2023-10, and Java 17 is not a supported Java environment for versions earlier than Talend Studio 8.0.1-R2023-10, please refer to Supported Java versions for launching Talend Studio | Qlik Help Documentation.
Qlik Geocoding operates using two QlikGeoAnalytics operations: AddressPointLookup and PointToAddressLookup.
Two frequently asked questions are:
The Qlik Geocoding add-on option requires an Internet connection. It is, by design, an online service. You will be using Qlik Cloud (https://ga.qlikcloud.com), rather than your local GeoAnalytics Enterprise Server.
See the online documentation for details: Configuring Qlik Geocoding.
To retrieve the task with the status "Misfired" via API, you can use the API "/monitoring/observability/executions/search" mentioned below:
#type_searchrequest | talend.qlik.dev .
However, EXECUTION_MISFIRED status returned only if "exclude=TASK_EXECUTIONS_TRIGGERED_BY_PLAN"
So if you want to return any plans or tasks that are misfired, you should send this filter request:
"filters": [ { "field": "status", "operator": "in", "value": [ "DEPLOY_FAILED", "EXECUTION_MISFIRED"
Example
URL: https://api.<region>.cloud.talend.com/monitoring/observability/executions/search
{
"environmentId": "123456......",
"category": "ETL",
"filters": [
{
"field": "status",
"operator": "in",
"value": [
"DEPLOY_FAILED",
"EXECUTION_MISFIRED"
]
}
],
"limit": 50,
"offset": 0,
"exclude": "TASK_EXECUTIONS_TRIGGERED_BY_PLAN"
}
Jira ID: SUPPORT-7251
This article outlines how to handle DDL changes on a SQL Server table as part of the publication.
The steps in this article assume you use the task's default settings: full load and apply changes are enabled, full load is set to drop and recreate target tables, and DDL Handling Policy is set to apply alter statements to the target.
To achieve something simple, such as increasing the length of a column (without changing the data type), run an ALTER TABLE command on the source while the task is running, and it will be pushed to the target.
For example: alter table dbo.address alter column city varchar(70)
To make more complicated changes to the table, such as:
Follow this procedure:
When connecting to Microsoft OneDrive using either Qlik Cloud Analytics or Qlik Sense Enterprise on Windows, shared files and folders are no longer visible.
While the endpoint may intermittently work as expected, it is in a degraded state until November 2026. See drive: sharedWithMe (deprecated) | learn.microsoft.com. In most cases, the API endpoint is no longer accessible due to the publicly documented degraded state.
Qlik is actively reviewing the situation internally (SUPPORT-7182).
However, given that the MS API endpoint has been deprecated by Microsoft, a Qlik workaround or solution is not certain or guaranteed.
Use a different type of shared storage, such as mapped network drives, Dropbox, or SharePoint, to name a few.
Microsoft deprecated the /me/drive/sharedWithMe API endpoint.
SUPPORT-7182
To check the location where your tasks are running on, access your task and refer to the right-hand side where artifact details can be found under Configuration.
In this instance, the Binary type is displayed as "Talend Runtime" because it is a REST type artifact, indicating that it will be deployed and executed on Talend Runtime.
If you are using Remote Engine to execute your task, it will be displayed in the "Processor" section under Configuration. In this instance, it demonstrates that Remote Engine version 2.13.13 will be employed to run the task.