Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
Qlik offers a wide range of channels to assist you in troubleshooting, answering frequently asked questions, and getting in touch with our technical experts. In this article, we guide you through all available avenues to secure your best possible experience.
For details on our terms and conditions, review the Qlik Support Policy.
Index:
We're happy to help! Here's a breakdown of resources for each type of need.
Support | Professional Services (*) | |
Reactively fixes technical issues as well as answers narrowly defined specific questions. Handles administrative issues to keep the product up-to-date and functioning. | Proactively accelerates projects, reduces risk, and achieves optimal configurations. Delivers expert help for training, planning, implementation, and performance improvement. | |
|
|
(*) reach out to your Account Manager or Customer Success Manager
Your first line of support: https://community.qlik.com/
Looking for content? Type your question into our global search bar:
Leverage the enhanced and continuously updated Knowledge Base to find solutions to your questions and best practice guides. Bookmark this page for quick access!
Subscribe to maximize your Qlik experience!
The Support Updates Blog
The Support Updates blog delivers important and useful Qlik Support information about end-of-product support, new service releases, and general support topics. (click)
The Qlik Design Blog
The Design blog is all about product and Qlik solutions, such as scripting, data modelling, visual design, extensions, best practices, and more! (click)
The Product Innovation Blog
By reading the Product Innovation blog, you will learn about what's new across all of the products in our growing Qlik product portfolio. (click)
Q&A with Qlik
Live sessions with Qlik Experts in which we focus on your questions.
Techspert Talks
Techspert Talks is a free webinar to facilitate knowledge sharing held on a monthly basis.
Technical Adoption Workshops
Our in depth, hands-on workshops allow new Qlik Cloud Admins to build alongside Qlik Experts.
Qlik Fix
Qlik Fix is a series of short video with helpful solutions for Qlik customers and partners.
Suggest an idea, and influence the next generation of Qlik features!
Search & Submit Ideas
Ideation Guidelines
Get the full value of the community.
Register a Qlik ID:
Incidents are supported through our Chat, by clicking Chat Now on any Support Page across Qlik Community.
To raise a new issue, all you need to do is chat with us. With this, we can:
Log in to manage and track your active cases in the Case Portal. (click)
Please note: to create a new case, it is easiest to do so via our chat (see above). Our chat will log your case through a series of guided intake questions.
When creating a case, you will be prompted to enter problem type and issue level. Definitions shared below:
Select Account Related for issues with your account, licenses, downloads, or payment.
Select Product Related for technical issues with Qlik products and platforms.
If your issue is account related, you will be asked to select a Priority level:
Select Medium/Low if the system is accessible, but there are some functional limitations that are not critical in the daily operation.
Select High if there are significant impacts on normal work or performance.
Select Urgent if there are major impacts on business-critical work or performance.
If your issue is product related, you will be asked to select a Severity level:
Severity 1: Qlik production software is down or not available, but not because of scheduled maintenance and/or upgrades.
Severity 2: Major functionality is not working in accordance with the technical specifications in documentation or significant performance degradation is experienced so that critical business operations cannot be performed.
Severity 3: Any error that is not Severity 1 Error or Severity 2 Issue. For more information, visit our Qlik Support Policy.
If you require a support case escalation, you have two options:
When other Support Channels are down for maintenance, please contact us via phone for high severity production-down concerns.
A collection of useful links.
Qlik Cloud Status Page
Keep up to date with Qlik Cloud's status.
Support Policy
Review our Service Level Agreements and License Agreements.
Live Chat and Case Portal
Your one stop to contact us.
Talend Administration Center (TAC) git and nexus passwords are not being stored normally. After every restarting of the TAC services, the password has to be applied again.
A review of the Technical Log shows the following error:
The value can't be decrypted java.lang.IllegalArgumentException: Required master key master.key.<Date> not found
configuration.dbadmin.enable=true
There might be some mess with Master keys probably during Migration to patch or Master keys became corrupted.
The master.key parameter is mandatory for encoding and decoding all sensitive information. If this parameter is missing, Talend Administration Center can not work properly (there was some encryption algorithm change between version) , So first of all rotate the master key.
For more information about how to clear Talend Administration Center Cache, please check below article
How-to-clear-the-Talend-Administration-Center-TAC-cache
This article provides a guide on installing Qlik Replicate on a docker image.
sudo rpm -qa|grep dockersudo yum remove docker docker-client docker-client-latest docker-common docker-latest docker-laster-logrotate docker-logrotate docker-engine
sudo yum install -y yum-utils device-mapper-persistent-data lvm2 sudo yum-config-manager --add-repo https://download.docker.com/linux/centos/docker-ce.repo sudo yum install --allowerasing docker-ce docker-ce-cli containerd.io sudo systemctl start docker sudo systemctl enable docker
/kit
" The /kit directory is a temporary folder used to store installation files. In the example below, the RPM file is areplicate-2024.5.0-357.x86_64.rpm
. Once the installation is complete, the entire /kit folder can be safely deleted.mkdir -p /kit/ar_docker
cd /kit
rpm2cpio areplicate-2024.5.0-357.x86_64.rpm | cpio -iv --make-directories --no-absolute-filenames -D ar_docker/ ./opt/attunity/replicate/addons/samples/docker/*
mv ./ar_docker/opt/attunity/replicate/addons/samples/docker/* ./ar_docker
rm -rf ./ar_docker/opt
cd /kit/ar_docker
cp ../areplicate-2024.5.0-357.x86_64.rpm .
./create-dockerfile.sh
/kit/ar_docker/Dockerfile
, from:
RUN yum -y install /tmp/areplicate-*.rpmto
RUN systemd=no yum -y install /tmp/areplicate-*.rpm
NOTE!
The parameter systemd=no is used to solve the below error you may hit during the docker build stage: This rpm is not supported on your system (no systemctl), exiting. error: %prein(areplicate-2024.5.0-357.x86_64) scriptlet failed, exit status 43
NOTE!
The password should not be empty and must be strong enough.
WARNING!
If you want to install ODBC Drivers, please make the corresponding ODBC Drivers rpm files are ready in this folder. If you want to skip the ODBC Drivers installation at present, rename or delete the file "drivers" in this folder.
docker build --no-cache -t johnw/replicate:2024.5 .where
/johnw/replicate:2024.5
is the image tag.Do not forget the last period "."
docker run -d --name ar --hostname cdc2 -e ReplicateRestPort=3552 -p 3552:3552 -v /dockermount/data/replicate/data:/replicate/data johnw/replicate:2024.5
Now Qlik Replicate is running in the docker and can be accessed from Qlik Replicate Console GUI.
After renaming a group used in the Identity Provider (IdP) set up with Qlik Cloud Analytics, the old group name continues to be listed in Qlik Cloud.
Users can log in without issues using the updated information.
Use Qlik-CLI to remove the old group name.
Example command:
qlik group rm <groupId>
See Delete group by ID for details.
Qlik Cloud retrieves all its information on groups from the IdP (Identity Provider), but does not delete old groups. Old groups can be removed using the API.
Renaming a group will be perceived by Qlik Cloud as a new group having been created.
Can Qlik Cloud Analytics Community Sheets be unpublished using a Qlik Automate block or the Qlik REST API?
There is no unpublish sheet block in Qlik Automate, and a simple unpublish option cannot be done with the Qlik REST API.
To note:
Starting from Qlik Replicate versions 2024.5 and 2024.11, Microsoft SQL Server 2012 and 2014 are no longer supported. Supported SQL Server versions include 2016, 2017, 2019, and 2022. For up-to-date information, see Support Source Endpoints for your respective version.
Attempting to connect to unsupported versions, both on-premise and cloud, can result in various errors.
Examples of reported Errors:
The system view sys.column_encryption_keys is only available starting from SQL Server 2016. Attempting to query this view on earlier versions results in errors.
Reference: sys.column_encryption_keys (Microsoft Docs)
Upgrade your SQL Server instances to a supported version (2016 or later) to ensure compatibility with Qlik Replicate 2024.5 and above.
00375940, 00376089
By default, Qlik Replicate translates an UPDATE operation on the source into an UPDATE on the target. However, in some scenarios, especially when a primary key column is updated, you may want to capture this change as a DELETE followed by an INSERT.
This behavior can be enabled in Qlik Replicate through a task setting called "DELETE and INSERT when updating a primary key column." For more details, refer to the Qlik Replicate User Guide: Miscellaneous tuning.
Consider the following Oracle source table example, where ID is a primary key and name is a non-primary key column:
This behavior is supported for the following types of targets:
00350953
When attempting to execute a Data Integration (DI) Job, encountered the following error messages:
Execution failed :Cannot invoke "java.util.jar.Manifest.getMainAttributes()" because "manifest" is null
The detailed log can be found from workspace\.metadata\.log:
!MESSAGE 2025-05-06 19:18:06,261 ERROR org.talend.commons.exception.CommonExceptionHandler - Cannot invoke "java.util.jar.Manifest.getMainAttributes()" because "manifest" is null
!STACK 0
java.lang.NullPointerException: Cannot invoke "java.util.jar.Manifest.getMainAttributes()" because "manifest" is null
at org.talend.designer.runprocess.java.JavaProcessor.compareSapjco3Version(JavaProcessor.java:1803)
at org.talend.designer.runprocess.java.JavaProcessor.appendLibPath(JavaProcessor.java:1653)
at org.talend.designer.runprocess.java.JavaProcessor.getNeededModulesJarStr(JavaProcessor.java:1599)
at org.talend.designer.runprocess.java.JavaProcessor.getLibsClasspath(JavaProcessor.java:1343)
at org.talend.designer.runprocess.java.JavaProcessor.getCommandLine(JavaProcessor.java:1243)
at org.talend.designer.core.runprocess.Processor.getCommandLine(Processor.java:304)
at org.talend.designer.core.runprocess.Processor.getCommandLine(Processor.java:294)
The error signifies that the JAR file utilized in Job lacks the necessary manifest meta-information, specifically, the MANIFEST.MF file is absent from the JAR.
Navigate to the <Studio_installation_Home>\configuration\.m2 folder, locate the JAR file (specifically 'sapjco3.jar' in this instance), and unzip it to confirm the presence of the MANIFEST.MF file. If the MANIFEST.MF file is absent, it indicates an issue with the JAR file. Proceed to delete the directory containing this JAR file and reinstall the right JAR file within Talend Studio, for more further details, please refer to Installing external modules to Talend Studio.
Uploading a file to a dataset using the Qlik Cloud Analytics Catalog only lists the Personal Space. No other Spaces are available.
This persists even if the user has all the correct Space permissions or owns the missing Shared Space.
Example:
The feature allowing the Date format to be changed from the Data Manager is inaccessible from the Qlik Sense Enterprise on Windows November 2024 IR release to the SR 10 release.
Upgrade to the latest version of Qlik Sense Enterprise on Windows.
If an upgrade is not possible, change the Date Format from the Data Load Editor.
Example:
Date(Date#(YearMonth, 'YYYYMM'), 'YYYY/MM') AS YearMonthFormatChanged
Qlik Sense Enterprise on Windows November 2024 SR 11 and higher.
Product Defect ID: SUPPORT-1846
The following release notes cover the Qlik PostgreSQL installer (QPI) version 1.2.0 to 2.0.0.
Content
Improvement / Defect | Details |
SHEND-2273 |
|
QCB-28706 |
Upgraded PostgreSQL version to 14.17 to address the pg_dump vulnerability (CVE-2024-7348). |
SUPPORT-335 | Upgraded PostgreSQL version to 14.17 to address the libcurl vulnerability (CVE-2024-7264). |
QB-24990 | Fixed an issue with upgrades of PostgreSQL if Qlik Sense was installed in a custom directory, such as D:\Sense. |
Improvement / Defect | Details |
SHEND-1359, QB-15164: Add support for encoding special characters for Postgres password in QPI | If the super user password is set to have certain special characters, QPI did not allow upgrading PostgreSQL using this password. The workaround was to set a different password, use QPI to upgrade the PostgreSQL database and then reset the password after the upgrade. This workaround is not required anymore with 1.4.0 QPI, as 1.4.0 supports encoded passwords. |
SHEND-1408: Qlik Sense services were not started again by QPI after the upgrade | QPI failed to restart Qlik services after upgrading the PostgreSQL database. This has been fixed now. |
SHEND-1511: Upgrade not working from 9.6 database | In QPI 1.3.0, upgrade from PostgreSQL 9.6 version to 14.8 was failing. This issue is fixed in QPI 1.4.0 version. |
QB-21082: Upgrade from May 23 Patch 3 to August 23 RC3 fails when QPI is used before attempting upgrade. QB-20581: May 2023 installer breaks QRS if QPI was used with a patch before. |
Using QPI on a patched Qlik Sense version caused issues in the earlier version. This is now supported. |
Contents:
Note: Many of the file level permissions would ordinarily be inherited from membership to the Local Administrators group. For information on non-Administrative accounts running Qlik Sense Services see Changing the user account type to run the Qlik Sense services on an existing site.
Record the Share Path. Navigate in the Qlik Management Console (QMC) to Service Cluster and record the Root Folder.
How to change the share path in Qlik Sense (Service Cluster)
A Talend ESB Route failed with below error when executing in studio
org.apache.camel.processor.errorhandler.DefaultErrorHandler- Failed delivery for (MessageId: xx on ExchangeId: xx).
Exhausted after delivery attempt: 1 caught: org.apache.camel.TypeConversionException: Error during type conversion from type: org.apache.camel.component.file.GenericFile to the required type: java.lang.String with value GenericFile[myfile] due to java.nio.charset.MalformedInputException: Input length = 1
The Route Design looks like:
cFile →cProcessor (with the code exchange.getIn().getBody(String.class) in it)
From above error message, you can see: "The java.nio.charset.MalformedInputException: Input length = 1” error in Java arises when the program attempts to decode a character sequence with a character encoding that is incompatible with the actual encoding of the input data. Specifically, "Input length = 1" indicates that a single byte or character is causing the decoding issue.
AWS System Manager (SM), an AWS service, can be used to view and control infrastructures on AWS. It offers automation documents to simplify common maintenance and deployment tasks of AWS resources.
AWS SM consists of a collection of capabilities related to automation, such as infrastructure maintenance and deployment tasks of AWS resources as well as some related to Application Management and Configuration. Among them, is a capability called Parameter Store.
AWS Systems Manager (SM) Parameter Store provides secure, hierarchical storage for configuration data management and secrets management.
It allows you to store data such as passwords, database strings, and license codes as parameter values.
Parameter Store offers the following benefits and features for Talend Jobs.
Secured, highly scalable, hosted service with NO SERVERS to manage: compared to the setup of a dedicated database to store Job context variables.
Control access at granular levels: specify who can access a specific parameter or set of parameters (for example, DB connection) at the user or group level. Using IAM roles, you can restrict access to parameters, which can have nested paths that can be used to define ACL-like access constraints. This is important for the control access of Production environment parameters.
Audit access: track the last user who created or updated a specific parameter value.
Encryption of data at rest and in transit: parameter values can be stored as plaintext (unencrypted data) or ciphertext (encrypted data). For encrypted value, KMS: AWS Key Management Service is used behind the scenes. Hence, Talend context variables with a Password type can be stored and retrieved securely without the implementation of a dedicated encryption/decryption process.
Another benefit of the AWS SM Parameter Store is its usage cost.
AWS SM Parameter Store consists of standard and advanced parameters.
Standard parameters are available at no additional charge. The values are limited to 4 KB size, which should cover the majority of Talend Job use cases.
With advanced parameters (8 KB size), you are charged based on the number of advanced parameters stored each month and per API interaction.
Assume you have 5,000 parameters, of which 500 are advanced. Assume that you have enabled higher throughput limits and interact with each parameter 24 times per day, equating to 3,600,000 interactions per 30-day month. Because you have enabled higher throughput, your API interactions are charged for standard and advanced parameters. Your monthly bill is the sum of the cost of the advanced parameters and the API interactions, as follows: Cost of 500 advanced parameters = 500 * $0.05 per advanced parameter = $25 Cost of 3.6M API interactions = 3.6M * $0.05 per 10,000 interactions = $18 Total monthly cost = $25 + $18 = $43.
For more information on pricing, see the AWS Systems Manager pricing web site.
A Parameter Store parameter is any piece of configuration data, such as a password or connection string, that is saved in the Store. You can centrally and securely reference this data in a Talend Job.
The Parameter Store provides support for three types of parameters:
In Talend, context variables are stored as a list of key-value pairs independent of the physical storage (Job, file, or database). Managing numerous parameters as a flat list is time-consuming and prone to errors. It can also be difficult to identify the correct parameter for a Talend Project or Job. This means you might accidentally use the wrong parameter, or you might create multiple parameters that use the same configuration data.
Parameter Store allows you to use parameter hierarchies to help organize and manage parameters. A hierarchy is a parameter name that includes a path that you define by using forward slashes (/).
The following example uses three hierarchy levels in the name:
/Dev/PROJECT1/max_rows
Parameter Store can accede from the AWS Console, AWS CLI, or the AWS SDK, including Java. Talend Studio leverage the AWS Java SDK to connect numerous Amazon Services, but, as yet, not to Amazon System Manager.
This initial implementation solely uses the current capabilities of Studio, such as Routines and Joblets.
A future version will leverage the Talend Component Development Kit (CDK) to build a dedicated connector for AWS System Manager.
The connector was developed in Java using the AWS SDK and exported as an UberJar (single JAR with all his dependencies embedded in it).
The AWSSSMParameterStore-1.0.0.jar file (attached to this article) is imported into the Studio local Maven Repository and then used as a dependency in the AwsSSMParameterStore Talend routine.
The routine provides a set of high-level APIs/functions of the Parameter Store for Talend Jobs.
package routines; import java.util.Map; import org.apache.commons.logging.Log; import org.apache.commons.logging.LogFactory; import com.talend.ps.engineering.AWSSMParameterStore; public class AwsSSMParameterStore { private static final Log LOG = LogFactory.getLog(AwsSSMParameterStore.class); private static AWSSMParameterStore paramsStore; /* * init * * Create a AWSSMParameterStore client based of the credentials parameters. * Follows the "Default Credential Provider Chain". * See https://docs.aws.amazon.com/sdk-for-java/v1/developer-guide/credentials.html * * Parameters: * accessKey : (Optional) AWS Access Key * secretKey : (Optional) AWS Secret Key * region : (Optional) AWS Region * * Return: * Boolean : False if invalid combination of parameters */ public static boolean init(String accessKey, String secretKey, String region) { ... } /* * loadParameters * * Retrieve all the parameters recursively with the path a prefix in their name * * Parameters: * path : Parameter path prefix for the parameters * * Return: * Map of name, value pair of parameters */ public static Map<String, String> loadParameters(String path){ ... } /* * saveParameter * * Retrieve all the parameters recursively with the path a prefix in their name * * Parameters: * name : Name of the parameter * value : Value of the parameter * encrypt : Encrypt the value the value in the Parameter Store * * Return: * Boolean : False if the save failed */ public static boolean saveParameter(String name, Object value, boolean encrypt) { ... } }
The init function creates the connector to AWS SSM using the AWS Default Credential Provider Chain.
The loadParameters function connects to the Parameter Store and retrieves a set/hierarchy of parameters prefixed with a specific path (see the naming convention for the parameters below).
The result is returned as a Map key-value pair.
Important: In the returned Map, the key represents only the last part of the parameter name path. If the parameter name is: /Dev/PROJECT1/max_rows, the returned Map key for this parameter is max_rows.
The saveParameter function allows you to save a context parameter name and value (derived from a context variable) to the Parameter Store.
Two Joblets were developed to connect to the AWS Parameter Store through the routine. One is designed to initialize the context variables of a Job using the parameters from the AWS Parameter Store. The other, as a utility for a Job to store its context variables into the Parameter Store.
Joblet: SaveContextVariableToAwsSSMParameterStore
The Joblet uses a tContextDump component to generate the context variables dataset with the standard key-value pair schema.
The tJavaFlex component is used to connect to the Parameter Store and save the context variables as parameters with a specific naming convention.
Parameter hierarchies naming convention for Talend context variables
In the context of context variables, the choice is to use a root prefix (optional) /talend/ to avoid any potential collision with the existing parameter name.
The prefix is appended with a string representing a runtime environment, for example, dev, qa, and prod. This to mimic the concept of the context environment found in the Job Contexts:
The parameter name is then appended with the name of the Talend Project (which is extracted from the Job definition) and, finally the name of the variable.
Parameter naming convention:
/talend/<environment name>/<talend project name>/<context variable name>
Example Job: job1 with a context variable ctx_var1 in a Talend Project PROJECT1.
The name of the parameter for the ctx_var1 variable in a development environment (identified by dev), is:
/talend/dev/PROJECT1/ctx_var1
For a production environment, prod, the name is:
/talend/prod/PROJECT1/ctx_var1
One option is to use the Job name as well in the hierarchy of the parameter name:
/talend/prod/PROJECT1/job1/ctx_var1
However, due to the usage of Talend Metadata connection, Context Group, and other that are shared across multiple Jobs, the usage of the Job name will result in multiple references of a context variable in the Parameter Store.
Moreover, if a value in the Context Group changes, the value needs to be updated in all the parameters for this context variable, which defies the purpose of the context group.
Joblet context variables
The Joblet uses a dedicated context group specific to the interaction with the Parameter Store.
AWS Access & Secret keys to connect to AWS. As mentioned earlier, the routine leverages AWS Default Credential Provider Chain. If these variables are not initialized, the SDK looks for Environment variables or the ~/.aws/Credential (user directory on Windows ) or EC2 roles to infer the right credentials.
AWS region of the AWS SM Parameter Store.
Parameter Store prefix and environment used in the parameter path as described above in the naming convention.
Joblet: LoadContextVariablesFromAwsSSMParmeterStore
The second Joblet is used to read parameters from The Parameter Store and update the Job context variables.
The Joblet uses a tJavaFlex component to connect to SSM Parameter Store, leveraging the AwsSSMParameterStore.loadParameters routine function described above. It retrieves all the parameters based on the prefix path (see the defined naming convention above).
The tContextLoad use the tJavaflex output key-value pair dataset, to overwrite the default values of the context variables.
Joblet context variables
The load Joblet uses the same context group as the save counterpart.
The sample Talend Job, generates a simple people's dataset using the tRowGenerator (first name, last name, and age), applies some transformations, and segregates the rows by age to create two distinct datasets, one for Adults ( age > 18) and one for Teenagers.
The two datasets are then inserted into a MySQL database in their respective tables.
The Job contains a mix of context variables, some are coming from a group defined for the MySQL Metadata Connection and some are specific to the Job: max_rows, table_adults, and table_teenagers.
The first step is to create all the parameters in the Parameter Store for the Job context variables. This can be done using the AWS console or through the AWS CLI, but those methods can be time-consuming and error-prone.
Instead, use the dedicated SaveContextVariableToAwsSSMParameterStore Joblet.
You need to drag-and-drop the Joblet into the Job canvas. There is no need to connect it to the rest of the Job components. It lists all the context variables, connects to AWS SM Parameter Store, creates the associated parameters, and stops the Job.
When the Job is executed, the System Manager Parameter Store web console should list the newly created parameters.
On the AWS console, the first column is not resizable, to see the full name of a parameter, you'll need to hide some of the columns.
You can also click a specific parameter to see the details.
For context variables defined with a Password type, the associated parameter is created as SecureString, which allows the value to be encrypted at rest in the store.
Talking about security, IAM access control can be leveraged to restrict access to a specific Operation team or to restrict access of a specific set of parameters such as production parameters: /talend/prod/*; developers will have access solely to the dev environment-related parameters, for example:
{ "Version": "2012-10-17", "Statement": [ { "Sid": "VisualEditor0", "Effect": "Allow", "Action": [ // Allows to decrypt secret parameters "kms:Decrypt", "ssm:DescribeParameters" ], "Resource": "*" }, { "Sid": "VisualEditor1", "Effect": "Allow", "Action": [ "ssm:PutParameter", "ssm:LabelParameterVersion", "ssm:DeleteParameter", "ssm:GetParameterHistory", "ssm:GetParametersByPath", "ssm:GetParameters", "ssm:GetParameter", "ssm:DeleteParameters" ], // Grant access only to dev parameters "Resource": "arn:aws:ssm:AWS-Region:AWS-AccountId:parameter/talend/dev/*" } ] }
In the context of a Talend Cloud Job/Task, the context variables don't need to be exported as connections or resources for Talend Cloud as they are initialized from the AWS Parameter Store.
You can only create a connection for the AWS SM Parameter Store credentials and config parameters.
The context group for the AWS SM Parameter Store, is externalized as Talend Cloud Custom Connection because, as yet, Talend Cloud doesn't have a native connector for AWS System Manager.
In Studio, you create a new Talend Cloud task by publishing the Job artifact to the cloud.
You'll then add the custom connection for AWS SM.
The additional context variables are exposed as advanced parameters, including the database connection parameters that are initialized from the Parameter Store.
A successful task execution on a cloud or Remote Engine means that the Job can connect to AWS SM, retrieve the parameters based on the naming convention set above, and initialize the corresponding context variables to allows the Job to connect to the MySQL database and create the requested tables.
This article provides a comprehensive guide to efficiently install the PostgreSQL ODBC client on Linux for a PostgreSQL target endpoint.
If the PostgreSQL serves as Replicate source endpoint, please check: How to Install PostgreSQL ODBC client on Linux for PostgreSQL Source Endpoint
rpm -ivh postgresql13-libs-13.2-1PGDG.rhel8.x86_64.rpm
rpm -ivh postgresql13-odbc-13.02.0000-1PGDG.rhel8.x86_64.rpm
rpm -ivh postgresql13-13.2-1PGDG.rhel8.x86_64.rpm
export LD_LIBRARY_PATH=/usr/pgsql-13/lib:$LD_LIBRARY_PATH
rpm -ivh unixODBC-2.3.7-1.el8.x86_64.rpm
[PostgreSQL]
Description = ODBC for PostgreSQL
Driver = /usr/lib/psqlodbcw.so
Setup = /usr/lib/libodbcpsqlS.so
Driver64 = /usr/pgsql-13/lib/psqlodbcw.so
Setup64 = /usr/lib64/libodbcpsqlS.so
FileUsage = 1
[pg15]
Driver = /usr/pgsql-13/lib/psqlodbcw.so
Database = targetdb
Servername = <targetDBHostName or IP Address>
Port = 5432
UserName = <PG User Name>
Password = <PG user's Password>
A task being run from the Qlik Sense Management Console fails with the error:
App already updating
The issue is typically only seen for specific apps and cannot be observed with new apps.
Restart all Qlik Sense services on the Central node. See Manual Start and Stop order of Qlik Sense services for details.
The task or app is queued within the manager schedule but has not completed successfully.
A Managed Space member is not able to create new report tasks even with the correct space permissions set. The Create button is greyed out and cannot be clicked. The minimal managed space permissions are set:
Ensure you do not have any 3rd party extensions running in the browser, such as AdBlock or similar. Disable or remove them and verify the Create Report button is once again available.
Adblock has been found to negatively interact with the Qlik Reporting create task button and other Qlik Reporting browser elements by aggressively blocking them.
Configuring a SharePoint connection fails when attempting to save the token. The error displayed is:
User: Error getting user info!
From the Qlik Web Connectors stand-alone site:
“Getting user info” is a request to https://graph.microsoft.com/v1.0/me. The endpoint was not reachable.
The QlikView Multi-box object does not show all expected results when opening from the QlikView AccessPoint.
Example:
Searching for *abc returns four results, while there are five matches in the data.
This may occur after an upgrade (such as upgrading QlikView from an earlier release to 12.90).
Review your browser's extensions. A known root cause for this issue is a custom browser extension that modifies a page's CSS (Cascading Style Sheet). This resulted in the results being hidden by the search box.
To resolve this, disable or otherwise modify the custom extension.
SUPPORT-722
The following error (C) is shown after successfully creating a Jira Connection string and selecting a Project/key (B) from select data to load (A😞
Failed on attempt 1 to GET. (The remote server returned an error; (404).)
The error occurs when connecting to JIRA server, but not to JIRA Cloud.
Tick the Use legacy search API checkbox. This is switched off by default.
A Use legacy search API option is not present in Qlik Sense On-Premise. To resolve the issue, manually add useLegacySearchAPI='true' in the generated script. This is required when using both Issues and CustomFieldsForIssues tables.
Example:
[Issues]:
LOAD key as [Issues.key],
fields_summary as [Issues.fields_summary];
SELECT key,
fields_summary
FROM Issues
WITH PROPERTIES (
projectIdOrKey='CL',
createdAfter='',
createdBefore='',
updatedAfter='',
updatedBefore='',
customFieldIds='',
jqlQuery='',
maxResults='4',
useLegacySearchApi='true'
);
Connections to JIRA Server use the legacy API.
SUPPORT-3600
CVE-2025-29927 is a critical authorization bypass vulnerability in Next.js, a popular React framework. Specifically, it allows attackers to circumvent security checks within a Next.js application if those checks are performed using middleware.
Is Qlik Sense Enterprise on Windows affected by this Security Vulnerability CVE-2025-29927?
Qlik Sense Enterprise on Windows is not affected by this Security Vulnerability CVE-2025-29927.
Next.js is not used in any of the on-premise Qlik Sense core services, such as the QMC or Hub.