Featured Content
-
How to contact Qlik Support
Qlik offers a wide range of channels to assist you in troubleshooting, answering frequently asked questions, and getting in touch with our technical e... Show MoreQlik offers a wide range of channels to assist you in troubleshooting, answering frequently asked questions, and getting in touch with our technical experts. In this article, we guide you through all available avenues to secure your best possible experience.
For details on our terms and conditions, review the Qlik Support Policy.
Index:
- Support and Professional Services; who to contact when.
- Qlik Support: How to access the support you need
- 1. Qlik Community, Forums & Knowledge Base
- The Knowledge Base
- Blogs
- Our Support programs:
- The Qlik Forums
- Ideation
- How to create a Qlik ID
- 2. Chat
- 3. Qlik Support Case Portal
- Escalate a Support Case
- Phone Numbers
- Resources
Support and Professional Services; who to contact when.
We're happy to help! Here's a breakdown of resources for each type of need.
Support Professional Services (*) Reactively fixes technical issues as well as answers narrowly defined specific questions. Handles administrative issues to keep the product up-to-date and functioning. Proactively accelerates projects, reduces risk, and achieves optimal configurations. Delivers expert help for training, planning, implementation, and performance improvement. - Error messages
- Task crashes
- Latency issues (due to errors or 1-1 mode)
- Performance degradation without config changes
- Specific questions
- Licensing requests
- Bug Report / Hotfixes
- Not functioning as designed or documented
- Software regression
- Deployment Implementation
- Setting up new endpoints
- Performance Tuning
- Architecture design or optimization
- Automation
- Customization
- Environment Migration
- Health Check
- New functionality walkthrough
- Realtime upgrade assistance
(*) reach out to your Account Manager or Customer Success Manager
Qlik Support: How to access the support you need
1. Qlik Community, Forums & Knowledge Base
Your first line of support: https://community.qlik.com/
Looking for content? Type your question into our global search bar:
The Knowledge Base
Leverage the enhanced and continuously updated Knowledge Base to find solutions to your questions and best practice guides. Bookmark this page for quick access!
- Go to the Official Support Articles Knowledge base
- Type your question into our Search Engine
- Need more filters?
- Filter by Product
- Or switch tabs to browse content in the global community, on our Help Site, or even on our Youtube channel
Blogs
Subscribe to maximize your Qlik experience!
The Support Updates Blog
The Support Updates blog delivers important and useful Qlik Support information about end-of-product support, new service releases, and general support topics. (click)The Qlik Design Blog
The Design blog is all about product and Qlik solutions, such as scripting, data modelling, visual design, extensions, best practices, and more! (click)The Product Innovation Blog
By reading the Product Innovation blog, you will learn about what's new across all of the products in our growing Qlik product portfolio. (click)Our Support programs:
Q&A with Qlik
Live sessions with Qlik Experts in which we focus on your questions.Techspert Talks
Techspert Talks is a free webinar to facilitate knowledge sharing held on a monthly basis.Technical Adoption Workshops
Our in depth, hands-on workshops allow new Qlik Cloud Admins to build alongside Qlik Experts.Qlik Fix
Qlik Fix is a series of short video with helpful solutions for Qlik customers and partners.The Qlik Forums
- Quick, convenient, 24/7 availability
- Monitored by Qlik Experts
- New releases publicly announced within Qlik Community forums (click)
- Local language groups available (click)
Ideation
Suggest an idea, and influence the next generation of Qlik features!
Search & Submit Ideas
Ideation GuidelinesHow to create a Qlik ID
Get the full value of the community.
Register a Qlik ID:
- Go to register.myqlik.qlik.com
If you already have an account, please see How To Reset The Password of a Qlik Account for help using your existing account. - You must enter your company name exactly as it appears on your license or there will be significant delays in getting access.
- You will receive a system-generated email with an activation link for your new account. NOTE, this link will expire after 24 hours.
If you need additional details, see: Additional guidance on registering for a Qlik account
If you encounter problems with your Qlik ID, contact us through Live Chat!
2. Chat
Incidents are supported through our Chat, by clicking Chat Now on any Support Page across Qlik Community.
To raise a new issue, all you need to do is chat with us. With this, we can:
- Answer common questions instantly through our chatbot
- Have a live agent troubleshoot in real time
- With items that will take further investigating, we will create a case on your behalf with step-by-step intake questions.
3. Qlik Support Case Portal
Log in to manage and track your active cases in the Case Portal. (click)
Please note: to create a new case, it is easiest to do so via our chat (see above). Our chat will log your case through a series of guided intake questions.
Your advantages:
- Self-service access to all incidents so that you can track progress
- Option to upload documentation and troubleshooting files
- Option to include additional stakeholders and watchers to view active cases
- Follow-up conversations
When creating a case, you will be prompted to enter problem type and issue level. Definitions shared below:
Problem Type
Select Account Related for issues with your account, licenses, downloads, or payment.
Select Product Related for technical issues with Qlik products and platforms.
Priority
If your issue is account related, you will be asked to select a Priority level:
Select Medium/Low if the system is accessible, but there are some functional limitations that are not critical in the daily operation.
Select High if there are significant impacts on normal work or performance.
Select Urgent if there are major impacts on business-critical work or performance.
Severity
If your issue is product related, you will be asked to select a Severity level:
Severity 1: Qlik production software is down or not available, but not because of scheduled maintenance and/or upgrades.
Severity 2: Major functionality is not working in accordance with the technical specifications in documentation or significant performance degradation is experienced so that critical business operations cannot be performed.
Severity 3: Any error that is not Severity 1 Error or Severity 2 Issue. For more information, visit our Qlik Support Policy.
Escalate a Support Case
If you require a support case escalation, you have two options:
- Request to escalate within the case, mentioning the business reasons.
To escalate a support incident successfully, mention your intention to escalate in the open support case. This will begin the escalation process. - Contact your Regional Support Manager
If more attention is required, contact your regional support manager. You can find a full list of regional support managers in the How to escalate a support case article.
Phone Numbers
When other Support Channels are down for maintenance, please contact us via phone for high severity production-down concerns.
- Qlik Data Analytics: 1-877-754-5843
- Qlik Data Integration: 1-781-730-4060
- Talend AMER Region: 1-800-810-3065
- Talend UK Region: 44-800-098-8473
- Talend APAC Region: 65-800-492-2269
Resources
A collection of useful links.
Qlik Cloud Status Page
Keep up to date with Qlik Cloud's status.
Support Policy
Review our Service Level Agreements and License Agreements.
Live Chat and Case Portal
Your one stop to contact us.
Recent Documents
-
Qlik Sense Enterprise on Windows: Task errors out in QMC with App already updati...
A task being run from the Qlik Sense Management Console fails with the error: App already updating The issue is typically only seen for specific apps... Show MoreA task being run from the Qlik Sense Management Console fails with the error:
App already updating
The issue is typically only seen for specific apps and cannot be observed with new apps.
Resolution
Restart all Qlik Sense services on the Central node. See Manual Start and Stop order of Qlik Sense services for details.
Cause
The task or app is queued within the manager schedule but has not completed successfully.
Environment
- Qlik Sense Enterprise on Windows
-
Qlik Reporting Create Report task button is greyed out
A Managed Space member is not able to create new report tasks even with the correct space permissions set. The Create button is greyed out and cannot ... Show MoreA Managed Space member is not able to create new report tasks even with the correct space permissions set. The Create button is greyed out and cannot be clicked. The minimal managed space permissions are set:
- Can View
- Can Manage
- Can Operate
Resolution
Ensure you do not have any 3rd party extensions running in the browser, such as AdBlock or similar. Disable or remove them and verify the Create Report button is once again available.
Cause
Adblock has been found to negatively interact with the Qlik Reporting create task button and other Qlik Reporting browser elements by aggressively blocking them.
Environment
- Qlik Cloud Analytics
- Qlik Reporting
-
Qlik Office 365 SharePoint Connector User: Error getting user info!
Configuring a SharePoint connection fails when attempting to save the token. The error displayed is: User: Error getting user info! Resolution From ... Show MoreConfiguring a SharePoint connection fails when attempting to save the token. The error displayed is:
User: Error getting user info!
Resolution
From the Qlik Web Connectors stand-alone site:
- Go to the Connector
- Switch to the About tab
- Select Permissions
This section will list the API endpoints that need to be reached. Verify they are reachable and configure firewall exceptions where necessary.
Note: For this connector, the endpoints are https port 443.
Cause
“Getting user info” is a request to https://graph.microsoft.com/v1.0/me. The endpoint was not reachable.
Environment
- Qlik Web Connectors
-
QlikView Multi-Box not showing all results
The QlikView Multi-box object does not show all expected results when opening from the QlikView AccessPoint. Example: Searching for *abc returns four ... Show MoreThe QlikView Multi-box object does not show all expected results when opening from the QlikView AccessPoint.
Example:
Searching for *abc returns four results, while there are five matches in the data.
This may occur after an upgrade (such as upgrading QlikView from an earlier release to 12.90).
Resolution
Review your browser's extensions. A known root cause for this issue is a custom browser extension that modifies a page's CSS (Cascading Style Sheet). This resulted in the results being hidden by the search box.
To resolve this, disable or otherwise modify the custom extension.
Internal Investigation ID(s)
SUPPORT-722
Environment
- QlikView
-
Qlik Sense and the JIRA Connector and JIRA Server: Error when displaying "Projec...
The following error (C) is shown after successfully creating a Jira Connection string and selecting a Project/key (B) from select data to load (A😞 Fa... Show MoreThe following error (C) is shown after successfully creating a Jira Connection string and selecting a Project/key (B) from select data to load (A😞
Failed on attempt 1 to GET. (The remote server returned an error; (404).)
The error occurs when connecting to JIRA server, but not to JIRA Cloud.
Resolution
Qlik Cloud Analytics
Tick the Use legacy search API checkbox. This is switched off by default.
Qlik Sense Enterprise on Windows
A Use legacy search API option is not present in Qlik Sense On-Premise. To resolve the issue, manually add useLegacySearchAPI='true' in the generated script. This is required when using both Issues and CustomFieldsForIssues tables.
Example:
[Issues]: LOAD key as [Issues.key], fields_summary as [Issues.fields_summary]; SELECT key, fields_summary FROM Issues WITH PROPERTIES ( projectIdOrKey='CL', createdAfter='', createdBefore='', updatedAfter='', updatedBefore='', customFieldIds='', jqlQuery='', maxResults='4', useLegacySearchApi='true' );
Cause
Connections to JIRA Server use the legacy API.
Internal Investigation ID(s)
SUPPORT-3600
Environment
- Qlik Cloud Analytics
- Qlik Sense Enterprise on Windows
-
Qlik Sense Service Account requirements and how to change the account
This article will outline how to successfully change the service account running Qlik Sense. Contents: Account Requirements: What the account needs... Show MoreThis article will outline how to successfully change the service account running Qlik Sense.
Contents:
- Account Requirements: What the account needs access to.
- Prep work
- Changing Qlik Sense dependencies
- Change the service account
- External Dependencies
- Video Demonstration
- Related Content
Account Requirements: What the account needs access to.
- Certificates
- Access to the certificate(s) for the site
- Files and file shares
- Access to the installation path for Qlik Sense
- Access to %ProgramData%
- Access to C:\Program Files\Qlik
- Access to the Service Cluster share
- Access to external systems as data sources, e.g.
- Databases
- UNC shares to QVDs, CSVs, etc
Note: Many of the file level permissions would ordinarily be inherited from membership to the Local Administrators group. For information on non-Administrative accounts running Qlik Sense Services see Changing the user account type to run the Qlik Sense services on an existing site.
Prep work
Record the Share Path. Navigate in the Qlik Management Console (QMC) to Service Cluster and record the Root Folder.
Changing Qlik Sense dependencies
- Stop all Qlik Sense services
- Ensure permissions on the Program Files path (this should be provided by Local Administrator rights):
- Navigate to the installation path (default: C:\Program Files\Qlik)
- Select the Sense folder > Right Click > Properties > Security > Edit > Add
- Lookup the new service account
- Ensure that the account has Full control over this folder
- Ensure permissions on the %ProgramData% path (this should be provided by Local Administrator rights):
- Navigate to the installation path (default: C:\ProgramData\Qlik)
- Select the Sense folder > Right Click > Properties > Security > Edit > Add
- Lookup the new service account
- Ensure that the account has Full control over this folder
- Ensure access to the certificates used by Qlik Sense
- Start > MMC > File > Add/Remove Snap-In > Certificates > Computer Account > Finish
- Go into Certificates (Local Computer) > Personal > Certificates
- For the Qlik CA server certificate (under Certificates (Local Computer) > Personal > Certificates)
- Right Click on the Server Certificate > All Tasks > Manage Private Keys > Ensure that the new service account has control
- If using a third party certificate, do the same
- Start > MMC > File > Add/Remove Snap-In > Certificates > Computer Account > Finish
- Ensure access to the Service Cluster path used by Qlik Sense
- Start > Computer Management > Shared Folders > Shares > Select the Share path
- Right click on the Share Path > Properties > Share Permissions > Add the new service account to have full control
- Open Windows File Explorer and navigate to the folder (e.g. C:\Share) > Right click on the folder > Security > Edit > Add the new service account to have full control
- Ensure membership in the Local Groups that Qlik Sense requires:
- Start > Computer Management
- Navigate to Local Users and Groups > Local Groups
- Add the new service account as a member of:
- Administrators (if using this configuration option)
- Performance Monitor Users
- Qlik Sense Service Users
Change the service account
- Swap the account for all Qlik Services except the Qlik Sense Repository Database Service.
- Open the Windows Services Console
- Locate the services
- One by one open the Properties of the Service and change the accountover by using the Windows services control panel
- Start all Qlik Sense Services
- Access the QMC to validate functionality, preferably as a previously configured RootAdmin
- Access the Data Connections section of the QMC
- Toggle the User ID field and change the data connections used by the License and Operations Monitor apps to use the new user ID and password:
- Add the RootAdmin role to the new service account*
- QMC > Users
- Filter on the new UserID > Edit
- Add RootAdmin role
*If this account is not existing yet in Qlik Sense, you would need to try to connect to the Hub/QMC with this new account first, in order to be able to see it in QMC>Users.
- Execute the License Monitor reload task
- Inspect the configured User Directory Connectors and change the User ID and password combination if previously configured.
External Dependencies
- Go into the QMC > Data Connections section and inspect all Folder data connections to determine all network shares that the service account needs access to. Either change them yourself or alert the necessary teams to provide both Share and NT level access to these shares.
- Inspect all Data Connections and ensure that none use the old Service account and password. Follow up with necessary teams to provide access to data sources that used the old credentials.
Video Demonstration
Related Content
How to change the share path in Qlik Sense (Service Cluster)
-
Qlik Sense Enterprise on Windows and Security Vulnerability CVE-2025-29927
CVE-2025-29927 is a critical authorization bypass vulnerability in Next.js, a popular React framework. Specifically, it allows attackers to circumvent... Show MoreCVE-2025-29927 is a critical authorization bypass vulnerability in Next.js, a popular React framework. Specifically, it allows attackers to circumvent security checks within a Next.js application if those checks are performed using middleware.
Is Qlik Sense Enterprise on Windows affected by this Security Vulnerability CVE-2025-29927?
Resolution
Qlik Sense Enterprise on Windows is not affected by this Security Vulnerability CVE-2025-29927.
Next.js is not used in any of the on-premise Qlik Sense core services, such as the QMC or Hub.
Environment
- Qlik Sense Enterprise on Windows
-
Qlik Cloud Analytics: Outer set expression with empty selection
An outer set expression returns an empty selection. Examples: Table 1 has source data that contains a row for Beta. The inner set and outer set behave... Show MoreAn outer set expression returns an empty selection.
Examples:
Table 1 has source data that contains a row for Beta. The inner set and outer set behave identically.
Table 2 has source data that does not include a row for Beta. The inner set and outer set behave differently.
Resolution
To make this work as expected, add “&” to the beginning of the set expression.
Example:
{&<group1={'Beta'}>} sum( {&<Company1={'A'}>} salary1)
Cause
This behavior is caused by how sets are combined when using multiple set expressions.
If the outer set expression produces an empty set, it is ignored when the inner set expression is evaluated.
The result is that only the inner set expression is used.
Internal Investigation ID(s)
SUPPORT-3523
Environment
- Qlik Cloud Analytics
-
Qlik Talend Product: Error 400 - Invalid SNI error after Installed Talend Runtim...
You may encounter an error : 400 - Invalid SNI when calling Talend Runtime API (Job as service) after installed 2025-02 patch or later. In the past be... Show MoreYou may encounter an error : 400 - Invalid SNI when calling Talend Runtime API (Job as service) after installed 2025-02 patch or later. In the past before the patch version R2025-02 of Talend Runtime Server, it did work well when using the same certificate for SSL connection with Talend Runtime Server and did not cause any issue.
The SNI validation is active after 2025-02 patch or later.
Resolution
There are three options to slove this issue
- Obtain and install a proper certificate that references the correct host name and then access it with the hostname rather than by IP.
- Disable SNI host check.
- Tell Talend component to resolve IP as hostname
Disable SNI Host Check
This has the same security risk as jetty before it was updated (low security)
In <RuntimeInstallationFolder>/etc/org.ops4j.pax.web.cfg file, please add
jetty.ssl.sniRequired=false
and
jetty.ssl.sniHostCheck=false
Or configuring these jetty parameters in <RuntimeInstallationFolder>/etc/jetty.xml or jetty-ssl.xml file
- Find the <New class="org.eclipse.jetty.server.SecureRequestCustomizer"> block in your jetty.xml or jetty-ssl.xml
- Edit the <Arg name="sniRequired" ...> and <Arg name="sniHostCheck" ...> lines so that the properties' defaults are set to false as shown below:
<New id="sslHttpConfig" class="org.eclipse.jetty.server.HttpConfiguration">
<Arg><Ref refid="httpConfig"/></Arg>
<Call name="addCustomizer">
<Arg>
<New class="org.eclipse.jetty.server.SecureRequestCustomizer">
<Arg name="sniRequired" type="boolean">
<Property name="jetty.ssl.sniRequired" default="false"/>
</Arg>
<Arg name="sniHostCheck" type="boolean">
<Property name="jetty.ssl.sniHostCheck" default="false"/>
</Arg>
<Arg name="stsMaxAgeSeconds" type="int">
<Property name="jetty.ssl.stsMaxAgeSeconds" default="-1"/>
</Arg>
<Arg name="stsIncludeSubdomains" type="boolean">
<Property name="jetty.ssl.stsIncludeSubdomains" default="false"/>
</Arg>
</New>
</Arg>
</Call>
</New>
Resolve IP to Hostname
If the certification includes the domain name, you should use that domain name instead of the IP with the Jetty security updates in Talend Runtime Server.
But if your DNS server does not resolve the IP, you must call it by the IP address, so please check it at first to see if the workaround is feasible for your current situation.
In the examples the hostname is unresolvedhost.net and the IP is 10.20.30.40.
Try this API call at the command line:
curl -k -X GET --resolve unresolvedhost.net:9001:10.20.30.40 https://unresolvedhost.net:9001/services/
or
curl -k -X GET -H "Host: unresolvedhost.net" https://10.20.30.20:9001/services/
If this works, in your Talend component that makes the API call, go to "Advanced settings" or "Headers" table, add a row with Key: Host and Value: The hostname that matches your SSL certificate (e.g. unresolvedhost.net)
This will instruct Talend to send the correct Host header, which most HTTP clients (including Java's HttpClient) will also use as the SNI value during the TLS handshake.
Cause
The SNI enforcement is there for a security reason. With the 2025-02 patch, the Jetty components on Talend Runtime Server resolved a CVE security issue where they allowed a hostname to connect to a server that doesn't match the hostname in the server's TLS certificate.
Certificates require the URI not to be localhost or an IP address, and to have at least one dot, so a fully qualified domain name is best.
Related Content
Environment
-
Talend Cloud and AWS System Manager Parameter Store for Talend context variables
AWS System Manager (SM), an AWS service, can be used to view and control infrastructures on AWS. It offers automation documents to simplify common m... Show MoreAWS System Manager (SM), an AWS service, can be used to view and control infrastructures on AWS. It offers automation documents to simplify common maintenance and deployment tasks of AWS resources.
AWS SM consists of a collection of capabilities related to automation, such as infrastructure maintenance and deployment tasks of AWS resources as well as some related to Application Management and Configuration. Among them, is a capability called Parameter Store.
AWS System Manager Parameter Store
AWS Systems Manager (SM) Parameter Store provides secure, hierarchical storage for configuration data management and secrets management.
It allows you to store data such as passwords, database strings, and license codes as parameter values.
AWS SM Parameter Store benefits
Parameter Store offers the following benefits and features for Talend Jobs.
-
Secured, highly scalable, hosted service with NO SERVERS to manage: compared to the setup of a dedicated database to store Job context variables.
-
Control access at granular levels: specify who can access a specific parameter or set of parameters (for example, DB connection) at the user or group level. Using IAM roles, you can restrict access to parameters, which can have nested paths that can be used to define ACL-like access constraints. This is important for the control access of Production environment parameters.
-
Audit access: track the last user who created or updated a specific parameter value.
-
Encryption of data at rest and in transit: parameter values can be stored as plaintext (unencrypted data) or ciphertext (encrypted data). For encrypted value, KMS: AWS Key Management Service is used behind the scenes. Hence, Talend context variables with a Password type can be stored and retrieved securely without the implementation of a dedicated encryption/decryption process.
Another benefit of the AWS SM Parameter Store is its usage cost.
AWS SM Parameter Store pricing
AWS SM Parameter Store consists of standard and advanced parameters.
Standard parameters are available at no additional charge. The values are limited to 4 KB size, which should cover the majority of Talend Job use cases.
With advanced parameters (8 KB size), you are charged based on the number of advanced parameters stored each month and per API interaction.
Pricing example
Assume you have 5,000 parameters, of which 500 are advanced. Assume that you have enabled higher throughput limits and interact with each parameter 24 times per day, equating to 3,600,000 interactions per 30-day month. Because you have enabled higher throughput, your API interactions are charged for standard and advanced parameters. Your monthly bill is the sum of the cost of the advanced parameters and the API interactions, as follows: Cost of 500 advanced parameters = 500 * $0.05 per advanced parameter = $25 Cost of 3.6M API interactions = 3.6M * $0.05 per 10,000 interactions = $18 Total monthly cost = $25 + $18 = $43.
For more information on pricing, see the AWS Systems Manager pricing web site.
About parameters
A Parameter Store parameter is any piece of configuration data, such as a password or connection string, that is saved in the Store. You can centrally and securely reference this data in a Talend Job.
The Parameter Store provides support for three types of parameters:
- String
- String List
- Secure String
Organizing parameters into hierarchies
In Talend, context variables are stored as a list of key-value pairs independent of the physical storage (Job, file, or database). Managing numerous parameters as a flat list is time-consuming and prone to errors. It can also be difficult to identify the correct parameter for a Talend Project or Job. This means you might accidentally use the wrong parameter, or you might create multiple parameters that use the same configuration data.
Parameter Store allows you to use parameter hierarchies to help organize and manage parameters. A hierarchy is a parameter name that includes a path that you define by using forward slashes (/).
The following example uses three hierarchy levels in the name:
/Dev/PROJECT1/max_rows
AWS SM Parameter Store with Talend Job
Parameter Store can accede from the AWS Console, AWS CLI, or the AWS SDK, including Java. Talend Studio leverage the AWS Java SDK to connect numerous Amazon Services, but, as yet, not to Amazon System Manager.
Implementation of AWS SM Parameter Store connector
This initial implementation solely uses the current capabilities of Studio, such as Routines and Joblets.
A future version will leverage the Talend Component Development Kit (CDK) to build a dedicated connector for AWS System Manager.
Routine
The connector was developed in Java using the AWS SDK and exported as an UberJar (single JAR with all his dependencies embedded in it).
The AWSSSMParameterStore-1.0.0.jar file (attached to this article) is imported into the Studio local Maven Repository and then used as a dependency in the AwsSSMParameterStore Talend routine.
The routine provides a set of high-level APIs/functions of the Parameter Store for Talend Jobs.
package routines; import java.util.Map; import org.apache.commons.logging.Log; import org.apache.commons.logging.LogFactory; import com.talend.ps.engineering.AWSSMParameterStore; public class AwsSSMParameterStore { private static final Log LOG = LogFactory.getLog(AwsSSMParameterStore.class); private static AWSSMParameterStore paramsStore; /* * init * * Create a AWSSMParameterStore client based of the credentials parameters. * Follows the "Default Credential Provider Chain". * See https://docs.aws.amazon.com/sdk-for-java/v1/developer-guide/credentials.html * * Parameters: * accessKey : (Optional) AWS Access Key * secretKey : (Optional) AWS Secret Key * region : (Optional) AWS Region * * Return: * Boolean : False if invalid combination of parameters */ public static boolean init(String accessKey, String secretKey, String region) { ... } /* * loadParameters * * Retrieve all the parameters recursively with the path a prefix in their name * * Parameters: * path : Parameter path prefix for the parameters * * Return: * Map of name, value pair of parameters */ public static Map<String, String> loadParameters(String path){ ... } /* * saveParameter * * Retrieve all the parameters recursively with the path a prefix in their name * * Parameters: * name : Name of the parameter * value : Value of the parameter * encrypt : Encrypt the value the value in the Parameter Store * * Return: * Boolean : False if the save failed */ public static boolean saveParameter(String name, Object value, boolean encrypt) { ... } }
The init function creates the connector to AWS SSM using the AWS Default Credential Provider Chain.
The loadParameters function connects to the Parameter Store and retrieves a set/hierarchy of parameters prefixed with a specific path (see the naming convention for the parameters below).
The result is returned as a Map key-value pair.
Important: In the returned Map, the key represents only the last part of the parameter name path. If the parameter name is: /Dev/PROJECT1/max_rows, the returned Map key for this parameter is max_rows.
The saveParameter function allows you to save a context parameter name and value (derived from a context variable) to the Parameter Store.
Joblets
Two Joblets were developed to connect to the AWS Parameter Store through the routine. One is designed to initialize the context variables of a Job using the parameters from the AWS Parameter Store. The other, as a utility for a Job to store its context variables into the Parameter Store.
Joblet: SaveContextVariableToAwsSSMParameterStore
The Joblet uses a tContextDump component to generate the context variables dataset with the standard key-value pair schema.
The tJavaFlex component is used to connect to the Parameter Store and save the context variables as parameters with a specific naming convention.
Parameter hierarchies naming convention for Talend context variables
In the context of context variables, the choice is to use a root prefix (optional) /talend/ to avoid any potential collision with the existing parameter name.
The prefix is appended with a string representing a runtime environment, for example, dev, qa, and prod. This to mimic the concept of the context environment found in the Job Contexts:
The parameter name is then appended with the name of the Talend Project (which is extracted from the Job definition) and, finally the name of the variable.
Parameter naming convention:
/talend/<environment name>/<talend project name>/<context variable name>
Example Job: job1 with a context variable ctx_var1 in a Talend Project PROJECT1.
The name of the parameter for the ctx_var1 variable in a development environment (identified by dev), is:
/talend/dev/PROJECT1/ctx_var1
For a production environment, prod, the name is:
/talend/prod/PROJECT1/ctx_var1
One option is to use the Job name as well in the hierarchy of the parameter name:
/talend/prod/PROJECT1/job1/ctx_var1
However, due to the usage of Talend Metadata connection, Context Group, and other that are shared across multiple Jobs, the usage of the Job name will result in multiple references of a context variable in the Parameter Store.
Moreover, if a value in the Context Group changes, the value needs to be updated in all the parameters for this context variable, which defies the purpose of the context group.
Joblet context variables
The Joblet uses a dedicated context group specific to the interaction with the Parameter Store.
-
AWS Access & Secret keys to connect to AWS. As mentioned earlier, the routine leverages AWS Default Credential Provider Chain. If these variables are not initialized, the SDK looks for Environment variables or the ~/.aws/Credential (user directory on Windows ) or EC2 roles to infer the right credentials.
-
AWS region of the AWS SM Parameter Store.
-
Parameter Store prefix and environment used in the parameter path as described above in the naming convention.
Joblet: LoadContextVariablesFromAwsSSMParmeterStore
The second Joblet is used to read parameters from The Parameter Store and update the Job context variables.
The Joblet uses a tJavaFlex component to connect to SSM Parameter Store, leveraging the AwsSSMParameterStore.loadParameters routine function described above. It retrieves all the parameters based on the prefix path (see the defined naming convention above).
The tContextLoad use the tJavaflex output key-value pair dataset, to overwrite the default values of the context variables.
Joblet context variables
The load Joblet uses the same context group as the save counterpart.
Sample Talend Job
The sample Talend Job, generates a simple people's dataset using the tRowGenerator (first name, last name, and age), applies some transformations, and segregates the rows by age to create two distinct datasets, one for Adults ( age > 18) and one for Teenagers.
The two datasets are then inserted into a MySQL database in their respective tables.
The Job contains a mix of context variables, some are coming from a group defined for the MySQL Metadata Connection and some are specific to the Job: max_rows, table_adults, and table_teenagers.
Create Parameter Store entries for the context variables
The first step is to create all the parameters in the Parameter Store for the Job context variables. This can be done using the AWS console or through the AWS CLI, but those methods can be time-consuming and error-prone.
Instead, use the dedicated SaveContextVariableToAwsSSMParameterStore Joblet.
You need to drag-and-drop the Joblet into the Job canvas. There is no need to connect it to the rest of the Job components. It lists all the context variables, connects to AWS SM Parameter Store, creates the associated parameters, and stops the Job.
When the Job is executed, the System Manager Parameter Store web console should list the newly created parameters.
On the AWS console, the first column is not resizable, to see the full name of a parameter, you'll need to hide some of the columns.
You can also click a specific parameter to see the details.
For context variables defined with a Password type, the associated parameter is created as SecureString, which allows the value to be encrypted at rest in the store.
Talking about security, IAM access control can be leveraged to restrict access to a specific Operation team or to restrict access of a specific set of parameters such as production parameters: /talend/prod/*; developers will have access solely to the dev environment-related parameters, for example:
{ "Version": "2012-10-17", "Statement": [ { "Sid": "VisualEditor0", "Effect": "Allow", "Action": [ // Allows to decrypt secret parameters "kms:Decrypt", "ssm:DescribeParameters" ], "Resource": "*" }, { "Sid": "VisualEditor1", "Effect": "Allow", "Action": [ "ssm:PutParameter", "ssm:LabelParameterVersion", "ssm:DeleteParameter", "ssm:GetParameterHistory", "ssm:GetParametersByPath", "ssm:GetParameters", "ssm:GetParameter", "ssm:DeleteParameters" ], // Grant access only to dev parameters "Resource": "arn:aws:ssm:AWS-Region:AWS-AccountId:parameter/talend/dev/*" } ] }
Talend Cloud Job
In the context of a Talend Cloud Job/Task, the context variables don't need to be exported as connections or resources for Talend Cloud as they are initialized from the AWS Parameter Store.
You can only create a connection for the AWS SM Parameter Store credentials and config parameters.
Custom connection for AWS SM Parameter Store
The context group for the AWS SM Parameter Store, is externalized as Talend Cloud Custom Connection because, as yet, Talend Cloud doesn't have a native connector for AWS System Manager.
Talend Cloud Task
In Studio, you create a new Talend Cloud task by publishing the Job artifact to the cloud.
You'll then add the custom connection for AWS SM.
The additional context variables are exposed as advanced parameters, including the database connection parameters that are initialized from the Parameter Store.
A successful task execution on a cloud or Remote Engine means that the Job can connect to AWS SM, retrieve the parameters based on the naming convention set above, and initialize the corresponding context variables to allows the Job to connect to the MySQL database and create the requested tables.
-
-
Upgrading and unbundling the Qlik Sense Repository Database using the Qlik Postg...
In this article, we walk you through the requirements and process of how to upgrade and unbundle an existing Qlik Sense Repository Database (see suppo... Show MoreIn this article, we walk you through the requirements and process of how to upgrade and unbundle an existing Qlik Sense Repository Database (see supported scenarios) as well as how to install a brand new Repository based on PostgreSQL. We will use the Qlik PostgreSQL Installer (QPI).
For a manual method, see How to manually upgrade the bundled Qlik Sense PostgreSQL version to 12.5 version.
Using the Qlik Postgres Installer not only upgrades PostgreSQL; it also unbundles PostgreSQL from your Qlik Sense Enterprise on Windows install. This allows for direct control of your PostgreSQL instance and facilitates maintenance without a dependency on Qlik Sense. Further Database upgrades can then be performed independently and in accordance with your corporate security policy when needed, as long as you remain within the supported PostgreSQL versions. See How To Upgrade Standalone PostgreSQL.
Index
- Supported Scenarios
- Upgrades
- New installs
- Requirements
- Known limitations
- Installing anew Qlik Sense Repository Database using PostgreSQL
- Qlik PostgreSQL Installer - Download Link
- Upgrading an existing Qlik Sense Repository Database
- The Upgrade
- Next Steps and Compatibility with PostgreSQL installers
- How do I upgrade PostgreSQL from here on?
- Troubleshooting and FAQ
- Related Content
Video Walkthrough
Video chapters:
- 01:02 - Intro to PostgreSQL Repository
- 02:51 – Prerequisites
- 03:24 - What is the QPI tool?
- 05:09 - Using the QPI tool
- 09:27 - Removing the old Database Service
- 11:27 - Upgrading a stand-alone to the latest release
- 13:39 - How to roll-back to the previous version
- 14:46 - Troubleshooting upgrading a patched version
- 18:25 - Troubleshooting upgrade security error
- 21:15 - Additional config file settings
Supported Scenarios
Upgrades
The following versions have been tested and verified to work with QPI (1.4.0):
Qlik Sense February 2022 to Qlik Sense November 2023.
If you are on a Qlik Sense version prior to these, upgrade to at least February 2022 before you begin.
Qlik Sense November 2022 and later do not support 9.6, and a warning will be displayed during the upgrade. From Qlik Sense August 2023 a upgrade with a 9.6 database is blocked.
New installs
The Qlik PostgreSQL Installer supports installing a new standalone PostgreSQL database with the configurations required for connecting to a Qlik Sense server. This allows setting up a new environment or migrating an existing database to a separate host.
Requirements
- Review the QPI Release Notes before you continue
-
Using the Qlik PostgreSQL Installer on a patched Qlik Sense version can lead to unexpected results. If you have a patch installed, either:
- Uninstall all patches before using QPI (see Installing and Uninstalling Qlik Sense Patches) or
- Upgrade to an IR release of Qlik Sense which supports QPI
- The PostgreSQL Installer can only upgrade bundled PostgreSQL database listening on the default port 4432.
- The user who runs the installer must be an administrator.
- The backup destination must have sufficient free disk space to dump the existing database
- The backup destination must not be a network path or virtual storage folder. It is recommended the backup is stored on the main drive.
- There will be downtime during this operation, please plan accordingly
- If upgrading to PostgreSQL 14 and later, the Windows OS must be at least Server 2016
Known limitations
- Cannot migrate a 14.8 embedded database to a standalone
- Using QPI to upgrade a standalone database or a database previously unbundled with QPI is not supported.
- The installer itself does not provide an automatic rollback feature.
Installing a new Qlik Sense Repository Database using PostgreSQL
- Run the Qlik PostgreSQL Installer as an administrator
- Click on Install
- Accept the Qlik Customer Agreement
- Set your Local database settings and click Next. You will use these details to connect other nodes to the same cluster.
- Set your Database superuser password and click Next
- Set the database installation folder, default: C:\Program Files\PostgreSQL\14
Do not use the standard Qlik Sense folders, such as C:\Program Files\Qlik\Sense\Repository\PostgreSQL\ and C:\Programdata\Qlik\Sense\Repository\PostgreSQL\.
- Set the database data folder, default: C:\Program Files\PostgreSQL\14\data
Do not use the standard Qlik Sense folders, such as C:\Program Files\Qlik\Sense\Repository\PostgreSQL\ and C:\Programdata\Qlik\Sense\Repository\PostgreSQL\.
- Review your settings and click Install, then click Finish
- Start installing Qlik Sense Enterprise Client Managed. Choose Join Cluster option.
The Qlik PostgreSQL Installer has already seeded the databases for you and has created the users and permissions. No further configuration is needed. - The tool will display information on the actions being performed. Once installation is finished, you can close the installer.
If you are migrating your existing databases to a new host, please remember to reconfigure your nodes to connect to the correct host. How to configure Qlik Sense to use a dedicated PostgreSQL database
Qlik PostgreSQL Installer - Download Link
Download the installer here.Qlik PostgreSQL installer Release Notes
Upgrading an existing Qlik Sense Repository Database
The following versions have been tested and verified to work with QPI (1.4.0):
February 2022 to November 2023.
If you are on any version prior to these, upgrade to at least February 2022 before you begin.
Qlik Sense November 2022 and later do not support 9.6, and a warning will be displayed during the upgrade. From Qlik Sense August 2023 a 9.6 update is blocked.
The Upgrade
- Stop all services on rim nodes
- On your Central Node, stop all services except the Qlik Sense Repository Database
- Run the Qlik PostgreSQL Installer. An existing Database will be detected.
- Highlight the database and click Upgrade
- Read and confirm the (a) Installer Instructions as well as the Qlik Customer Agreement, then click (b) Next.
- Provide your existing Database superuser password and click Next.
- Define your Database backup path and click Next.
- Define your Install Location (default is prefilled) and click Next.
- Define your database data path (default is prefilled) and click Next.
- Review all properties and click Upgrade.
The review screen lists the settings which will be migrated. No manual changes are required post-upgrade. - The upgrade is completed. Click Close.
- Open the Windows Services Console and locate the Qlik Sense Enterprise on Windows services.
You will find that the Qlik Sense Repository Database service has been set to manual. Do not change the startup method.
You will also find a new postgresql-x64-14 service. Do not rename this service.
- Start all services except the Qlik Sense Repository Database service.
- Start all services on your rim nodes.
- Validate that all services and nodes are operating as expected. The original database folder in C:\ProgramData\Qlik\Sense\Repository\PostgreSQL\X.X_deprecated
-
Uninstall the old Qlik Sense Repository Database service.
This step is required. Failing to remove the old service will lead the upgrade or patching issues.
- Open a Windows File Explorer and browse to C:\ProgramData\Package Cache
- From there, search for the appropriate msi file.
If you were running 9.6 before the upgrade, search PostgreSQL.msi
If you were running 12.5 before the upgrade, search PostgreSQL125.msi - The msi will be revealed.
- Right-click the msi file and select uninstall from the menu.
- Open a Windows File Explorer and browse to C:\ProgramData\Package Cache
- Re-install the PostgreSQL binaries. This step is optional if Qlik Sense is immediately upgraded following the use of QPI. The Sense upgrade will install the correct binaries automatically.
Failing to reinstall the binaries will lead to errors when executing any number of service configuration scripts.
If you do not immediately upgrade:
- Open a Windows File Explorer and browse to C:\ProgramData\Package Cache
- From there, search for the .msi file appropriate for your currently installed Qlik Sense version
For Qlik Sense August 2023 and later: PostgreSQL14.msi
Qlik Sense February 2022 to May 2023: PostgreSQL125.msi - Right-click the file
- Click Open file location
- Highlight the file path, right-click on the path, and click Copy
- Open a Windows Command prompt as administrator
- Navigate to the location of the folder you copied
Example command line:
cd C:\ProgramData\Package Cache\{GUID}
Where GUID is the value of the folder name. - Run the following command depending on the version you have installed:
Qlik Sense August 2023 and later
msiexec.exe /qb /i "PostgreSQL14.msi" SKIPINSTALLDBSERVICE="1" INSTALLDIR="C:\Program Files\Qlik\Sense"
Qlik Sense February 2022 to May 2023
msiexec.exe /qb /i "PostgreSQL125.msi" SKIPINSTALLDBSERVICE="1" INSTALLDIR="C:\Program Files\Qlik\Sense"
This will re-install the binaries without installing a database. If you installed with a custom directory adjust the INSTALLDIR parameter accordingly. E.g. you installed in D:\Qlik\Sense then the parameter would be INSTALLDIR="D:\Qlik\Sense".
- Open a Windows File Explorer and browse to C:\ProgramData\Package Cache
- Finalize the process by updating the references to the PostgreSQL binaries paths in the SetupDatabase.ps1 and Configure-Service.ps1 files. For detailed steps, see Cannot change the qliksenserepository password for microservices of the service dispatcher: The system cannot find the file specified.
If the upgrade was unsuccessful and you are missing data in the Qlik Management Console or elsewhere, contact Qlik Support.
Next Steps and Compatibility with PostgreSQL installers
Now that your PostgreSQL instance is no longer connected to the Qlik Sense Enterprise on Windows services, all future updates of PostgreSQL are performed independently of Qlik Sense. This allows you to act in accordance with your corporate security policy when needed, as long as you remain within the supported PostgreSQL versions.
Your PostgreSQL database is fully compatible with the official PostgreSQL installers from https://www.enterprisedb.com/downloads/postgres-postgresql-downloads.
How do I upgrade PostgreSQL from here on?
See How To Upgrade Standalone PostgreSQL, which documents the upgrade procedure for either a minor version upgrade (example: 14.5 to 14.8) or a major version upgrade (example: 12 to 14). Further information on PostgreSQL upgrades or updates can be obtained from Postgre directly.
Troubleshooting and FAQ
- If the installation crashes, the server reboots unexpectedly during this process, or there is a power outage, the new database may not be in a serviceable state. Installation/upgrade logs are available in the location of your temporary files, for example:
C:\Users\Username\AppData\Local\Temp\2
A backup of the original database contents is available in your chosen location, or by default in:
C:\ProgramData\Qlik\Sense\Repository\PostgreSQL\backup\X.X
The original database data folder has been renamed to:
C:\ProgramData\Qlik\Sense\Repository\PostgreSQL\X.X_deprecated - Upgrading Qlik Sense after upgrading PostgreSQL with the QPI tool fails with:
This version of Qlik Sense requires a 'SenseServices' database for multi cloud capabilities. Ensure that you have created a 'SenseService' database in your cluster before upgrading. For more information see Installing and configuring PostgreSQL.
See Qlik Sense Upgrade fails with: This version of Qlik Sense requires a _ database for _.
To resolve this, start the postgresql-x64-XX service.
The information in this article is provided as-is and to be used at own discretion. Depending on tool(s) used, customization(s), and/or other factors ongoing support on the solution below may not be provided by Qlik Support. The video in this article was recorded in a earlier version of QPI, some screens might differ a little bit.
Related Content
Qlik PostgreSQL installer version 1.3.0 Release Notes
Techspert Talks - Upgrading PostgreSQL Repository Troubleshooting
Backup and Restore Qlik Sense Enterprise documentation
Migrating Like a Boss
Optimizing Performance for Qlik Sense Enterprise
Qlik Sense Enterprise on Windows: How To Upgrade Standalone PostgreSQL
How-to reset forgotten PostgreSQL password in Qlik Sense
How to configure Qlik Sense to use a dedicated PostgreSQL database
Troubleshooting Qlik Sense Upgrades -
Where to find and how to download Qlik Sense app demos?
You can access Qlik Sense demo apps from https://demos.qlik.com/. If you are looking for real-life business examples, our Qlik Gallery hosts a platfor... Show MoreYou can access Qlik Sense demo apps from https://demos.qlik.com/.
If you are looking for real-life business examples, our Qlik Gallery hosts a platform meant for sharing apps, outcomes, and ideas. It is not restricted to Qlik-provided demos and mostly consists of customer examples.
If you are looking to download demo apps:
- Go to https://demos.qlik.com/
- Select Downloadable from the available Tags
- Open the App
- A Download button will be available (if logged in)
-
Qlik Talend Data Integration: java.lang.NoClassDefFoundError: Could not initiali...
When attempting to deploy a Route in Talend Runtime, encountered the following error: java.lang.NoClassDefFoundError: Could not initialize class org.t... Show MoreWhen attempting to deploy a Route in Talend Runtime, encountered the following error:
java.lang.NoClassDefFoundError: Could not initialize class org.talend.components.common.oauth.Jwt
Caused by: java.lang.NoClassDefFoundError: org/codehaus/jackson/map/ObjectMapperCause
Missing bundles for Jackson library.
Resolution
Launch the Karaf console (./trun) located in the <Runtime-Home>\bin directory, and then execute the following command:
list | grep Jackson
If the following bundles are not installed please install them using the command below:
bundle:install -s mvn:org.codehaus.jackson/jackson-mapper-asl/1.9.16-TALEND bundle:install -s mvn:org.codehaus.jackson/jackson-core-asl/1.9.16-TALEND
Furthermore, you have the option to install the feature that encompasses all necessary bundles by running:
feature:install camel-jackson-avro
Environment
- Talend Studio R2025-02
- Talend Runtime R2025-02
- Azul Java 17
-
Using Qlik Application Automation to create and distribute Excel reports in Offi...
With Qlik Application Automation, you can get data out of Qlik Cloud and distributing it to different users in formatted Excel. The workflow can be au... Show MoreWith Qlik Application Automation, you can get data out of Qlik Cloud and distributing it to different users in formatted Excel. The workflow can be automated by leveraging the connectors for Office 365, specifically Microsoft SharePoint and Microsoft Excel.
Here I share two example Qlik Application Automation workspaces that you can use and modify to suit your requirements.
Content:
Video:
Considerations
- This example is built on distributing a SharePoint link. It is also possible to use attachments with the Mail block (see Creating a Qlik Reporting Service report).
- Qlik Application Automation has a limit of 100,000 rows when getting data out of a Qlik Sense straight table object.
- The On-Demand example uses an extension in QSE SaaS to send data to the Automation. An update to the Qlik Sense Button object is expected soon, which will provide a native way to pass selections to an Automation.
Example 1: Scheduled Reports
- Download the 'Scheduled Report.json' file attached to this document.
- Create a new Automation in QSE SaaS, give it a name, and then upload the workspace you just downloaded by right clicking in the editor canvas, and selecting 'Upload workspace'.
- Select the 'Create Binary File (Personal One Drive)' block, select 'Connection' in the block configurator to the right, and then create your connection to Microsoft SharePoint.
- Select the 'Get Straight Table Data' block. Under 'Inputs' in the block configurator, lookup your the App Id, Sheet Id, and Object Id for the relevant QSE SaaS table you wish to output.
- Select the 'Create Excel Table With Headers' block, select 'Connection' in the block configurator, and then create your connection to Microsoft Excel.
- Select the 'Send Mail' block. Under 'Inputs' in the block configurator update the 'To' to reflect the addresses you wish to deliver to.
- With the 'Send Mail' block still selected, select 'Connection' in the block configurator and add your Sender details.
- To test, Save and then Run the Automation
- If you receive any warnings or errors, navigate to the relevant blocks and ensure your Connection is selected in the block configurator.
- Select the 'Start' block. Under 'Inputs' in the block configurator, change Run Mode to Scheduled and define your required schedule.
Example 2: On-Demand Reports
Note - These instructions assume you have already created connections as required in Example 1.
- Download the 'On-Demand Report v3.json' file attached to this document.
- Download and install the 'qlik-blends' extension. See:
https://github.com/rileymd88/qlik-blends/files/6378232/qlik-blends.zip - Create a new Automation in QSE SaaS, give it a name, and then upload the workspace you just downloaded by right-clicking in the editor canvas, and selecting 'Upload workspace'.
- Ensure your Connections are selected in the block configurator for each of the following blocks, 'Create Binary File (Personal One Drive)', 'Create Excel Table With Headers', 'Add Rows To Excel Worksheet Table (Batch)', 'Create Sharing Link', and 'Send Mail'. Save the Automation.
- Select the 'Start' block and ensure Run Mode is set to Triggered. Make note of the of URL and Execution Token shown in the POST example.
- Open your chosen QSE SaaS application, and Edit the Sheet where you wish to add a Button to trigger an On-Demand report.
- Under 'Custom Objects' look for 'qlik-blends' from the Extensions menu and drag this into your Sheet.
- Under the 'Blend' properties to the right, add-in your POST webhook URL and Token as noted in Step 5.
- We will now add three measures to the 'qlik-blends' object. It is important you add them in the order described. Add the first measure, using the following function in the expression editor: GetCurrentSelections()
- Add the second measure, using the following function DocumentName()
- The final measure will be the Object ID of the table you wish to use. To find the Object ID, select 'Done Editing'. Then right click on the table, select 'Share', select 'Embed', then look for the Object ID under the preview. Copy this value, go back into Editing mode and paste this as your third measure value.
- With the 'qlik-blends' object selected, under Form select 'Add items'. For 'Item type' select Text. Under default value you can choose to add a default email address. For 'Label' and 'Reference' type 'Email'. It is critical that Reference is updated to 'Email'. Turn 'Required input' on.
- You can change the Appearance properties to suit your preferences, such as updating the Button label and message, enabling Dialog, and changing the Color under Theme.
- Back in the Automation, under the Start Block. Set 'Run asynchronously' = yes to allow multiple requests to run at the same time (This will also increase the max run time from 1min to 60min)
- Once happy, test the On-Demand report by entering an email and clicking the button.
This On-Demand Report Automation can be used across multiple apps and tables. Simply copy the extension object between apps & sheets, and update the Object ID (Measure 3) for each instance.
Environment
- Qlik Application Automation
- Qlik Cloud
- Microsoft Office 365
The information in this article is provided as-is and to be used at own discretion. Depending on tool(s) used, customization(s), and/or other factors ongoing support on the solution below may not be provided by Qlik Support.
-
Qlik Talend Product Q and A: How To Optimize Campaign Performance in Talend Data...
Question I Is there any limitation on Talend Data Stewardship storage space for campaigns/tasks creation? For Talend Cloud Data Stewardship, regardin... Show MoreQuestion I
Is there any limitation on Talend Data Stewardship storage space for campaigns/tasks creation?
For Talend Cloud Data Stewardship, regarding of Qlik Talend Help Documentation managing-tasks, this is a Warning: "You can store up to 20 GB of tasks in Talend Cloud Data Stewardship per account".
The 20GB of tasks size is provided by mongo db stats on collections tds_tasks
For Talend on-prem Data Stewardship, there is no hard/soft Limit by default on storage per on-prem version as long as Talend Data Stewardship installation server has sufficient storage, please make sure to keep cleaning the old/unused campaigns/tasks to release storage from performance consideration.
Question II
Where to view the configuration about tds.tasks.storage.limit.mode and how to define the Limit?
There are 3 ways to define the limit:
- For Cloud:
- Talend Data Stewardship leverages the platform service tpsv-config service to define the max size. To do so, a property name tds_tasks_max_storage can be configured for a given tenant.
- In case this property is not defined, a default value 20GB will be applied.
- Manual Setting (onprem/hybrid)
- Set the limit mode to manual setting
tds.tasks.storage.limit.mode=manualSetting
- Define the limit via the property
tds.tasks.storage.limit.value= The defaut value is 20GB
- Set the limit mode to manual setting
- No Limit
tds.tasks.storage.limit.mode=disabled No limit setting
This is the default configuration for onprem/hybrid and no need for any extra setting
Question III
How to optimize campaign performance in Talend Data Stewardship?
To optimize your campaign performances in Talend Data Stewardship, you need to, at least, make sure the number of tasks in the campaign does not exceed 100,000.
The performances of a campaign can be impacted by various elements. When a campaign contains more than 100,000 tasks, a warning is displayed next to a campaign name in the Campaigns tab or in the Tasks tab. The Campaign Owner needs to, at least, reduce the number of tasks in the campaign below this threshold to remove the warning message.For more information, please refer to Qlik Talend Help Documentation
campaign-performance-limitations
Besides, here are some elements to take into consideration to improve the performances of the campaign- The number of tasks
- The number of attributes in the data model: Talend recommends not to exceed 50 attributes
- The campaign type
- The number of task sources in a Grouping or Merging campaign
- The number and type of constraints
- The number of rules
- The number of campaigns depending on the same data model
Environment
- For Cloud:
-
Qlik Replicate: Mapping Oracle TIMESTAMP(6) WITH TIME ZONE to SQL Server Data Ty...
By default, the Oracle data type TIMESTAMP(6) WITH TIME ZONE is mapped to VARCHAR(38) in the SQL Server target when using Qlik Replicate. However, in ... Show MoreBy default, the Oracle data type TIMESTAMP(6) WITH TIME ZONE is mapped to VARCHAR(38) in the SQL Server target when using Qlik Replicate. However, in some cases, you may prefer to preserve a more compatible datetime format on the SQL Server side. Below are two workarounds to achieve this:
Map to DATETIMEOFFSET(6) in SQL Server
You can map TIMESTAMP(6) WITH TIME ZONE to DATETIMEOFFSET(6) using the following transformation to trim the input:
substr($TZ, 1, 26)
This transformation will remove the time zone information.
For example, the source value "2025-04-18 14:43:06.000000000 +08:00" will become "2025-04-18 14:43:06.000000".
Without applying this transformation, Qlik Replicate may raise an error:
Invalid character value specified for cast
Map to DATETIMEOFFSET(7) in SQL Server
To retain both the full precision and the time zone, map the Oracle data type to DATETIMEOFFSET(7) and use the following transformation:
substr($TZ, 1, 27) || substr($TZ, 30, 7)
This approach preserves both the 7-digit fractional seconds and the time zone.
For example, the Oracle source value "2025-04-18 14:43:06.000000000 +08:00" will be converted to "2025-04-18 14:43:06.0000000 +08:00" on the SQL Server side.
Environment
- Qlik Replicate, all versions
- Oracle Server, all versions
- Microsoft SQL Server, all versions
-
Qlik Replicate: Microsoft Fabric warehouse Invalid object name
With more than one data warehouse on your Microsoft Fabric target endpoint, the task may fail to find the target table and will produce the following ... Show MoreWith more than one data warehouse on your Microsoft Fabric target endpoint, the task may fail to find the target table and will produce the following error:
[TARGET_APPLY ]T: RetCode: SQL_ERROR SqlState: 42S02 NativeError: 208 Message: [Microsoft][ODBC Driver 18 for SQL Server][SQL Server]Invalid object name 'Schema.Table'. Line: 1 Column: -1 [1022502] (ar_odbc_stmt.c:5090)
The task may default to a different data warehouse than the one specified in your MS Fabric endpoint settings, which prevents the task from finding the target tables.
Resolution
The Internal Parameter additionalConnectionProperties can be applied to the Microsoft Fabric endpoint to ensure the right data warehouse is used.
Set the value to: database=DataWarehouseName
Where DataWarehouseName is the name of the warehouse you are trying to use under the Database name field in your MS Fabric endpoint.
For more information about Internal Parameters and now to set them, see Qlik Replicate: How to set Internal Parameters and what are they for?
Cause
This is caused by defect SUPPORT-2305 and affects tasks that use a default data warehouse other than the one specified in your MS Fabric endpoint. Symptoms can be found where this line of information does not match the value filled out in your MS Fabric Database name field:
[TARGET_APPLY ]T: ODBC database name: 'DifferentWarehouseName' (ar_odbc_conn.c:639)
Internal Investigation ID(s)
SUPPORT-2305
Environment
- Qlik Replicate
-
Qlik Replicate: Existing Tasks Fail to resume with 'Not Authorized to Use This F...
The following error is thrown when running a Qlik Replicate task without sufficient authorization on the required Function module: [AT_GLOBAL ]E... Show MoreThe following error is thrown when running a Qlik Replicate task without sufficient authorization on the required Function module:
[AT_GLOBAL ]E: java.lang.reflect.UndeclaredThrowableException com.sap.conn.jco.AbapException: (126) ERROR: ERROR Message 001 of class 00 type E, Par[1]: Not authorized to use this Function module java.lang.reflect.UndeclaredThrowableException at com.sun.proxy.$Proxy94.getTableList(Unknown Source)
[METADATA_MANAGE ]E: Failed to list datasets [1024719] (custom_endpoint_metadata.c:242)
[METADATA_MANAGE ]E: Failed to get the capture list from the endpoint [1024719] (metadatamanager.c:4527)
[TABLES_MANAGER ]E: Cannot get captured tables list [1024719] (tasktablesmanager.c:1267)
[TASK_MANAGER ]E: Build tables list failed [1024719] (replicationtask.c:2593)
[TASK_MANAGER ]E: Task 'TEST_2LIS_13_VAITM_DELTA' failed [1024719] (replicationtask.c:4020)Resolution
Grant the necessary authorizations for /QTQVC/RFC to the communication user.
Cause
The Qlik Replicate user (specifically the communication user) lacks authorization to execute function modules under /QTQVC/RFC. These modules are essential for the replication process.
The missing role provides the necessary permissions to run these function modules, which are used by Qlik Replicate to fetch metadata and extract data through 2LIS_* extractors.
Environment
- Qlik Replicate
-
How to retrieve Qlik Replicate table-level DML statistics data
How can we get detailed table-level DML profiling data from Qlik Replicate? Table-level DML profiling data can be retrieved by checking the Store chan... Show MoreHow can we get detailed table-level DML profiling data from Qlik Replicate?
Table-level DML profiling data can be retrieved by checking the Store change option when creating a Qlik Replicate task. See Store Changes Settings for details.
Once set, DML data will be saved in the target DB's <target_table>__ct table. DML statistics data can then be profiled from this table using customized SQL queries.
Environment
- Qlik Replicate
-
Where to find the Qlik Cloud status information and operational health data
Are you looking for status information on Qlik Cloud? Qlik makes data on uptime and incidents publicly available on status.qlikcloud.com. There, you ... Show MoreAre you looking for status information on Qlik Cloud?
Qlik makes data on uptime and incidents publicly available on status.qlikcloud.com.
There, you are able to:
- See the current operational health, from fully operational, to degraded, to full outages
- Review a history of past incidents beyond what is currently happening (login required)
- Subscribe to updates (login required)
What do I do if I experience an issue that is not reported?
Please contact Qlik Support.
How do I log in to the Qlik Cloud status page?
- Navigate to status.qlikcloud.com
- Click Login
- From here, view historical information and subscribe to updates