Search or browse our knowledge base to find answers to your questions ranging from account questions to troubleshooting error messages. The content is curated and updated by our global Support team
You can open and build an app on Qlik Cloud using a direct link.
To obtain the link:
Qlik Cloud
The information in this article is provided as-is and to be used at own discretion. Depending on tool(s) used, customization(s), and/or other factors ongoing support on the solution below may not be provided by Qlik Support.
Qlik offers a range of opportunities to assist you in troubleshooting, answering frequently asked questions, and contacting our experts. In this article, we guide you through all available avenues to secure your best possible experience.
For details on our terms and conditions, review the Qlik Support Policy.
Index:
We're happy to help! Here's a breakdown of resources for each type of need.
Support | Professional Services (*) | |
Reactively fixes technical issues as well as answers narrowly defined specific questions. Handles administrative issues to keep the product up-to-date and functioning. | Proactively accelerates projects, reduces risk, and achieves optimal configurations. Delivers expert help for training, planning, implementation, and performance improvement. | |
|
|
(*) reach out to your Account Manager or Customer Success Manager
Your first line of support: https://community.qlik.com/
Looking for content? Type your question into our global search bar:
Leverage the enhanced and continuously updated Knowledge Base to find solutions to your questions and best practice guides. Bookmark this page for quick access!
Subscribe to maximize your Qlik experience!
The Qlik Design Blog
The Design blog is all about product and Qlik solutions, such as scripting, data modelling, visual design, extensions, best practices, and more! (click)
The Product Innovation Blog
By reading the Product Innovation blog, you will learn about what's new across all of the products in our growing Qlik product portfolio. (click)
The Support Updates Blog
The Support Updates blog delivers important and useful Qlik Support information about end-of-product support, new service releases, and general support topics. (click)
Q&A with Qlik
Live sessions with Qlik Experts in which we focus on your questions).
Techspert Talks
Techspert Talks is a free webinar to facilitate knowledge sharing held on a monthly basis.
Qlik Fix
Qlik Fix is a series of short video with helpful solutions for Qlik customers and partners.
Suggest an idea, and influence the next generation of Qlik features!
Search & Submit Ideas
Ideation Guidelines
Get the full value of the community.
Register a Qlik ID:
Incidents are supported through our Chat, by clicking Contact Support on any Support Page across Qlik Community.
To raise a new issue, all you need to do is chat with us. With this, we can:
How to create a case using chat
Log in to manage and track your active cases in Manage Cases. (click)
Your advantages:
If you require a support case escalation, you have two options:
A collection of useful links.
Qlik Cloud Status Page
Keep up to date with Qlik Cloud's status.
Support Policy
Review our Service Level Agreements and License Agreements.
Live Chat and Case Portal
Your one stop to contact us.
Qlik Cloud is a modern analytics and data platform built on the same software engine as QlikView and Qlik Sense Client-Managed and adds significant value to empower everyone in an organization to make better decisions daily. Qlik Cloud allows you to use one common platform for all users – executives, decision-makers, and analysts.
Migrating to Qlik Cloud can help your organization:
This site provides you the tools to monitor, manage, and execute a migration from Client-Managed Qlik Sense to Qlik Cloud.
No two client-managed Qlik Sense Enterprise deployments are the same. And no two migrations will be the same. The processes, procedures, and instructions in this section shouldn’t be considered a cookbook. Rather, they’re meant to guide you.
The Qlik Cloud Migration Center provides a general approach to migration along with sequencing, strategy, and best practice recommendations. It also includes tools such as a Qlik Sense app, scripts, and worksheets to aid in planning elements of the migration.
If your organization has a complex deployment with custom tooling, or sophisticated or complicated data integration pipelines, consider contacting your Qlik Customer Support representative.
This site provides comparisons of QlikView and Qlik Cloud, as well as best practices on how to move content, including information about migration assessments and QlikView document conversions.
Qlik Enterprise Manager 2022.5.0.402
Microsoft SQL Server (MS-CDC) as a source endpoint
The Windows authentication and Database name fields are missing in Enterprise Manager when attempting to create the Microsoft SQL Server (MS-CDC) as a source endpoint. You will not be able to create the endpoint from this version of Enterprise Manager.
Create the Microsoft SQL Server (MS-CDC) endpoint from the Qlik Replicate console
To be fixed in Qlik Enterprise Manager version 2023.5.
RECOB-6602, RECOB-6154
If a Qlik Sense Desktop installation fails due to a software restriction policy, install it as a portable installation instead.
Remember to get authorization or approval from your organization's administration teams where necessary prior to installing software if the default installation failed due to policy restrictions.
The information in this article is provided as-is and to be used at own discretion. Depending on tool(s) used, customization(s), and/or other factors ongoing support on the solution below may not be provided by Qlik Support.
A Replicate task is entering a stop state on its own once every few hours. When the task is resumed manually, the same behavior is noticed in the next few hours. The log has no information on why the task has stopped.
Issue was due to an application defect which is fixed in version 2022.5.0.652 (SP3).
Component/Process: Kafka Target
Description: In rare scenarios, the task would end abnormally.
An environment may experience failing tasks after cloud disconnects or related communication issues. Reviewing the Qlik Replicate logs shows the error found more than one matching file for pattern even if no files with those names are found in the directory.
Found more than one matching file for pattern ^CDC00000008.tmp(#[0-9]+)?$ [1022406] (filestream.c:640)
00027860: 2022-11-04T20:30:01 [TARGET_APPLY ]E: Failed to init Target File producer. [1022406] (file_imp.c:4794)
00027860: 2022-11-04T20:30:01 [TARGET_APPLY ]E: Could not create CSV writer on table id: 201. [1022406] (file_apply.c:218)
00027860: 2022-11-04T20:30:01 [TARGET_APPLY ]E: create and insert file writer failed. [1022406] (file_apply.c:229)
00027860: 2022-11-04T20:30:01 [TARGET_APPLY ]E: Could not insert and create file writer for table ID: 201. [1022406] (file_apply.c:571)
00027860: 2022-11-04T20:30:01 [TARGET_APPLY ]E: Failed to create file writers for on directory: /attadhdata/data/attunity/AWS_PROD_OSO_PART1. [1022406] (file_apply.c:636)
00027860: 2022-11-04T20:30:01 [TARGET_APPLY ]E: Could not init data file writers. [1022406] (file_apply.c:688)
00027860: 2022-11-04T20:30:01 [TARGET_APPLY ]E: Error executing command [1022406] (streamcomponent.c:2024)
00027860: 2022-11-04T20:30:01 [TASK_MANAGER ]E: Stream component failed at subtask 0, component st_0_TGT_PROD_OSO_PART1 [1022406] (subtask.c:1396)
00027860: 2022-11-04T20:30:01 [TARGET_APPLY ]E: Stream component 'st_0_TGT_PROD_OSO_PART1' terminated [1022406] (subtask.c:1565)
00027857: 2022-11-04T20:30:01 [TASK_MANAGER ]W: Task 'PT_AWS_OSO_PART1' encountered a fatal error (repository.c:5794)
Start the task from timestamp to resolve the issue.
Patch to version 2021.5 to resolve the issue permanetly.
Environment
Qlik Replicate version 2021.5
File target
A query timeout expired error is displayed.
Example query run that resulted in the error:
select bmf.physical_device_name, bs.position, [dbo].[attrep_fn_NumericLsnToHexa](bs.first_lsn), [dbo].[attrep_fn_NumericLsnToHexa](bs.last_lsn), bs.backup_set_id from msdb.dbo.backupmediafamily bmf, msdb.dbo.backupset bs where bmf.media_set_id = bs.media_set_id and bs.backup_set_id > 0 and bs.database_name=db_name() and bs.type='L' and ( cast('00042ead:00024eb8:0001' collate SQL_Latin1_General_CP1_CI_AS as varchar(24)) >= cast([dbo].[attrep_fn_NumericLsnToHexa](bs.first_lsn) collate SQL_Latin1_General_CP1_CI_AS as varchar(24)) and cast('00042ead:00024eb8:0001' collate SQL_Latin1_General_CP1_CI_AS as varchar(24)) < cast([dbo].[attrep_fn_NumericLsnToHexa](bs.last_lsn) collate SQL_Latin1_General_CP1_CI_AS as varchar(24)) ) and bmf.device_type in(2, 102, 0
This timeout can happen when at least one of the tables msdb.dbo.backupmediafamily and msdb.dbo.backupset has grown in size and delays the query that locates backup files to load. You can purge backup data that is above your retention by using sp_delete_backuphistory with @oldest_date set with a value that should be enough for your retention policy to work fine.
We have data that is not being replicated from DB2 LUW DPF (database partitioning facility). Data during a time frame will be replicated for some update jobs but not other.
Root cause is due to a documented limitation on adding partitions. Found a viable work around: Stop all the tasks, add the partition, start all the tasks from timestamp just before the tasks were stopped. A feature request will be submitted to see if we can better handle this use case in the software.
Limitations and considerations #Limitations and considerations | Qlik Replicate Help
7449
Issue:
May 2022 (2022.5.0.291) version needs to be updated because of Java SE vulnerability. Plugin Output: Path : /opt/attunity/replicate/jvm/ Installed version : 11.0.14 Fixed version : Upgrade to version 11.0.16 or greater CVE-2022-21426,CVE-2022-21434,CVE-2022-21443,CVE-2022-21449,CVE-2022-21476,CVE-2022-21496 CVE-2022-21540,CVE-2022-21541,CVE-2022-21549,CVE-2022-25647,CVE-2022-34169
Qlik Data Integration products use JVM version 11 for QEM/Replicate and also JVM version 8 for Compose. This issue does not apply. You can upgrade Java SE to 11.0.17 for the Qlik products/versions you are using.
7345
Our Qlik Replicate instances are non-PCI compliant due to weak SSL ciphers on ports 443, 3389 and 3552. Ports 443 and 3552 are used by Qlik. Port 3389 is RDP.
Qlik cannot offer advice on how to configure Windows to disable certain ciphers which customer security guidelines forbid. Customer security staff folks should be able to apply their policy on any Windows system, whether it is managed by AD or not (see for example https://learn.microsoft.com/en-us/windows-server/security/tls/tls-registry-settings). Lucky 13 and Sweet 32 are the versions which are not compliant for port 3552. Luck-13 (https://crashtest-security.com/prevent-ssl-lucky13/) and Sweet-32 (https://crashtest-security.com/prevent-ssl-sweet32/) are not versions, nor are they ciphers that Replicate uses on port 3552. Those are old vulnerabilities (5Y+) that are either mitigated in the version of OpenSSL currently used in Replicate (with its cipher selection) or is otherwise impractical or irrelevant in the way Replicate works. In short, there is no security issue here from a Replicate perspective.
Task error: JAVA_EXCEPTION, message: 'io.swagger.client.ApiException: Operations per second is over the account limit.'
This is a Databricks issue with the storage account for this Replicate user. Please check with your Databricks Admin to confirm the limits on data per second for the storage account for the Replicate user.
Issue was resolved after installing the Qlik UDTF
When using the Snowflake ODBC driver as a source, the source endpoint is returning columns from multiple tables of the same name from different databases/schemas.
This video will demonstrate how to install and configure Qlik-CLI for SaaS editions of Qlik Sense.
get-command qlik
choco install qlik-cli --source=https://www.nuget.org/api/v2 --version 2.1.0
Where --version 2.1.0 needs to be replaced by the latest package version. Locate the latest version on Nuget.org. You can also download them from there directly.
if ( -not (Test-Path $PROFILE) ) {
echo "" > $PROFILE
}
qlik completion ps > "./qlik_completion.ps1" # Create a file containing the powershell completion.
. ./qlik_completion.ps1 # Source the completion.
Advanced and additional instructions as seen in the video can be found at Qlik-CLI on Qlik.Dev. Begin with Get Started.
Changing QlikView to a new serial number license requires you to use Apply License in the Management Console, rather than Update License from Server.
QlikView using a LEF and Serial Number
To change the license, you must use Apply License.
It is recommended to backup all license related files on the QlikView server prior to updating license details. This allows for reverting to the previous license if the update causes any problems. QlikView license information is stored in two locations. Files from both locations, as specified below, must be copied to a safe location prior to updating or applying a new license.
When clicking update from server, QlikView will update the current license and not replace the old one.
How to License a QlikView Server or Update the License
Loading files larger than 50MB from either Dropbox or SharePoint with their respective connectors leads to:
Data file size exceeded file streaming buffer size. Adjust the StreamingBufferSizeMB setting.
Qlik Sense Enterprise on Windows
Qlik Cloud
The solution requires us to modify the Qlik Sense Engine settings.ini. See How to modify Qlik Sense Engine settings.ini for details on the procedure.
The value cannot be configured in Qlik Cloud. A workaround is to split the big data files by e.g. months.
QLIK-83207
CP-4120
QLIK-82097
In this article, we will explain how to set up one automation as a source and keep multiple target automations in other tenants synced with the changes made to the original. We will be using an intermediary tool for this process, in our case Github, to store the changes made to the source automation so you could also say this article doubles as a version control article for your future automations.
To get the initial automation data inside a Github repository, all we need to do is construct a quick automation that contains the "Get Automation" block and run it in order to receive the much-needed data in the block history. We can then copy that data and paste it all into a file inside our working repository. For example purposes, our working file is called 'automation.json' and is situated in the main directory of the demo repository.
Now that our initial set-up is complete, we will list the steps needed to keep our repository up to date with all the changes to our automation. This is how the workflow will look:
You can see that our initial start block is replaced with a Qlik Cloud Services webhook that listens to all automations being updated in our tenant. We have also created a variable to keep the Github branch name in memory and keep it as a version number for the cases where it might be needed:
The formula used takes the date at which the automation was modified, transforming it to a UTC format, and attachs it to the branch name: version-{date: {$.AutomationUpdated.data.updatedAt}, 'U'}
We also need to attach a condition block to our flow to make sure only our source automation gets sent to our repository:
In this case, we are hardcoding the id of our automation in the condition input parameter. Next, on the 'yes' branch of the condition, comes the 'Get automation' block that will return the latest version of our source automation, followed by the flow that sends that data to a new branch in our Github repository. First, we create a new branch:
Secondly, we update the automation.json file we had in our main repository with the new data we received from the 'Get automation' block:
As you can see, Github platform only accepts BASE64 encoded file contents, so the JSON content received from 'Get automation' will be transformed to that format. To create the commit and pull request, we have one more step, which is to find out the branch TREE SHA information:
With this information, we can go ahead and create the commit:
The final step in our flow is to create the pull request :
To do this, we now move to our target tenant and create a separate automation there. The workflow involved is a simple one:
Again, the start block will get replaced by the Github webhook that is listening to all new pull requests done in our repository:
We also create an empty object variable that will be later used to save the incoming information from the repository. We need to create a condition block as well with the following two conditions:
This lets the automation flow continue only if the pull request has been closed and its contents have been merged into the main branch of our repository. It tells us that the pull request has been approved and we can sync all changes into our target automations. For that, we query the file contents of the repository and save that information in our object variable after we have transformed it from a BASE64 encoding format to a text-based format:
The formula used when assigning the value to the variable is {object: '{base64decode: {$.getFileContent}}'} which just removes the base64 formatting and turns the string into a JSON object to better handle it in the next block. Now all that is left is to use the 'Update Automation' block to bring the new changes to our target automation:
As you can see, we hardcoded the GUID of our target automation, but the do lookup functionality can be used as well to select one of the automations in our tenant. Finally, we need to send the correct parameters from our object to the input parameters of the block. This can easily be done by selecting the Name, Description, Workspace and Schedule parts of the object.
This should let you keep your automations synced over multiple tenants. Your target automation history should not be impacted by the syncs in any way.
Please be aware that if the target automation is open during the sync process, the changes to it will not be reflected in real-time and a reopening of the automation will be needed in order to see them.
Attached to this article you can also find the JSON files that contain the automation flows presented.
The information in this article is provided as-is and to be used at own discretion. Depending on tool(s) used, customization(s), and/or other factors ongoing support on the solution below may not be provided by Qlik Support.
Executing tasks or modifying tasks (changing owner, renaming an app) in the Qlik Sense Management Console and refreshing the page does not update the correct task status. Issue affects Content Admin and Deployment Admin roles.
The behavior began after an upgrade of Qlik Sense Enterprise on Windows.
Should the issue persist after applying the workaround/fix, contact Qlik Support.
This issue can be mitigated beginning with August 2021
Workaround for earlier versions:
Upgrade to the latest Service Release and disable the caching functionality:
To do so:
NOTE: Make sure to use lower case when setting values to true or false as capabilities.json file is case sensitive.
QB-2096
QB-5168
QB-7655
This is part 3 of 4 in a series of articles with information about migrating Hive HDS projects from Compose for Data Lakes 6.6 (C4DL 6.6) to Qlik Compose 2021.8 (Gen2)
Different Migration Modules for customers
Module 3: Migrate Hive HDS projects from C4DL 6.6 to Gen2
There are two paths for Hive HDS project, you can choose one of the paths to finish migration.
Path 1: You can clean up storage database and replicate landing database (including underlaying files for attrep_cdc_partition). Reload replicate task and start the compose tasks as it is new project. It was covered in first demo.
Path 2: You can follow migration path, and we explained all the required steps in this document.
Here are the migration steps (Path 2) if you don’t want to reload the replicate task. You are going to completely migrate your project definition and data to Gen2.
ComposeCli.exe adjust_cfdl_project --project Suresh_Hive_HDS --infile “C:\Program Files\Qlik\C4DL66\ Suresh_Hive_HDS_deployment_<datetime>.zip”
- C:\ProgramFiles\Qlik\Compose\data\projects\Suresh_Hive_HDS\deployment_package\ Suresh_Hive_HDS__<datetime>__QlikComposeDLMigration.zip
ComposeCli.exe create_cfdl_data_migration_script --project Suresh_Hive_HDS --infile “C:\Program Files\Qlik\C4DL66\ Suresh_Hive_HDS_deployment_<datetime>.zip”
NOTE:
After migration you can check\run
also set
You can watch demo video How to Migrate Hive HDS projects.
The information in this article is provided as-is and to be used at own discretion. Depending on tool(s) used, customization(s), and/or other factors ongoing support on the solution below may not be provided by Qlik Support.
When either preview and NPrinting Report containing a straight table or pivot table, the following error is generated:
Going to retrieve archived REDO log with sequence 2056016, thread 1 but fails REDO log with a sequence not found error. The Archived Redo Log has Primary Oracle DB as DEST_ID 1 and the Standby DEST_ID 32 pointing to the correct location.
Cannot use DEST_ID greater than 31 for Primary or Standby Oracle Redo Log locations
Qlik Replicate only supported the DEST_ID 0 through 31
https://docs.oracle.com/en/database/oracle/oracle-database/19/refrn/V-ARCHIVE_DEST.html
JIRA RECOB-6332 case 00058856