Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
Search our knowledge base, curated by global Support, for answers ranging from account questions to troubleshooting error messages.
Qlik offers a wide range of channels to assist you in troubleshooting, answering frequently asked questions, and getting in touch with our technical experts. In this article, we guide you through all available avenues to secure your best possible experience.
For details on our terms and conditions, review the Qlik Support Policy.
Index:
We're happy to help! Here's a breakdown of resources for each type of need.
Support | Professional Services (*) | |
Reactively fixes technical issues as well as answers narrowly defined specific questions. Handles administrative issues to keep the product up-to-date and functioning. | Proactively accelerates projects, reduces risk, and achieves optimal configurations. Delivers expert help for training, planning, implementation, and performance improvement. | |
|
|
(*) reach out to your Account Manager or Customer Success Manager
Your first line of support: https://community.qlik.com/
Looking for content? Type your question into our global search bar:
Leverage the enhanced and continuously updated Knowledge Base to find solutions to your questions and best practice guides. Bookmark this page for quick access!
Subscribe to maximize your Qlik experience!
The Support Updates Blog
The Support Updates blog delivers important and useful Qlik Support information about end-of-product support, new service releases, and general support topics. (click)
The Qlik Design Blog
The Design blog is all about product and Qlik solutions, such as scripting, data modelling, visual design, extensions, best practices, and more! (click)
The Product Innovation Blog
By reading the Product Innovation blog, you will learn about what's new across all of the products in our growing Qlik product portfolio. (click)
Q&A with Qlik
Live sessions with Qlik Experts in which we focus on your questions.
Techspert Talks
Techspert Talks is a free webinar to facilitate knowledge sharing held on a monthly basis.
Technical Adoption Workshops
Our in depth, hands-on workshops allow new Qlik Cloud Admins to build alongside Qlik Experts.
Qlik Fix
Qlik Fix is a series of short video with helpful solutions for Qlik customers and partners.
Suggest an idea, and influence the next generation of Qlik features!
Search & Submit Ideas
Ideation Guidelines
Get the full value of the community.
Register a Qlik ID:
Incidents are supported through our Chat, by clicking Chat Now on any Support Page across Qlik Community.
To raise a new issue, all you need to do is chat with us. With this, we can:
Log in to manage and track your active cases in Manage Cases. (click)
Please note: to create a new case, it is easiest to do so via our chat (see above). Our chat will log your case through a series of guided intake questions.
When creating a case, you will be prompted to enter problem type and issue level. Definitions shared below:
Select Account Related for issues with your account, licenses, downloads, or payment.
Select Product Related for technical issues with Qlik products and platforms.
If your issue is account related, you will be asked to select a Priority level:
Select Medium/Low if the system is accessible, but there are some functional limitations that are not critical in the daily operation.
Select High if there are significant impacts on normal work or performance.
Select Urgent if there are major impacts on business-critical work or performance.
If your issue is product related, you will be asked to select a Severity level:
Severity 1: Qlik production software is down or not available, but not because of scheduled maintenance and/or upgrades.
Severity 2: Major functionality is not working in accordance with the technical specifications in documentation or significant performance degradation is experienced so that critical business operations cannot be performed.
Severity 3: Any error that is not Severity 1 Error or Severity 2 Issue. For more information, visit our Qlik Support Policy.
If you require a support case escalation, you have two options:
A collection of useful links.
Qlik Cloud Status Page
Keep up to date with Qlik Cloud's status.
Support Policy
Review our Service Level Agreements and License Agreements.
Live Chat and Case Portal
Your one stop to contact us.
Scenario :
Possible Error messages :
Environment:
Steps for resolution :
1. Rebuild the performance counters (lodctr /R)
Tasks do not reload after upgrade to Qlik Sense June 2018
2. Disable the loopback check
Reopen Registry and navigate to HKEY_LM\System\CCS\Control\LSA\MSV1.0 and create the following DWORD key
DisableLoopbackCheck
Value : 1
3. Then, reboot the box
A security issue in Qlik Sense Enterprise for Windows has been identified, and patches have been made available. If successfully exploited, this vulnerability could lead to a compromise of the server running the Qlik Sense software, including remote code execution (RCE).
This issue was responsibly disclosed to Qlik and no reports of it being maliciously exploited have been received.
All versions of Qlik Sense Enterprise for Windows prior to and including these releases are impacted:
Using the CVSS V3.1 scoring system (https://nvd.nist.gov/vuln-metrics/cvss), Qlik rates this severity as high.
CVE-2024-xxxx (QB-26216) Privilege escalation for authenticated/anonymous user
Severity: CVSS:3.1/AV:N/AC:L/PR:L/UI:N/S:U/C:H/I:H/A:H 8.8 (High)
Due to improper input validation, a remote attacker with existing privileges is able to elevate them to the internal system role, which in turns allows them to execute commands on the server.
Customers should upgrade Qlik Sense Enterprise for Windows to a version containing fixes for these issues. Fixes are available for the following versions:
All Qlik software can be downloaded from our official Qlik Download page (customer login required).
This issue was identified and responsibly reported to Qlik by Daniel Zajork.
The max concurrent reloads can be configured in the Qlik Sense Management Console.
<ServerName>_System_Scheduler.txt
Domain\qvservice Engine connection released. 5 of 4 used
Domain\qvservice Engine connection 6 of 4 established
Domain\qvservice Request for engine-connection dequeued. Total in queue: 25
Use the "Max concurrent reloads" to limit the maximal concurrent tasks can be run at same time on current node. By default, it's set to 4, which means only 4 tasks can be run at same time on this node.
When the 5th task comes in:
On a multi-node deployment, tasks will be balanced from the manager node to any node(s) designated as workers.
It's highly advised to check if the central node configured is set to Manager and Worker or Manager. When set to Manager, it will send all reload jobs to the reload/scheduler nodes, as it should. However if a central node is set to Manager and Worker, this means the Central node will also be involved in performing reloads. This is not recommended.
The work flow looks as follows:
The improvement to track the Max concurrent reloads can, if desired, be disabled. This reverts Sense to an older load balancing method that relies only on CPU usage.
To disable the setting:
In our example, we allow one concurrent reload, but we assume that two reloads are executed at the same time.
Reload fails in QMC even though script part is successfull in Qlik Sense Enterprise on Windows November 2023 and above.
When you are using a NetApp based storage you might see an error when trying to publish and replace or reloading a published app.
In the QMC you will see that the script load itself finished successfully, but the task failed after that.
ERROR QlikServer1 System.Engine.Engine 228 43384f67-ce24-47b1-8d12-810fca589657
Domain\serviceuser QF: CopyRename exception:
Rename from \\fileserver\share\Apps\e8d5b2d8-cf7d-4406-903e-a249528b160c.new
to \\fileserver\share\Apps\ae763791-8131-4118-b8df-35650f29e6f6
failed: RenameFile failed in CopyRename
ExtendedException: Type '9010' thrown in file
'C:\Jws\engine-common-ws\src\ServerPlugin\Plugins\PluginApiSupport\PluginHelpers.cpp'
in function 'ServerPlugin::PluginHelpers::ConvertAndThrow'
on line '149'. Message: 'Unknown error' and additional debug info:
'Could not replace collection
\\fileserver\share\Apps\8fa5536b-f45f-4262-842a-884936cf119c] with
[\\fileserver\share\Apps\Transactions\Qlikserver1\829A26D1-49D2-413B-AFB1-739261AA1A5E],
(genericException)'
<<< {"jsonrpc":"2.0","id":1578431,"error":{"code":9010,"parameter":
"Object move failed.","message":"Unknown error"}}
ERROR Qlikserver1 06c3ab76-226a-4e25-990f-6655a965c8f3
20240218T040613.891-0500 12.1581.19.0
Command=Doc::DoSave;Result=9010;ResultText=Error: Unknown error
0 0 298317 INTERNAL&
emsp; sa_scheduler b3712cae-ff20-4443-b15b-c3e4d33ec7b4
9c1f1450-3341-4deb-bc9b-92bf9b6861cf Taskname Engine Not available
Doc::DoSave Doc::DoSave 9010 Object move failed.
06c3ab76-226a-4e25-990f-6655a965c8f3
Potential workarounds
The most plausible cause currently is that the specific engine version has issues releasing File Lock operations. We are actively investigating the root cause, but there is no fix available yet.
An update will be provided as soon as there is more information to share.
QB-25096
QB-26125
When installing Qlik Replicate v2023.5 on RHEL/CentOS v8.x, you may encounter a warning message from rpm as follows:
warning: areplicate-2023.5.0-213.x86_64.rpm: Header V4 RSA/SHA256 Signature, key ID 05d7eace: NOKEY
This warning message indicates that rpm is unable to verify the package signature due to the absence of the public key.
Please perform the following commands before trying to install the RPM package:
$ rpm -q gpg-pubkey --qf '%{version}-%{release} %{summary}\n' | sed '/qlik.com/!d;s/ .*$//' | xargs -n 1 -I {} sudo rpm -e gpg-pubkey-{}
$ curl https://qlikcloud.com/.well-known/qlik-codesign-public-keys.asc > qlik-codesign-public-keys.asc
$ sudo rpm --import qlik-codesign-public-keys.asc
$ rpm --checksig <qlik-rpm-package>
Where the commands do the following:
Here is an example of an RPM package that passed the authenticity check:
$ rpm --checksig areplicate-2023.5.0-152.x86_64.rpm
areplicate-2023.5.0-152.x86_64.rpm: digests SIGNATURES OK
An example of failed check is:
$ rpm --checksig areplicate-2023.5.0-152.x86_64.rpm
areplicate-2023.5.0-152.x86_64.rpm: digests SIGNATURES NOT OK
Then proceed with the installation or upgrade as per the instructions in the user guide.
Qlik Replicate v2023.5, v2023.11 or after on Linux 8.x (64-bit)
NPrinting report is hanging on preview or in a publish task preventing the report from generating as expected. In many cases with 'Carded' Qlik Sense themes, we are seeing this problem.
This is also seen when the extended sheet feature is used
"error=task SENSE_JS_PAINT_COVER_POINTS_TIMEOUT"
There is currently a fix coming to resolve 'carded' custom themes however this fix is still under investigation.
Information provided on this defect is given as is at the time of documenting. For up to date information, please review the most recent Release Notes, or contact support with the QB-24997 for reference.
Possible workarounds for this are:
NOTE: Extended or Custom sheet sizes are not fully supported. See Qlik Help page below for details
Qlik Sense custom and extended sheets
A Job fails to run on a remote JobServer with the following error message:
Unrecognized option: --add-opens=java.base/java.lang=ALL-UNNAMED
Error: Could not create the Java Virtual Machine.
Error: A fatal exception has occurred. Program will exit.
The issue occurs with Talend Remote Engine or Talend JobServer using Java 8, but Talend Studio has JDK 17 enabled. The --add-opens option is only supported in java 11 and above, Java 8 does not support it.
If the 'Enable Java 17 compatibility' option is enabled, the Jobs built by Talend Studio will have the --add-opens parameters in the job.sh or job.bat script files, such as:
JAVA_OPTS=-Dfile.encoding=UTF-8 -Dsun.stdout.encoding=UTF-8 -Dsun.stderr.encoding=UTF-8 -Dconsole.encoding=UTF-8 --add-opens=java.base/java.lang=ALL-UNNAMED
The Jobs built this way cannot be executed with Java 8.
Option 1:
Option 2:
From R2024-05, Java 17 will become the only supported version to start most Talend modules, If you want to keep the "Enable JAVA 17 Compatibility" option enabled, make sure that Talend Remote Engine or Talend JobServer also uses JDK 17 to execute jobs.
Setting-compiler-compliance-level
A Qlik app may show an Incomplete Visualization or Invalid Visualization error. This article covers the most common root causes.
Incomplete Visualizations:
Invalid Visualizations:
The information in this article is provided as-is and will be used at your discretion. Depending on the tool(s) used, customization(s), and/or other factors, ongoing support on the solution below may not be provided by Qlik Support.
This article describes how to resolve the OAUTH Error Status 400 error preventing successful configuration and download capability of the MS Office addin manifest necessary for Tabular Reporting.
{"errors":[{"title":"Invalid redirect_uri","detail":"redirect_uri is not registered","code":"OAUTH-1","status":"400"}],"traceId":"0000000000000000xxxxxxxxxxxxxxxx"}
An OAuth client configuration is required to install the Qlik add-in for Microsoft Excel. The add-in is used by report developers to prepare report templates which control output of tabular reports from the Qlik Sense app.
Review or re-do the steps documented in Creating an OAuth client for the Qlik add-in for Microsoft Excel.
Verify you have included the user_default scope.
Example of a working OAUTH configuration:
The OAUTH configuration was not set correctly, such as missing the user_default scope, or not setting up the redirect link.
Preparing and obtaining the add-in manifest
Qlik Tabular Reporting General Troubleshooting and Best Practices
Tabular Reporting available in Qlik Cloud Enterprise and Premium editions. See for Product Description for Qlik Cloud® Subscriptions for details. *Not included Qlik Cloud Standard and Government Editions.
When replicating a CDS View the Updates are being processed as Inserts not Updates. In SAP the CDS views delta with UPSERT and SAP capture INSERT and UPDATE both as one operation which is treated as INSERT.
If you have a SAP login you can look up SAP Note 3300238 for more information as shown below:
SAP Note 3300238 - ABAP CDS CDC: ODQ_CHANGEMODE not showing proper status forcreation
Component: BW-WHM-DBA-ODA (SAP Business Warehouse > Data Warehouse Management > Data Basis >Operational Data Provider for ABAP CDS, HANA & BW), Version: 4, Released On: 19.01.2024
This is working as expected. It is the designed behavior of the CDC logic. For both insert and update, ODQ_CHANGEMODE = U and ODQ_ENTITYCNTR = 1.
The CDC-delta logic is designed as UPSERT-logic. This means a DB-INSERT (or create) or a DB-UPDATE both get the ODQ_CHANGEMODE = U and ODQ_ENTITYCNTR = 1. It's not possible to distinguish in CDC-delta between Create and Update.
Qlik Replicate
SAP S/4HANA
SAP BW/4HANA
The problem occurs when running a reload from Data Gateway to a Cloudera DB (Cloudera Hive).
The reload is completed correctly, but the fields are imported with aliases. This means that the reload imports the fields as 'TableName.FieldName' instead of importing them as just 'FieldName'.
For example, with this script:
LOAD meter_id,
Field1,
Field2,
Field3;
[meter_attribute]:
SELECT "meter_id",
"Field1",
"Field2",
"Field3"
FROM internaldb."meter_attribute";
Fields are imported as: meter_attribute.Field1, meter_attribute.Field2, meter_attribute.Field3.
The problem can be fixed adding the row EnableUniqueColumnName=0 to the connection string.
{ "SyntaxId": "ClouderaHive", "DisplayName": "Cloudera Hive syntax", "DelimiterStart": "`", "DelimiterEnd": "`", "DataPreviewSelectTemplate": "SELECT ${COLUMN_LIST} FROM ${TABLE_NAME} ${FILTER} LIMIT ${LIMIT_VALUE}", "DatabaseTerm": "catalog", "OwnerTerm": "schema" }
QB-26342
This article explains how the Amazon SNS connector in Qlik Application Automations can be used to set up webhooks that trigger when a object creation event occurs in Amazon S3. This connector only has webhooks available.
Content:
Search for the "Amazon SNS" connector in Qlik Application Automations. When you click connect, you will be prompted for the following input parameters:
You must obtain the AWS Access Key from IAM in your AWS console. This can be obtained by going to the IAM section in AWS and in the left side panel choose users.
Here you can choose either for an already existing user or creating a new one by clicking the "Add Users" button in the topright.
When you create a new user, you must provide a user name and click next. You do not need to give this user access to the AWS console. In the next step, you will give permissions to do this IAM user.
The following policy needs to be created and attached to the IAM user, replace the account-id with your account ID:
Other permissions that are suggested to add are:
Furthermore, the IAM user must be made an owner of a S3 bucket when creating a notification configuration.
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Action": [
"s3:PutBucketNotification",
"iam:PassRole",
"sns:Publish",
"sns:CreateTopic",
"sns:Subscribe"
],
"Resource": [
"arn:aws:s3:::*",
"arn:aws:iam::account-id:role/*",
"arn:aws:sns:*:account-id:*"
]
},
{
"Sid": "VisualEditor1",
"Effect": "Allow",
"Action": "sns:Unsubscribe",
"Resource": "*"
}
]
}
You will have to create an access key for the IAM user.
This can be done in the (a) Users menu and in the (b) Security credentials tab. Click (c) Create access key.
Choose Third-party service and choose to understand the above recommendation, click next:
You will now have your access key and secret key and can finish creating the datasource in Qlik Application Automation:
You can use this in an automation, but only as a webhook. When you create a new automation, you will be presented with a blank canvas. Select the Start block and change the run mode to webhook.
Choose an event type next. These are currently limited to S3 object creation events. You will have lookup capabilities available to other parameters, such as bucket and topic selection:
After saving the automation, you can test the webhook by uploading objects in your S3 bucket and see in your automation run history that it is triggering this automation.
The use of this is that you can now trigger tasks after a object is uploaded to S3. Common tasks will be to reload a Qlik Sense app or trigger a data pipeline in any of our other connectors:
If a tenant previously had an incomplete SMTP configuration with their Qlik Cloud, an error message will now be shown to the Tenant Administrator:
To resolve this error a Tenant Admin can enter valid SMTP credentials.
At the moment, it is not possible to delete/clear the previous credential entry in the authentication. An option to clear the credentials is being prepared to support a return to a default (non-configured/empty) state and is expected to be available in the coming weeks.
We will update this article when the ability to clear becomes available.
With the release of the SMTP service connectivity for Microsoft O365 from the Management Console, more stringent error checking was added to the basic authentication configuration.
If interested, Admins can still successfully connect to Microsoft 0365 SMTP with this error showing. More details on the new available 0Auth2 authentication can be found here: Qlik Cloud: Introducing OAuth2 authentication for ... - Qlik Community - 2444243
Many of the GeoAnalytics can be executed with a split input of the indata table. This article explains which and how to modify the code that the connector produces. Operations that cannot be Splittable are mostly the aggregating and hence not Splittable. When loadable tables are used for input, inline tables are created in loops and can be used for a quick way to split. Of course it's possible to write custom code to do the splitting instead.
Making calls with large indata tables often causes time outs on the server side, splitting is a way around that.
Splittable ops | Non-Splittable ops | Special ops, splittable |
---|---|---|
|
|
|
Bining and SpatialIndex differs from other operations, they are not placing any call to the server if the indata are internal geometries, ie lat ,ong points. The operations als produce the same type of results so the resulting tables can be concatenated.
The code as the connector produces it:
/* Generated by Idevio GeoAnalytics for operation Within ---------------------- */
Let [EnclosedInlineTable] = 'POSTCODE' & Chr(9) & 'Postal.Latitude' & Chr(9) & 'Postal.Longitude';
Let numRows = NoOfRows('PostalData');
Let chunkSize = 1000;
Let chunks = numRows/chunkSize;
For n = 0 to chunks
Let chunkText = '';
Let chunk = n*chunkSize;
For i = 0 To chunkSize-1
Let row = '';
Let rowNr = chunk+i;
Exit for when rowNr >= numRows;
For Each f In 'POSTCODE', 'Postal.Latitude', 'Postal.Longitude'
row = row & Chr(9) & Replace(Replace(Replace(Replace(Replace(Replace(Peek('$(f)', $(rowNr), 'PostalData'), Chr(39), '\u0027'), Chr(34), '\u0022'), Chr(91), '\u005b'), Chr(47), '\u002f'), Chr(42), '\u002a'), Chr(59), '\u003b');
Next
chunkText = chunkText & Chr(10) & Mid('$(row)', 2);
Next
[EnclosedInlineTable] = [EnclosedInlineTable] & chunkText;
Next
chunkText=''
Let [EnclosingInlineTable] = 'ClubCode' & Chr(9) & 'Car5mins_TravelArea';
Let numRows = NoOfRows('TravelAreas5');
Let chunkSize = 1000;
Let chunks = numRows/chunkSize;
For n = 0 to chunks
Let chunkText = '';
Let chunk = n*chunkSize;
For i = 0 To chunkSize-1
Let row = '';
Let rowNr = chunk+i;
Exit for when rowNr >= numRows;
For Each f In 'ClubCode', 'Car5mins_TravelArea'
row = row & Chr(9) & Replace(Replace(Replace(Replace(Replace(Replace(Peek('$(f)', $(rowNr), 'TravelAreas5'), Chr(39), '\u0027'), Chr(34), '\u0022'), Chr(91), '\u005b'), Chr(47), '\u002f'), Chr(42), '\u002a'), Chr(59), '\u003b');
Next
chunkText = chunkText & Chr(10) & Mid('$(row)', 2);
Next
[EnclosingInlineTable] = [EnclosingInlineTable] & chunkText;
Next
chunkText=''
[WithinAssociations]:
SQL SELECT [POSTCODE], [ClubCode] FROM Within(enclosed='Enclosed', enclosing='Enclosing')
DATASOURCE Enclosed INLINE tableName='PostalData', tableFields='POSTCODE,Postal.Latitude,Postal.Longitude', geometryType='POINTLATLON', loadDistinct='NO', suffix='', crs='Auto' {$(EnclosedInlineTable)}
DATASOURCE Enclosing INLINE tableName='TravelAreas5', tableFields='ClubCode,Car5mins_TravelArea', geometryType='POLYGON', loadDistinct='NO', suffix='', crs='Auto' {$(EnclosingInlineTable)}
SELECT [POSTCODE], [Enclosed_Geometry] FROM Enclosed
SELECT [ClubCode], [Car5mins_TravelArea] FROM Enclosing;
[EnclosedInlineTable] = '';
[EnclosingInlineTable] = '';
/* End Idevio GeoAnalytics operation Within ----------------------------------- */
The header and the call is moved inside of the loop. chunkSize decides how big each split is.
Note that the first inline table now comes after the first one, this to get the call inside of the iteration.
/* Generated by Idevio GeoAnalytics for operation Within ---------------------- */
Let [EnclosingInlineTable] = 'ClubCode' & Chr(9) & 'Car5mins_TravelArea';
Let numRows = NoOfRows('TravelAreas5');
Let chunkSize = 1000;
Let chunks = numRows/chunkSize;
For n = 0 to chunks
Let chunkText = '';
Let chunk = n*chunkSize;
For i = 0 To chunkSize-1
Let row = '';
Let rowNr = chunk+i;
Exit for when rowNr >= numRows;
For Each f In 'ClubCode', 'Car5mins_TravelArea'
row = row & Chr(9) & Replace(Replace(Replace(Replace(Replace(Replace(Peek('$(f)', $(rowNr), 'TravelAreas5'), Chr(39), '\u0027'), Chr(34), '\u0022'), Chr(91), '\u005b'), Chr(47), '\u002f'), Chr(42), '\u002a'), Chr(59), '\u003b');
Next
chunkText = chunkText & Chr(10) & Mid('$(row)', 2);
Next
[EnclosingInlineTable] = [EnclosingInlineTable] & chunkText;
Next
chunkText=''
Let numRows = NoOfRows('PostalData');
Let chunkSize = 1000;
Let chunks = numRows/chunkSize;
For n = 0 to chunks
Let [EnclosedInlineTable] = 'POSTCODE' & Chr(9) & 'Postal.Latitude' & Chr(9) & 'Postal.Longitude';
Let chunkText = '';
Let chunk = n*chunkSize;
For i = 0 To chunkSize-1
Let row = '';
Let rowNr = chunk+i;
Exit for when rowNr >= numRows;
For Each f In 'POSTCODE', 'Postal.Latitude', 'Postal.Longitude'
row = row & Chr(9) & Replace(Replace(Replace(Replace(Replace(Replace(Peek('$(f)', $(rowNr), 'PostalData'), Chr(39), '\u0027'), Chr(34), '\u0022'), Chr(91), '\u005b'), Chr(47), '\u002f'), Chr(42), '\u002a'), Chr(59), '\u003b');
Next
chunkText = chunkText & Chr(10) & Mid('$(row)', 2);
Next
[EnclosedInlineTable] = [EnclosedInlineTable] & chunkText;
[WithinAssociations]:
SQL SELECT [POSTCODE], [ClubCode] FROM Within(enclosed='Enclosed', enclosing='Enclosing')
DATASOURCE Enclosed INLINE tableName='PostalData', tableFields='POSTCODE,Postal.Latitude,Postal.Longitude', geometryType='POINTLATLON', loadDistinct='NO', suffix='', crs='Auto' {$(EnclosedInlineTable)}
DATASOURCE Enclosing INLINE tableName='TravelAreas5', tableFields='ClubCode,Car5mins_TravelArea', geometryType='POLYGON', loadDistinct='NO', suffix='', crs='Auto' {$(EnclosingInlineTable)}
SELECT [POSTCODE], [Enclosed_Geometry] FROM Enclosed
SELECT [ClubCode], [Car5mins_TravelArea] FROM Enclosing;
[EnclosedInlineTable] = '';
[EnclosingInlineTable] = '';
Next
chunkText=''
/* End Idevio GeoAnalytics operation Within ----------------------------------- */
The code as the connector produces it:
/* Generated by GeoAnalytics for operation AddressPointLookup ---------------------- */
Let [DatasetInlineTable] = 'id' & Chr(9) & 'STREET_NAME' & Chr(9) & 'STREET_NUMBER';
Let numRows = NoOfRows('data');
Let chunkSize = 1000;
Let chunks = numRows/chunkSize;
For n = 0 to chunks
Let chunkText = '';
Let chunk = n*chunkSize;
For i = 0 To chunkSize-1
Let row = '';
Let rowNr = chunk+i;
Exit for when rowNr >= numRows;
For Each f In 'id', 'STREET_NAME', 'STREET_NUMBER'
row = row & Chr(9) & Replace(Replace(Replace(Replace(Replace(Replace(Peek('$(f)', $(rowNr), 'data'), Chr(39), '\u0027'), Chr(34), '\u0022'), Chr(91), '\u005b'), Chr(47), '\u002f'), Chr(42), '\u002a'), Chr(59), '\u003b');
Next
chunkText = chunkText & Chr(10) & Mid('$(row)', 2);
Next
[DatasetInlineTable] = [DatasetInlineTable] & chunkText;
Next
chunkText=''
[AddressPointLookupResult]:
SQL SELECT [id], [Dataset_Address], [Dataset_Geometry], [CountryIso2], [Dataset_Adm1Code], [Dataset_City], [Dataset_PostalCode], [Dataset_Street], [Dataset_HouseNumber], [Dataset_Match]
FROM AddressPointLookup(searchTextField='', country='"Canada"', stateField='', cityField='"Toronto"', postalCodeField='', streetField='STREET_NAME', houseNumberField='STREET_NUMBER', matchThreshold='0.5', service='default', dataset='Dataset')
DATASOURCE Dataset INLINE tableName='data', tableFields='id,STREET_NAME,STREET_NUMBER', geometryType='NONE', loadDistinct='NO', suffix='', crs='Auto' {$(DatasetInlineTable)}
;
[DatasetInlineTable] = '';
/* End GeoAnalytics operation AddressPointLookup ----------------------------------- */
The header and the call is moved inside of the loop. chunkSize decides how big each split is.
/* Generated by GeoAnalytics for operation AddressPointLookup ---------------------- */
Let numRows = NoOfRows('data');
Let chunkSize = 1000;
Let chunks = numRows/chunkSize;
For n = 0 to chunks
Let [DatasetInlineTable] = 'id' & Chr(9) & 'STREET_NAME' & Chr(9) & 'STREET_NUMBER';
Let chunkText = '';
Let chunk = n*chunkSize;
For i = 0 To chunkSize-1
Let row = '';
Let rowNr = chunk+i;
Exit for when rowNr >= numRows;
For Each f In 'id', 'STREET_NAME', 'STREET_NUMBER'
row = row & Chr(9) & Replace(Replace(Replace(Replace(Replace(Replace(Peek('$(f)', $(rowNr), 'data'), Chr(39), '\u0027'), Chr(34), '\u0022'), Chr(91), '\u005b'), Chr(47), '\u002f'), Chr(42), '\u002a'), Chr(59), '\u003b');
Next
chunkText = chunkText & Chr(10) & Mid('$(row)', 2);
Next
[DatasetInlineTable] = [DatasetInlineTable] & chunkText;
[AddressPointLookupResult]:
SQL SELECT [id], [Dataset_Address], [Dataset_Geometry], [CountryIso2], [Dataset_Adm1Code], [Dataset_City], [Dataset_PostalCode], [Dataset_Street], [Dataset_HouseNumber], [Dataset_Match]
FROM AddressPointLookup(searchTextField='', country='"Canada"', stateField='', cityField='"Toronto"', postalCodeField='', streetField='STREET_NAME', houseNumberField='STREET_NUMBER', matchThreshold='0.5', service='default', dataset='Dataset')
DATASOURCE Dataset INLINE tableName='data', tableFields='id,STREET_NAME,STREET_NUMBER', geometryType='NONE', loadDistinct='NO', suffix='', crs='Auto' {$(DatasetInlineTable)}
;
[DatasetInlineTable] = '';
Next
chunkText=''
/* End GeoAnalytics operation AddressPointLookup ----------------------------------- */
In the add-in window, click the 'Home' tab in the toolbar
Next to Source app, click ...
In the Set the new Qlik Sense app ID field, enter the new app ID
(to find the new app ID, open the app in the 'alternate' or 'new' tenant hub and copy the app ID from the address bar
To change the tenant, enable the Change the Qlik Cloud tenant switch (disabled by default) and insert the new tenant address.
The information in this article is provided as-is and will be used at your discretion. Depending on the tool(s) used, customization(s), and/or other factors, ongoing support on the solution below may not be provided by Qlik Support.
The Qlik Sense Hub connection times out faster / sooner than configured in the Virtual Proxy.
When working with an app in the hub, the connection to the app times out faster than configured in virtual proxy (for example, after 5 minutes), and the error message below is displayed:
Connection lost. Make sure that Qlik Sense is running properly. If your session has timed out due to inactivity, refresh to continue working.
This issue occurs when accessing Qlik Sense through a reverse proxy or load balance, but does not occur when accessing Qlik Sense on the same server (https://localhost/hub).
The user is also being active on the sheet, i.e. they are making selections etc. to interact with sheet objects.
Here is an example log line that may be seen in the AuditActivity_Proxy logs:
76 12.20.4.0 20200115T101930.708-0500 QLIKSERVER 8cd6aa7c-fab5-4e82-a2cd-0681943e7d58 Command=Close connection;Result=0;ResultText=Success d62cfb0a-3355-44a5-9e34-5eaada84be67 1ba723cc-e754-4e24-a42b-0732b24ad924 0 DOMAINNAME user.name 5c5e98cf-3f1f-45ed-96f0-8e76d7c44473 Not available Proxy AppAccess /app/5c5e98cf-3f1f-45ed-96f0-8e76d7c44473?reloaduri=https%3a%2f%2domain.com%2fsense%2fapp%2f5c5e98cf-3f1f-45ed-96f0-8e76d7c44473%2foverview Close connection 0 Backend web socket connection Closed for session 'd62cfb0a-3355-44a5-9e34-5eaada84be67'.
Qlik Sense should not terminate a session earlier than configured in the Virtual Proxy (default is 30 minutes), and when it does the error message would be different:
Your session has timed out. Log back in to Qlik Sense to continue.
In this case, if the connection is being lost and it is occurring sooner than configured in the virtual proxy then this indicates that either:
Qlik Sense requires WebSockets. To verify WebSocket traffic, see Qlik Sense Websocket Connectivity Tester.
Check to see if this issue occurs when using Internet Explorer. Internet Explorer by default adds additional unnecessary traffic to the WebSocket which can help in these situations. If you do not receive early timeouts when using IE, you can try Enabling TCP Keep Alive Functionality In Qlik Sense.
The HTTP keep-alive timeout may be increased here which helps in environments exhibiting the behavior mentioned above. See the Keep-alive timeout under the Proxy configuration. See Editing Proxies - Advanced > Keep-alive timeout (seconds)
If the above does not work, then you will need to make changes to your network infrastructure. Sniffing the network traffic to confirm things further can be performed. As sniffer, Wireshark can be installed on both Qlik Sense Proxy server and Hub client PC to run captures on both ends simultaneously. This may assist confirming the following:
Qlik Support may assist in a best-effort basis with the above. Please check out the video below for additional guidance. If further assistance is required correcting the environmental behavior, unfortunately the the task falls outside the scope of Qlik Support as this often is specific to the network involved. If further assistance is needed please reach out to your Account Owner to arrange Consulting Services (or see How to Contact the Consulting Team?)
Click here for video transcript
Network appliances (e.g. Load balancers, reverse proxies, VPN solutions, firewalls, web accelerators, etc) may not see the websocket connection as an "alive" TCP connection, depending on the vendor, and therefore terminate the connection on behalf of the end-user.
The "Backend web socket connection Closed for session" string seen in the logs is registered when a WebSocket TCP layer connection is closed by the transmission of a TCP Packet containing the FIN flag set. This happens when either the browser window or tab is closed or when a network device in the environment drops the TCP connection sessions in a controlled fashion (sends the TCP FIN packet to the Sense server). This can also be sent by the client's computer if there is(are) any setting(s) preventing inactive TCP connections from staying open, as required by the Websocket protocol.
Windows Firewall blocking port 4747. Disable Firewall or apply all ports inbound and outbound
When working with Data Gateway - Direct Access, odbc drivers can have unexpected behaviors that can lead to crashes.
Note: For more information about how to enable dump file creation and Isolation capability for Data Gateway - Direct Access, please review the links included in the "Related Content" at the end of this article.
One known reason to cause crashes is the use of malformed SQL query statement.
After configuring the dump file generation and Process Isolation, dump files will be created under the folder configured during the setup.
Once a crash occurs the file will include the process id associated to the reload:
dotnet.exe.XXXXX.dmp
Where XXXX will be the process id.
With that process id and looking for the date and time when the file was created, in the odbc connector log, is possible to find the ProcessId associated to the Id of the dump file and get the reload ID.
Once the reload id was identified, from the QMC get the reload script associated to the reload id to verify the query that was triggered.
If the query is executed from a third party tool (outside of Qlik products) and doesn't work, the problem is a malformed SELECT statement.
If the issue is not related to a malformed SELECT query, collect the dump file and contact Qlik Support for further analysis.
Malformed SQL query statement.
Mitigating connector crashes during reload
How to collect dump files for the Qlik Data Gateway - Direct Access ODBC connector crashes.
Qlik Sense allows for three settings that may influence the perceived connection and session timeout period. These are the "Session Inactivity Timeout", "Keep-Alive Timeout", and "TCP Websocket keep-alive" settings.
Note: Adjusting the below settings can help when working with slow internet connectivity or wanting to extend the session inactivity. However, session disconnect issues can be caused by other network connectivity issues and by system resource shortage as well and may require additional troubleshooting. See Hub access times out with: Error Connection lost. Make sure that Qlik Sense is running properly
This is the maximum timeout for a single HTTP request. The default value is 10 seconds. During the defined keep alive timeout value, the connection between end user and Qlik Sense will remain open.
It serves as protection against denial-of-service attacks. That is, if an ongoing request exceeds this period, Qlik Sense proxy will close the connection.
Increase this value if your users work over slow connections and experience closed connections for which no other workaround has been found. Make sure to take the mentioned DoS consideration above into account.
This is the browser authentication session time out ( 30 minutes by default set under Virtual Proxy in QMC ). This sets a cookie on the client machine with the name X-Qlik-Session. This cookie can be traced in Fiddler or Developer tools under the header tab.
If the session cookie header value does not get passed, is destroyed, or modified between the end user client and the Qlik Sense server while 'in-flight' the user session is terminated and the user is logged out.
By default, it will be destroyed after 30 minutes of inactivity or when the browser is closed.
This is another setting that may help keep the connection open in certain environments. See Enabling TCP Keep Alive Functionality In Qlik Sense. Note that customers who don't experience any issues with web sockets terminated by the network due to inactive SHOULD NOT switch this feature ON since it may potentially cause Qlik Sense to send unnecessary traffic on the network towards the client.
Opening an application to visualize charts through the hub, the visualization may fail and the error "Out of calculation memory" will be printed.
The issue is often related to having an app that is too large or requires a re-design or general optimization. A poorly designed application can lead to memory leaks.
Review the application and optimize as needed to prevent performance problems for the engine service.
Verify that the servers hosting the Qlik Sense engine service have sufficient resources (CPU / Memory) to proceed with the calculation. It is highly recommended to have at least 16 cores and 64 GB of memory.
You can otherwise change the timeout.
The "Hypercube memory limits" limits how much memory a hypercube evaluation can allocate during a request. If multiple hypercubes are calculated during the request, the limit is applied to each hypercube calculation separately.
If it sets to 0, the engine applies a global heuristic which basically ensures that it doesn't run more than 1 "big" calculation in parallel.
If you set this value to -1 then you will disable the limit and allow the Engine to keep trying to load the application.
The negative value disables the limit, but it wouldn't exceed the limit set under the Max memory usage (%) or Memory usage mode.
Qlik Sense: "Calculation timed out" while loading a chart into an application