Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
Search our knowledge base, curated by global Support, for answers ranging from account questions to troubleshooting error messages.
Qlik offers a wide range of channels to assist you in troubleshooting, answering frequently asked questions, and getting in touch with our technical experts. In this article, we guide you through all available avenues to secure your best possible experience.
For details on our terms and conditions, review the Qlik Support Policy.
Index:
We're happy to help! Here's a breakdown of resources for each type of need.
Support | Professional Services (*) | |
Reactively fixes technical issues as well as answers narrowly defined specific questions. Handles administrative issues to keep the product up-to-date and functioning. | Proactively accelerates projects, reduces risk, and achieves optimal configurations. Delivers expert help for training, planning, implementation, and performance improvement. | |
|
|
(*) reach out to your Account Manager or Customer Success Manager
Your first line of support: https://community.qlik.com/
Looking for content? Type your question into our global search bar:
Leverage the enhanced and continuously updated Knowledge Base to find solutions to your questions and best practice guides. Bookmark this page for quick access!
Subscribe to maximize your Qlik experience!
The Support Updates Blog
The Support Updates blog delivers important and useful Qlik Support information about end-of-product support, new service releases, and general support topics. (click)
The Qlik Design Blog
The Design blog is all about product and Qlik solutions, such as scripting, data modelling, visual design, extensions, best practices, and more! (click)
The Product Innovation Blog
By reading the Product Innovation blog, you will learn about what's new across all of the products in our growing Qlik product portfolio. (click)
Q&A with Qlik
Live sessions with Qlik Experts in which we focus on your questions.
Techspert Talks
Techspert Talks is a free webinar to facilitate knowledge sharing held on a monthly basis.
Technical Adoption Workshops
Our in depth, hands-on workshops allow new Qlik Cloud Admins to build alongside Qlik Experts.
Qlik Fix
Qlik Fix is a series of short video with helpful solutions for Qlik customers and partners.
Suggest an idea, and influence the next generation of Qlik features!
Search & Submit Ideas
Ideation Guidelines
Get the full value of the community.
Register a Qlik ID:
Incidents are supported through our Chat, by clicking Chat Now on any Support Page across Qlik Community.
To raise a new issue, all you need to do is chat with us. With this, we can:
Log in to manage and track your active cases in Manage Cases. (click)
Please note: to create a new case, it is easiest to do so via our chat (see above). Our chat will log your case through a series of guided intake questions.
When creating a case, you will be prompted to enter problem type and issue level. Definitions shared below:
Select Account Related for issues with your account, licenses, downloads, or payment.
Select Product Related for technical issues with Qlik products and platforms.
If your issue is account related, you will be asked to select a Priority level:
Select Medium/Low if the system is accessible, but there are some functional limitations that are not critical in the daily operation.
Select High if there are significant impacts on normal work or performance.
Select Urgent if there are major impacts on business-critical work or performance.
If your issue is product related, you will be asked to select a Severity level:
Severity 1: Qlik production software is down or not available, but not because of scheduled maintenance and/or upgrades.
Severity 2: Major functionality is not working in accordance with the technical specifications in documentation or significant performance degradation is experienced so that critical business operations cannot be performed.
Severity 3: Any error that is not Severity 1 Error or Severity 2 Issue. For more information, visit our Qlik Support Policy.
If you require a support case escalation, you have two options:
A collection of useful links.
Qlik Cloud Status Page
Keep up to date with Qlik Cloud's status.
Support Policy
Review our Service Level Agreements and License Agreements.
Live Chat and Case Portal
Your one stop to contact us.
From R2024-05, Java 17 will become the only supported version to start most Talend modules, enforcing the improved security of Java 17 and eliminating concerns about Java's end-of-support for older versions. In 2025, Java 17 will become the only supported version for all operations in Talend modules.
Starting from v2.13, Talend Remote Engine requires Java 17 to run. If some of your artifacts, such as Big Data Jobs, require other Java versions, see Specifying a Java version to run Jobs or Microservices.
Content
Qlik Talend Module | Patch Level and Version |
Studio | Supported from R2023-10 onwards |
Remote Engine | 2.13 or later |
Runtime | 8.0.1-R2023-10 or later |
For Windows users, please follow the JDK installation guide (docs.oracle.com).
For Linux users, please follow the JDK installation guide (docs.oracle.com).
For MAC OS users, please follow the JDK installation guide (docs.oracle.com).
When working with software that supports multiple versions of Java, it's important to be able to specify the exact Java version you want to use. This ensures compatibility and consistent behavior across your applications. Here is how you can specify a specific Java version on the following products (such as build servers, shared application server, and similar):
For Studio users who are using multiple JDKs, please follow the appropriate instructions listed above and follow the proceeding additional steps:
-vm
<JDK17 HOME>\bin\server\jvm.dll
For Remote Engine (RE) users who are using multiple JDKs, please follow the appropriate instructions listed above and follow the proceeding additional steps.
For Runtime users who are using multiple JDKs, please follow the appropriate instructions listed above and follow the proceeding additional steps.
If Runtime is not running as a service:
With the Enable Java 17 compatibility option activated, any Job built by Talend Studio cannot be executed with Java 8. For this reason, verify the Java environment on your Job execution servers before activating the option.
To use Talend Administration Center with Java 17, you need to open the <tac_installation_folder>/apache-tomcat/bin/setenv.sh file and add the following commands:
# export modules export JAVA_OPTS="$JAVA_OPTS --add-opens=java.base/sun.security.x509=ALL-UNNAMED --add-opens=java.base/sun.security.pkcs=ALL-UNNAMED"
Windows users use <tac_installation_folder>\apache-tomcat\bin\setenv.bat
For Java 17 users, Talend CICD process requires the following Maven options:
set "MAVEN_OPTS=%MAVEN_OPTS% --add-opens=java.base/java.net=ALL-UNNAMED --add-opens=java.base/sun.security.x509=ALL-UNNAMED --add-opens=java.base/sun.security.pkcs=ALL-UNNAMED"
For Java 17 users, Talend CICD process requires the following Maven options:
export MAVEN_OPTS="$MAVEN_OPTS \ --add-opens=java.base/java.net=ALL-UNNAMED \ --add-opens=java.base/sun.security.x509=ALL-UNNAMED \ --add-opens=java.base/sun.security.pkcs=ALL-UNNAMED"
<name>TALEND_CI_RUN_CONFIG</name> <description>Define the Maven parameters to be used by the product execution, such as: - Studio location - debug flags These parameters will be put to maven 'mavenOpts'. If Jenkins is using Java 17, add: --add-opens=java.base/java.net=ALL-UNNAMED --add-opens=java.base/sun.security.x509=ALL-UNNAMED --add-opens=java.base/sun.security.pkcs=ALL-UNNAMED </description>
Overview
Enable your Remote Engine to run Jobs or Microservices using a specific Java version.
By default, a Remote Engine uses the Java version of its environment to execute Jobs or Microservices. With Remote Engine v2.13 and onwards, Java 17 is mandatory for engine startup. However, when it comes to running Jobs or Microservices, you can specify a different Java version. This feature allows you to use a newer engine version to run the artifacts designed with older Java versions, without the need to rebuild these artifacts, such as the Big Data Jobs, which reply on Java 8 only.
When developing new Jobs or Microservices that do not exclusively rely on Java 8, that is to say, they are not Big Data Jobs, consider building them with the add-opens option to ensure compatibility with Java 17. This option opens the necessary packages for Java 17 compatibility, making your Jobs or Microservices directly runnable on the newer Remote Engine version, without having to go through the procedure explained in this section for defining a specific Java version. For further information about how to use this add-opens option and its limitation, see Setting up Java in Talend Studio.
Procedure
c:\\Program\ Files\\Java\\jdk11.0.18_10\\bin\\java.exe
org.talend.remote.jobserver.commons.config.JobServerConfiguration.JOB_LAUNCHER_PATH=c:\\jdks\\jdk11.0.18_10\\bin\\java.exe
ms.custom.jre.path=C\:/Java/jdk/binMake this modification before deploying your Microservices to ensure that these changes are correctly taken into account.
Reload fails in QMC even though script part is successfull in Qlik Sense Enterprise on Windows November 2023 and above.
When you are using a NetApp based storage you might see an error when trying to publish and replace or reloading a published app.
In the QMC you will see that the script load itself finished successfully, but the task failed after that.
ERROR QlikServer1 System.Engine.Engine 228 43384f67-ce24-47b1-8d12-810fca589657
Domain\serviceuser QF: CopyRename exception:
Rename from \\fileserver\share\Apps\e8d5b2d8-cf7d-4406-903e-a249528b160c.new
to \\fileserver\share\Apps\ae763791-8131-4118-b8df-35650f29e6f6
failed: RenameFile failed in CopyRename
ExtendedException: Type '9010' thrown in file
'C:\Jws\engine-common-ws\src\ServerPlugin\Plugins\PluginApiSupport\PluginHelpers.cpp'
in function 'ServerPlugin::PluginHelpers::ConvertAndThrow'
on line '149'. Message: 'Unknown error' and additional debug info:
'Could not replace collection
\\fileserver\share\Apps\8fa5536b-f45f-4262-842a-884936cf119c] with
[\\fileserver\share\Apps\Transactions\Qlikserver1\829A26D1-49D2-413B-AFB1-739261AA1A5E],
(genericException)'
<<< {"jsonrpc":"2.0","id":1578431,"error":{"code":9010,"parameter":
"Object move failed.","message":"Unknown error"}}
ERROR Qlikserver1 06c3ab76-226a-4e25-990f-6655a965c8f3
20240218T040613.891-0500 12.1581.19.0
Command=Doc::DoSave;Result=9010;ResultText=Error: Unknown error
0 0 298317 INTERNAL&
emsp; sa_scheduler b3712cae-ff20-4443-b15b-c3e4d33ec7b4
9c1f1450-3341-4deb-bc9b-92bf9b6861cf Taskname Engine Not available
Doc::DoSave Doc::DoSave 9010 Object move failed.
06c3ab76-226a-4e25-990f-6655a965c8f3
Qlik Sense Client Managed version:
Potential workarounds
The most plausible cause currently is that the specific engine version has issues releasing File Lock operations. We are actively investigating the root cause, but there is no fix available yet.
QB-25096
QB-26125
The information in this article is provided as is. Adjustments to error messages cannot be supported by Qlik Support and can lead to difficulties identifying underlying root causes of technical issues experienced later. All changes will be reverted after an upgrade.
The method documented in this article is intended for later versions of Qlik Sense Enterprise on Windows.
The message texts are defined in JavaScript files located by default in path C:\Program Files\Qlik\Sense\Client\translate\. There are subfolders for each supported language, e.g. en-US for English.
To modify the message text edit e.g. the file hub.js in the folder corresponding to the language you want to change the text for.
To change e.g. the default message text for access denied messages you need to find the following line:
"ProxyError.OnLicenseAccessDenied": "You cannot access Qlik Sense because you have no access pass.",
To change the text of the message you need to modify the second part of the text which is in brackets (see example below):
"ProxyError.OnLicenseAccessDenied": "This is a modified message text for access denied events",
To modify the error message:
Users will not be able to see the new message until their browser cache has been cleared.
After upgrading or installing Qlik Sense Enterprise on Windows May 2022, users may not see all of their streams.
The streams appear only after an interaction or a click anywhere on the Hub.
This is caused by defect QB-10693, resolved in May 2022 Patch 4.
Qlik Sense Enterprise on Windows May 2022
To work around the issue without a patch:
A fix is available in May 2022, patch 4.
Streams will show up (after a few seconds) without the need to click or interact with the hub.
Important NOTE about feature HUB_HIDE_EMPTY_STREAMS:
When you activate HUB_HIDE_EMPTY_STREAMS, you will have an expected delay before all streams appear.
To improve this delay, from Patch 4, you can add HUB_OPTIMIZED_SEARCH (needs to be added manually as a new flag). As of now, HUB_OPTIMIZED_SEARCH tag will be available in the upcoming August 2022 release and is not planned for any patches (yet)
If this delay (seconds) is not acceptable, you will need to disable this HUB_HIDE_EMPTY_STREAMS capability.
This defect was introduced by a new capability service.
QB-10693
This article provides step-by-step instructions for implementing Azure AD as an identify provider for Qlik Cloud. We cover configuring an App registration in Azure AD and configuring group support using MS Graph permissions.
It guides the reader through adding the necessary application configuration in Azure AD and Qlik Sense Enterprise SaaS identity provider configuration so that Qlik Sense Enterprise SaaS users may log into a tenant using their Azure AD credentials.
Content:
Throughout this tutorial, some words will be used interchangeably.
The tenant hostname required in this context is the original hostname provided to the Qlik Enterprise SaaS tenant.
Copy the "value of the client secret" and paste it somewhere safe.After saving the configuration the value will become hidden and unavailable.
In the OpenID permissions section, check email, openid, and profile. In the Users section, check user.read.
Failing to grant consent to GroupMember.Read.All may result in errors authenticating to Qlik using Azure AD. Make sure to complete this step before moving on.
In this example, I had to change the email claim to upn to obtain the user's email address from Azure AD. Your results may vary.
While not hard, configuring Azure AD to work with Qlik Sense Enterprise SaaS is not trivial. Most of the legwork to make this authentication scheme work is on the Azure side. However, it's important to note that without making some small tweaks to the IdP configuration in Qlik Sense you may receive a failure or two during the validation process.
For many of you, adding Azure AD means you potentially have a bunch of clean up you need to do to remove legacy groups. Unfortunately, there is no way to do this in the UI but there is an API endpoint for deleting groups. See Deleting guid group values from Qlik Sense Enterprise SaaS for a guide on how to delete groups from a Qlik Sense Enterprise SaaS tenant.
Qlik Cloud: Configure Azure Active Directory as an IdP
[TASK_MANAGER ]E: Unable to resolve the conflict between the execute task command('CDC only') and the task settings('full load only') [1021616] (replicationtask.c:1680)
This error is an example of corrupted metadata for the task. The task settings saved in the SQLite files no longer match and will need a correction.
There are two methods to refresh the metadata of the task when the task is set to CDC only. The first option should be used unless corruption prevents the option from starting properly.
Method One
Use the Advanced Run Options to start the task from a timestamp.
Method Two
Use the Advanced Run Options to reload the Metadata of the tables completely. Note that this option will reload all your tables
[SOURCE_CAPTURE ]W: The metadata for source table 'dbo.table' is different than the corresponding MS-CDC Change Table. The table will be suspended. (sqlserver_mscdc.c:805)
This warning can occur from hidden columns as Qlik Replicate is not able to identify the hidden columns that exist on the SQL Server database. This will cause a mismatch between Qlik Replicate and the SQL Server's version of the metadata.
Hidden columns need to be revealed for Qlik Replicate to include them in the metadata. Columns that are hidden will look like the following in your DDL.
[$ValidFromDTM] [datetime2](7) GENERATED ALWAYS AS ROW START HIDDEN NOT NULL,
[$ValidToDTM] [datetime2](7) GENERATED ALWAYS AS ROW END HIDDEN NOT NULL,
Example SQL queries to unhide the hidden columns:
ALTER TABLE <the table name> ALTER COLUMN [SysStartTime] drop HIDDEN;
ALTER TABLE <the table name> ALTER COLUMN [SysEndTime] drop HIDDEN;
Qlik Replicate is not able to identify the hidden columns that exist on the SQL Server database. This will cause a mismatch between Replicate and SQL Server's version of the metadata.
QB-25470
When adding P&L Pivot table to the sheet, the table is blank and there is no dimension and measure in the object container. When the object is selected, there is no Data menu to select on the Object Properties menu.
Qlik Sense installation file may corrupt. Repairing Qlik Sense resolves the issue.
After upgrading Qlik Sense Enterprise on Windows, the Qlik Sense Management Console (QMC) may fail to load. The following error is displayed:
InitializationError: Initialization of the Qlik Management Console failed for the following reason: Missing parameter value(s)
The Qlik Sense version upgrade or patch update did not complete as expected.
Installing and uninstalling Qlik Sense Patches
This article shows you how to locate your Qlik Cloud license number (Subscription ID) as well as your Tenant Hostname, your Tenant ID, and your Recovery Address.
Only the Tenant Admin can see the Qlik Cloud Subscription ID and Tenant Hostname / ID.
If you are looking to change your Tenant Alias or Display name, see Assigning a hostname Alias to a Qlik Cloud Tenant and changing the Display Name.
Elasticsearch logs are generated in the Logserver/elasticsearch-1.5.2/log directory. If the log files are not moved or deleted, disk space can fill up. Can Elasticsearch log files be purged automatically?
Update the Elasticsearch log configuration, and use the MaxBackupIndex option to determine how many backup files are kept before the oldest is erased.
The Elasticsearch logs configuration can be found in Logserver/elasticsearch-1.5.2/config/logging.yml.
The default configuration is:
"
file:
type: dailyRollingFile
file: ${path.logs}/${cluster.name}.log
datePattern: "'.'yyyy-MM-dd"
layout:
type: pattern
conversionPattern: "[%d{ISO8601}][%-5p][%-25c] %m%n"
"
With this configuration a file will be created every day in order to save the previous day's log. As a result, the number of log files will increase, and it can lead to the disk full problem.
To restrict the amount of backup files, set MaxBackupIndex:
Stop these Talend services:
Talend Log Server search engine
Talend Log Server analytics and visualization platform
"
file:
type: rollingFile
file: ${path.logs}/${cluster.name}.log
maxFileSize : 200KB
maxBackupIndex: 4
layout:
type: pattern
conversionPattern: "[%d{ISO8601}][%-5p][%-25c] %m%n"
"
Qlik Talend Product: Reducing the logging threshold for Elasticsearch in Talend Log Server
To manage and browse executions of your Tasks, Plans and Promotions, please use the correct objects in the processing entity.
As per our API doc https://api.talend.com/apis/processing/2021-03/ and PlanExecutable section, the object step identifier(stepId) should be equivalent to stepExecutionID.
For example:
{
"executable": "b91cf8b2-5dd1-4b18-915b-4c447cee5267",
"executionPlanId": "0798b8d1-0e12-472f-be02-a0f04e792daa",
"rerunOnlyFailedTasks": true,
"stepId": "09043c9f-02d0-41f6-b3cb-0ea53ffde377"
}
If we put stepId in the field, it is likely to get exception when calling API to execute plan
"Validation do not allow processing entity"
The solution is to get all steps for a plan execution (order by designed execution, only steps, without error handlers) to gain stepExecutionID from query by issuing this API
{{apiUrl}}/processing/executions/plans/:planExecutionId/steps
And the executionID in the successful response array of StepExecution is the one you need to select and put in the stepID field of API call post /processing/executions/plans
various Talend Products
Tabular Reporting events in the management console not showing for all the users in the tabular reporting recipient list
When section access is used in a Qlik App, ensure to add all required recipients/users to the section access load script
For example, users in the Recipient import file should ideally match the users entered to the Section Access load script of the app.
This generally permits users to view management console details such as 'Events' assuming those user also have the necessary 'view' permissions in the tenant in which the app exists
If some of the recipients in the tabular reporting recipient list do not have access to the Space/App - they won't be considered in the task execution because they fail the governance.
ie: Recipients/users that are not added to the load script will not have access to the app nor associated management console events.
This is expected behavior.
In my recent Getting SaaSsy with DataRobot post, I documented how to use the DataRobot Analytics Connector from within your Qlik Sense applications. I know this will sound crazy but what if you want to make predictions on data you aren't loading into your application? Maybe you are collecting input parameters in your application from end users to play what-if games. Maybe you will record the predictions but you also want to take immediate action based on their values (ie prescriptive analytics.) Well, those things sound like perfect use cases for Qlik ApplicationAutomation.
It gets better my friends. Whether you are using a dedicated on-premise DataRobot server, a dedicated tenant or you are on the leading edge path with DataRobot's shiny new AI Cloud Manager using Paxata, Qlik Application Automation has you covered, and so do I.
In this post, I will help you identify the right DataRobot Connector Block to use for your path, help you understand how to execute predictions, and help you understand what to do with the output from the predictions.
You have already chosen your DataRobot path, now it's just a matter of choosing the correct block from the DataRobot Connector. You probably would have guessed from the elaborate way I described the DataRobot choices which Qlik Application Automation block goes to which environment. But to be sure ... If you have a dedicated DataRobot server, or you have a dedicated tenant, you should use the List Prediction Explanation from Dedicated Prediction API block. If you are using the DataRobot AI Manager environment with Paxata, you should use the List Predictions block.
Oh no! What's that you are saying? You weren't told which path your organization chose, you were just given credentials and you just log in. Don't sweat it. I can help you with that. Just go to your Deployments within DataRobot, choose the Deployment you are going to execute predictions against, and choose Predictions, Prediction API and Real-time. DataRobot will provide all of the clues we need to choose the right block.
If the API URL contains App2.datarobot.com like in the first image below you are working with their AI Cloud and will need to use the List Predictions block. However, if you see a Dedicated path in your API_URL such as Qlik.orm.datarobot.com (second image) you will need to use the List Predictions fro Dedication Prediction API.
There are some other clues above as well. Notice in the second image the rest of the URL path contains api/v2/deployments, while the second image contains preApi/v1.0/deployments. It's basically DataRobot telling you which of their API's you need to utilize.
So how will that help you know for sure? One of the things that many people seldom look at with Qlik Application Automation blocks is the Description. Simply drag either/both of the blocks onto your canvas and scroll all the way down in the right panel. If you look at the List Predictions for Dedication Prediction API description you will see the following and notice it clearly indicates preApi/v1.0/deployments.
However, if you press the Show API endpoint link for the List Predictions block it will look like this. Both are dead giveaways as to the block/path you should choose.
Regardless of which block you are using, you will first need to create a Connection for Qlik Application Automation to your DataRobot environment. If you have a Dedicated server your Connection details will look like this. Notice that you will need to copy the api_key value right from the DataRobot Deployment Details:
Your DataRobot AI Cloud connection will look similar and again you would need to copy your api_key from the deployment details. My DataRobot AI Cloud is just a "trial", hence I chose that region, while my dedicated tenant (above) is the US. The biggest difference is in the domain. Dedicated connections will be app.datarobot.com and AI Cloud connections will be app2.datarobot.com:
Once you test/save the connections we are ready to start making predictions. The List Predictions block is the easiest to set up so we will start with that one. Simply click the drop-down in the Deployment Id field, press "Do Lookup" and then choose the specific deployment model you are going to be making predictions against.
Then you simply provide the Input data you want to pass to the deployment to have predictions made for. More about the Prediction Data later, but for now notice that I've simply hard-coded a JSON string with field/value pairs:
The List Predictions fro Dedication Prediction API requires us to do a few more things that need to be completed. The first is the Dedicated Prediction Url. Good thing I had you bring up your deployment details because we will just copy it. Notice you do not need the HTTPS:// or the /predApi.. text, just the actual URL information.
Next will again simply click the drop-down for the Deployment ID, click Do Lookup and then choose our desired deployment.
Next, we copy the Data Robot Key from our deployment details and then we can insert our JSON block. Again, more later about that so don't panic in thinking I'm suggesting that you hand code the values you want to predict. It's just to make this section easier to navigate. 😁
Qlik Application Automation provides 4 additional parameters that are part of the DataRobot API specification. You can define the Passthrough Columns, Passthrough Columns Set, turn on Prediction Warnings and set the Decimals Number format.
You can refer directly to the DataRobot API documentation for all of the details you wish. For instance, notice that I have the Prediction Warning Enabled set to "true." Getting warnings sounded like a good idea. But alas, I ended up with an error.
Well, it turns out that in order to utilize the Prediction Warning Enabled there is work that must be done on our Deployment within DataRobot.
I guess I could have saved myself the trouble had I read the documentation. Oh well, I simply changed my default back to false so that the prediction can run.
Above I simply demonstrated the JSON format you need for your Prediction Data with hard-coded values. I've used my DataRobots and have predicted with a 99.99999999% confidence level that your goal in reading this isn't to hardcode 50+ input values each time you want a prediction. Instead, that data will come from somewhere else. Which is perfectly ok. Maybe you will be pulling the values from some other system as part of a workflow. When A event triggers this Qlik Application you will go do B and C and then assign output from those things to Variables that you will use as the Prediction Data. That's a great plan ... simply choose your variables, and use them where you need them in your Prediction Data. Notice I have already assigned the MasterPatientID variable and am in process of choosing the race variable below.
I'm so sorry. You don't like to use values, and you weren't doing A, B and C you were actually firing a SQL Query live based on input to your workflow and you wanted to use the data from the SQL Query. That is brilliant. Pulling the live data, when whatever event you have chosen triggers the Automation. You should write some posts. No problem, Qlik Application Automation will absolutely allow you to do that.
Or perhaps you are using a writeback solution, like Inphinity Forms, within a Qlik Sense application to capture input parameters and you wish to use those values. Do that.
Or perhaps you are ... You get the point. The Prediction Data simply needs to be a JSON block containing the field/value pairs. How you construct it, or read it from an S3 bucket, or pull it out of thin air doesn't matter. Which is the beauty of working with DataRobot within Qlik Application Automation.
Woohoo, you now have a block that will execute a deployment in your DataRobot environment, regardless of which kind, and we are now ready for those wonderful predictions. Perhaps the first thing you noticed about the blocks List Predictions and List Predictions from Dedicate Prediction API was that they start with List as opposed to Get. It's of course because you may be passing a single row of data as Prediction Data or you could be passing many items in the JSON block. So these blocks are handled as lists, even if it is just a list of 1 prediction.
The DataRobot Connector for loading data into our applications simply returns the Prediction value, which is 0, in this case (the patient is not predicted to be readmitted.) However, notice below that within Qlik Application Automation either prediction block will return the Prediction as well, but it will also return a list of the Prediction values and the scores for each possible value. In my case, the 1, likely to be readmitted was scored at 0.0428973004, whereas the 0, the Prediction, was scored at .571026996.
Who cares?
Well, maybe you do. As I started this post I mentioned that perhaps we want to take action(s) based on the predictions which might be why you are making the prediction in your Qlik Application Automation workflow instead of just making the prediction in a Qlik Sense Application. If we are writing a flow that is "prescriptive" we might want to check the values. Ooooh .49999999999 vs .500000000001. Maybe that will be Action A, just email someone. While .000000001 vs .999999999 tells us that it's safe to go ahead and take the really expensive Action Z. So we might want to set up a Conditional expression.
Regardless of what we do with the values, Qlik Application Automation allows you to simply choose the values right from the block, just like it allowed you to choose Variables or data from another source.
If you don't already know me I will bring you up to speed quickly. I have very defined boundaries and am really particular about how things are worded. For example, take the phrase Data Science. Well, Science is explainable. Therefore, if something isn't explainable it isn't science. And if your predictions aren't explainable, then that isn't Data Science, it's just Data. One of the key reasons you are likely using DataRobot is the fact that it can so wonderfully return explanations for its predictions.
The Prediction of 0 above is nice. But knowing what factors led to the prediction may be just as valuable when helping us choose our prescriptive actions. Well, my friends, Qlik Application Automation has you covered for that scenario as well. In fact, you can see from the following image the block is literally raising its hand and begging you to choose it. List Prediction Explanation from Dedicated Prediction API will give you not only the Prediction, and the Prediction values but it will also return the explanations to you.
Wait something must be wrong. I see a qualitativeStrength of +++, but the second is --. What do those mean? Oh yeah, now I remember ... Qlik Application Automation is just calling the provided DataRobot API's so I might as well check the documentation from DataRobot so that I get a full and complete understanding of the input paramters I can choose for the block and understand the output values. Sure enough, it's covered.
https://docs.datarobot.com/en/docs/api/reference/predapi/pred-ref/dep-predex.html
I see you out there on the leading edge doing Time Series Predictions in DataRobot. Not an issue, Qlik Application Automation has you covered with a block as well. Simply choose the List Time Series Predictions from Dedicated Prediction API and you will be good to go.
The initial inputs needed are already covered above. However, there are a few additional parameters you will need to input as well.
Of course, DataRobot has you covered with complete documentation at https://docs.datarobot.com/en/docs/api/reference/predapi/pred-ref/time-pred.html
The information in this article is provided as-is and to be used at own discretion. Depending on tool(s) used, customization(s), and/or other factors ongoing support on the solution below may not be provided by Qlik Support.
Getting SaaSsy with Data Robot
Lions and Tigers and Reading and Writing Oh My
The authentication token associated with auto Provisioning is about to expire
Or
The authentication token associated with auto Provisioning has expired
The token must be deleted that was created for SCIM Auto-provisioning with Azure.
curl "https://<tenanthostname>/api/v1/api-keys?subType=externalClient" \ -H "Authorization: Bearer <dev-api-key>"
Ensure to replace <tenanthostname>
with your actual tenant hostname and <dev-api-key>
with your generated developer API key. Execute the command in Postman or a similar tool, and make sure to include the API key in the header for authorization.
Once you have obtained the key ID from the output, copy it for later use in the deletion process.
To delete the API key associated with SCIM provisioning, execute the following curl command:
curl "https://<tenanthostname>/api/v1/api-keys/<keyID>" \ -X DELETE \ -H "Authorization: Bearer <dev-api-key>"
<tenanthostname>
with your actual tenant hostname, <keyID>
with the key ID you obtained in step 2, and <dev-api-key>
with your developer API key. Execute this command in Postman or a similar tool, and ensure that the API key is included in the header for authorization.By following these steps, you can successfully delete the token created for SCIM Auto-provisioning with Azure.
The information in this article is provided as-is and is to be used at your own discretion. Depending on the tool(s) used, customization(s), and/or other factors ongoing support on the solution below may not be provided by Qlik Support.
The token was not deleted when the SCIM Auto-Provisioning was disabled or completely removed.
Using the GUEST user type when creating an Azure storage account in the Azure portal leads to an error after successful authentication from Qlik.
The remote server returned an error: (400) Bad Request.
Use a MEMBER account. Authentication with a GUEST user type is not supported for connecting from Qlik Cloud to Azure.
Properties of a Microsoft Entra B2B collaboration user
Permission differences between member users and guest users
Configuring Azure Blob Storage in Qlik Cloud
Qlik Sense Repository Service API (QRS API) contains all data and configuration information for a Qlik Sense site. The data is normally added and updated using the Qlik Management Console (QMC) or a Qlik Sense client, but it is also possible to communicate directly with the QRS using its API. This enables the automation of a range of tasks, for example:
Using Xrfkey header
A common vulnerability in web clients is cross-site request forgery, which lets an attacker impersonate a user when accessing a system. Thus we use the Xrfkey to prevent that, without Xrfkey being set in the URL the server will send back a message saying: XSRF prevention check failed. Possible XSRF discovered.
Environments:
Note: Please note that this example is related to token-based licenses and in case this is needed to be configured with Professional Analyser type of licenses you might need to use the following API calls:
Furthermore, combining this with QlikCli and in case you need to monitor and more specifically remove users, the following link from community might be useful: Deallocation of Qlik Sense License
This procedure has been tested in a range of Qlik Sense Enterprise on Windows versions.
$hdrs = @{} $hdrs.Add("X-Qlik-xrfkey","12345678qwertyui") $url = "https://qlikserver1.domain.local/qrs/about?xrfkey=12345678qwertyui" Invoke-RestMethod -Uri $url -Method Get -Headers $hdrs -UseDefaultCredentials
$hdrs = @{} $hdrs.Add("X-Qlik-xrfkey","12345678qwertyui") $hdrs.Add("X-Qlik-User","UserDirectory=DOMAIN;UserId=Administrator") $cert = Get-ChildItem -Path "Cert:\CurrentUser\My" | Where {$_.Subject -like '*QlikClient*'} $url = "https://qlikserver1.domain.local:4242/qrs/about?xrfkey=12345678qwertyui" Invoke-RestMethod -Uri $url -Method Get -Headers $hdrs -Certificate $cert
Execute the command.
A possible response for the 2 above scripts may look like this (Note that the JSON string is automatically converted to a PSCustomObject by PowerShell) :
buildVersion : 23.11.2.0 buildDate : 9/20/2013 10:09:00 AM databaseProvider : Devart.Data.PostgreSql nodeType : 1 sharedPersistence : True requiresBootstrap : False singleNodeOnly : False schemaPath : About
If there are several certificates from different Qlik Sense server, these can not be fetched by subject as there will have several certificates with subject QlikClient and that script will fail as it will return as array of certificates instead of a single certificate. In that case, fetch the certificate by thumbprint. This required more Powershell knowledge, but an example can be found here: How to find certificates by thumbprint or name with powershell