Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
The architecture of the integration at a High-level looks like this:
Qlik Sense Advanced Analytics integration is essentially an extension to Qlik Sense’s expression syntax, and as such it can be used in both Chart Expressions, and in Load Script Expressions.
With this new capability, we are now able to add syntax to a chart expression that tells Qlik Sense that particular expression should not be evaluated on the Qlik Sense server, but instead, all the information and data needed to calculate that expression should be sent via the server side extension on to the backend R system for calculation.
After the advanced analytic calculations are completed, the data is sent back to the Qlik Sense Server and to the client for visualization.
This video shows an example of how Qlik Sense connects to an R server for extending expression capabilities in Qlik Sense Apps while offloading the calculations to the R server engine.
Click here for Video Transcript
In order to start displaying a simple "Hello World" in Qlik Sense using a R-Script, we will do the following:
1. Have R & R-studio installed in your system. (RGui included with R for Windows can also be used) R can be downloaded at https://cloud.r-project.org/
2. We need a package in R to extend R functionality to applications via TCP/IP. The package name is "Rserve()"
Install the package using the below command in RStudio GUI:
install.packages('Rserve')
3. Now we need to invoke that library and start Rserve. In order to do so, execute the below scripts:
library(Rserve) Rserve()
4.The communication method from Sense to R is taken care using gRPC. R is not a supported language in gRPC by default.
So a possible solution for this is to develop a connector in any of the supported languages of gRPC. Qlik provides an open-source connector developed in C# which in turn access Rserve to be able to run R scripts.
qlik-oss/sse-r-plugin
Once you built the connector, start the SSEtoRserve.exe (ideally on the Rserve server itself)
Note: Qlik Support does not support this plugin directly. Inquiries should be submitted via GitHub under sse-r-plugin - Issues
5. Now we will have to configure the plugin:
Add the following line in the settings.ini file:
SSEPlugin=R,localhost:50051
The settings.ini is located in this location:
Add the following line in the settings.ini file:
SSEPlugin=R,localhost:50051
The settings.ini file is located in this location:
a. In the QMC, add a new Analytic Connection.
b. Restart the Qlik Sense Engine service.
Please refer to the screenshot below for creating a new connection.
Note: If the R-Plugin (SSEtoRserve.exe) was installed on the R-Server (where Rserve runs) or another machine, point to that machine name instead of 'localhost'. Also, in multi-node environments with multiple Qlik Sense Engines, even if the plugin was installed on the Central node, make sure to add the Central node's hostname instead of 'localhost' as the other Rim node Engine services need the correct DNS/Netbios name to reach the plugin.
6. Now Open a Qlik Sense App and add a KPI object in the sheet. This can be one of the Apps included with the plugin itself under <storage path>\sse-r-plugin-master\sense_apps
Note that the example apps also need data connections to be created to the data files included with these apps files in the above location.
7. Otherwise, a new app can be created and any data may be loaded for the SSE example below.
8. For the measure, add the following expression which contains an R-script:
R.ScriptEvalStr('paste(q$firstWord, q$secondWord);', 'Hello' as firstWord, 'World' as secondWord)
9. If everything is configured properly, the R-script shown in bold above should be executed fine and it should display a "Hello World" message.
R.ScriptEvalStr('paste(q$firstWord, q$secondWord);', Only([First Word]) as firstWord, Only([Second Word]) as secondWord)
Eight script functions are automatically added to the functionality of the plugin. What is needed to be covered on the plugin side to fulfill the functionality is to implement the EvaluateScript rpc function.
The syntax of these functions is:
<EngineSSEName>.<FunctionName>(Script [,Parameter...])
Where the Script is an R-Script to be evaluated & Parameter is the data sent from Qlik's end.
Here, we use the ScriptEvalStr function which accepts argument of type String & returns a String. The 'paste' function in R concatenates vectors after converting to character. We pass two data fields of type string from Qlik (First Word & Second Word). The R-script then references these data fields through the q dataframe (structure already taken care in R) (q$firstWord and q$secondWord). The script/function finally returns a String back to Qlik Sense.
Setting up Qlik Web Connectors may require you to provide administrator approval and consent for the respective apps (such as Azure storage, Office 358 Sharepoint, and similar). This article aims to provide you with information on what needs to be requested.
Personal Account: Once the connector has authorization to access the Microsoft account, there will be a corresponding entry in "My apps": https://myapps.microsoft.com/
Example:
Enterprise Account: These require additional configuration.
If a basic (non-admin) user attempts to authenticate, the following message would appear in Azure:
To fix this, the administrator needs to grant access to the users.
Once the administrator grants access, the user will see the warning:
Need admin approval
At this stage, you will need to request your administrator to enable the following API permissions for your user:
Azure Storage
MSCRM V2
Office365 SharePoint
OneDrive V2
Outlook 365
Some connectors require an encryption key before you create or edit a connection. Failing to generate a key will result in:
Error retrieving the URL to authenticate: ENCRYPTION_KEY_MISSING - you must manually set an encryption key before creating new connections.
Qlik Sense Desktop February 2022 and onwards
Qlik Sense Enterprise on Windows February 2022 and onwards
all Qlik Web Storage Provider Connectors
Google Drive and Spreadsheets Metadata
PowerShell demo on how to generate a key:
# Generates a 32 character base 64 encoded string based on a random 24 byte encryption key
function Get-Base64EncodedEncryptionKey {
$bytes = new-object 'System.Byte[]' (24)
(new-object System.Security.Cryptography.RNGCryptoServiceProvider).GetBytes($bytes)
[System.Convert]::ToBase64String($bytes)
}
$key = Get-Base64EncodedEncryptionKey
Write-Output "Get-Base64EncodedEncryptionKey: ""${key}"", Length: $($key.Length)"
Example output:
Get-Base64EncodedEncryptionKey: "muICTp4TwWZnQNCmM6CEj4gzASoA+7xB", Length: 32
This command must be run by the same user that is running the Qlik Sense Engine Service (Engine.exe). For Qlik Sense Desktop, this should be the currently logged-in user.
Do the following:
Open a command prompt and navigate to the directory containing the connector .exe file. For example:
"cd C:\Program Files\Common Files\Qlik\Custom Data\QvWebStorageProviderConnectorPackage"
Run the following command:
QvWebStorageProviderConnectorPackage.exe /key {key}
Where {key} is the key you generated. For example, if you used the OpenSSL command, your key might look like: QvWebStorageProviderConnectorPackage.exe /key zmn72XnySfDjqUMXa9ScHaeJcaKRZYF9w3P6yYRr
You will receive a confirmation message:
Info: Set key. New key id=qseow_prm_custom.
Info: key set successfully!
The {sense service user} must be the name of the Windows account which is running your Qlik Sense Engine Service. You can see this in the Windows Services manager. In this example, the user is: MYCOMPANY\senseserver.
Do the following:
Open a command prompt and run:
runas /user:{sense service user} cmd. For example:runas /user:MYCOMPANY\senseserver
Run the following two commands to switch to the directory containing the connectors and then set the key:
"cd C:\Program Files\Common Files\Qlik\Custom Data\QvWebStorageProviderConnectorPackage"
QvWebStorageProviderConnectorPackage.exe /key {key}
Where {key} is the key you generated. For example, if you used the OpenSSL command, your key might look like: QvWebStorageProviderConnectorPackage.exe /key zmn72XnySfDjqUMXa9ScHaeJcaKRZYF9w3P6yYRr
You should repeat this step, using the same key, on each node in the multinode environment.
Encryption keys will be stored in: "C:\Users\{sense service user}\AppData\Roaming\Qlik\QwcKeys\"
For example, encryption keys will be stored in "C:\Users\QvService\AppData\Roaming\Qlik\QwcKeys\"
Always run the command prompt while logged in with the Qlik Sense Service Account which is running your Qlik Sense Engine Service and which has access to all the required folders and files.
This security requirement came into effect in February 2022. Old connections made before then will still work, but you will not be able to edit them. If you try to create or edit a connection that needs a key, you will receive an error message: Error retrieving the URL to authenticate: ENCRYPTION_KEY_MISSING) - you must manually set an encryption key before creating new connections.
Instead of pulling the data from the Primary node, which can potentially negatively affect performance, we can force Qlik Replicate to connect to a secondary read-only node during Full Load.
To do so:
The Data Load Editor in Qlik Sense Enterprise on Windows 2025 experiences noticeable performance issues.
The issue is caused by defect SUPPORT-6006. Qlik is actively working on a fix.
A fix is planned for the next possible patches. Review the Release Notes for SUPPORT-6006.
A workaround is available. It is viable as long as the Qlik SAP Connector is not in use.
No service restart is required.
SUPPORT-6006
This article outlines recommended retention policies for Oracle redo logs and provides detailed recovery steps in scenarios where redo logs are needed. It includes guidance on preserving logs, identifying stream positions, verifying log availability, and using SCN-based recovery with SQL_REDO extraction for issue resolution.
Generally, Qlik recommends a 24-hour retention period for redo logs, but this can vary depending on the storage capacity and backup policies of each organization, as they have different requirements.
When customers back up their redo logs, they typically retain backups for at least 2 weeks to 1 month. This allows them to restore the logs if necessary. The specific policy regarding the retention of redo logs in backup media should be clarified with the customer.
SELECT name, sequence#, thread#, first_change#
FROM v$archived_log
WHERE sequence# = [sequence#] AND thread# = [thread#];
By following these steps, you can ensure proper recovery from redo logs and address any issues related to missing or inconsistent data.
During a CICD artifact update or redeployment, the custom context variables are lost.
CICD only publishes a new version of the artifact to the Qlik Talend Management Console. When creating a task via the Qlik Talend Management Console API call, if the user includes the following parameters in their JSON request (as demonstrated in the Resolution section), it will result in updating the task with the newly published artifact and the default artifact context values.
When creating a task via the Qlik Talend Management Console API for the first time, please ensure that the "overrideWithDefaultParameters" parameter is set to false in the "autoUpgradeInfo" section of the JSON body. For example:
"autoUpgradeInfo": {
"autoUpgradable": true,
"overrideWithDefaultParameters": false
}
By setting overrideWithDefaultParameters to false during task creation, you can prevent the task from being updated with default context values for future artifact deployments via CICD.
After upgrading Qlik Replicate from 2023.11 to 2024.11 or 2025.5, testing connections and other UI endpoint operations fail whenever you are using an ADDON (such as UDF) that retrieves the password from the Vault.
The task works correctly. It is only the connection test that fails.
The following messages are from the task logs:
Error: SYS-E-HTTPFAIL, SYS-E-HTTPFAIL, Failed in prepare imp for Snowflake.. SYS,GENERAL_EXCEPTION,SYS-E-HTTPFAIL, Failed in prepare imp for Snowflake.,SYS,GENERAL_EXCEPTION,Failed in prepare imp for Snowflake,Failed to deobfuscate password
[INFRASTRUCTURE ]W: Password provider is not registered (at_secure.c:2454) (child_proc_exec.c:36)
[SERVER ]E: Failed to deobfuscate password [1003200] (cloud_imp.c:3700) (child_proc_exec.c:36)
[INFRASTRUCTURE ]T: Process id 396 : 00005244: 2025-06-26T12:42:26 [SERVER ]E: Failed in prepare imp for Snowflake [1003200] (cloud_imp.c:4770) (child_proc_exec.c:36)
This has been identified as a defect (ID RECOB-10010).
The fix is scheduled for the Qilk Replicate release in November 2025. See the Release Notes for details.
The child REPCTL process does not identify the ADDON.
RECOB-10010
A Qlik Replicate task with MSSQL as a source and Oracle as a target fails during Full Load and CDC.
The error logged:
Failure in resolving table name for objid %
Failure in resolving table name for objid is commonly linked to temporary tables. When applications create temporary tables, they are logged in the transaction log. Qlik Replicate attempts to resolve the object ID from the log, but may fail if the object is unknown or transient.
To investigate further and to identify if there is data loss:
Run the following query to identify the table name:
SELECT name, object_id FROM sys.objects WHERE type = 'U' AND object_id LIKE '%' -- Replace % with the actual objid
This error is often associated with temporary tables. When an application creates temporary tables, they are written to the transaction log.
Qlik Replicate then attempts to read an unknown object ID from this log and fails to resolve it to a table name.
NPrinting has a library of APIs that can be used to customize many native NPrinting functions outside the NPrinting Web Console.
An example of two of the more common capabilities available via NPrinting APIs are as follows
These and many other public NPrinting APIs can be found here: Qlik NPrinting API
In the Qlik Sense data load editor of your Qlik Sense app, two REST connections are required (These two REST Connectors must also be configured in the QlikView Desktop application>load where the API's are used. See Nprinting Rest API Connection through QlikView desktop)
Requirements of REST user account:
Creating REST "GET" connections
Note: Replace QlikServer3.domain.local with the name and port of your NPrinting Server
NOTE: replace domain\administrator with the domain and user name of your NPrinting service user account
Creating REST "POST" connections
Note: Replace QlikServer3.domain.local with the name and port of your NPrinting Server
NOTE: replace domain\administrator with the domain and user name of your NPrinting service user account
Ensure to enter the 'Name' Origin and 'Value' of the Qlik Sense (or QlikView) server address in your POST REST connection only.
Replace https://qlikserver1.domain.local with your Qlik sense (or QlikView) server address.
Ensure that the 'Origin' Qlik Sense or QlikView server is added as a 'Trusted Origin' on the NPrinting Server computer
NOTE: The information in this article is provided as-is and to be used at own discretion. NPrinting API usage requires developer expertise and usage therein is significant customization outside the turnkey NPrinting Web Console functionality. Depending on tool(s) used, customization(s), and/or other factors ongoing, support on the solution below may not be provided by Qlik Support.
Scenario: A task with Sybase ASE as the Source and Confluent Kafka as the Destination.
Requirement: Create a new column with a unique identifier "UUID" to identify the message in Confluent Kafka.
Is there any customization in Qlik Replicate that allows the creation of a unique sequencer for each replicated record per table?
The following transformation is provided as is. For more detailed customization assistance, post your requirement in the Qlik Replicate forum, where your knowledgeable Qlik peers and our active Support Agents can help. If you need direct assistance, contact Qlik's Consulting Services.
You can add a new column and hard-code it to a unique (random) value by using the Transform tab in Table Settings.
Example Expression Value:
substr(lower(hex(randomblob(16))),1,8) || '-' || substr(lower(hex(randomblob(16))),9,4) || '-4' || substr(lower(hex(randomblob(16))),13,3) || '-' || substr('89ab', abs(random()) % 4 + 1, 1) || substr(lower(hex(randomblob(16))),17,3) || '-' || substr(lower(hex(randomblob(16))),21,12)
For more information on transformations, see Defining transformations for a single table/view.
Loading MS Access DB data via Qlik Data Gateway fails when using the Data Manager. The following error is thrown:
Failed to add data
Data could not be added to Data manager. Please verify that all data sources connected to the app are working and try adding the data again.
Loading data through the Data Load Editor works as expected.
Reviewing the ODBC log for additional details highlights the error:
System.Exception: ODBC Wrapper: Unable to execute SQLForeignKeys: [Microsoft][ODBC Driver Manager] Driver does not support this function
This is a driver-related issue and a known limitation.
Use the Data Load Editor instead of the Data Manager.
MS Access is a legacy technology dating back to the 1990s; loading data from a file-based Access database via ODBC into a modern BI tool is considered an outdated solution. Moreover, with larger files, this approach is likely to result in significant performance degradation during data loading.
The MS Access ODBC driver does not support the SQLForeignKeys ODBC function.
You may encounter this issue that unable to connect to Azure SQL Database after enabling TLS v1.2 on the database side, and gives an error when executing the task on the Talend management Console side.
java.sql.SQLException: Reason: Login failed due to client TLS version being less than minimal TLS version allowed by the server.
When using the MSSQL components in Talend Studio, Qlik Talend components support 2 kinds of driver (explicit choice to make). The open source one JTDS or the official Microsoft jdbc driver.
It is recommended to use the official MSSQL driver to ensure greater compatibility. The jTDS is not recommended but can still be supported and this driver in a “deprecated” status, Keep this possibility to use this driver for users who want to use it and for whom it remains compatible with their databases.
There are two solutions for this issue as belows
Solution 1
The jTDS driver prior 1.2 does not support TLS v1.2. Please use the official JDBC driver from Microsoft (switch to the official MSSQL driver to ensure greater compatibility).
Solution 2
Add ssl=require to the end of the JDBC connection URL (ex: jdbc:jtds:sqlserver://my-example-instance.c656df8582985.database.windows.net:1433;ssl=require), or in "Additional JDBC parameters" field of MSSQL components and then re-publish the job and run on Talend management Console.
The open source jTDS driver, specifically earlier versions like 1.2, does not natively support TLS 1.2. This can lead to connection failures when connecting to SQL Server instances that enforce TLS 1.2, such as Azure SQL Managed Instances or on-premises servers configured to require TLS 1.2 for security reason.
Microsoft will deprecate Change Data Capture (CDC) components by Attunity. See SQL Server Integration Services (SSIS) Change Data Capture Attunity feature deprecations | microsoft.com for details.
Will this affect Qlik Replicate?
This announcement does not affect Qlik Replicate. It is only relevant to the product "Change Data Capture (CDC) components by Attunity".
Microsoft distributes and provides primary support for this product. Qlik Replicate's functionality will remain the same.
Qlik supports connecting to SAP HANA DATABASE using Qlik's QvOdbcConnectorPackage. This assumes that the SAP HDBODBC Driver has been installed on the Qlik system.
To connect to the HANA Database using ODBC drivers:
If the drivers are missing, install SAP HANA CLIENT (see SAP HANA AND QLIK VIEW/SENSE | community.sap.com for details)
From ODBC Data Sources > USER DSN, select HDBODBC and complete your configuration
In Qlik Sense, use the created ODBC DSN to create a new connection
Even if the archived redo logs are not deleted, the following error may still occur, causing task failure:
2025-06-19T15:20:29 [SOURCE_CAPTURE ]E: Archived Redo log with the sequence '188055' does not exist, thread 5 [1022318] (oradcdc_thread.c:711)
Two DEST_IDs exist.
ARCHIVED Log Status – Query Execution Results
SEQUENCE#,THREAD#,NAME,DEST_ID,FIRST_TIME,NEXT_TIME,DELETED,COMPLETION_TIME
188055,5,,1,2025-06-19 15:18:43,2025-06-19 15:20:28,YES,2025-06-19 15:20:28
188055,5,dg_osaka,2,2025-06-19 15:18:43,2025-06-19 15:20:28,NO,2025-06-19 15:20:28
If multiple DEST_IDs exist, have the Oracle DBA check the configuration and set the correct values for the Archived and Alternate Redo Logs Destination IDs.
If multiple DEST_IDs exist and neither the archived redo logs location identifier nor the alternate archived redo logs destination ID is specified, the Qlik Replicate server will use the lowest existing DEST_ID.
As a result, Qlik Replicate will only use DEST_ID=1, which causes the issue.
The Save button is greyed out when enabling or disabling the Process Isolation setting in Data Gateway. This issue only occurs when the Gateway has been upgraded from older versions.
This must be done on all connectors, even if they are not actively used.
DG-444
The new SAP Hana trigger based backend DB Endpoint, used with the SAP Application DB Endpoint, fails when using the new Full Record Mode (V4) triggers.
The logged error:
RetCode: SQL_ERROR SqlState: S1000 NativeError: 362 Message: [SAP AG][LIBODBCHDB DLL][HDBODBC] General error;362 invalid schema name: R4S - SCHEMA_NAME: line 1 col 15 (at pos 14) [1022502]
Failed to get table R4S - SCHEMA.TABLE_NAME definition [1022500]
Upgrade to Qlik Replicate 2024.11 SP05 or higher.
An update was necessary to Qlik Replicate due to how the table list is being captured in the SAP Source Endpoint.
RECOB-10089
The purpose of this article is to address an extraction error that occurs with the Facebook Ads integration when replicating data from the ads table. Users may encounter a 500 error when using the Facebook Ads integration in Stitch.
The error message states:
tap - Status: 500
tap - Response:
tap - {
tap - "error": {
"code": 1,
"message": "Please reduce the amount of data you're asking for, then retry your request"
This error occurs when the API request to Facebook's API exceeds its data limit threshold. Facebook imposes this limitation to manage server load and prevent excessive data requests. If Stitch requests too many fields for this table, it will result in this error. Essentially what is happening is the 500 error you're encountering with the Facebook Ads integration is related to requesting too much data from Facebook's API.
To resolve this issue, you need to reduce the number of fields selected for the Ads table. This limitation is imposed by Facebook's API, and Stitch does not have control over it. The error code 1 in the log stack is a general error code used by Facebook, but in this case, the accompanying message provides more specific information about the issue. The message "Please reduce the amount of data you're asking for, then retry your request" clearly indicates that the problem is related to requesting too much data for the table.
Keep in mind that some 500 errors and code 1 errors from the Facebook Ads API can be intermittent and may resolve on their own during the next extraction attempt. However, in this specific case, reducing the amount of data requested by de-selecting fields for the affected table is the recommended solution.
To address the problem:
This approach should allow the integration to progress further. Currently, this is the only available option to resolve the issue. There is no estimated time for a fix from Facebook's side, so adjusting your field selection is the best course of action.
If the problem persists after reducing the fields, you may need to adjust the field selections until the API accepts the request:
You may encounter this error that unable to import csv files into Talend Data Preparation Datasets which gives Import Error and the oauth authenticaiton failed message.
Log
2025-08-13 14:04:21.403 ERROR [user ] 1678370 --- [nio-9999-exec-3] o.t.d.exception.TDPExceptionController : An error occurred. Error : org.talend.dataprep.exception.TDPExceptionFlowControl: Sorry, an unexpected error occurred and we could not complete your last operation. You can continue to use Data Preparation | Context : {}
2025-08-13 14:04:21.416 ERROR [user ] 1678370 --- [nio-9999-exec-5] o.t.d.exception.TDPExceptionController : An error occurred : {cause=null}
org.talend.dataprep.exception.TDPException: 500 Internal Server Error from GET http://1.2.3.4:9999/datasets/9d69c531-b630-4c5e-b2ca-b9cb73f7910d/content
at org.talend.dataprep.processor.ExceptionsConfiguration$ExceptionsConversions.lambda$doWith$1(ExceptionsConfiguration.java:109)
at org.talend.dataprep.conversions.BeanConversionService.convert(BeanConversionService.java:189)
at org.talend.dataprep.command.GenericCommand$ErrorHandler.getTDPExceptionFromDTO(GenericCommand.java:546)
at org.talend.dataprep.command.GenericCommand$ErrorHandler.apply(GenericCommand.java:519)
at org.talend.dataprep.command.GenericCommand$ErrorHandler.apply(GenericCommand.java:495)
at org.talend.dataprep.command.GenericCommand.executeCall(GenericCommand.java:253)
at org.talend.dataprep.command.GenericCommand.execute(GenericCommand.java:265)
at org.talend.dataprep.dataset.adapter.DataInventoryClient.getDataSetContent(DataInventoryClient.java:263)
2025-08-13 14:04:38.385 ERROR [user ] 1678370 --- [nio-9999-exec-4] o.t.d.exception.TDPExceptionController : An error occurred. Error : org.talend.dataprep.exception.TDPExceptionFlowControl: ACL_NOT_FOUND.MESSAGE | Context : {entityId=689c39066827013844aa11d1}
2025-08-13 14:04:38.393 WARN [user ] 1678370 --- [or-http-epoll-8] o.t.dataprep.sharing.SharingWebClient : Sharing client error with : 404 Not Found from GET http://1.2.3.4:9999/sharing/v1/sharingset/preparation_folder/689c39066827013844aa11d1
2025-08-13 14:04:38.621 WARN [user ] 1678370 --- [nio-9999-exec-2] o.t.d.cache.MultiTenantKeyGenerator : No Current tenant in the context we can't add the tenantId into the cache key
2025-08-13 14:04:38.991 ERROR [user ] 1678370 --- [io-9999-exec-10] o.t.d.exception.TDPExceptionController : An error occurred. Error : org.talend.dataprep.exception.TDPExceptionFlowControl: ACL_NOT_FOUND.MESSAGE | Context : {entityId=689c39066827013844aa11d1}
2025-08-13 14:04:38.993 WARN [user ] 1678370 --- [or-http-epoll-8] o.t.dataprep.sharing.SharingWebClient : Sharing client error with : 404 Not Found from GET http://1.2.3.4:9999/sharing/v1/sharingset/preparation_folder/689c39066827013844aa11d1
2025-08-13 14:04:39.826 WARN [user ] 1678370 --- [nio-9999-exec-1] o.t.d.cache.MultiTenantKeyGenerator : No Current tenant in the context we can't add the tenantId into the cache key
2025-08-13 14:05:00.316 ERROR [user ] 1678370 --- [nio-9999-exec-4] o.t.d.exception.TDPExceptionController : An error occurred. Error : org.talend.dataprep.exception.TDPExceptionFlowControl: ACL_NOT_FOUND.MESSAGE | Context : {entityId=3168a305-05d1-41ee-9c9e-b43f18c49c96}
2025-08-13 14:05:00.319 WARN [user ] 1678370 --- [or-http-epoll-9] o.t.dataprep.sharing.SharingWebClient : Sharing client error with : 404 Not Found from GET http://1.2.3.4:9999/sharing/v1/sharingset/dataset/3168a305-05d1-41ee-9c9e-b43f18c49c96
2025-08-13 14:05:00.513 ERROR [user ] 1678370 --- [io-9999-exec-10] o.t.d.exception.TDPExceptionController : An error occurred. Error : org.talend.dataprep.exception.TDPExceptionFlowControl: ACL_NOT_FOUND.MESSAGE | Context : {entityId=3168a305-05d1-41ee-9c9e-b43f18c49c96}
2025-08-13 14:05:00.516 WARN [user ] 1678370 --- [or-http-epoll-9] o.t.dataprep.sharing.SharingWebClient : Sharing client error with : 404 Not Found from GET http://1.2.3.4:9999/sharing/v1/sharingset/dataset/3168a305-05d1-41ee-9c9e-b43f18c49c96
2025-08-13 14:05:01.684 ERROR [user ] 1678370 --- [nio-9999-exec-6] o.t.d.d.s.a.synchronous.SchemaAnalysis : Unable to analyse schema for dataset 3168a305-05d1-41ee-9c9e-b43f18c49c96.
org.springframework.web.client.HttpClientErrorException$Unauthorized: 401 on GET request for "http://localhost:8187/artifacts/t_default/lastVersion": "{"status":401,"title":"Unauthorized"}"
at org.talend.tsd.maven.publisher.controller.ApiClient.invokeAPI(ApiClient.java:579)
at org.talend.tsd.maven.publisher.controller.api.ArtifactsApi.lastVersionWithHttpInfo(ArtifactsApi.java:95)
at org.talend.tsd.maven.publisher.controller.api.ArtifactsApi.lastVersion(ArtifactsApi.java:57)
at org.talend.tsd.dictionary.provider.service.IndexVersionsCache.getLastVersionFromProducer(IndexVersionsCache.java:43)
2025-08-13 14:05:01.701 ERROR [user ] 1678370 --- [nio-9999-exec-6] o.t.d.exception.TDPExceptionController : An error occurred : {}
org.talend.dataprep.exception.TDPException: Sorry, an unexpected error occurred and we could not complete your last operation. You can continue to use Data Preparation
at org.talend.dataprep.exception.TDPException.rethrowOrWrap(TDPException.java:63)
at org.talend.dataprep.dataset.service.analysis.synchronous.SchemaAnalysis.analyze(SchemaAnalysis.java:109)
at org.talend.dataprep.dataset.service.BaseDataSetService.analyzeDataSet(BaseDataSetService.java:112)
at org.talend.dataprep.dataset.service.DataSetService.create(DataSetService.java:443)
Caused by: org.springframework.web.client.HttpClientErrorException$Unauthorized: 401 on GET request for "http://localhost:8187/artifacts/t_default/lastVersion": "{"status":401,"title":"Unauthorized"}"
at org.talend.tsd.maven.publisher.controller.ApiClient.invokeAPI(ApiClient.java:579)
at org.talend.tsd.maven.publisher.controller.api.ArtifactsApi.lastVersionWithHttpInfo(ArtifactsApi.java:95)
at org.talend.tsd.maven.publisher.controller.api.ArtifactsApi.lastVersion(ArtifactsApi.java:57)
at org.talend.tsd.dictionary.provider.service.IndexVersionsCache.getLastVersionFromProducer(IndexVersionsCache.java:43)
at org.talend.tsd.dictionary.provider.service.DictionaryProviderFacade.getBy(DictionaryProviderFacade.java:17)
at org.talend.dataprep.configuration.ExtendedDictionarySnapshotProviderImpl$ExtendedDictionarySnapshotImpl.<init>(ExtendedDictionarySnapshotProviderImpl.java:58)
at org.talend.dataprep.configuration.ExtendedDictionarySnapshotProviderImpl.get(ExtendedDictionarySnapshotProviderImpl.java:47)
at org.talend.dataprep.dataset.service.analysis.synchronous.SchemaAnalysis.analyze(SchemaAnalysis.java:84)
2025-08-13 14:05:01.711 ERROR [user ] 1678370 --- [or-http-epoll-8] o.t.d.d.adapter.DataInventoryClient : Error when create dataSet Purview_ApplicationService-sample from content
org.springframework.web.reactive.function.client.WebClientResponseException$InternalServerError: 500 Internal Server Error from POST http://1.2.3.4:9999/dataset/v1/datasets/
at org.springframework.web.reactive.function.client.WebClientResponseException.create(WebClientResponseException.java:332)
Suppressed: reactor.core.publisher.FluxOnAssembly$OnAssemblyException:
Error has been observed at the following site(s):
*__checkpoint ⇢ 500 INTERNAL_SERVER_ERROR from POST http://1.2.3.4:9999/dataset/v1/datasets/ [DefaultWebClient]
Original Stack Trace:
2025-08-13 14:05:01.716 ERROR [user ] 1678370 --- [nio-9999-exec-3] o.t.d.exception.TDPExceptionController : An error occurred : {}
org.talend.dataprep.exception.TDPException: An error has occurred during the import.
at org.talend.dataprep.dataset.adapter.DataInventoryClient.lambda$createDatasetFromContent$4(DataInventoryClient.java:292)
Suppressed: [CIRCULAR REFERENCE: org.springframework.web.reactive.function.client.WebClientResponseException$InternalServerError: 500 Internal Server Error from POST http://1.2.3.4:9999/dataset/v1/datasets/]
Caused by: org.springframework.web.reactive.function.client.WebClientResponseException$InternalServerError: 500 Internal Server Error from POST http://1.2.3.4:9999/dataset/v1/datasets/
at org.springframework.web.reactive.function.client.WebClientResponseException.create(WebClientResponseException.java:332)
Suppressed: reactor.core.publisher.FluxOnAssembly$OnAssemblyException:
Error has been observed at the following site(s):
*__checkpoint ⇢ 500 INTERNAL_SERVER_ERROR from POST http://1.2.3.4:9999/dataset/v1/datasets/ [DefaultWebClient]
Oauth Authenticaiton Failed
org.springframework.web.client.HttpClientErrorException$Unauthorized: 401 on GET request for "http://localhost:8187/artifacts/t_default/lastVersion": "{"status":401,"title":"Unauthorized"}"
oidc.semanticservice.secret=5yjRGCv-f5e15374fae358e1-Lp3Q
Since Talend Data Preparation is connected with dq-dict service per schema analysis module, this issue is caused by DQ-Dictionary-IAM-setting and iam/config/clients/dqdict-client.json is configured improperly with wrong client_secret
{"client_name" : "DQ Dict API Server", "application_type" : "web", "client_id" : "FqtrjyZu7hTsoQ", "client_secret" : "/5TG+15LvXy7DE49z+UdmSPdBhwVBUt8ptVSF2m0Dx5YN4+81PUjfBTv90TG4OSZezR98o5L/byb", "grant_types" : [ "password", "authorization_code" ] }
which is not aligned with the oidc.semanticservice.secret in dq_dict/config/data-quality.properties
oidc.semanticservice.secret=5yjRGCv-f5e15374fae358e1-Lp3Q