Search or browse our knowledge base to find answers to your questions ranging from account questions to troubleshooting error messages. The content is curated and updated by our global Support team
Loading data from Oracle may fail on a full load with the error:
ORA-01555: snapshot too old: rollback segment number string with name "string" too small
This is an Oracle configuration issue which must be resolved for the task to be able to continue.
In Automatic Undo Management mode, increase the setting of UNDO_RETENTION. Otherwise, use larger rollback segments.
You can verify your current settings:
SHO PARAMETER UNDO;
SELECT SUM(BYTES)/1024/1024 "MB", TABLESPACE_NAME FROM DBA_FREE_SPACE GROUP BY TABLESPACE_NAME
Verify how large the problematic table is and what the current settings are. Then increase the sizes as per your findings.
Oracle references:
http://www.dba-oracle.com/t_ora_01555_snapshot_old.htm
https://asktom.oracle.com/pls/asktom/f?p=100:11:0::::P11_QUESTION_ID:275215756923
http://www.dba-oracle.com/oracle_tips_rollback_segments.htm
It caused by rollback records needed by a reader being overwritten by other writers.
A log stream from SQL was being loaded into Microsoft Azure Databricks Delta Target, but the process failed. The error occurred during a full load of a table:
00007080: 2023-02-20T19:32:08 [TARGET_LOAD ]E: Failed (retcode -1) to execute statement: 'COPY INTO `raw_iso_les`.`text_translation` FROM(SELECT cast(_c0 as INT) as `TextID`, cast(_c1 as INT) as `LanguageID`, _c2 as `Micro`, _c3 as `Short`, _c4 as `Medium`, _c5 as `Extended`, _c6 as `Text`, _c7 as `MicroAudioURL`, _c8 as `ShortAudioURL`, _c9 as `MediumAudioURL`, _c10 as `LongAudioURL`, _c11 as `TextAudioURL`, _c12 as `SmallIconURL`, _c13 as `MediumIconURL`, _c14 as `LargeIconURL`, cast(_c15 as INT) as `OrgChecksum`, cast(_c16 as INT) as `CustChecksum`, cast(_c17 as INT) as `ReferenceID`, cast(_c18 as TIMESTAMP) as `LastUpdateOn`, _c19 as `LastUpdatedBy`, cast(_c20 as TIMESTAMP) as `CreatedOn`, _c21 as `CreatedBy`, cast(_c22 as BOOLEAN) as `Active`, cast(_c23 as TIMESTAMP) as `LastDeleteOn`, _c24 as `LastDeletedBy`, cast(_c25 as TIMESTAMP) as `LastReactivateOn`, _c26 as `LastReactivatedBy`, cast(_c27 as INT) as `ArchiveID`, cast(_c28 as TIMESTAMP) as `LastArchiveOn`, _c29 as `LastArchivedBy`, cast(_c30 as TIMESTAMP) as `LastRestoreOn`, _c31 as `LastRestoredBy`, cast(_c32 as INT) as `RowVersionStamp` from 'abfss://XXXX') FILEFORMAT = CSV FILES = ('/X/X/X/X/12/X.csv.gz') FORMAT_OPTIONS('nullValue' = 'attrep_null', 'multiLine'='true') COPY_OPTIONS('force' = 'true')' [1022502] (ar_odbc_stmt.c:4906)
00007080: 2023-02-20T19:32:08 [TARGET_LOAD ]E: RetCode: SQL_ERROR SqlState: HY000 NativeError: 35 Message: [Simba][Hardy] (35) Error from server: error code: '0' error message: 'org.apache.hive.service.cli.HiveSQLException: Error running query: org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 3805190.0 failed 4 times, most recent failure: Lost task 0.3 in stage 3805190.0 (TID 32173534) (10.139.64.145 executor 2621): com.univocity.parsers.common.TextParsingException: java.lang.ArrayIndexOutOfBoundsException - 20480
Identified line separator characters in the parsed content. This may be the cause of the error. The line separator in your parser settings is set to '[lf]'. Parsed content: ... [1022502] (ar_odbc_stmt.c:4913)
00007080: 2023-02-20T19:32:08 [TARGET_LOAD ]E: Failed to copy data of file x\x.csv to database [1022502] (cloud_imp.c:6497)
To resolve the issue, the user updated Qlik Replicate to the November 2022 version and reloaded the table. This will resolve the current issue and prevent the table from failing again in the future.
Qlik Replicate
Microsoft Azure Databricks Delta Target
To connect to Snowflake using ODBC with keypair authentication, you can use the following connection string format:
Driver={SnowflakeDSIIDriver};Server=<account_name>.<region_name>.snowflakecomputing.com;User=<username>;PrivateKey=<path_to_private_key>;Db=<database_name>;Schema=<schema_name>
Each part of the connection string represents:
Note that you may also need to include additional connection parameters, such as Warehouse, Role, and SSL. Here's an example connection string that includes these parameters:
Driver={SnowflakeDSIIDriver};Server=<account_name>.<region_name>.snowflakecomputing.com;User=<username>;PrivateKey=<path_to_private_key>;Db=<database_name>;Schema=<schema_name>;Warehouse=<warehouse_name>;Role=<role_name>;SSL=on
Replace the placeholders in the connection string with the appropriate values for your Snowflake account and keypair authentication setup.
Qlik Replicate
Snowflake Target
Due a product defect, users deleted in mid-March 2023 might still keep a license assigned to them.
The issue was solved on March 21st, and deleting users correctly deallocates licenses.
Qlik is working on finding a solution for removing licenses still allocated to users that were deleted during this period for both Qlik Sense Enterprise and Qlik Sense Business subscriptions.
To speed up things, customers on Qlik Sense Enterprise can import and run the attached automation.
Qlik Replicate tasks error out, and tables are suspended.
RetCode: SQL_ERROR SqlState: 22007 NativeError: 100035 Message: Timestamp '0000-00-00 00:00:00.000000' is not recognized Failed (retcode -1) to execute statement: 'INSERT INTO "XXXXX_CTRL"."public_refer__ct"
Qlik Replicate
Snowflake Endpoint
'0000-00-00 00:00:00' is not a valid timestamp in Snowflake. A transformation will need to be added to change this to your desired value.
Here is an example, in this case we are replacing the timestamp to '1970-01-01 00:00:00':
coalesce(
CASE WHEN substr($AR_M_SOURCE_COLUMN_DATA,1,4) = '0000'
THEN '1970-01-01 00:00:00'
WHEN substr($AR_M_SOURCE_COLUMN_DATA,1,4) = ''
THEN '1970-01-01 00:00:00'
ELSE
$AR_M_SOURCE_COLUMN_DATA
END
,$AR_M_SOURCE_COLUMN_DATA)
Update operations to the Nvarchar(max) data types will cause the values to become 'NULL' values when the target is Microsoft Azure SQL Database.
Qlik Replicate 2021.11.0.165
Microsoft Azure SQL Database
The behavior was resolved in Replicate 2022.11.0.475
Upgrade to Qlik Replicate sp03-2022.11.0.475 or later.
Case 00072614
Working with an Amazon RDS for Oracle endpoint as a source, we might encounter an issue where CDC doesn't start and fails with the following error:
Cannot create Oracle directory name 'ATTUREP_XXXXXX_XXX' with path '/XXXXX/log/arch'
This is because of undocumented access request required which is not available in the User Guide yet. We have to execute the following grant statements to provide read only access to Qlik user to the online and archive redo logs:
grant read on directory ONLINELOG_DIR to qlik;
grant read on directory ARCHIVELOG_DIR to qlik;
or
exec rdsadmin.rdsadmin_util.grant_sys_object('ONLINELOG_DIR','QLIK','READ');
exec rdsadmin.rdsadmin_util.grant_sys_object('ARCHIVELOG_DIR','QLIK','READ');
Where qlik/QLIK is the user specified in the source endpoint connection.
The Apply Exceptions table is a Qlik Replicate Control Table created on the target endpoint when the corresponding table is selected in the Control Tables tab.
You may notice that you cannot turn off the Apply Exceptions in the Qlik Replicate control table settings. The option to do so is greyed out and cannot be unchecked. See img 01.
img 1
Change Processing errors are recorded in the attrep_apply_exceptions
table. The data in this table is never deleted. For more information on the Apply Exceptions table please review attrep_apply_exceptions.
Additional tables and data take up space and resources in the database, so some people prefer to opt out of having this table for a Replicate task. This can be done with some additional configuration.
"common_settings": {
"support_lobs": false,
"change_table_settings": {
"handle_ddl": false,
"header_columns_settings": {
}
},
"full_load_enabled": false,
"audit_table_settings": {
},
"dr_settings": {
},
"statistics_table_settings": {
},
"bidi_table_settings": {
},
"task_uuid": "c24fe2e3-9d04-334c-80c4-35f4b349f9e4",
"status_table_settings": {
},
"status_table_enabled": true,
"suspended_tables_table_settings": {
},
"suspended_tables_table_enabled": true,
"history_table_settings": {
},
"history_table_enabled": true,
"exception_table_settings": {
},
"recovery_table_settings": {
},
"data_batching_settings": {
},
"data_batching_table_settings": {
},
"log_stream_settings_depricated": {
},
"ddl_history_table_settings": {
},
"customized_charset_settings": {
"validation": {
"sub_char": 0
}
}
}
},
"common_settings": {
"exception_table_enabled": false,
"support_lobs": false,
"change_table_settings": {
"handle_ddl": false,
"header_columns_settings": {
}
},
"full_load_enabled": false,
"audit_table_settings": {
},
"dr_settings": {
},
"statistics_table_settings": {
},
"bidi_table_settings": {
},
"task_uuid": "c24fe2e3-9d04-334c-80c4-35f4b349f9e4",
"status_table_settings": {
},
"status_table_enabled": true,
"suspended_tables_table_settings": {
},
"suspended_tables_table_enabled": true,
"history_table_settings": {
},
"history_table_enabled": true,
"exception_table_settings": {
},
"recovery_table_settings": {
},
"data_batching_settings": {
},
"data_batching_table_settings": {
},
"log_stream_settings_depricated": {
},
"ddl_history_table_settings": {
},
"customized_charset_settings": {
"validation": {
"sub_char": 0
}
}
}
},
The information in this article is provided as-is and to be used at own discretion. Depending on tool(s) used, customization(s), and/or other factors ongoing support on the solution below may not be provided by Qlik Support.
The Qlik Replicate server service in Windows used to be named Attunity Replicate Server.
This name has changed to Qlik Replicate Server. The change was introduced in the 2022.11 release.
If you are running a Windows Cluster and are monitoring Qlik Replicate through it, or if you have deployed third-party applications monitoring the service uptime, these applications must be modified to look for the correct service name.
2021.11 and later
In this article, we address best practices through a questionnaire for Qlik Replicate administrators and architects before an environment goes live in production. Having answers to these questions is helpful for a successful deployment and sustainable implementation.
Content:
How To Get Started with Qlik Replicate
The information in this article is provided as-is and to be used at own discretion. Depending on tool(s) used, customization(s), and/or other factors ongoing support on the solution below may not be provided by Qlik Support.
This Techspert Talks session addresses:
Resources:
Optimizing Performance for Qlik Sense Enterprise
Qlik Sense Enterprise deployment examples
Grafana Loki | Grafana Loki documentation
Portainer for Docker management
Selenium IDE for user session replay
Click here to see video transcript
When using Qlik Cloud with a custom IdP and a user is removed from the IdP, will that change be propagated into Qlik Cloud?
The answer to the question depends on the IdP used and the if the provisioning has been enabled or not.
If the current IdP used does not support SCIM then the changes done on your IdP side will not be propagated into Qlik Cloud. So, the user will not be flagged as inactive or removed on your tenant.
If you are using Azure AD* and the provisioning has been enabled, then the changes will be propagated as explained in Provisioning users and groups using SCIM | Qlik Cloud Help.
SCIM provides a standardized way for IT systems to communicate and exchange user identity data. With SCIM, you can:
Create a new user in one system and automatically provision the same user in another system.
Update user information in one system and reflect that change in all other systems.
Delete a user in one system and automatically deprovision the user in all other systems.
By standardizing the user management process, SCIM makes managing users across different IT systems easier and reduces the likelihood of errors or discrepancies in user information.
Provisioning users and groups using SCIM - Qlik | Help
SCIM: System for Cross-domain Identity Management (simplecloud.info)
SCIM synchronization with Azure Active Directory - Microsoft Entra | Microsoft Learn
*SCIM connectors that work with Qlik Cloud (as of the date of creation of this article - 15th March 2023) :
A newly designed Replication task (with default settings) reported errors in the first run:
2023-03-15T21:02:28 [STREAM_COMPONEN ]E: Replication task cannot support lobs, since log stream staging task does not [1020480] (ar_cdc_channel.c:261)
2023-03-15T21:02:28 [METADATA_MANAGE ]E: Allocating utility stream component failed [1020480] (metadatamanager.c:837)
2023-03-15T21:02:28 [METADATA_MANAGE ]E: Cannot create the source utility component [1020480] (metadatamanager.c:753)
2023-03-15T21:02:28 [TASK_MANAGER ]E: Creating Metadata Manager's utility components failed [1020480] (replicationtask.c:3893)
In general this error comes with "Replicate LOB columns" is deselected in Log Stream Staging task :
RECOB-6036, #00075812, #00054094
Replicate 2022.5 (eg PR02, PR03, PR04)
Replicate 2022.11 (eg PR03)
When Data mart has referenced dimensions with one or more local dimensions, generating instructions for DataMart fails with the below error message:
SYS,GENERAL_EXCEPTION,Object reference not set to an instance of an object
Information provided on this defect is given as is at the time of documenting. For up-to-date information, please review the most recent Release Notes with RECOB-6639 for reference
Environment
Fix Version:
2022.05-SP08 (build 2022.5.583)
Cause
Jira issue: RECOB-6639
The Qlik Compose will result in the below error when deleting a workflow or task that was created using a non-printable character or one of these following characters: /\,&"#$@=^*+'`~?|<>:;[]{} as well as percentage sign.
REPO-E-OBJNOTFND, Object not found. name <TASK NAME>, type FlowDto.
Error handling in Qlik Compose version 2021.8.0.569 is different to how it is handled in version 2021.8.0.734 and higher.
In version 2021.8.0.569, attempting to create a workflow with a special character, Qlik Compose will create a workflow but with the below message. As the message says workflow can not be edited or deleted once created.
"REPO-E-OBJNOTFND, Object not found" and workflow will be created, the workflow cannot be edited or deleted
However, in version 2021.8.0.734 and higher, attempting to create a workflow with a special character will fail with the below error message and the workflow will NOT be created:
"COMPOSE-E-TASKINV, Invalid task name “Workflow/1”. Task names cannot contain the following characters: /\,&"#$@=^*+'`~?|<>:;[]{} as well as percentage sign and all non-printable characters (below 0x20)"
So, the workflows that were created in version 2021.8.0.569 with a special character will never be able to delete from the UI even after upgrading to later versions. The fix for this issue would be to delete the special character in the SQLite file manually. See below for the workaround.
For more information on the unsupported characters please check this link: Qlik User guide.
Qlik Compose 2021.8.0.569
From Qlik Sense February 2023 onwards, apps listed in the Qlik Sense Management Console (Apps view) are now represented with clickable links. Clicking them will open the app directly on the hub.
This means the app names are no longer plain grey text but are now formatted as links (blue, underline). See Apps for details.
Example:
Qlik Sense Enterprise on Windows February 2023 and above
External program tasks in Qlik Sense are simply another task type. With external program tasks, you can trigger external processes, such as scripts or .exe files. Task chaining is supported and you can combine reload tasks with external program tasks.
The functionality is exposed in the Qlik Sense Management Console beginning with the May 2021 release. See Creating and editing external program tasks.
Previous versions allow you to create external tasks in a command shell process behind the scenes; which in turns means you can run more or less anything you can run on the command prompt. Once you define your command and create a new task, you can both trigger and chain these types of tasks in the QMC.
In this example, we will be using a Powershell Script which means that powershell.exe needs to be in your PATH variable (By default C:\\Windows\\System32\\WindowsPowerShell\\v1.0\\powershell.exe). Again, you can run any command that you can run in the windows command prompt, so these examples are only for demo purposes.
Before we get started, we need to understand that IF you are calling scripts in your command, the script needs to be accessible by the user running the Qlik services. In this example, there’s a folder on the root C: drive called ‘externalTasksExample’, in this folder, there is a script.ps1 file which simply creates a new file in the same folder.
The file looks like:
Start-Transcript C:\externalTasksExample\transcript.log $date = Get-Date Write-Host $date Stop-Transcript
You can run the file from the command prompt by doing the following:
C:\externalTasksExample>C:\Windows\System32\WindowsPowerShell\v1.0\powershell.exe -File script.ps1
When the command runs successfully, you should have a new file created in the externalTasksExample folder called “transcript.log”.
Now we will delete this transcript.log and try to run that same PowerShell file using an external program task in Qlik Sense. The first thing we need to do is actually create the task.
The endpoint is:
POST /qrs/externalprogramtask
With the body being:
{ "path": "C:\\Windows\\System32\\WindowsPowerShell\\v1.0\\powershell.exe", "parameters": "C:\\externalTasksExample\\Script.ps1", "name": "Task Name", "taskType": 1, "enabled": true, "taskSessionTimeout": 1440, "maxRetries": 0, "privileges": null, "schemaPath": "ExternalProgramTask" }
Here is a sample script:
$hdrs = @{}
$hdrs.Add("X-Qlik-xrfkey","12345678qwertyui")
$hdrs.Add("X-Qlik-User","UserDirectory=DOMAIN;UserId=Administrator")
$hdrs.Add("content-type","application/json; charset=UTF-8")
$body = '{
"path": "C:\\Windows\\System32\\WindowsPowerShell\\v1.0\\powershell.exe",
"parameters": "C:\\externalTasksExample\\Script.ps1",
"name": "Task Name",
"taskType": 1,
"enabled": true,
"taskSessionTimeout": 1440,
"maxRetries": 0,
"privileges": null,
"schemaPath": "ExternalProgramTask"
}'
$cert = Get-ChildItem -Path "Cert:\CurrentUser\My" | Where {$_.Subject -like '*QlikClient*'}
$url = "https://qlikserver1.domain.local:4242/qrs/externalprogramtask?xrfkey=12345678qwertyui"
Invoke-RestMethod -Uri $url -Method Post -Headers $hdrs -Body $body -Certificate $cert
This generates a response like:
id : 1353eb7c-44c7-422b-9e73-c2667011a1b0
createdDate : 2022-06-13T19:35:40.382Z
modifiedDate : 2022-06-13T19:35:40.382Z
modifiedByUserName : DOMAIN\administrator
customProperties : {}
path : C:\Windows\System32\WindowsPowerShell\v1.0\powershell.exe
parameters : C:\externalTasksExample\Script.ps1
qlikUser :
operational : @{id=5ececc2a-dc6c-4769-b424-c325faddf74c; lastExecutionResult=; nextExecution=1753-01-01T00:00:00.000Z; privileges=}
name : Task Name
taskType : 1
enabled : True
taskSessionTimeout : 1440
maxRetries : 0
tags : {}
privileges :
schemaPath : ExternalProgramTask
Once you post this into Qlik, you should be able to go to the QMC and see the following task in the task list.
You should be able to select the task, click 'Start', and the same file should appear in 'C:\externalTasksExample'. You are now running an external task. Remember you don't have to use powershell.exe, you can trigger anything you want as long as it can be run from the command line.
How to configure Postman (desktop app) to connect to Qlik Sense
An automation will not automatically rerun or retry if it fails. You can, however, rerun a failed automation by using an additional automation.
Caution should be taken when implementing this solution to prevent running endless loops or reruns.
Before a solution can be implemented you must decide if the rerun should:
The block Retry Automation Run can be used to retry a specific run of the automation using the same, if any, inputs. This block is useful if the monitored automation has its run mode set to Webhook or Triggered. It is also possible to manually re-run these runs using the Retry button in the automations run history.
The block Run Automation can be used to initiate a new run of the automation. This block is useful if the automation that fails has the run mode set to Scheduled, and the inputs of the failed run wouldn’t be unique.
The following example shows how a failed automation can be retried using the Retry Automation Run. Depending on the specifics of the automation you want to rerun, you may want to use the Run Automation block instead.
The value for stopTime must be transformed using the Date formula so that the value may be used for comparisons. The output format must be changed to Unix format (U):
The information in this article is provided as-is and to be used at own discretion. Depending on tool(s) used, customization(s), and/or other factors ongoing support on the solution below may not be provided by Qlik Support.