Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
Dec 20, 2024 3:27:29 PM
May 15, 2022 1:08:46 PM
Table of Contents
The following release notes cover the versions of Qlik Compose released in May 2022.
Skipping versions: Customers who are not upgrading directly from the previous version are strongly encouraged to review the release notes for all versions higher than their currently installed version.
This section describes various upgrade scenarios and considerations.
Compose upgrade path
Direct upgrade is supported from Compose May 2021 or Compose August 2021 only. Customers upgrading from earlier Compose versions need to first upgrade to one of the aforementioned versions and then to Compose May 2022.
Compose for Data Warehouses upgrade path
Compose for Data Warehouses has been superseded by Qlik Compose. Existing Compose for Data Warehouses customers can upgrade to Qlik Compose as described below.
For information on the procedure for upgrading from Compose for Data Warehouses to Compose February 2021, see the Qlik Compose February 2021 release notes
Upgrading from Compose for Data Warehouses 6.6.1 (September 2020) or 7.0 (November 2020):
Upgrade to Compose February 2021.
Upgrade to Compose May 2021.
Compose May 2022.
Upgrading from unsupported Compose for Data Warehouses versions
Customers upgrading from Compose for Data Warehouses 6.5 or 6.6:
Upgrade to Compose for Data Warehouses 6.6.1.
Upgrade to Compose February 2021.
Upgrade to Compose May 2021.
Upgrade to Compose May 2022.(including SRs).
Customers upgrading from Compose for Data Warehouses 6.3 or 6.4:
Upgrade to Compose for Data Warehouses 6.5.
Upgrade to Compose for Data Warehouses 6.6.1.
Upgrade to Compose February 2021.
Upgrade to Compose May 2021.
Upgrade to Compose May 2022 (including SRs).
Customers upgrading from Compose for Data Warehouses 3.1 should contact Qlik Support.
Compose for Data Lakes upgrade path
For information on upgrading from Compose for Data Lakes, see Migrating from Compose for Data Lakes (page 5).
ETL script enhancements
After upgrading, in order to benefit from the latest enhancements to the task ETL scripts:
Upgrade scripts
After upgrading, depending on the version from which you upgraded, you might need to generate upgrade scripts and run them in your databases.
Upgrade script 1
Should be run only if upgrading from versions earlier than Compose August 2021.
Various performance enhancements require modifications to the internal Compose tables in the following data warehouses:
If you have Data Warehouse projects configured to use any of the above databases, you need to generate an upgrade script and then run it in each of the relevant databases.
Running the script in Google Cloud BigQuery and Amazon Redshift databases will delete historical monitoring metadata.
Upgrade script 2
Should be run only if upgrading from versions earlier than Compose August 2021 Service Release 02.
This upgrade script must be run after upgrading, as the database structure has been slightly modified to correctly report the error mart for each source (as part of the Uniform source consolidation (page 9) feature).
Upgrade script 3
Should be run only if upgrading from versions earlier than Compose August 2021 SP 12, and only if you have projects with Microsoft Azure Synapse Analytics data warehouse (or intend to create such projects in the future).Generating and running the upgrade scripts
From the Start menu, open the Compose Command Line console and run the following command:.
ComposeCli.exe connect
Run the following command:
ComposeCli.exe generate_upgrade_scripts
For each of your projects, the CLI output will tell you the name of the script and its location. Each script has a different name, consisting of the script identifier (the bold part), the project name, and a timestamp.
Example of Upgrade script 1:
C:\Program Files\Qlik\Compose\data\projects\Project_1\ddl-scripts\ComposeUpgradeFrom2021_ 5To2021_8Project_1__210714142110.sql
Example of Upgrade script 2:
C:\Program Files\Qlik\Compose\data\projects\Project_2\ddl-scripts\ComposeUpgradeFrom2021_ 8SP4To2021_8SP10Project_2__220114142110.sql
Example of Upgrade script 3:
C:\Program Files\Qlik\Compose\data\projects\Project_3\ddl-scripts\ComposeUpgradeFrom2021_ 8SP10To2021_8SP12Project_3__220518142110.sql
Access each of your databases using SQL Workbench or a similar tool and run the script(s).
When the script(s) completes successfully, generate and run your tasks in Compose.
Existing Compose for Data Warehouses customers who want to create and manage Data Warehouse projects only in Qlik Compose can use their existing license. Similarly, existing Compose for Data Lakes customers who want to create and manage Data Lake projects only in Qlik Compose can use their existing license.
Customers migrating from Qlik Compose for Data Warehouses or Qlik Compose for Data Lakes, and who want to create and manage both Data Warehouse projects and Data Lakes projects in Qlik Compose, will need to obtain a new license. Customers upgrading from Compose February 2021 can continue using their existing license.
It should be noted that the license is enforced only when trying to generate, run, or schedule a task (via the UI or API ). Other operations such as Test Connection may also fail if you do not have an appropriate license.
Compose for Data Lakes has been superseded by Qlik Compose. Existing Compose for Data Lakes customers can migrate their projects from Qlik Compose for Data Lakes to Qlik Compose. You can migrate both your project definitions and your data although the latter is only required if you need to migrate production data.
For migration instructions, see Qlik Compose August 2021 Release notes.
Migration can be performed from Compose for Data Lakes 6.6 only.
Relevant to Compose May 2022 SR1 only. Requires Replicate November 2022 or later.
From Compose May 2022 SR1, if you use Replicate November 2022 or later to land data in Databricks, only the Replicate Databricks (Cloud Storage) target endpoint can be used. If you are using Replicate May 2022, you can continue using te existing Databricks target endpoints.
Qlik Replicate is required for landing data into the data warehouse or storage while Qlik Enterprise Manager allows you to monitor and control Compose tasks running on different servers. This section lists the supported versions for each of these products.
Compose May 2022 Initial Release
Compose May 2022 Initial Release is compatible with the following Replicate and Enterprise Manager versions:
Compose May 2022 Service Release 1
Compose May 2022 Service Release 01 is compatible with the following Replicate and Enterprise Manager versions:
The following section describes the enhancements and new features introduced in Qlik Compose May 2022.
The "What's new?" is cumulative, meaning that it also describes features that were already released as part of Compose August 2021 service/patch releases. This is because customers upgrading from initial release versions might not be aware of features that were released in interim service releases.
The following section describes the enhancements and new features introduced in Qlik Compose Data Warehouse projects.
Keeping changes in the Change Tables
This version introduces a new Keep in Change Tables option in the landing zone connection settings:
When you select the Keep in Change Tables option, the changes are kept in the Change Tables after they are applied (instead of being deleted or archived). This is useful as it allows you to:
Referenced dimensions
This version introduces support for referencing dimensions. To facilitate this new functionality, a new Reference selected dimensions option has been added to the Import Dimensions dialog which, together with the toolbar button, has been renamed to Import and Reference Dimensions.
The ability to reference dimensions improves data mart design efficiency and execution flexibility by facilitating the reuse of data sets. Reuse of dimension tables across data marts allows you to break up fact tables into smaller units of work for both design and data loading, while ensuring consistency of data for analytics.
Data mart enhancements
Data mart adjust
This version introduces the following enhancements:
Data mart reloading
This version introduces the ability to reload the data mart or parts of the data mart without dropping and recreating it, thereby eliminating costly and lengthy reloading of the data mart while maximizing data availability. Such operations should usually be performed after a column with history has been added by the automatic adjust operation.
To facilitate this, a new mark_reload_datamart_on_next_run CLI has been developed. The new CLI allows users to mark dimensions and facts to be reloaded on the next data mart run. These can either be specific dimensions and facts or multiple dimensions and facts (either from the same data mart or different data marts) using a CSV file.
Microsoft Azure Synapse Analytics Enhancements
A number of changes related to statistics have been implemented. In addition, several statements are now tagged with an identifier label for troubleshooting 'problem queries' and identifying possible ways to optimize database settings. Moreover, the addition of labels to ELT queries enables fine-grained workload management and workload isolation via Synapse WORKLOAD GROUPS and CLASSIFIERS.
The identifier labels are as follows:
|
Table type |
Tag |
|
Hubs |
CMPS_HubIns |
|
Satellites |
CMPS_SatIns |
|
Type1 dimensions |
CMPS_<data mart name>_DimT1_Init/CMPS_<data mart name>_DimT1_Incr |
|
Type2 dimensions |
CMPS_<data mart name>_DimT2_Init/CMPS_<data mart name>_DimT2_Incr |
|
Transactional facts |
CMPS_<data mart name>_FctTra_Init/CMPS_<data mart name>_FctTra_Incr |
|
State-oriented facts |
CMPS_<data mart name>_FctStO_Init |
|
Aggregated facts: |
CMPS_<data mart name>_FctAgg_Init |
Uniform source consolidation
Uniform source consolidation as its name suggests allows you to ingest data from multiple sources into a single, consolidated, entity.
To enable uniform source consolidation configuration, a new Consolidation tab has been added to the data warehouse task settings.
When the Consolidate uniform sources option is enabled, Compose will read from the selected data sources and write the data to one consolidated entity. This is especially useful if your source data is managed across several databases with the same structure, as instead of having to define multiple data warehouse tasks (one for each source), you only need to define a single task that consolidates the data from the selected data sources.
Consolidation tab showing selected data sources
Environment variables
Environment variables allow developers to build more portable expressions, custom ETLs, and Compose configurations, which is especially useful when working with several environments such as DTAP (Development, Testing, Acceptance and Production). Different environments (for example, development and production) often have environment-specific settings such as database names, schema names, and Replicate task names. Variables allow you to easily move projects between different environments without needing to manually configure the settings for each environment. This is especially useful if many settings are different between environments. For each project, you can use the predefined environment variables or create your own environment variables.
Excluding environment variables from export operations
An option has been added to replace environment-specific settings with the defaults when exporting projects (CLI) or creating deployment packages.
To facilitate this functionality, the --without_environment_specifics parameter was added to the export_project_repository CLI and a Exclude environment variable values option was added to the Create Deployment Package dialog.
You can now configure data profiling and data quality rules when using Google Cloud BigQuery as a data warehouse.
In previous versions, attempting to create several Attributes with the same name but a different case would result in a duplication error. Now, such attributes will now be created with an integer suffix that increases incrementally for each attribute added with the same name. For example: Sales, SALES_01, and Sales_ 02.
You can now associate a Replicate task that writes to a Hadoop target with the Compose landing.
This version provides the following performance improvements:
Relevant from Compose May 2022 SR1 only.
Supported from Compose May 2022 SR1 only.
Customers who want to leverage this support need to create Redshift Spectrum external tables and discover them. Additionally, when running a CDC task, the new Keep in Change Tables option described above needs to be turned on.
The Data Mart Dimensions tree and the Star Schema Fact tab were redesigned to provide a better user experience.
The following section describes the enhancements and new features introduced in Qlik Compose Data Lake projects.
A Deleted records in ODS views section has been added to the General tab of the project settings, with
the following options:
As this was the default behavior in previous versions, you might need to select this option to maintain backward compatibility.
Supported from Compose May 2022 SR1 only.In previous versions, HDS resolution was one second. This was problematic at times as multiple changes to a Primary Key within a second resulted in only the last change appearing in the HDS. To view all the history, customers were forced to review the landing.
From this version, all changes (history) will shown in the HDS, facilitating better support for auditing.
You can now associate a Replicate task that writes to a Hortwonworks Data Platform target with the Compose landing connection (in a Cloudera Data Platform (CDP) Compose project).
New Databricks versions
Databricks 10.4 LTS is supported from Compose May 2022 SR1 only.
Supported from Compose May 2022 SR1 only.
Compose May 2022 SR1 introduces support for SQL Warehouse compute. To benefit from this support, customers need to use the new Replicate Databricks (Cloud Storage) target endpoint, which is available from Replicate November 2022. SQL Warehouse compute offers a lower cost alternative to clusters while also allowing Parquet file format to be used in the Landing Zone.
A new Project title setting had been added to the Environment tab of the project settings. The project title will be shown in the console banner. If both an Environment Title and a Project Title are defined, the project title will be displayed to the right of the environment title. Unlike the Environment title and Environment type, which are unique for each environment, the project title is environment independent. This means that the project title will always be retained, even when deploying to a different environment.
The following image shows the banner with both an Environment title and a Project title:
The banner text is shown without the Environment title and Project title console labels. This provides greater flexibility as it allows you add any banner text you like, regardless of the actual label name. For example, specifying Project owner: Mike Smith in the Project title field, will display that text in the banner.
This version introduces support for accessing the Compose console using Microsoft Edge.
Windows Server 2022 support is available from Compose May 2022 SR1.
For security reasons, command tasks are now blocked by default. To be able to run command tasks, a Compose administrator needs to turn on this capability using the Compose CLI. For more information, see the Compose online help.
This functionality only applies to command tasks created after a clean installation. If you upgrade to this version, command tasks will continue to work as previously. This feature is available from Compose May 2022 SR1 only.You can set and update user and group roles using the Compose CLI. You can also remove users and groups from a role in one of the available scopes (for example, Admin in All Projects). This is especially useful if you need to automate project deployment.
This section provides information about End of Life versions, End of Support features, and deprecated features.
End of support for Databricks 7.3 is applicable to Compose May 2022 SR1 only.
The following section lists the defects and enhancements resolved since Compose August 2021 Initial Release.
Defects
| Jira Issue | Salesforce case | Component/Process | Description |
| RECOB-4808 | 2271788 | Environment variables in data mart | After the data mart database name was applied as an environment variable, Compose would not clear the cache automatically, resulting in the old cache object not being reset. |
| RECOB-4806 | 26263 | UI | Selecting a Replicate task would not be possible when using a Hortonworks Data Platform endpoint in a Cloudera Data Platform Compose project. |
| RECOB-4822 | 25044 | Project deployment | The following error would sometimes be encountered when deploying a project: Invalid Configuration file the database <name> Landing does not exist |
| RECOB-4861 | 26682 | Test connection | When the schema name was*, testing the connection for the landing database would return the following error: Object reference not set to an instance of an object |
| RECOB-4854 | 7550 | Lineage | When importing data marts using the Composecli import_csv command, the "Showlineage" option for corresponding domain attributes would be disabled. |
| RECOB-4876 | 27847 | Project Deployment | When a landing connection was removed from the target project, project deployment would fail with the following error: REPO-E-ITMNTFND, Invalid configuration file. The 'Database' 'Landing4' does not exist. REPO,CONFIGURATION_ITEM_NOT_FOUND,Database,Landing4 |
| RECOB-4809 | 22405 | Data Marts | Hub tables would sometimes be updated unnecessarily which would result in unnecessary updates of the related dimensions. |
| RECOB-4836 | N/A | Data Marts | Failed to set a filter on a dimension or a fact. |
| RECOB-4779 | 24471 | Data Marts | Filters and expression on dimensions would not work as expected. |
| RECOB-4882 | 27704 | Data Marts | When a data mart contained an entity with multiple satellites, the query would sometimes be generated incorrectly. |
| RECOB-4864 | 24810 | Filters and expressions | Tasks with filters or expressions would end with errors. |
| RECOB-4913 | 27960 | Compare CSV CLI | The Compare CSV CLI would sometimes not complete successfully. |
| RECOB-4917 | 28209 | Expression Editor | An error would sometimes occur when opening the Expression Editor. |
| RECOB-4959 | 20574 | Data Warehouse Tasks - Snowflake | Records in the data warehouse would not be updated with a NULL value, even though the data warehouse task was set to "Set the target value to null". |
| RECOB-4928 | 27075 | Metadata validation in Data Lakes projects | Validating the metadata would fail with an error that "ID" is a reserved word. |
| RECOB-4722 | 2271788 | Project documentation | In the generated project documentation, the domain name would be shown in the attribute name field. |
| RECOB-4739 | 22780 | Databricks | After upgrading to 2021.08 SP08, Databricks connection issues would be encountered when a token was revoked. |
| RECOB-4707 | N/A | Data Marts - Oracle | The following Oracle syntax error would be encountered during the initial load task command: ORA-01400: cannot insert NULL into |
| RECOB-4675 | 15882 | Facts | State oriented facts would not reflect changes that were made to the Type 2 relation or changes that were made to the dimension table. |
| RECOB-4771 | 24505 | Project deployment | Users with the "Designer" role were not able to deploy project deployment packages. |
| RECOB-4785 | 10094 | Import CSV | After running the import_csv CLI command to import tasks, the generated task statements would contain a syntax error. |
| RECOB-4776 | 23553 | Data mart editing | When working with large models, it would not be possible to edit a dimension or fact. |
| RECOB-4656 | 21696 | CSV Import - Microsoft Azure Synapse Analytics Data Warehouse | Importing a CSV file to a project with a Microsoft Azure Synapse Analytics datawarehouse would fail if the CSV contained an NVARCHAR attribute. |
| RECOB-4666 | 19667 | Security | Resolved security vulnerabilities discovered in Compose 2021.8.0.365. |
| RECOB-4699 | 23508 | Upgrade Script | Running the generate_upgrade_script command would fail after upgrading to 2021.8.0.425. |
| RECOB-4045 | 10967 | Generate project CLI | Running the generate_project CLI command with the --database_already_adjusted parameter would drop the Qlik table "TPIL_DMA_RUNNO". |
| RECOB-3999 | 9804 | Generate project CLI | Running the generate_project CLI command with the --database_already_adjusted parameter would fail with the following error: SQL compilation error: <p>Object does not exist, or operation cannot be performed. |
| RECOB-4057 | N/A | Data Marts | Creating a denormalized new dimension would create the root dimension only. |
| RECOB-3990 | 2264064 | Workflows | In rare cases, it would not be possible to create, edit, or duplicate workflows. |
| RECOB-3937, RECOB-3859 |
2236402, 5136 | Upgrade | After migrating to 2021.5, projects containing two domain attributes with the same name but a different case (e.g. abc and Abc) would fail to load with the following error: SYS,GENERAL_EXCEPTION, An item with the same key has already been added. |
| RECOB-3987 | N/A | Project Deployment | It would not be possible to open a project after deployment if one schema was missing. |
| RECOB-4043 | 9043 | Data Marts | Fact tables would contain obsolete VIDs from dimensions, resulting in orphaned records. |
| RECOB-4033 | 9805 | Data Marts | Data mart loading tasks would sometimes fail with the following error: Cannot write value for process parameter twice: 1265: Duplicate write to param DimCnt_Tot |
| RECOB-3204 | 2214622 | Loading data mart dimensions into Snowflake and Microsoft Azure Synapse Analytics | When a data mart ETL task failed, the next task would sometimes load duplicate rows into dimensions. |
| RECOB-3957 | 2231873 | Data Marts | Adding data mart dimensions would sometimes fail without a clear error. |
| RECOB-3954 | 8634 | Data warehouse validation | The following error would occur when validating the data warehouse: Index was out of range. Must be non-negative and less than the size of the collection |
| RECOB-3902 | 7392 | Snowflake | The data warehouse ETL would fail to create a transient table with a "already exists" error. |
| RECOB-3934 | 8399 | CLI | Importing a project repository to a new project that does not exist it would fail with the following error: Project: 'Project_name' does not exist. |
| RECOB-3636 | 2248515 | Backdating | Backdated data in the Data Warehouse would not get updated in the Data Mart. |
| RECOB-3703 | 2240557 | Backdating | Migrating a project from an older version would disable the backdating options. The issue was resolved by adding a new CLI command line that sets the "Add actual data row and a precursor row" option for all entities as well as in the project settings. composecli set_backdating_options --project project_name After running the command, refresh the browser to see the changes. |
| RECOB-3719 | 2260256 | Discovery from Snowflake | When a landing table had a foreign key, discovering the table would result in the following error (excerpt): Specified argument was out of the range of valid values. |
| RECOB-3799 | 2264057 | Validation and Schema Evolution | Validation of Databricks storage and Snowflake data warehouse would be excessively long. The slow Databricks validation would also impact schema evolution. |
| RECOB-4528 | 17678 | Pivot table - Google BigQuery | In Google BigQuery projects, the data mart pivot table displays a "no data error" when there is data in tables. |
| RECOB-4529 | 17465 | Data profiler - Google BigQuery | In Google BigQuery projects, the following error would be encountered when using the data profiler: "SYS,GENERAL_EXCEPTION,Sequence contains no elements" |
| RECOB-4535 | 16513 | OID and VID Columns | The OID and VID column names would include the entire path from the fact source to the dimension instead of just the dimension name. |
| RECOB-4555 | 2260638 | MySQL source | When setting up a MySQL source connection, testing the connection would return the following error: "Object reference not set to an instance of an object". |
| RECOB-4557 | 19777 | Export CLI | After deleting an entity, export of projects using the CLI would sometimes fail. |
| RECOB-4584 | 19673 | Data mart loading | When a dimension contained more than 10 entities, loading of the data mart would fail with the following error: "Case expressions may only be nested to level 10.Operation cancelled by user" |
| RECOB-4595 | 20256 | Data mart task generation | Data mart task generation would fail when attributes of the same entity were assigned to different satellite tables. |
| RECOB-4633 | 20347 | Bulk Operations | Generating Bulk Operations would not include the last data mart in the list. |
| RECOB-4636 | 20746 | Data mart loading | Some projects could not be opened after upgrading. |
| RECOB-4464 | 14522 | CLI | Running the "generate_project" command with the "database_already_adjusted" parameter would reset the data mart to the "Create Tables" state. |
| RECOB-3917 | 2256585 | Data mart dimensions | Sometimes, rows in dimensions would incorrectly be marked as obsolete. |
| RECOB-4459 | 17328 | CLI - Export CSV | Running the export_csv command would cause ETL Set generation to fail for lookups with the following error: SYS,GENERAL_EXCEPTION,startIndex cannot be larger than length of string.<p>Parameter name: startIndex |
| RECOB-4481 | 17567 | Data Marts | Data Mart creation would sometimes fail with the following error "Sequence contains no matching element". |
| RECOB-4482 | 17567 | Data Marts | An error would sometimes be encountered when trying to delete a star schema. |
| RECOB-4390 | 12810 | ETLs | The ETL for handling data mart dimensions would use the non-optimized approach for one of the statements. |
| RECOB-4386 | 14640 | Snowflake | After four hours of inactivity, a "Snowflake Authentication token has expired" error would be shown. |
| RECOB-4500 | 5008 | ETLs | Verification of unused and/or outdated column mapping expressions would lead to redundant errors. |
| RECOB-4501 | 17659 | Data Marts | Validation of Type 2 dimensions would sometimes fail with an error that no Type 2 columns were detected (and that the dimension should be created as Type 1), even though Type 2 relationships existed in the dimension. |
| RECOB-4370 | N/A | Security | Fixes critical vulnerabilities (CVE-2021-45105, CVE-2021-45046, CVE-2021-44228) that may allow an attacker to perform remote code execution by exploiting the insecure JNDI lookups feature exposed by the logging library log4j. The fix replaces the vulnerable log4j library with version 2.16. |
| RECOB-4293 | 15341 | UI | Editing a data mart entity after creating the data mart would result in all of the fields being reordered alphabetically. |
| RECOB-4199 | 12178 | Project settings - Snowflake only | Enabling the Write metadata to the TDWM tables in the data warehouse option in the project settings would have no effect. |
| RECOB--4320 | 2160919 | Deployment packages | The source schema connection would not be updated after deploying a deployment package. |
| RECOB-4258 | 13575 | Data Marts | Data mart creation would fail when there were more than 500 relationships. |
| RECOB-4330 | 13852 | Amazon Redshift | An error would occur when trying to connect to Amazon Redshift using SSL. |
| RECOB-4351 | 16688 | Data Marts | When there was a 3-tier relationship - for example, Entity_A→Entity_B→Entity_C - and the Fact table contained columns from Entity_A and Entity_C, changes in the relationship values in Entity_B (which should have updated columns from Entity_C in the Fact) would not be updated in the Fact table. |
| RECOB-4071 | 5258 | Live Views | Reading from live views would take an excessively long time. |
| RECOB-4387 | 16511 | Microsoft Azure Synapse Analytics | Columns with numeric(n,n) data types would not be retrieved from the Landing Zone. |
| RECOB-4339 | 5276 | Import | The following error would sometimes be encountered when importing a data mart: SYS,GENERAL_EXCEPTION,Sequence contains no matching element |
| RECOB-4388 | 14522 | ComposeCLI Project Generation | Generating the project would truncate the data mart tables when running the following command: ComposeCli.exe generate_project --project <project name> --database_already_adjusted After generating the project, you need to clear the cache by running the following command: ComposeCli.exe clear_cache --project <project_name> --type storage |
| RECOB-4316 | N/A | Data Mart Tasks | When loading dimensions, a column would sometimes be used twice, causing the data mart task to fail. |
| RECOB-4235 | 13170 | Data Mart Tasks | A runtime parameter ("MutCnt_8323" or similar) was incorrectly initialized, causing the data mart task to fail. |
| RECOB-4109 | 10247 | Diagnostics | Diagnostic packages would contain the server name of the customer environment, which would sometimes result in users being locked out when the package was deployed in our internal testing environment. Now, the diagnostic packages will be generated without the server name. |
| RECOB-4113 | 2222648 | Project Documentation | The project documentation for Multi-Table ETLs and Post-Loading ETLs was generated without contents. |
| RECOB-3928 | 7892 | Post-ETL Error Reporting | Errors in Post-ETL stored procedures run on Microsoft Azure Synapse Analytics would not be reported. |
| RECOB-4149 | 2218407 | ETLs on Snowflake | While working with Snowflake via the private link configuration, the engine task would sometimes stop unexpectedly. |
| RECOB-5239 | 33030 | Data Mart Adjustment | When dropping a relationship to a lookup-table in the Model, adjusting the data mart would fail with the following error: Object reference not set to an instance of an object |
| RECOB-5210 | 33745 | Data Mart Task Generation | The following error would sometimes be encountered when generating ETLs after data mart validation: Sequence contains no matching elements" or "SYS,GENERAL_EXCEPTION,Input string was not in a correct format |
| RECOB-5217 | 30618 | Data Mart Tasks | Data mart tasks would sometimes fail with the following error: Invalid object name dbo.TPIL_RUNS |
| RECOB-5081 | 26461 | Satellite Loading Performance | Performance issues would sometimes be encountered when loading data warehouse satellites tables. |
| RECOB-5064 | 29989 | Project documentation | When generating project documentation, the following error would sometimes occur: System.OutOfMemoryException |
| RECOB-5137 | 30948 | Adding dimensions | Adding a dimension without the "dummy" row would result in incomplete loading on the next task run. |
Enhancements
| Jira Issue | Salesforce case | Component/Process | Description |
| CMPS-625 | N/A | Environment variables in export | An option has been added to remove environment information when exporting projects (CLI) or creating deployment packages. To facilitate this functionality, the --without_environment_specificsparameter was added to the CLI and a Replace environment specifics with defaults option was added to the Create Deployment Packagewindow. |
| RECOB-4802 | 2218782 | Project Settings | A new Project title field has been added to the project settings'General tab. The value of the field will be included in the project deployment. |
| RECOB-4104 | 2160919 | Microsoft Azure Synapse Analytics Performance | Performance was improved by adding indexes to Transactional and State Oriented fact tables. |
| RECOB-4105 | 2160919 | Microsoft Azure Synapse Analytics Performance | Performance was improved by creating the TEMP table as a HEAP table instead of a HASH table. |
| RECOB-4106 | 2160919 | Microsoft Azure Synapse Analytics Performance | Performance was improved by updating the statistics after each incremental load of the dimensions. |
| RECOB-4126 | 10967 | Microsoft Azure Synapse Analytics Performance | Performance was improved for data mart ETL tasks by adding indexes (over columns used for join clauses) to intermediate tables. |
| RECOB-4142 | 10996 | Compose CLI Timeouts | A session expired error would sometimes occur during the CLI commands that took a long time to complete (e.g. import_csv). To resolve such timeouts, users can now add the "–timeout seconds" parameter to the command. Setting "--timeout -1" will run the command without it timing out. |
| RECOB-4929 | N/A | Data Lakes Project - Real-Time Views | Subquery HIVE errors would sometimes be encountered when creating and reading from the real-time view. The issue was resolved by updating the latest applied partition during runtime. |
| Jira Issue | Salesforce case | Component/Process | Description |
| RECOB-5366 | 35823 | Filters | The fact table would not use the filter of the dimension table it was related to. |
| RECOB-5618 | 39793 | Data Marts - Relationships | Relationship prefixes would be ignored when adding dimensions to existing facts. |
| RECOB-6232 | 57837 | Data Mart Loading - SQL Server | Loading the data mart would sometimes fail with an "Invalid column name" error. |
| RECOB-5410 | 31391 | Data Mart Tasks | The SQL server TempDB system database would reach capacity during Data Mart task execution. |
| RECOB-5418, RECOB-5555 | 37420 | Data Mart Tasks - Performance | Data mart tasks would take an excessively long time to complete. |
| RECOB-5425 | 51127 | Data Mart Generation | When there were multiple relationships to the same table, issues would be encountered when generating the data mart task. |
| RECOB-5450 | 20156 | Fact Table Statistics | The UPDATE STATS command would only update the stats on some of the fact tables, instead of all of them. |
| RECOB-5463 | 38236 | Data Mart Performance | When running Full Load ETL statements, records would be loaded directly into the indexed data mart table using CTE (Common Table Expression). These inserts would take an excessively long time to complete. |
| RECOB-5616 | 21675 | Data Marts | When an entity had a self-referencing relationship, data mismatches would sometimes occur between the data warehouse and data mart hierarchies. |
| RECOB-5645 | 38753 | Data Mart Tasks | The OBSOLETE__INDICATION = 0 rows indicator would be temporarily missing from the data mart while the task was running. |
| RECOB-5655 | 43588 | Data Mart Tasks | A task with five or more relationships would take an excessively long time to complete. |
| RECOB-5865 | 50151 | Filters in Data Mart Tasks | When defining a multi-column filter condition on a data mart dimension, where one column was from a Satellite table and the other column was from a Hub table, the the condition would not be processed correctly. |
| RECOB-5895 | 38277 | Data Mart Tasks | The following error would sometime occur after running the data mart task: duplicate alias 'E04 |
| RECOB-6191 | 57012 | State-oriented Fact Tables | The OPTION(FORCE ORDER) hint would not be added for state-oriented fact tables. |
| RECOB-6203 | N/A | Performance | Performance was improved by removing unnecessary queries. |
| RECOB-6189 | 53277 | Data Mart Tasks | An "ambiguous column" error would occur in the data mart after upgrading from Compose for Data Warehouses 7.0 (November 2020). |
| RECOB-6031 | 48268 | INSERT/UPDATE Operations | A join clause would be used for INSERT/UPDATE operations, even when flags were set. |
| Jira Issue | Salesforce case | Component/Process | Description |
| RECOB-5729 | 45316 | Record status | Previously deleted records would still be shown as deleted after the source was reloaded. |
| RECOB-6089 | 54204 | ETL tasks | ETL tasks would try to connect to localhost instead of the configured DSN, and fail. |
| RECOB-6079 | N/A | Compose CLI | Added the ability to manage user and group roles using the Compose CLI. |
| Jira Issue | Salesforce case | Component/Process | Description |
| RECOB-6005 | 51516 | Amazon Redshift | Added support for external (Spectrum) tables. |
| RECOB-6014 | 48481 | Amazon Redshift | The following error would occur when using the JDBC 4.2 driver: Java connection failed, error: 'SYS-E-GNRLERR, Required driver class not found: com.amazon.redshift.jdbc41.Driver. |
| RECOB-6003 | 48481 | Databricks | The following error would occur when attempting to connect using the latest Databricks JDBC driver: Test connection failed, Error: SYS-E-HTTPFAIL, Failed to add session connection: SYS-E-GNRLERR, Required driver class not found: com.simba.spark.jdbc.Driver.. |
| RECOB-6078 | N/A | Databricks Cloud Storage | Added support for the new "Databricks (Cloud Storage)" Replicate endpoint. |
| RECOB-6041 | 51707 | Snowflake | Header columns would be case-sensitive in task statements. The issue was resolved by setting the "setIgnoreCaseFlag" flag. |
| Jira Issue | Salesforce case | Component/Process | Description |
| RECOB-5582 | 37431 | Drop and Recreate tables | When using the Drop and Recreate > Tables Data Warehouse option, data would not be populated into the Date and Time hub tables. |
| RECOB-5809 | 44396 | Updating dimensions | Updating "ghost" references in the data warehouse would not add the records to the dimension. |
| RECOB-5742 | 46049 | Compose CLI | It would not be possible to run multiple instances of the Compose CLI. Therefore, it would not be possible to run multiple project workflows in parallel using the Compose CLI. |
| RECOB-5835 | 46762 | Data marts | MIN/MAX custom date functions in the data mart task statements would be dropped prematurely. |
| Jira Issue | Salesforce case | Component/Process | Description |
| RECOB-5288 | 33745 | Data mart ETL generation | When generating ETLs after data mart validation, the following errors would sometimes occur: Sequence contains no matching elements -OR- SYS,GENERAL_EXCEPTION,Input string was not in a correct format. |
| RECOB-5240 | 33030 | Deleting dimensions | Deleting a dimension would sometimes cause the following error: Object reference not set to an instance of an object |
| RECOB-5387 | 34759 | Installation | Some of the HTML files were missing after the installation. |
| RECOB-5454 | 32555 | Views | CDP view creation was modified for Apache Impala compatibility. |
| RECOB-5506 | 38079 | Upgrade | After upgrading from Compose November 2021 to Compose May 2022, the following error would occur: COMPOSE-E-DATAMARTMODELERROR, Datamart model error. |
| RECOB-5442 | 38277 | Data mart tasks | Data mart tasks would sometimes fail with the following error: Terminated:sqlstate 42601, errorcode 2027, message SQL compilation error:duplicate alias E04 |
This section describes the known issues for this release.
Jira issue: N/A
Salesforce case: N/A
Component/Process: Schema Evolution - New Columns
Description: When using Replicate to move source data to Compose, both the Full Load and Store Changes replication options must be enabled. This means that when Replicate captures a new column, it is added to the Replicate Change Table only. In other words, the column is stored without being added to the actual target table (which in terms of Compose is the table containing the Full Load data only i.e. the landing table).
For example, let's assume the Employees source table contains the columns First Name and Last Name. Later, the column Middle Name is added to the source table as well. The Change Table will contain the new column while the Replicate Full Load target table (the Compose Landing table) will not.
In older versions of Compose for Data Warehouses, mappings relied on the Full Load tables (the Compose Landing tables), meaning that users were not able to see any new columns (i.e. Middle Name in the above example) until they were created in the Full Load tables via a reload.
From Compose May 2021, the Compose Discover and Mappings windows show changes to new columns that exist in both the Change Tables and the Replicate Full Load target tables. This allows Schema Evolution to suggest adding columns that exist in either of them.
Although this is a much better implementation, it may create another issue. If a Full Load or Reload occurs in Compose before the Replicate reload, Compose will try to read from columns that have not yet been propagated to the Landing tables (assuming they exist in the Change Tables only). In this case, the Compose task will fail with an error indicating that the columns are missing.
Should you encounter such a scenario, either execute a reload in Replicate or create an additional mapping without the new columns to allow Compose to perform a Full Load from the Landing tables.
Jira issue: N/A
Salesforce case: N/A
Component/Process: Referenced dimensions
Description: If a dimension being referenced is dropped and created, or reloaded for any reason (for example, the source data mart is fully rebuilt on each load), any facts to which the referenced dimension was added should be reloaded too. Currently, Compose does not handle this automatically.
Workaround:Run the data marts containing the referenced dimensions.
Jira issue: RECOB-5315
Salesforce case: 33522
Component/Process: Snowflake Data Warehouse Tasks
Description: When generating the data warehouse task, if any attribute with the JSON data type is defined as Type 2, the following error will occur:
SYS,GENERAL_EXCEPTION,invalid enum value<p>Parameter name: ACDataType
About Qlik
Qlik converts complex data landscapes into actionable insights, driving strategic business outcomes. Serving over 40,000 global customers, our portfolio provides advanced, enterprise-grade AI/ML, data integration, and analytics. Our AI/ML tools, both practical and scalable, lead to better decisions, faster. We excel in data integration and governance, offering comprehensive solutions that work with diverse data sources. Intuitive analytics from Qlik uncover hidden patterns, empowering teams to address complex challenges and seize new opportunities. As strategic partners, our platform-agnostic technology and expertise make our customers more competitive.