Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
We are using Databricks Lakehouse (Delta) as our target destination. As we have multiple sources, I created global roles that provide us with both 'createdtime' and 'updatedtime', along with a 'softdelete' flag. However, I cannot find a way to add primary key information to a new column.
Is it possible to do this? Additionally, we also use Azure Data Lake Storage (ADLS) as a target, where we have the ability to create metadata files that include primary keys. The idea now is to mirror that ability directly to a table in Databricks Lakehouse.
Hello @toni_lajsner ,
Thanks for reaching out to Qlik Community!
... with both 'createdtime' and 'updatedtime', along with a 'softdelete' flag. However, I cannot find a way to add primary key information to a new column.
Are you intend to make the 'createdtime' and/or 'updatedtime' as PK ? or another new column? any helpful information is welcome.
Is it possible to do this? Additionally, we also use Azure Data Lake Storage (ADLS) as a target, where we have the ability to create metadata files that include primary keys. The idea now is to mirror that ability directly to a table in Databricks Lakehouse.
Not sure what's the source DB type and if the source tables have PK (or Unique Index) or not? We'd like to confirm for you.
Thanks,
John.
Createdtime and updatedtime are columns that I am adding to every table as part of our own processes after data lands on Databricks Delta. They are not and will not be used as part of the primary key.
My question was only regarding a new column that will inherit information from the source and give us a hint of which columns the primary key is created from. The metadata created while using ADLS provides us with "primaryKeyPos":1,"primaryKeyPos":2 etc.
So idea is to create an column that will give us same information ?
Hello @toni_lajsner ,
Thanks for the information.
In general Qlik Replicate will replicate below index information from source database to target database (by order):
1- Primary Key (if the source table has PK. The other Unique Indexes and Indexes will be ignored)
2- Unique Index (if the source table has NO PK but only one single UI)
3- The first Unique Index (in alphabetical name order, if the source table has NO PK but multiple UIs)
4- The desired column(s) as PK if you change the PK definition in table transformation, or in case the source table has no PK nor UI and you want some column(s) as PK in target side
Below is a sample while Qlik Replicate creating 2 columns (EMPLOYEE_ID & PERSON_ID) as PK in Databricks Delta table MY_TRANSACTION_TESTTABLE1:
ALTER TABLE `default`.`MY_TRANSACTION_TESTTABLE1` ADD CONSTRAINT `default_MY_TRANSACTION_TESTTABLE1_1709129364786094_PK` PRIMARY KEY (`EMPLOYEE_ID`, `PERSON_ID`) DISABLE NOVALIDATE
Feel free to let me know if you need any additional information.
John.
Hi again,
We did not receive an 'alter table' command. I was looking at the logs and only noticed 'create or replace table' plus 'copy into' initialization.
Do we need to check in some other setting in qlik replicate
Hello @toni_lajsner ,
Please set TARGET_LOAD to Trace or Verbose mode then rerun the task again you may get it - please make sure the target table does not exist before the task running, or the Target Table Preparation option is set to "DROP and CREATE table".
Regards,
John.
Hi @john_wang
I did all the steps , also using verbose , but we are not getting ALTER TABLE , only create or replace and copy into
Hello @toni_lajsner ,
Would you please open a support ticket with below information:
1- Set SOURCE_UNLOAD & TARGET_LOAD/TARGET_APPLY to Verbose mode, Collect the Diag Packages and attach it to case;
2- Please provide the source table creation DDL include the PK/Unique Index information
We'd like investigate further for you.
Regards,
John.
Hi John , i have now created a ticket 00148242 hope you can investigate the issue further
/Toni
Hi @toni_lajsner ,
Thanks for creating a ticket. Support team will handle it.
Regards,
Desmond