Skip to main content

Suggest an Idea

Vote for your favorite Qlik product ideas and add your own suggestions.

Announcements
This page is no longer in use. To suggest an idea, please visit Browse and Suggest.

Qlik Replicate - Data tokenization service Integrating with Hashicorp

skokanay
Contributor III
Contributor III

Qlik Replicate - Data tokenization service Integrating with Hashicorp

Problem for considering to Ideation :  Currently, we have an demand to tokenize the restricted columns  from the source tables integrating with the third party tool i.e Hashicorp. Its extremely important to meet the security standards using this approach where sensitive PCI DSS and PII columns should be tokenized using an centralised service platform (Advanced data protection) from Hashicorp.

Few points to be considered for an solution :

1. Qlik Replicate (Full load) - Should have an capability to integrate with the tokenization service in this case its  Hashicorp integration in terms of batches (i.e configurable number of messages /chunks say example 10,000 account no's) generated as batch input json  to invoke Hashicorp API's and  process the returned tokenized json batch response and flatten it to relational individual records to load into target.

2.  The tokenization should be invoked only for the identified source columns. 

3. The above mentioned details in point 1 should support for change processing as well.

 

 

 

 

 

 

Tags (1)
2 Comments
Nulee_Massaro
Employee
Employee

While we do not plan to support direct integration with these third party vendors, we have several customers leveraging our transformation feature in Replicate and use a UDF to make a call to a third party tool - like Hashicorp. In fact, it is pretty common that this capability is used to encrypt PII data. While this is an undocumented use case, you can reach out to our professional services team who have more information on it. 

Status changed to: Closed - Already Available
Prabodh
Creator II
Creator II

We currently use a third party tokenization service for this very reason. We have written a custom addon and it tokenizes the records field by field.

While I would love the ability to tokenize in batches, the current solution is working for us.

We replicate millions of transactions per day through this addon and have not noticed any significant performance impact.