Skip to main content
Announcements
See what Drew Clarke has to say about the Qlik Talend Cloud launch! READ THE BLOG
cancel
Showing results for 
Search instead for 
Did you mean: 
noobDE
Contributor II
Contributor II

Automate Transformation Across Many Tables

I have ~100 tables that I will use to do data ingestion to a data warehouse. We have a transformation mapping for some of the columns for each table, something like:
table_1
   - encrypt(column_X)
table_2
   - encrypt(column_X)
   - encrypt(column_Y)
table_3
   - encrypt(column_Z)

Is there any way to automate this? My first try was to create a task that would get data from all the tables and their columns, export the task, and update the exported JSON file on the manipulations part.
But when I tried to import the updated JSON, it failed. I assumed that this probably not that simple so wondering if there is some kind of "official" way to do this.

Labels (2)
4 Replies
Dana_Baldwin
Support
Support

Hi @noobDE 

There are certain transformations you can make at the task level. If the column names you're working with are shared between tables you may be able to use this functionality:

Defining global rules | Qlik Replicate Help

This link is for the latest version. Please scroll up to the top left to filter the results by the version you're using.

Apart from that, editing the task JSON file is the only other option besides doing it manually in the console.

Thanks,

Dana

SushilKumar
Support
Support

Hello @noobDE 

Could you share the error message. so that we can check is that a syntax error or Semantic error.

Regards,

Sushil Kumar

noobDE
Contributor II
Contributor II
Author

@Dana_Baldwin Thanks for the response. The column names are not shared, so I will retry on how to edit the task JSON file.

@SushilKumar The error :

noobDE_0-1716340426757.png


I only edited the manipulations field (inside the cmd.replication_definition). What are the suspects here? Do I need to update something else? I can see on the exported JSON file there is this task_settings field.

Also, the way I exported the initial task is by using export task on the GUI, not via repctl (not sure whether these are different)

Heinvandenheuvel
Specialist III
Specialist III

1) please just copy & paste the error text instead of some pixels which look like text.

2) repeat the import with REPCTL and check out REPCMD.LOG. it will have line/character information about the problem point in the json.

3) Manipulations json is tricky, errors are easily made. The error is on you - for sure. To help us help you find the error you may need to attach the before and after json. To do that most effectively consider reducing the task to the bare minimum. 1 table, 1 transformation. In doing so, I would not be surprised if you find the issue.

4) yes you can automate adding manipulations to json files, but all too often it is hard to find a good 'rule'. Anyway, please find attached a (perl) script I made to show tables and their manipulations in a task which may be able to highlight the issue in the 'after' json. A second (PowerShell) script can automatically apply manipulations from an input control CSV file to a json file, It is unlike to have the the right rule for your need, but perhaps it is a good starting point.

Good luck,

Hein.