Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
DB2 tables are being replicated to S3 through log stream. One of the tables has a BLOB column with JSON content. These JSONs have special characters (e.g., grave accents) and replicate is not recognizing/encoding these characters properly and the target file in S3 has bad characters.
What options are there to preserve the data content/characters?
Hello,
Replicate has options to do transformations and character substitutions.
However, since its a LOB column, unfortunately, this is a limitation.
The only supported transformation for LOB/CLOB data types is to drop the column on the target and Character substitution does not support LOB data types.
I suggest to submit a feature request thru our ideation:
https://community.qlik.com/t5/Suggest-an-Idea/idb-p/qlik-ideas
Thanks
Lyka
Hello @narashan
What version of Replicate and which DB2 Are you using? ( DB2 LUW, or DB2 iSeries or DB2 z/OS)
Thanks
Barb
Hello,
Replicate has options to do transformations and character substitutions.
However, since its a LOB column, unfortunately, this is a limitation.
The only supported transformation for LOB/CLOB data types is to drop the column on the target and Character substitution does not support LOB data types.
I suggest to submit a feature request thru our ideation:
https://community.qlik.com/t5/Suggest-an-Idea/idb-p/qlik-ideas
Thanks
Lyka
Hello @narashan , copy @lyka , @Barb_Fill21 ,
I totally agree with Lyka. Besides Lyka suggestion, I'd like to know what's the file format you are using in S3.
If it's CSV/JSON format then some special chars may lead problems, eg Carriage return in these files break a single line into multiple lines (then data messed up). If you change to Parquet format then the problem should be solved.
Hope this helps.
Regards,
John.