Do not input private or sensitive data. View Qlik Privacy & Cookie Policy.
Skip to main content

Announcements
Qlik Named a 7-Time Gartner® Magic Quadrant™ Leader: See the 2026 Report
cancel
Showing results for 
Search instead for 
Did you mean: 
data_engineer_whop
Contributor II
Contributor II

Upload column as json to target ( store changes).

I am trying to load in Qlik Replicate from mongo db to Azure Data Lake Storage (changes feed) as json or parquet. I am trying to load document part into column _doc but it keeps being NCLOB type read as string/text in databricks. 

Is there any way to save this document/column as proper json format readable downstream by databricks?

I would appreciate any tips on settings I can try?

Second  aspect is to check the length of the column we mentioned above and add this as a value of new column and write to targer 
length($_doc) as global rule or on the  table level doesn t work . 

Labels (2)
1 Reply
OritA
Support
Support

Hi , 

It seems that since mongodb stores the json in binary format and this is the reason that you see the column stored as NCLOB data type.  I am not sure if it can be changed to json format. I suggest you open a salesforce case and attach to the case the task diagnostic package with the DDL of the table and description of the request so we can further investigare and see if we can find a way to meet your requirement. 

Thanks & regards,
Orit