Do not input private or sensitive data. View Qlik Privacy & Cookie Policy.
Skip to main content

Announcements
Save $650 on Qlik Connect, Dec 1 - 7, our lowest price of the year. Register with code CYBERWEEK: Register
cancel
Showing results for 
Search instead for 
Did you mean: 
Anonymous
Not applicable

Insert JSON in kafka using tKafkaOutput

Hello,
I have a csv file as follows:
id,first_name,last_name,email
1,x,x,x@x.com
2,y,y,y@y.com
I want to convert this to json object as follows
{
"id" : 1,
"first_name" : "x",
"last_name" : "x",
"email" : "x@x.com",
}
{
"id" : 2,
"first_name" : "y",
"last_name" : "y",
"email" : "y@y.com",
}
and now insert this each json object in kafka.
Thanks.
Labels (4)
5 Replies
vapukov
Master II
Master II

Hi,
simple answer - see attachments, somewhere in a middle You can add transformation CSV to JSON
using tWriteJSONFileds + convert data to byte[] (in example it .getByte() )
   0683p000009MBiQ.png   0683p000009MBXi.png
Anonymous
Not applicable
Author

Hello ,
Can you give me example for tFileInputDelimited -> tWriteJSONField
I tried it but getting error

I think I am doing wrong here for configure JSON tree.
0683p000009MBZ6.png
Anonymous
Not applicable
Author

Solved thanks.
You saved me.
I used tJavaRow instead of tFixedFlowInput.
Anonymous
Not applicable
Author

Hello,
How can a pass the entire row from the file as it is , to kafka.
I am not understanding how i will convert the entire row to Bytes , This time I dont want it as JSON.
I want to send it to kafka as it in the raw file.
Thank you
vapukov
Master II
Master II

Use tFileInputFullRow, with schema - single column - line
0683p000009MBHB.png