
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Avro messages to Kafka
Hi,
need some help with sending Avro data to Kafka because we are not seeing the data as we expect.
1. When we configure the Kafka target endpoint to "Use logical data types for specific data types" (screenshot below)
the generated schema in Schema registry is the one in attachment called schema-cdc.aspect4.EG3DT.XLPLKDP1-value-v1.json.
When looking in Confluent Cloud, the message shown is the one in attachment cdc.aspect4.EG3DT.XLPLKDP1_message-v1.json.
2. When we configure the Kafka target endpoint to NOT "Use logical data types for specific data types" (screenshot below)
the generated schema in Schema registry is the one in attachment called schema-cdc.aspect4.EG3DT.XLPLKDP1-value-v2.json.
When looking in Confluent Cloud, the message shown is the one in attachment cdc.aspect4.EG3DT.XLPLKDP1_message-v2.json.
What we expected to see in the message is the payload that can be found below:
How can we achieve this result when using Avro?
Thank you.

- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hello @ernesto_costa ,
If I understood correctly, you are intend to exclude the Change Mask and Column Mask header columns, is that correct?
The setting is in task settings --> Message Format.for example:
Regards,
John.

- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi John,
no, the issue is that the format of the data fields for the tables being extracted, meaning the fields inside the "data" and "beforeData" which relate to the table structure, are not according to expected. We expected to see something similar to what is in the shared screenshot.
As I explained:
- If we choose DON'T choose "Use logical data types for specific data types", every data field inside "data" and "beforeData" is sent as a string
- If we choose the "Use logical data types for specific data types", the data fields are sent as bytes/buffer, which is also what is not expected
So, as a conclusion, what we expect to see is the fields from the tables extracted that are INT to be INT, FLOAT to be FLOAT, BOOLEAN to be BOOLEAN, etc.
Let us know if you can help.
Thank you.

- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hello @ernesto_costa ,
Please open a support case (and link this article to the case), support team would help you further.
Regards,
John.

- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hello @ernesto_costa
If its a New POC or it was working fine earlier. According to my experience JSON is the default format and switch to using AVRO format require PS engagement .
Request you to please get in touch with you AM to avail that.
Regards,
Sushil Kumar
