Skip to main content
Announcements
July 15, NEW Customer Portal: Initial launch will improve how you submit Support Cases. IMPORTANT DETAILS
cancel
Showing results for 
Search instead for 
Did you mean: 
ernesto_costa
Contributor III
Contributor III

Avro messages to Kafka

Hi,

need some help with sending Avro data to Kafka because we are not seeing the data as we expect.

1. When we configure the Kafka target endpoint to "Use logical data types for specific data types" (screenshot below)

ernesto_costa_0-1709304490576.png

the generated schema in Schema registry is the one in attachment called schema-cdc.aspect4.EG3DT.XLPLKDP1-value-v1.json.

When looking in Confluent Cloud, the message shown is the one in attachment cdc.aspect4.EG3DT.XLPLKDP1_message-v1.json.

2. When we configure the Kafka target endpoint to NOT "Use logical data types for specific data types" (screenshot below)

ernesto_costa_5-1709305374254.png

the generated schema in Schema registry is the one in attachment called schema-cdc.aspect4.EG3DT.XLPLKDP1-value-v2.json.

When looking in Confluent Cloud, the message shown is the one in attachment cdc.aspect4.EG3DT.XLPLKDP1_message-v2.json.

 

What we expected to see in the message is the payload that can be found below:

ernesto_costa_6-1709305889354.png

How can we achieve this result when using Avro?

Thank you.

 

Labels (2)
4 Replies
john_wang
Support
Support

Hello @ernesto_costa ,

If I understood correctly, you are intend to exclude the Change Mask and Column Mask header columns, is that correct?

The setting is in task settings --> Message Format.for example:

john_wang_0-1709465151743.png

 

Regards,

John.

Help users find answers! Do not forget to mark a solution that worked for you! If already marked, give it a thumbs up!
ernesto_costa
Contributor III
Contributor III
Author

Hi John,

 

no, the issue is that the format of the data fields for the tables being extracted, meaning the fields inside the "data" and "beforeData" which relate to the table structure, are not according to expected. We expected to see something similar to what is in the shared screenshot.

As I explained:

  • If we choose DON'T choose "Use logical data types for specific data types", every data field inside "data" and "beforeData" is sent as a string
  • If we choose the "Use logical data types for specific data types", the data fields are sent as bytes/buffer, which is also what is not expected

So, as a conclusion, what we expect to see is the fields from the tables extracted that are INT to be INT, FLOAT to be FLOAT, BOOLEAN to be BOOLEAN, etc.

Let us know if you can help.

Thank you.

john_wang
Support
Support

Hello @ernesto_costa ,

Please open a support case (and link this article to the case), support team would help you further.

Regards,

John.

Help users find answers! Do not forget to mark a solution that worked for you! If already marked, give it a thumbs up!
SushilKumar
Support
Support

Hello @ernesto_costa 

If its a New POC or it was working fine earlier. According to my experience JSON is the default format and switch to using AVRO format require PS engagement .  

Request you to please get in touch with you AM to avail that.

Regards,

Sushil Kumar