Skip to main content
Announcements
UPGRADE ADVISORY for Qlik Replicate 2024.5: Read More
cancel
Showing results for 
Search instead for 
Did you mean: 
mannanrehbari
Contributor
Contributor

Kafka Avro Decoding

Hello,

I am facing some issues while decoding Avro encoded messages. 
Here is some information about my application/issue:

  • I am using Kafka as a target endpoint
  • Message format is Avro without compression
  • The metadata messages are read from the metadata topic
  • We are using the Avro schemas to generate Java classes using the avro-maven-plugin. This allows easy mappings to POJOs using MapStruct for further actions within our app.
  • Then instantiating Schema, SpecificDatumReader for each type of message to decode at application startup. 
    When a message is received, its Schema is looked up and a SpecificDatumReader is instantiated. SpecificDatumReaders could be cached once I have debugged the current issue.
  • the provided decoder with the SDK works fine with the length of byte [] array to be decoded is <= 8192
  • I see that the DecoderFactory provides a method: configureDecoderBufferSize(int size)
    When I use this to expand the size of the buffer, the decoding works fine on the first kafka message that is received, however, the subsequent kafka message decoding fails for a given schema.
  • The SDK that I have access to uses avro-tools v1.8.1

I would appreciate any comments/feedback on this issue. 

Labels (1)
3 Replies
john_wang
Support
Support

Hello @mannanrehbari ,

Welcome to Qlik Community forum and thanks for reaching out here!

You need to use Qlik Replicate AvroDecoderSDK, for example QlikReplicate_2023.11.0.468_AvroDecoderSDK.zip for Replicate 2023.11.

Hope this helps.

John.

Help users find answers! Do not forget to mark a solution that worked for you! If already marked, give it a thumbs up!
mannanrehbari
Contributor
Contributor
Author

Hey @john_wang ,

Thank you for your response. That is the exact SDK we are using for decoding the messages. Again, it works fine until the length of record.value() gets larger than 8192. It fails on larger messages. Is it expected to decode larger messages by default? 

The SDK internally uses Avro's DecoderFactory which has a DEFAULT_BUFFER_SIZE = 8192.

BinaryDecoder messageDecoder = DecoderFactory.get().binaryDecoder(messageBytes, null);

Following this method, it calls configureSource() method within BinaryDecoder class with the above hardcoded fixed value. 

john_wang
Support
Support

Hello @mannanrehbari ,

Thank you so much for the detailed information. Please raise a support ticket and provide a piece of code, our support will be more than happy to help you further.

Regards,

John.

Help users find answers! Do not forget to mark a solution that worked for you! If already marked, give it a thumbs up!