This article explains the steps to create two Talend Data Integration jobs. One DI job as a Kafka producer and another job as a Kafka Consumer. Both the jobs use the option 'Schema Registry' in tKafka components.
According to Confluent documentation, Schema Registry (or Confluent Schema Registry) "provides a centralized repository for managing and validating schemas for topic message data". Refer to Confluent documentation for more information about Confluent Schema Registry.
Creating a Kafka topic using Confluent Control Center UI.
It is possible to create Kafka topics programmatically or with command-line options.
In this article example scenario, we are using Confluent Control Center UI to create Kafka topics. Create a topic called 'demo' using 'Add topic' option. Refer screenshot below.
Set a schema for 'value' within the topic 'demo'. Refer screenshot below.
Below is the sample schema used in this example topic.
{
"type": "record",
"namespace": "com.mycorp.mynamespace",
"name": "sampleRecord",
"doc": "Sample schema to help you get started.",
"fields": [
{
"name": "id",
"type": "string"
},
{
"name": "amount",
"type": "string"
}
]
}
After creating the topic, lets create a Talend DI job for producer that will publish messages to 'demo' topic.
Sample producer job includes tFixedFlowInput component to write values to ProducerRecord object, tJavaRow that creates the ProducerRecord object from the values defined in tFixedFlowInput, and finally tKafkaOutput component to publish the ProducerRecord to topic 'demo'. Refer to screenshots below.