Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
Hi,
I'm trying to consume a Kafka feed that outputs AVRO.
I'm getting an issue with using the following .avsc file (attached as .txt, eventtaggs-event-v6.txt) on tKafkaInputAVRO:
java.lang.UnsupportedOperationException: MAP not supported ...
Similarly if I use tKafkaInput -> tHMapRecord I get:
java.lang.IllegalArgumentException: Unsupported type for node {"type":"map","values":"long"}
at org.talend.transform.avro.AvroUtils.toAvroValue(AvroUtils.java:273)
at org.talend.transform.avro.AvroUtils.toAvroValue(AvroUtils.java:207)
I think the feed providers uses Confluent to serialize/deserialise their data with zstd
compression - any suggestions?
I've attached the example JSON output that I should be reading (example_Event_aggs_output.txt). I only want to pull out specific fields and load these into a SQL server db etc from the Kafka Stream that output AVRO.
Talend studio
Version: 7.1.1
Build id: 20201210_0723-patch
Regards,
John
I saw this page about using confluent deserializer: https://docs.confluent.io/platform/current/schema-registry/serdes-develop/index.html + https://docs.confluent.io/platform/current/schema-registry/serdes-develop/serdes-avro.html
but not sure how to use this within Talend studio