Do not input private or sensitive data. View Qlik Privacy & Cookie Policy.
Skip to main content

Announcements
Qlik Open Lakehouse is Now Generally Available! Discover the key highlights and partner resources here.
cancel
Showing results for 
Search instead for 
Did you mean: 
sensiva
Contributor
Contributor

tkafkaInput Advanced settings and error handling

Hello,

 

I am currently working with the tkafka components and for the moment  all is working on a success scenario, but i am facing lot of issues on the error handling.  For example the advanced settings, i gave ack=all, retries=10, but none of these seems to get considered. I tried to test a scenario where i brought down a broker with ack=all, but the kafka input doesn't try to turn down that transaction rather continued to process the next message and finished with success, i had message lost during this transaction.

 

Is this a bug ? or there are other ways of configuring these additional properties ?

 

Would be really helpful with some pointers on this topic.

 

Thanks

Labels (4)
5 Replies
Anonymous
Not applicable

Hello,

For further information about the consumer properties you can define in Kafka consumer properties, please see the section describing the consumer configuration in Kafka's documentation in http://kafka.apache.org/documentation.html#consumerconfigs.

Best regards

Sabrina

 
Anonymous
Not applicable

Hi,

 

Am getting the following error "Exception in component tKafkaInput_1
kafka.common.MessageSizeTooLargeException: Found a message larger than the maximum fetch size of this consumer on topic "

 

I have increased my fetch size as attached . Please assist .

 


kafka.PNG
sensiva
Contributor
Contributor
Author

I see the properties being defined in the kafka node, are they inline with
the properties defined in the kafka server ? May be that could be a point
to look
Anonymous
Not applicable

Hello @jibe

We see that this topic is set as resolved.

https://community.talend.com/t5/Design-and-Development/tkafkaInput-Advanced-settings/m-p/142543

Is this solution "clear your additional Kafka properties as in your attached config and  try using the "fetch.message.max.bytes" parameter" Ok with you?

Best regards

Sabrina

Anonymous
Not applicable

Hello @sensiva

What's kafka API version are you using? The kafka API changed between 0.8 and 0.9.

Best regards

Sabrina