Do not input private or sensitive data. View Qlik Privacy & Cookie Policy.
Skip to main content

Announcements
Join us in NYC Sept 4th for Qlik's AI Reality Tour! Register Now
cancel
Showing results for 
Search instead for 
Did you mean: 
reapeleg
Contributor
Contributor

extracting a kafka "stream" with TOS BigData

Hi eb

I need to implement a continuous consumption (extraction) of kafka events into a target DB.

Events rate is low and should not exceed a few to a few dozens a second (at picks).

But then this steady flow of events will continue forever.

Would you attempt this at all with TOS big data or go straight away to a licensed streaming big data or ESB solution?

These two feels to me like an over complication for such a modest flow of events...

In these two former discussions for doing so with the TOS bigData:

One seems to have implemented successfully tLoop over tKafkaInput here:

https://community.talend.com/s/question/0D53p00007vCsxRCAS/problem-with-running-tkafkainput-in-a-loo...

The other seem to have struggled with tInfiniteLoop over tKafkaInput and was advised to put the kafka extraction in a child job in order to have it work??

https://community.talend.com/s/question/0D53p00007vCnPJCA0/how-to-rerun-tkafkainput-component-in-a-j...

Appreciate your thoughts on this!

TIA

Rea

Labels (3)
1 Reply
andyhilton27
Contributor
Contributor

Thanks for this valuable information...I really need this kind of information mcdvoice mybkexperience