Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
Hi eb
I need to implement a continuous consumption (extraction) of kafka events into a target DB.
Events rate is low and should not exceed a few to a few dozens a second (at picks).
But then this steady flow of events will continue forever.
Would you attempt this at all with TOS big data or go straight away to a licensed streaming big data or ESB solution?
These two feels to me like an over complication for such a modest flow of events...
In these two former discussions for doing so with the TOS bigData:
One seems to have implemented successfully tLoop over tKafkaInput here:
The other seem to have struggled with tInfiniteLoop over tKafkaInput and was advised to put the kafka extraction in a child job in order to have it work??
Appreciate your thoughts on this!
TIA
Rea
Thanks for this valuable information...I really need this kind of information mcdvoice mybkexperience