Do not input private or sensitive data. View Qlik Privacy & Cookie Policy.
Skip to main content

Announcements
Save $650 on Qlik Connect, Dec 1 - 7, our lowest price of the year. Register with code CYBERWEEK: Register
cancel
Showing results for 
Search instead for 
Did you mean: 
Anonymous
Not applicable

How to write to multiple files within same big data sparkstreaming job

My use case is to stream from either Kafka or Kinesis (we are AWS), buffer windows of data for one minute intervals and then write out the results from each one minute buffer to either HDFS or S3.  The issue is that each buffer must be written to a separate file?
Is this possible using a talend spark big data streaming job? 
Labels (3)
1 Reply
Anonymous
Not applicable
Author

Hi,
Sorry for delay!
We have redirected your issue to Talend Bigdata experts and then come back to you as soon as we can.
Best regards
Sabrina