Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
Hi,
In my usecase, i am receiving data in Kafka 24*7 in ebcdic. I read the data as byte array from kafkainput and parse the data using the tHMAP. Output of tHmap goes to tjavarow as byte array. output from the tJavaRow is getting stored in tHDFSOutput. Initial tJava component to create the filename with datetimestamp.
Here are my challenges
1) my data is getting stored as a single file in hdfs. I need to have them smaller file, as i might need to see the data from morning to afternoon (or to any point of time). Because of me having the data in a single file, i am unable to fetch data.
2) I tried updating the filename in the tJavaRow (that is the reason for inserting the component here). But i am unable to change the file name.
Need some suggestions to get this done.
Below is a flow, recreated for understanding.
just the question, if reason for split only access to timed data - why not query over HDFS files?
- SQL
- Hive
- Drill
For a day, i am expecting around 100GB data to be received. So, i am storing the data in HDFS with partition. But, the file initially created only being used to store the data despite the partition on date created on run time in tjavarow component.
any suggestions ?
Looking on Your original screenshot - You have something wrong in 2 components
Not for HDFS, but it work like this:
You must define variable before component start work, in my case - tJavaFlex3, an result will be: