Do not input private or sensitive data. View Qlik Privacy & Cookie Policy.
Skip to main content

Announcements
Join us to spark ideas for how to put the latest capabilities into action. Register here!
cancel
Showing results for 
Search instead for 
Did you mean: 
Anonymous
Not applicable

Storing Data from Kafka Input in smaller files

Hi,

 

In my usecase, i am receiving data in Kafka 24*7 in ebcdic. I read the data as byte array from kafkainput and parse the data using the tHMAP. Output of tHmap goes to tjavarow as byte array. output from the tJavaRow is getting stored in tHDFSOutput. Initial tJava component to create the filename with datetimestamp.

 

Here are my challenges

1) my data is getting stored as a single file in hdfs. I need to have them smaller file, as i might need to see the data from morning to afternoon (or to any point of time). Because of me having the data in a single file, i am unable to fetch data.

2) I tried updating the filename in the tJavaRow (that is the reason for inserting the component here). But i am unable to change the file name.

 

Need some suggestions to get this done.

 

Below is a flow, recreated for understanding.

0683p000009Lupt.png

 

 

 

 

Labels (4)
4 Replies
vapukov
Master II
Master II

just the question, if reason for split only access to timed data - why not query over HDFS files?

- SQL

- Hive

- Drill


Anonymous
Not applicable
Author

For a day, i am expecting around 100GB data to be received. So, i am storing the data in HDFS with partition. But, the file initially created only being used to store the data despite the partition on date created on run time in tjavarow component. 

Anonymous
Not applicable
Author

any suggestions ?

vapukov
Master II
Master II

Looking on Your original screenshot - You have something wrong in 2 components

 

Not for HDFS, but it work like this:

0683p000009Lu6W.png

 

You must define variable before component start work, in my case - tJavaFlex3, an result will be:

0683p000009LuaL.png