Skip to main content
Announcements
Accelerate Your Success: Fuel your data and AI journey with the right services, delivered by our experts. Learn More
cancel
Showing results for 
Search instead for 
Did you mean: 
STelkar1613587356
Contributor
Contributor

Kafka - Talend - Teradata Huge data processing

I am trying to read huge amount of data about 1Million+ messages from message streaming service(Kafka). My current methodology is tKafkaInput > tExtractJsonFields > tMap >tTeradataOutput.

When I am running my job, I am getting the following error

Exception in thread "main" java.lang.OutOfMemoryError: Java heap space

Could someone please suggest on what can be done to avoid this issue and handle unexpected huge data of 5Million and upwards

Labels (3)
1 Reply
sagu
Contributor
Contributor

first write in to file and then load file into table.