Do not input private or sensitive data. View Qlik Privacy & Cookie Policy.
Skip to main content

Announcements
Qlik Open Lakehouse is Now Generally Available! Discover the key highlights and partner resources here.
cancel
Showing results for 
Search instead for 
Did you mean: 
Anonymous
Not applicable

How to efficiently load 30mil rows from Kafka into Postgresql w/TOS?

Hi Team,
I have a scenario where i want to load 30million rows of data using Kafka into PostgreSQL. My job is going to be quite straightforward with few filters and tmap. 
I would like to know:
1. What are the best configuration settings for TOS in such scenario?
2. how can i reduce cpu/memory consumptions?
3. what is the most efficient way to load huge data using TOD? Shall i do it in one go or in batches?
4. And also, what could be the average time to load such huge data using TOS?
Thanks in advance.
Rera
Labels (2)
1 Reply
Anonymous
Not applicable
Author

Hi Rera,
I have a scenario where i want to load 30million rows of data using Kafka into PostgreSQL. My job is going to be quite straightforward with few filters and tmap.

tMap is cache component consuming much memory. For a large set of data, try to store the data on disk instead of memory.
Here is an option "Use Batch Size" in tPostgresqloutput which is used to activate the batch mode for data processing.
What does your filter look like? What's your current row rate(rows/s)? Is it a normal speed?
Please take a look at document about: TalendHelpCenter:Exception outOfMemory
Best regards
Sabrina