Do not input private or sensitive data. View Qlik Privacy & Cookie Policy.
Skip to main content

Announcements
Qlik Open Lakehouse is Now Generally Available! Discover the key highlights and partner resources here.
cancel
Showing results for 
Search instead for 
Did you mean: 
Anonymous
Not applicable

Migration from MySQL to PostgreSQL - enable stream causes table not being created

Hi!

I'm migrating a huge table (460.000 records with attachements), which (expectedly) causes OOM errors when using batch.

Problem is that if I activate the "Enable Stream" option in MySQLInput advanced settings, talend start executing inserts before creating the table in PostgreSQL - which I don't expect.

My job has only 3 objects:

 

tMySqlInput => tConvertType => tDbOutput

 

All schema is correctly configured, and tDbOutput has the "Create table" + "Insert" options set.

 

How to deal with this "behavior"?

 

Thanks,

 

Edson

Labels (2)
1 Solution

Accepted Solutions
Anonymous
Not applicable
Author

Yes, I've read the docs.

Problem is not where to activate the option. Problem is that when the option is activated, table is not being created before start the inserts...

 

But right now, I've discovered a workaround: I'd to disable the "batch" operation in PostgreSQL output - so, for now, I'm answering my own question :-).

 

View solution in original post

2 Replies
manodwhb
Champion II
Champion II

@ecerichter , check the below link ? and which version of mysql are you using.

Select this check box to enables streaming over buffering which allows the code to read from a large table without consuming a large amount of memory in order to optimize the performance.

This check box is available only when Mysql 4 or Mysql 5 is selected from the DB Version drop-down list.

 

https://help.talend.com/reader/NNO~fmVQU4rlkF9Depfdxw/3Fqzte0UWXDOcBxURdu9gg

Anonymous
Not applicable
Author

Yes, I've read the docs.

Problem is not where to activate the option. Problem is that when the option is activated, table is not being created before start the inserts...

 

But right now, I've discovered a workaround: I'd to disable the "batch" operation in PostgreSQL output - so, for now, I'm answering my own question :-).