Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
Hi!
I'm migrating a huge table (460.000 records with attachements), which (expectedly) causes OOM errors when using batch.
Problem is that if I activate the "Enable Stream" option in MySQLInput advanced settings, talend start executing inserts before creating the table in PostgreSQL - which I don't expect.
My job has only 3 objects:
tMySqlInput => tConvertType => tDbOutput
All schema is correctly configured, and tDbOutput has the "Create table" + "Insert" options set.
How to deal with this "behavior"?
Thanks,
Edson
Yes, I've read the docs.
Problem is not where to activate the option. Problem is that when the option is activated, table is not being created before start the inserts...
But right now, I've discovered a workaround: I'd to disable the "batch" operation in PostgreSQL output - so, for now, I'm answering my own question :-).
@ecerichter , check the below link ? and which version of mysql are you using.
Select this check box to enables streaming over buffering which allows the code to read from a large table without consuming a large amount of memory in order to optimize the performance.
This check box is available only when Mysql 4 or Mysql 5 is selected from the DB Version drop-down list.
https://help.talend.com/reader/NNO~fmVQU4rlkF9Depfdxw/3Fqzte0UWXDOcBxURdu9gg
Yes, I've read the docs.
Problem is not where to activate the option. Problem is that when the option is activated, table is not being created before start the inserts...
But right now, I've discovered a workaround: I'd to disable the "batch" operation in PostgreSQL output - so, for now, I'm answering my own question :-).