Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
Hi all,
I am trying to read a PostgreSQL database (version 9.6) through tJDBCInput component (I am using TOS Data Integration 6.3).
The table I am trying to read have 5 millions rows approximatively.
When I use the native component tPostgresqlInput I don't have problems to read my table and to insert rows in another table (tJDBCOutput linked to PostgreSQL DB) :
However when I use a tJDBInput component I don't manage to read the same table, the job is "Starting" and after a few minutes I have a OutOfMemoryError (increasing Xms and Xmx doesn't solve the problem):
In both my Input components I ticked "use a cursor" ( cursor size = 10000 rows).
I seems that the cursor is not taken into account in tJBDCInput component.
Any idea what can cause the problem?
Thank you in advance for your help!
Class name et driver are correct in my JDBC component (same than in Posstgresql component). I tried to add setAutoCommi(false) in my JDBC URL but it doesn't work.
In fact I am normally using contexts in my JDBC components, so that I can easily change the database type (one of my client is using PostgreSQL, an other will use hyperfilesql) without rewriting all my jobs.
I would like to precise that if I fetch first 20000 rows for exemple then it's working fine.
Can someone help me ?
It seems the problem comes from the "huge" dataset I am reading...
Thank you!