Do not input private or sensitive data. View Qlik Privacy & Cookie Policy.
Skip to main content

Announcements
Qlik GA: Multivariate Time Series in Qlik Predict: Get Details
cancel
Showing results for 
Search instead for 
Did you mean: 
Anonymous
Not applicable

Problem switching from tPostgresqlInput to tJDBCInput

Hi all,

 

I am trying to read a PostgreSQL database (version 9.6) through tJDBCInput component (I am using TOS Data Integration 6.3).

The table I am trying to read have 5 millions rows approximatively.

 

When I use the native component tPostgresqlInput I don't have problems to read my table and to insert rows in another table (tJDBCOutput linked to PostgreSQL DB) :

0683p000009LzDf.png

 

However when I use a tJDBInput component  I don't manage to read the same table, the job is "Starting" and after a few minutes I have a OutOfMemoryError (increasing Xms and Xmx doesn't solve the problem):

0683p000009LzDk.png

 

In both my Input components I ticked "use a cursor" ( cursor size = 10000 rows).

I seems that the cursor is not taken into account in tJBDCInput component.

 

Any idea what can cause the problem?

 

Thank you in advance for your help!

 

Labels (3)
4 Replies
Jesperrekuh
Specialist
Specialist

Probably the postgres component sends correct additonal parameters (parameter names) and, I assume, jdbc connection parameters are different. Check the postgres driver parameter options and add them to your generic jdbc connection URL JDBC. You might experience datatype issues regarding postgres.

You still use a postgres driver (which is absolutely fine) but why do you want to switch? eloborate please?!?
Anonymous
Not applicable
Author

Class name et driver are correct in my JDBC component (same than in Posstgresql component). I tried to add setAutoCommi(false) in my JDBC URL but it doesn't work.

 

In fact I am normally using contexts in my JDBC components, so that I can easily change the database type (one of my client is using PostgreSQL, an other will use hyperfilesql) without rewriting all my jobs. 

 

 

Anonymous
Not applicable
Author

I would like to precise that if I fetch first 20000 rows for exemple then it's working fine.

Anonymous
Not applicable
Author

Can someone help me ?

It seems the problem comes from the "huge" dataset I am reading...

Thank you!