Skip to main content
Announcements
SYSTEM MAINTENANCE: Thurs., Sept. 19, 1 AM ET, Platform will be unavailable for approx. 60 minutes.
cancel
Showing results for 
Search instead for 
Did you mean: 
soowork
Contributor III
Contributor III

[resolved] ArrayIndexOutOfBoundsException error when writing data to Oracle

We have a Talend job which has been running happily nightly for 6 months.  This job takes data from a Vertica database and loads it into an Oracle table (does straight Inserts).
We are using Enterprise Big Data 5.4.1.r.111943 and Oracle Database 11g Enterprise Edition Release 11.2.0.3.0 - 64bit Production
The error was - 
Exception in component tOracleOutput_1
java.lang.ArrayIndexOutOfBoundsException: -32043
at oracle.jdbc.driver.OraclePreparedStatement.setupBindBuffers(OraclePreparedStatement.java:2677)
at oracle.jdbc.driver.OraclePreparedStatement.executeBatch(OraclePreparedStatement.java:9270)
at oracle.jdbc.driver.OracleStatementWrapper.executeBatch(OracleStatementWrapper.java:210)
at pdwserver.j_lcp_dmart_pbssummary_refund_0_1.j_LCP_DMART_PBSSUMMARY_REFUND.tVerticaInput_2Process(j_LCP_DMART_PBSSUMMARY_REFUND.java:23969)
at pdwserver.j_lcp_dmart_pbssummary_refund_0_1.j_LCP_DMART_PBSSUMMARY_REFUND.runJobInTOS(j_LCP_DMART_PBSSUMMARY_REFUND.java:27036)
at pdwserver.j_lcp_dmart_pbssummary_refund_0_1.j_LCP_DMART_PBSSUMMARY_REFUND.runJob(j_LCP_DMART_PBSSUMMARY_REFUND.java:26878)
at pdwserver.s_lcp_dmart_pbssummary_0_1.s_LCP_DMART_PBSSUMMARY.tRunJob_2Process(s_LCP_DMART_PBSSUMMARY.java:2746)
at pdwserver.s_lcp_dmart_pbssummary_0_1.s_LCP_DMART_PBSSUMMARY.tRunJob_6Process(s_LCP_DMART_PBSSUMMARY.java:2564)
at pdwserver.s_lcp_dmart_pbssummary_0_1.s_LCP_DMART_PBSSUMMARY.tRunJob_1Process(s_LCP_DMART_PBSSUMMARY.java:2298)
at pdwserver.s_lcp_dmart_pbssummary_0_1.s_LCP_DMART_PBSSUMMARY.tRunJob_8Process(s_LCP_DMART_PBSSUMMARY.java:1730)
at pdwserver.s_lcp_dmart_pbssummary_0_1.s_LCP_DMART_PBSSUMMARY.runJobInTOS(s_LCP_DMART_PBSSUMMARY.java:3056)
at pdwserver.s_lcp_dmart_pbssummary_0_1.s_LCP_DMART_PBSSUMMARY.main(s_LCP_DMART_PBSSUMMARY.java:2892)

The "commit every" setting was 1000.  Last night was the first time that we were writing more than 1000 rows.
I read other comments in the forum that suggested people got this kind of error when they wrote their second chunk of data - and this seemed to fit our case (as we were just > 1000 records for the first time).  We tested this theory by bumping the commit level to 1200 and running the job again.  In this mode, the job was successful.
What is causing this error to occur?  Obviously we are lucky in this job because we are writing such small amounts of data - but we don't want to just keep changing the commit level to stay greater than our data volume.  And we have other jobs that also write to Oracle that have data volumes that exceed the commit level.  So what is causing this one to have trouble - any ideas?
TIA
Labels (5)
1 Solution

Accepted Solutions
Anonymous
Not applicable

hi,
Are you using oracle driver ojdbc6 ?
I guess it comme from a bug with that driver :
https://jira.talendforge.org/browse/TDI-30221
try & update your driver used by output oracle component.
some ways to do that ... one of them is to change the driver called :
<IMPORT NAME="ORACLE_11" REQUIRED_IF="(DB_VERSION == 'ORACLE_11') AND (USE_EXISTING_CONNECTION == 'false') AND (SPECIFY_DATASOURCE_ALIAS == 'false')" UrlPath="platform:/plugin/org.talend.libraries.jdbc.oracle/lib/ojdbc6.jar" MODULE="ojdbc6.jar" BundleID=""/>

remove  o"jdbc6.jar" by another one (of course it have to be in your library).

hope it helps
regards
laurent
 

View solution in original post

5 Replies
Anonymous
Not applicable

Hi,
This issue is probably limited to some particular job. Would you mind uploading your job design screenshots into forum?
Best regards
Sabrina
soowork
Contributor III
Contributor III
Author

Thanks Sabrina.  Attached please see the job flow.  The table that is being loaded (PBS_TRANSACTION_TMP) is truncated in a previous step.  
0683p000009MBTI.png
soowork
Contributor III
Contributor III
Author

Hi Sabrina, do you have any ideas for me?  TIA 
Anonymous
Not applicable

Hi,
We don't find any error in your job design.
For your issue, could you please take a look at: http://stackoverflow.com/questions/24400006/talend-etl-job-error-in-toracleoutput-component?
Best regards
Sabrina
Anonymous
Not applicable

hi,
Are you using oracle driver ojdbc6 ?
I guess it comme from a bug with that driver :
https://jira.talendforge.org/browse/TDI-30221
try & update your driver used by output oracle component.
some ways to do that ... one of them is to change the driver called :
<IMPORT NAME="ORACLE_11" REQUIRED_IF="(DB_VERSION == 'ORACLE_11') AND (USE_EXISTING_CONNECTION == 'false') AND (SPECIFY_DATASOURCE_ALIAS == 'false')" UrlPath="platform:/plugin/org.talend.libraries.jdbc.oracle/lib/ojdbc6.jar" MODULE="ojdbc6.jar" BundleID=""/>

remove  o"jdbc6.jar" by another one (of course it have to be in your library).

hope it helps
regards
laurent