I created a simple test job to test a simple transformation.
I have defined two db connections as generic jdbc using two different Oracle 11 dbs.
I can connect to the job pull up two identical tables on each db.
The job will just move data from table A on db1 to table B on db2.
I define the job fine and use automap to map the columns of table A to table B. The columns are defined and named the same.
I run the job with no data in table A or table B and it runs in a few milliseconds.
I populate table A with 10000 rows and I run the job.
I get an ORA-00928: missing SELECT keyword in the execution log window.
No rows move.
Any ideas or is this another bug with 3.1?
It's a simple mapping - I used automap within TMap to map each column in Table A to Table B.
I don't have anything in the expression editor and I didn't write any sql.
The job is defined as :
tjdbcInput - Oracle 11 table A input in DB1
TMap - automap for all columns
tjdbcoutput - Oracle 11 table B output table in DB2
If I turn on tracing I see the data coming from the source table A and I see the data displaying under the target table b.
Then I get the Ora-00928 error for every row processed...
Well, the error message isn't abiguous, u have to have some request SQL.
Like here (in built-in mode)
And all fields "selected" should map to the schema defined.
If you pick automap it does a select all_columns from the source table. I checked the select in the component tab and it's fine. Unless the TOS has a query size limit. select col1,col2,etc... from table 1; If you tell me how to post a image in my reply I'll post the query component. Thanks for your help.
Can someone from the Talend team respond to this thread ?
I installed Talend 3.1.1 on Windows the problem still occurs.
I create the same job but use Oracle 10 as my source and target database and use Oracle 10 jdbc driver the job still fails.
However - I get the following error in my component log when I run the job against Oracle 10.
Is some inherent size limit in the tJDBCInput code ? If yes, what is the wworkaround for large objects?
Starting job LoadOseriesstagingora10 at 15:54 11/06/2009.
Exception in thread "main" java.lang.Error: Unresolved compilation problem:
The code of method tJDBCInput_1Process(Map<String,Object>) is exceeding the 65535 bytes limit
at etms.loadoseriesstagingora10_0_1.LoadOseriesstagingora10.tJDBCInput_1Process(LoadOseriesstagingora10.java:26954)
at etms.loadoseriesstagingora10_0_1.LoadOseriesstagingora10.runJobInTOS(LoadOseriesstagingora10.java:40360)
at etms.loadoseriesstagingora10_0_1.LoadOseriesstagingora10.main(LoadOseriesstagingora10.java:40262)
connecting to socket on port 4004
connected
connecting to socket on port 5135
connected
Job LoadOseriesstagingora10 ended at 15:54 11/06/2009.
Hello
From the last two images, I see the error occurs when the six row insert, can you check if there are some special characters like '@' in the six row?
Best regards
shong