Do not input private or sensitive data. View Qlik Privacy & Cookie Policy.
Skip to main content

Announcements
Qlik Open Lakehouse is Now Generally Available! Discover the key highlights and partner resources here.
cancel
Showing results for 
Search instead for 
Did you mean: 
Anonymous
Not applicable

[resolved] Talend: The code of method exceeding the 65535 bytes limit...

I am loading data from Salesforce to SQL. I've done it for few objects successfully. I am getting a problem while loading data from one object. The source object has over 500 columns. I am just dumping the data from source to target after some data type conversion. Initially, I auto mapped all the columns and it failed. I thought of mapping only few columns but still it is giving me the same error. However, I've not modified schema to include just 10 columns that I was using.
I am getting compilation error-The code of method is exceeding the 65535 limit.
I've checked other similar posts but I don't know what exactly I should do. It is java limitation but how I can solve it in talend. Please let me know what I should do with this Talend job.
Thanks,
Jayesh.
0683p000009MArg.jpg
Labels (4)
1 Solution

Accepted Solutions
Anonymous
Not applicable
Author

The only way to shorten the names are if you have a tMap and shorten the names from the source schema. Keep in mind reading from a database does not depends on the correct schema names, only the position and the type is important.
But I guess, if you cannot solve the situation be removing the tMap my suggestion will probably not solve the problem.
What about reading a bunch of columns. Actually from the perspective of an business model it could hardly be the case an object has that huge amount of attributes. I am pretty sure you can model more than one business object.
You would need to read the datasets multiple times because you will not get everything at once and you should think about more the one output table - because of the mentioned split of the object into multiple objects.

View solution in original post

16 Replies
Anonymous
Not applicable
Author

I am not sure but is it true you need only few columns from your input source?
In this case I would put the tMap at first to cut up all not needed columns.
Otherwise I would suggest to avoid type conversion in this first step. You could do this in a second step (reading the first table - I would call it reading staging table and transforming the data into a core table.
Anonymous
Not applicable
Author

Hi,
It is known issue in talend.
Please refer to related jira bug https://jira.talendforge.org/browse/TDI-22059, https://jira.talendforge.org/browse/TDI-4569, https://jira.talendforge.org/browse/TDI-22933. We are working on the error "The code of method is exceeding the 65535 limit." for a huge of columns.
It suggest you can open a jira issue of DI project for work item issue jira portal. Our component developer will check your issue and give a workaround on your case

Best regards
Sabrina
0683p000009M7hd.png
Anonymous
Not applicable
Author

Hi,
I am using TOS-ID 5.5 and I am experiencing the same error for the component tMSSQLInput. The exact error message is: The code of method tMSSqlInput_1Process(Map<String,Object>) is exceeding the 65535 bytes limit .
This process was working fine prior to my last changes; the MS SQL component reads an SQL server table which is a 1:1 copy of my target table on an AS400. This worked fine in general, but I had some truncations (causing to loose some records) and therefore decided to connect a tLogRow component to the tAS400Output component to catch the rejected records. But the job did not compile any longer. I removed the tLogRow, but it didn't change anything.
Prior to this action, I changed the schema on my AS400 component; I had found out that my Character fields where not assigned to a DBType. "Retrieve Schema" on the Schema Dialog solved this problem.
The 2 tables have 228 columns.
Any ideas how to overcome this problem ?
Regards
Heiko
Anonymous
Not applicable
Author

It is known issue in talend

it's more like a java (jvm) constraint I think 0683p000009MACn.png
regards
laurent
Anonymous
Not applicable
Author

To narrow the problem, I have created a new job, using the MS SQL Server input for the file which causes the error and an output to the AS400.
It works fine.
So it must be in the environment; any idea ?
The original job reads an Excel file prior to this steps, sets context variables and global maps from that input, executes "contextLoad" and then should run the upload to the AS400.
/Heiko
Anonymous
Not applicable
Author

it's only due to generated code that cannot exceed 65535 bytes.
## 4.10 Limitations of the Java Virtual Machine
* The amount of code per non-native, non-abstract method is limited to 65536 bytes ....

cf : http://docs.oracle.com/javase/specs/#88659
have a look at Talend generated code for the job (method) that exceed jvm bytes limit 0683p000009MA9p.png
hope it help
regards
laurent
Anonymous
Not applicable
Author

Hi,
It is known issue in talend.
Please refer to related jira bug , , . We are working on the error "The code of method is exceeding the 65535 limit." for a huge of columns.
It suggest you can open a jira issue of DI project for work item issue jira portal. Our component developer will check your issue and give a workaround on your case

Best regards
Sabrina


Hi,
I'm now having same problem trying to export a 500-column table from SQL Server using tMSSQLInput. Is there a workaround - I do need the table exported with all columns. The tickets you listed seem to be resolved, albeit mention a different component. Is this the same issue that was discussed here or something else? If yes - was it supposed to be fixed for all components or only ones listed in jira tickets?
Full error is as follows:
Exception in thread "main" java.lang.Error: Unresolved compilation problem:
     The code of method tMSSqlInput_1Process(Map<String,Object>) is exceeding the 65535 bytes limit
     at .tMSSqlInput_1Process(.java:14697)
     at .runJobInTOS(.java:30898)
     at .main(.java:30755)
Appreciate your help
Alex
Anonymous
Not applicable
Author

Please note the problem is always a too complex subjob. A subjob is the area which has the same background rectangle. A job it self can have a lot of subjobs.
I have made a job which processes 1500 columns and it works well. The number of columns alone is not the problem. You should take care you do not to much in your subjob. 
Could you provide a screenshot of your job?
Anonymous
Not applicable
Author

Unfortunately, I cannot - we are in a completely disconnected environment. But the job only consists three things:
1. tMSSqlInput - with a SELECT column1, column2, ..., column500 FROM table
2. tMap - which simply maps columns in the select 1:1
3. tVerticaOutput - which supposedly would insert the data
All 3 share the same background and are connected by Row-Main
Hope it makes sense