Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
The argument you want to adjust for the heap is -Xmx, try increasing this to 1024m. MaxPermSpace of 128m should be fine.
Use trial-and-error to see if you can find a value that will get your map to run..
It looks like there's a custom CSV reader class that may be reading the entire input file in memory. That's fine for a moderate-sized document (100-200k), but not if the file is large. Can the custom CSV class be replaced with a tFileInputDelimited? That way, the input is processed line-by-line and the overall memory doesn't need to exceed that required for a single row.
The -Xmx argument in the .ini file controls the memory usage of the studio itself and thus no further than building a job. It makes no difference to the actual running of the job. The memory allocated to running a job is controlled by default through Window > Preferences > Talend > Run/Debug or for specific jobs under JVM arguments on the left side of the Run tab.
The job fails because StringBuilder object is trying to expand its backing array due to a very big record...
tFileInputDelimited with CSV options uses third party "com.csvreader.CsvReader" under the hood... so is possible that you are using it already... because is present in the stack trace...
you should post the job and example data if you want more optimization insights...