Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
Hi all,
I develop an ETL pipeline with TOS 8. Pipeline read from SQL Server and write to Google BigQuery with tBigQueryOutput but I received this error:
Exception in thread "Thread-1" java.lang.OutOfMemoryError: Required array size too large
at java.base/java.nio.file.Files.readAllBytes(Files.java:3212)
at lov_poc.job_skuitemtag_1_1.Job_SkuItemTag.tDBInput_1Process(Job_SkuItemTag.java:11010)
at lov_poc.job_skuitemtag_1_1.Job_SkuItemTag.tDBConnection_1Process(Job_SkuItemTag.java:1005)
at lov_poc.job_skuitemtag_1_1.Job_SkuItemTag$2.run(Job_SkuItemTag.java:21841)
Here the flow details:
The CSV is correctly written on the local file system (5GB) but is not copied to Google Cloud Storage.
The same process is correctly execute by TOS 7.3.
My JVM:
has anyone had the same problem?
Thank you,
Federico
in the project properties, under Build --> java version wich one is selected ?
TOS 7.3:
TOS 8:
I cannot change it in either version. In the drop-down menu there is only 1.8.
and in the .ini file of you talend do you have specified a specific Java version ?
cause as i see it's seem you don't use jdk 11 in tos8
are you talking about these files?
at the top of your ini file you can put this :
-vm
C:\Program Files\Java\jdk11\bin\ (here your path to the jdk11 bin)
I have tried and this attempt does not work either.
I have tried also this path: C:\Program Files\Zulu\zulu-11\bin
Hello,
Could you please open an issue on Talend Bugtracker and our developers from R&D will check your problem to see if it is an issue from libraries.
Best regards
Sabrina
done: https://jira.talendforge.org/browse/TFD-14038
Thanks again
Federico