Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
Hi all,
I develop an ETL pipeline with TOS 8. Pipeline read from SQL Server and write to Google BigQuery with tBigQueryOutput but I received this error:
Exception in thread "Thread-1" java.lang.OutOfMemoryError: Required array size too large
at java.base/java.nio.file.Files.readAllBytes(Files.java:3212)
at lov_poc.job_skuitemtag_1_1.Job_SkuItemTag.tDBInput_1Process(Job_SkuItemTag.java:11010)
at lov_poc.job_skuitemtag_1_1.Job_SkuItemTag.tDBConnection_1Process(Job_SkuItemTag.java:1005)
at lov_poc.job_skuitemtag_1_1.Job_SkuItemTag$2.run(Job_SkuItemTag.java:21841)
Here the flow details:
The CSV is correctly written on the local file system (5GB) but is not copied to Google Cloud Storage.
The same process is correctly execute by TOS 7.3.
My JVM:
has anyone had the same problem?
Thank you,
Federico
Hi, do you use a 32 bit jre or 64bit ?
Hi @guenneguez jeremy
32 bit
I tried with 64 jre:
java -version
openjdk version "11.0.14.1" 2022-02-08 LTS
OpenJDK Runtime Environment Zulu11.54+25-CA (build 11.0.14.1+1-LTS)
OpenJDK 64-Bit Server VM Zulu11.54+25-CA (build 11.0.14.1+1-LTS, mixed mode)
But I received the same error.
Thanks,
Federico
I ask it cause the jre 32 bit could throw thos type of error
ok. 😞
Hello,
No target table was created in BigQuery and no load was started from the GCS staging bucket?
Have you tried to add a few JVM parameters? Based on what has been mentioned so far, we would suggest to increase the amount of memory that is set for the job itself.
Please try increasing more value. please monitor the job while it running to check how many memory it really utilizing.
Hope this KB article helps
https://community.talend.com/s/article/OutOfMemory-Exception-WmtmQ
Best regards
Sabrina
Hello,
"No target table was created in BigQuery and no load was started from the GCS staging bucket?"
Exactly. Only the csv on the local filesystem is created.
"Have you tried to add a few JVM parameters? Based on what has been mentioned so far, we would suggest to increase the amount of memory that is set for the job itself."
YES:
Also I checked the memory run tab with this parameters and no problem is detected.
I think the problem is due to the version of the libraries it uses.
Infact if I run the project build with TOS 7.3 and TOS 8 and compare them the latter in the .bat file uses newer libraries.
But I don't know how to solve this problem.
Thanks again,
Federico
Hello,
Increasing heap doesn't help?
Looks like it happens due to the max size limitation in java.
The same process is correctly execute by TOS 7.3 and what's JDK you are using in talend studio V 7.3.1? JDK 1.8 and 64 bit? Are you running on 64 bit OS as well? Did you get the same issue when use oracle JDK 11?
Best regards
Sabrina
I use the same configuration in TOS 7.3 and TOS 8 and run under the same OS.
The OS is 64bit.
JDK installed is:
thanks again
Federico