Skip to main content
Announcements
Introducing Qlik Answers: A plug-and-play, Generative AI powered RAG solution. READ ALL ABOUT IT!
cancel
Showing results for 
Search instead for 
Did you mean: 
ffanali0804
Contributor
Contributor

tBigQueryOutput Exception in thread "Thread-1" java.lang.OutOfMemoryError: Required array size too large

Hi all,

I develop an ETL pipeline with TOS 8. Pipeline read from SQL Server and write to Google BigQuery with tBigQueryOutput but I received this error:

Exception in thread "Thread-1" java.lang.OutOfMemoryError: Required array size too large

at java.base/java.nio.file.Files.readAllBytes(Files.java:3212)

at lov_poc.job_skuitemtag_1_1.Job_SkuItemTag.tDBInput_1Process(Job_SkuItemTag.java:11010)

at lov_poc.job_skuitemtag_1_1.Job_SkuItemTag.tDBConnection_1Process(Job_SkuItemTag.java:1005)

at lov_poc.job_skuitemtag_1_1.Job_SkuItemTag$2.run(Job_SkuItemTag.java:21841)

Here the flow details:

0695b00000PKAdIAAX.png

The CSV is correctly written on the local file system (5GB) but is not copied to Google Cloud Storage.

The same process is correctly execute by TOS 7.3.

My JVM:

0695b00000PKAe6AAH.png

has anyone had the same problem?

Thank you,

Federico

17 Replies
gjeremy1617088143

Hi, do you use a 32 bit jre or 64bit ?

 

ffanali0804
Contributor
Contributor
Author

Hi @guenneguez jeremy​ 

32 bit

ffanali0804
Contributor
Contributor
Author

I tried with 64 jre:

java -version

openjdk version "11.0.14.1" 2022-02-08 LTS

OpenJDK Runtime Environment Zulu11.54+25-CA (build 11.0.14.1+1-LTS)

OpenJDK 64-Bit Server VM Zulu11.54+25-CA (build 11.0.14.1+1-LTS, mixed mode)

 

But I received the same error.

 

Thanks,

Federico

gjeremy1617088143

I ask it cause the jre 32 bit could throw thos type of error

ffanali0804
Contributor
Contributor
Author

ok. 😞

Anonymous
Not applicable

Hello,

No target table was created in BigQuery and no load was started from the GCS staging bucket?

Have you tried to add a few JVM parameters? Based on what has been mentioned so far, we would suggest to increase the amount of memory that is set for the job itself.

Please try increasing more value. please monitor the job while it running to check how many memory it really utilizing.

Hope this KB article helps

https://community.talend.com/s/article/OutOfMemory-Exception-WmtmQ

Best regards

Sabrina

 

ffanali0804
Contributor
Contributor
Author

Hello,

 

"No target table was created in BigQuery and no load was started from the GCS staging bucket?"

Exactly. Only the csv on the local filesystem is created.

 

"Have you tried to add a few JVM parameters? Based on what has been mentioned so far, we would suggest to increase the amount of memory that is set for the job itself."

YES:

0695b00000PKNfCAAX.pngAlso I checked the memory run tab with this parameters and no problem is detected.

 

I think the problem is due to the version of the libraries it uses. 

Infact if I run the project build with TOS 7.3 and TOS 8 and compare them the latter in the .bat file uses newer libraries.

But I don't know how to solve this problem.

 

Thanks again,

Federico

Anonymous
Not applicable

Hello,

Increasing heap doesn't help?

Looks like it happens due to the max size limitation in java.

The same process is correctly execute by TOS 7.3 and what's JDK you are using in talend studio V 7.3.1? JDK 1.8 and 64 bit? Are you running on 64 bit OS as well? Did you get the same issue when use oracle JDK 11?

Best regards

Sabrina

ffanali0804
Contributor
Contributor
Author

I use the same configuration in TOS 7.3 and TOS 8 and run under the same OS.

The OS is 64bit.

JDK installed is:

0695b00000PKO6rAAH.png 

thanks again

Federico