Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
Hi,
I am creating a Job just reading from an oracle source (500,000 rows) and writing directly in a csv. It given me Timeout error after writng 50,000 lines, when i increased xms to 512m & xmx to 512g it gives below error message after reading all lines but it took around 2 hours to read all lnes. we have 48 gb RAM in the machine and no other users are there. when i am executing the DB source it's working fine and returning output in some seconds.
Any help will be appreciated.
Exception in thread "main" java.lang.OutOfMemoryError
at java.lang.AbstractStringBuilder.hugeCapacity(Unknown Source)
at java.lang.AbstractStringBuilder.newCapacity(Unknown Source)
at java.lang.AbstractStringBuilder.ensureCapacityInternal(Unknown Source)
at java.lang.AbstractStringBuilder.append(Unknown Source)
at java.lang.StringBuilder.append(Unknown Source)
at local_project.a_basic_job_0_1.A_Basic_Job$1Util_tLogRow_2.format(A_Basic_Job.java:14805)
at local_project.a_basic_job_0_1.A_Basic_Job.tOracleInput_2Process(A_Basic_Job.java:22210)
at local_project.a_basic_job_0_1.A_Basic_Job.runJobInTOS(A_Basic_Job.java:22525)
at local_project.a_basic_job_0_1.A_Basic_Job.main(A_Basic_Job.java:22374)
You can increase the heap from the Run->Advanced settings tab
Refer: https://community.talend.com/t5/Migration-Configuration-and/OutOfMemory-Exception/ta-p/21669
If you see my earlier mail, i already did the same changes but still it's throwing error after reading all data.
More important its taking so much time to read the data.