Skip to main content
Announcements
SYSTEM MAINTENANCE: Thurs., Sept. 19, 1 AM ET, Platform will be unavailable for approx. 60 minutes.
cancel
Showing results for 
Search instead for 
Did you mean: 
Anonymous
Not applicable

Ram allocation in Batch wihile using tRunJob in Dynamic mode

Hi All ,
I've designed a job where the 'tRunJob' is being used with 'Dynamic Job' parameter enabled. Here Am looping many jobs one by one to this component , where it inturn calls the job .
I take a batch script of the wrapper job(the one which calls other job in loop) where I give 12 to 14GB (-Xms12288M -Xmx14336M ) of ram and run it in server.
The issue is that the ram(14GB) given is not completely taken to all the jobs that are called in a single 'tRunJob'. So when I trigger the batch with huge amount of data , it fails with "Out of Memory" error eventhough memory is available. But when the failed job is taken batch seperately it runs with no error.
Can anyone tell me how to allocate the ram that is given in batch file to all jobs in 'tRunJob' component.
Labels (2)
4 Replies
Anonymous
Not applicable
Author

Hi
Do you have the same problem when you execute the same job in Talend Studio where you have allocated more memory to the job by modifying the java parameter -Xmx. If you have export the job script, you can allocate more memory to the job executing through modify the job script, for example;
%~d0
cd %~dp0
java -Xms256M -Xmx2048M -cp classpath.jar; shong.test_0_1.test --context=Default %*

In addition, you need to check if the job can be optimized to avoid storing huge amount of data in memory, refer to this KB article:
https://community.talend.com/t5/Migration-Configuration-and/OutOfMemory-Exception/ta-p/21669?content...
Shong
Anonymous
Not applicable
Author

Hi
Do you have the same problem when you execute the same job in Talend Studio where you have allocated more memory to the job by modifying the java parameter -Xmx. If you have export the job script, you can allocate more memory to the job executing through modify the job script, for example;
%~d0
cd %~dp0
java -Xms256M -Xmx2048M -cp classpath.jar; shong.test_0_1.test --context=Default %*

In addition, you need to check if the job can be optimized to avoid storing huge amount of data in memory, refer to this KB article:
https://community.talend.com/t5/Migration-Configuration-and/OutOfMemory-Exception/ta-p/21669?content...
Shong

Actually the issue is , am running batch where jobs are called in loop one by one using 'Dynamic Job' in 'tRunJob'. If I give 12 to 14 GB of ram in Batch, it is taking that ram to first job only. from the next job onwards it is taking the ram that was given in talend while run.
For eg :
I ve tested the job with 1 to 1.5GB in my local system. After taking batch Am giving 12 to 14 GB in Batch in server. But for the first job it is taking 12 to 14 GB of ram. From next job which comes in loop it takes 1 to 1.5 GB of ram only..
So when I run the job for huge data(10 lak records) , it fails.. Is there anyway parameter or setting, that I could give 12 to 14 gb of ram to all the jobs that comes in loop.
Anonymous
Not applicable
Author

Hi
I understand your problem now, I will test it and discuss it with our developers, and come back to you asap!
Shong
Anonymous
Not applicable
Author

Hi
When I look into the generated code of job, I found that it launches another process to run the child job when 'use dynamic job' option is selected, the main job and the child job are in different JVM, that's why the JVM parameters such -Xmx set in the main job are not applied to the child job.
The workaround is to configure JVM parameters in the preference page, go to Window-->Preferences-->Talend-->Run/Debug, and then configure the Job run VM arguments. Finally, export the job script of the main job.
Note that: the settings in the preference page are for all the jobs.
Shong