Do not input private or sensitive data. View Qlik Privacy & Cookie Policy.
Skip to main content

Announcements
Check out our latest virtual session where BARC Fellow, Doug Laney, highlighted the opportunities data monetization can offer enterprises. Watch here.
cancel
Showing results for 
Search instead for 
Did you mean: 
Anonymous
Not applicable

error when executing a job

Hi,
I'm using Talend open studio for Data integration 6.0.0.20150702_1326 on mac, any idea why I'm getting this error when executing a job?
Exception in thread "main" java.lang.NoSuchFieldError: INSTANCE
at com.amazonaws.http.conn.SdkConnectionKeepAliveStrategy.getKeepAliveDuration(SdkConnectionKeepAliveStrategy.java:48)
at org.apache.http.impl.client.DefaultRequestDirector.execute(DefaultRequestDirector.java:532)
at org.apache.http.impl.client.AbstractHttpClient.execute(AbstractHttpClient.java:906)
at org.apache.http.impl.client.AbstractHttpClient.execute(AbstractHttpClient.java:805)
at com.amazonaws.http.AmazonHttpClient.executeOneRequest(AmazonHttpClient.java:728)
at com.amazonaws.http.AmazonHttpClient.executeHelper(AmazonHttpClient.java:489)
at com.amazonaws.http.AmazonHttpClient.execute(AmazonHttpClient.java:310)
at com.amazonaws.services.s3.AmazonS3Client.invoke(AmazonS3Client.java:3604)
at com.amazonaws.services.s3.AmazonS3Client.invoke(AmazonS3Client.java:3557)
at com.amazonaws.services.s3.AmazonS3Client.listObjects(AmazonS3Client.java:647)
at com.amazonaws.services.s3.AmazonS3Client.listObjects(AmazonS3Client.java:632)
Labels (4)
20 Replies
Anonymous
Not applicable
Author

I'm having the same error. In the log only the line numbers change (compared to previous posts):
 Exception in thread "main" java.lang.NoSuchFieldError: INSTANCE
at com.amazonaws.http.conn.SdkConnectionKeepAliveStrategy.getKeepAliveDuration(SdkConnectionKeepAliveStrategy.java:48)
at org.apache.http.impl.client.DefaultRequestDirector.execute(DefaultRequestDirector.java:532)
at org.apache.http.impl.client.AbstractHttpClient.execute(AbstractHttpClient.java:906)
at org.apache.http.impl.client.AbstractHttpClient.execute(AbstractHttpClient.java:805)
at com.amazonaws.http.AmazonHttpClient.executeOneRequest(AmazonHttpClient.java:860)
at com.amazonaws.http.AmazonHttpClient.executeHelper(AmazonHttpClient.java:631)
at com.amazonaws.http.AmazonHttpClient.doExecute(AmazonHttpClient.java:400)
at com.amazonaws.http.AmazonHttpClient.executeWithTimer(AmazonHttpClient.java:362)
at com.amazonaws.http.AmazonHttpClient.execute(AmazonHttpClient.java:311)
at com.amazonaws.services.s3.AmazonS3Client.invoke(AmazonS3Client.java:3673)
at com.amazonaws.services.s3.AmazonS3Client.putObject(AmazonS3Client.java:1436)

novastorm's solution didn't work for me, as I don't have User Routine dependencies.
In using TOs 6.2.1
I also have JDK 1.7.0_75 and jre 1.8.0_101 (not sure which one Talend uses)
Any ideas?
Thanks,
Anonymous
Not applicable
Author

Hi MARIKARITALEND,
 It seems that there is a version conflict error on the jar file. Did you get this issue on all your jobs or a specific job?
The recommended Java Environment for V 6.2.1 is oralce java 8.
https://help.talend.com/search/all?query=Java&content-lang=en

Best regards
Sabrina
Anonymous
Not applicable
Author

thanks, Sabrina
I only tried this in a job that copies files to S3.
I just uninstalled JDK 1.7 and it's working now!
0683p000009MACJ.png
Anonymous
Not applicable
Author

mmm... Now the job works and doesn't through the error when I run it alone. But when I called it from a main job, I have the same exception again 😕
Anonymous
Not applicable
Author

Hi MARIKARITALEND,
Would you mind posting your whole job design screenshot into forum?
Best regards
Sabrina
Anonymous
Not applicable
Author

Hello Sabrina.
Here you have the screenshots, and the details:
1- root job: this job calls the main job 4 times
2- main job: this job does several tasks and finally calls the job "localToS3" that does the "copy task"
3- subjob localToS3 - this is the only job that deals with S3.
4 - error: is the error that I have when I run the root job.
Only the first of the 4 iterations runs, and the first group of files are indeed copied to S3.
The subjob (3) was first part of the main job, and the result was the same error.
I moved it to a separate subjob just to be able to skip its execution and avoid the error (I have a parameter for this)
Today I tested calling the subjob (3) directly from the root job, and it worked. It run 4 times, and copied all the files.
The error is only present whe I run the full job.
I hope this is clear. 
Thanks for your help,
Anonymous
Not applicable
Author

for some reason the screenshots were not attached in the previous post
0683p000009MBal.png 0683p000009MBaq.png 0683p000009MBPy.png 0683p000009MBav.png
Anonymous
Not applicable
Author

any clue? thank you!
Anonymous
Not applicable
Author

Hi,
Would you mind sending your job .zip file to us by email. It's a little hard for us to address your issue from attached files.
Best regards
Sabrina
Anonymous
Not applicable
Author

Hello  MARIKARITALEND. I have the same problem with s3 component. Do you have any idea? Thanks.