
Anonymous
Not applicable
2014-11-04
04:32 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Pig UDF java.lang.NoSuchMethodError for writing custom UDFs.
Hi TalendForge,
I'm using 5.4.
I can successfully use the UPPER Pig UDF in a tMap like the following:
However, if I make my own Pig UDF using Talend's Eval template, which simply sets a string to uppercase, like:
And change the tMap to use pigudf.TestUDF(row1.newColumn):
I then get the following error:
In fact, if I change the UDF back to UPPER which originally worked, the job will still fail with the same error. I have to create a new job to get rid of this error.
Regards,
Matthew
I'm using 5.4.
I can successfully use the UPPER Pig UDF in a tMap like the following:
However, if I make my own Pig UDF using Talend's Eval template, which simply sets a string to uppercase, like:
public class TestUDF extends EvalFunc<String> {
public String exec(Tuple input) throws IOException {
if (input == null || input.size() == 0) {
return null;
}
try {
String str = (String) input.get(0);
return str.toUpperCase();
} catch (Exception e) {
throw new IOException("Caught exception processing input row ", e);
}
}
}
And change the tMap to use pigudf.TestUDF(row1.newColumn):
I then get the following error:
Exception in thread "main" java.lang.NoSuchMethodError: org.apache.hadoop.mapred.jobcontrol.JobControl.addJob(Lorg/apache/hadoop/mapred/jobcontrol/Job;)Ljava/lang/String;
at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.JobControlCompiler.compile(JobControlCompiler.java:261)
at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher.launchPig(MapReduceLauncher.java:180)
at org.apache.pig.PigServer.launchPlan(PigServer.java:1270)
at org.apache.pig.PigServer.executeCompiledLogicalPlan(PigServer.java:1255)
at org.apache.pig.PigServer.execute(PigServer.java:1245)
at org.apache.pig.PigServer.executeBatch(PigServer.java:362)
at primehome.pigudf2_0_1.PigUDF2.tPigLoad_1Process(PigUDF2.java:757)
at primehome.pigudf2_0_1.PigUDF2.runJobInTOS(PigUDF2.java:1071)
at primehome.pigudf2_0_1.PigUDF2.main(PigUDF2.java:936)
disconnected
Job PigUDF2 ended at 13:29 04/11/2014.
In fact, if I change the UDF back to UPPER which originally worked, the job will still fail with the same error. I have to create a new job to get rid of this error.
Regards,
Matthew
156 Views
2 Replies

Anonymous
Not applicable
2014-11-09
07:48 AM
Author
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hello,
Can you please create a JIRA issue in our issue tracker ? https://jira.talendforge.org
You can create it in the TBD project.
Thanks in advance,
Regards,
Rémy.
Can you please create a JIRA issue in our issue tracker ? https://jira.talendforge.org
You can create it in the TBD project.
Thanks in advance,
Regards,
Rémy.
156 Views

Anonymous
Not applicable
2014-11-18
04:32 AM
Author
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi,
It seems this problem comes from a classloading issue. If you right click on the UDF you created in the repository, you can see the UDF dependencies. By default, Talend uses pig-0.10.jar in order to compile the UDF. The problem is that the cluster you work on uses a more recent version of Pig. You should update it with the correct Pig version.
HTH,
Rémy.
It seems this problem comes from a classloading issue. If you right click on the UDF you created in the repository, you can see the UDF dependencies. By default, Talend uses pig-0.10.jar in order to compile the UDF. The problem is that the cluster you work on uses a more recent version of Pig. You should update it with the correct Pig version.
HTH,
Rémy.
156 Views
