<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic [resolved] Exception in component tHDFSOutput_ in Talend Studio</title>
    <link>https://community.qlik.com/t5/Talend-Studio/resolved-Exception-in-component-tHDFSOutput/m-p/2284489#M58167</link>
    <description>Hi 
&lt;BR /&gt;I am new to Hadoop and Talend. I am trying to come up with a POC to see how Hadoop and Talend serve to our business needs. 
&lt;BR /&gt;We have set a VM with "Hortonworks Sandbox 2.1" and downloaded Talend Open Studio for BigData 5.6. In my job I have components "thdfsConnection" and "tfileInputJson"(to read a JSON file from my local Machine) and a tlogrow to see the output and thdfsoutput component to write file. Without HDFSOutput component,job is running fine and able to see JSON data. But not able to write this data into HDFS. Failing with the following error. 
&lt;BR /&gt;Here is the log. 
&lt;BR /&gt;Starting job jsonFileReader at 11:47 12/12/2014. 
&lt;BR /&gt; connecting to socket on port 3830 
&lt;BR /&gt; connected 
&lt;BR /&gt;: org.apache.hadoop.conf.Configuration.deprecation - fs.default.name is deprecated. Instead, use fs.defaultFS 
&lt;BR /&gt;: org.apache.hadoop.util.NativeCodeLoader - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable 
&lt;BR /&gt; 
&lt;FONT color="#ff3366"&gt;: org.apache.hadoop.util.Shell - Failed to locate the winutils binary in the hadoop binary path&lt;BR /&gt;java.io.IOException: Could not locate executable null\bin\winutils.exe in the Hadoop binaries.&lt;/FONT&gt; 
&lt;BR /&gt;&amp;nbsp;at org.apache.hadoop.util.Shell.getQualifiedBinPath(Shell.java:318) 
&lt;BR /&gt;&amp;nbsp;at org.apache.hadoop.util.Shell.getWinUtilsPath(Shell.java:333) 
&lt;BR /&gt;&amp;nbsp;at org.apache.hadoop.util.Shell.&amp;lt;clinit&amp;gt;(Shell.java:326) 
&lt;BR /&gt;&amp;nbsp;at org.apache.hadoop.util.StringUtils.&amp;lt;clinit&amp;gt;(StringUtils.java:76) 
&lt;BR /&gt;&amp;nbsp;at org.apache.hadoop.security.Groups.parseStaticMapping(Groups.java:93) 
&lt;BR /&gt;&amp;nbsp;at org.apache.hadoop.security.Groups.&amp;lt;init&amp;gt;(Groups.java:77) 
&lt;BR /&gt;&amp;nbsp;at org.apache.hadoop.security.Groups.getUserToGroupsMappingService(Groups.java:240) 
&lt;BR /&gt;&amp;nbsp;at org.apache.hadoop.security.UserGroupInformation.initialize(UserGroupInformation.java:256) 
&lt;BR /&gt;&amp;nbsp;at org.apache.hadoop.security.UserGroupInformation.ensureInitialized(UserGroupInformation.java:233) 
&lt;BR /&gt;&amp;nbsp;at org.apache.hadoop.security.UserGroupInformation.isAuthenticationMethodEnabled(UserGroupInformation.java:310) 
&lt;BR /&gt;&amp;nbsp;at org.apache.hadoop.security.UserGroupInformation.isSecurityEnabled(UserGroupInformation.java:304) 
&lt;BR /&gt;&amp;nbsp;at org.apache.hadoop.ipc.RPC.getProtocolProxy(RPC.java:534) 
&lt;BR /&gt;&amp;nbsp;at org.apache.hadoop.hdfs.NameNodeProxies.createNNProxyWithClientProtocol(NameNodeProxies.java:348) 
&lt;BR /&gt;&amp;nbsp;at org.apache.hadoop.hdfs.NameNodeProxies.createNonHAProxy(NameNodeProxies.java:244) 
&lt;BR /&gt;&amp;nbsp;at org.apache.hadoop.hdfs.NameNodeProxies.createProxy(NameNodeProxies.java:144) 
&lt;BR /&gt;&amp;nbsp;at org.apache.hadoop.hdfs.DFSClient.&amp;lt;init&amp;gt;(DFSClient.java:579) 
&lt;BR /&gt;&amp;nbsp;at org.apache.hadoop.hdfs.DFSClient.&amp;lt;init&amp;gt;(DFSClient.java:524) 
&lt;BR /&gt;&amp;nbsp;at org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:146) 
&lt;BR /&gt;&amp;nbsp;at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2397) 
&lt;BR /&gt;&amp;nbsp;at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:89) 
&lt;BR /&gt;&amp;nbsp;at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2431) 
&lt;BR /&gt;&amp;nbsp;at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2413) 
&lt;BR /&gt;&amp;nbsp;at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:368) 
&lt;BR /&gt;&amp;nbsp;at org.apache.hadoop.fs.FileSystem$1.run(FileSystem.java:157) 
&lt;BR /&gt;&amp;nbsp;at org.apache.hadoop.fs.FileSystem$1.run(FileSystem.java:154) 
&lt;BR /&gt;&amp;nbsp;at java.security.AccessController.doPrivileged(Native Method) 
&lt;BR /&gt;&amp;nbsp;at javax.security.auth.Subject.doAs(Unknown Source) 
&lt;BR /&gt;&amp;nbsp;at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1557) 
&lt;BR /&gt;&amp;nbsp;at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:154) 
&lt;BR /&gt;&amp;nbsp;at bigdatademo.jsonfilereader_5_4.jsonFileReader.tFileInputJSON_1Process(jsonFileReader.java:760) 
&lt;BR /&gt;&amp;nbsp;at bigdatademo.jsonfilereader_5_4.jsonFileReader.tHDFSConnection_1Process(jsonFileReader.java:396) 
&lt;BR /&gt;&amp;nbsp;at bigdatademo.jsonfilereader_5_4.jsonFileReader.runJobInTOS(jsonFileReader.java:1526) 
&lt;BR /&gt;&amp;nbsp;at bigdatademo.jsonfilereader_5_4.jsonFileReader.main(jsonFileReader.java:1383) 
&lt;BR /&gt; 
&lt;FONT color="#ff3366"&gt;Exception in component tHDFSOutput_1&lt;BR /&gt;java.io.IOException: Failed on local exception: com.google.protobuf.InvalidProtocolBufferException: Protocol message tag had invalid wire type.; Host Details : local host is: "U90-CNADELLA/10.90.23.50"; destination host is: "10.90.22.112":8000;&lt;/FONT&gt; 
&lt;BR /&gt;&amp;nbsp;at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:764) 
&lt;BR /&gt;&amp;nbsp;at org.apache.hadoop.ipc.Client.call(Client.java:1414) 
&lt;BR /&gt;&amp;nbsp;at org.apache.hadoop.ipc.Client.call(Client.java:1363) 
&lt;BR /&gt;&amp;nbsp;at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:206) 
&lt;BR /&gt;&amp;nbsp;at com.sun.proxy.$Proxy7.create(Unknown Source) 
&lt;BR /&gt;&amp;nbsp;at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) 
&lt;BR /&gt;&amp;nbsp;at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source) 
&lt;BR /&gt;&amp;nbsp;at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source) 
&lt;BR /&gt;&amp;nbsp;at java.lang.reflect.Method.invoke(Unknown Source) 
&lt;BR /&gt;&amp;nbsp;at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:190) 
&lt;BR /&gt;&amp;nbsp;at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:103) 
&lt;BR /&gt;&amp;nbsp;at com.sun.proxy.$Proxy7.create(Unknown Source) 
&lt;BR /&gt;&amp;nbsp;at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.create(ClientNamenodeProtocolTranslatorPB.java:258) 
&lt;BR /&gt;&amp;nbsp;at org.apache.hadoop.hdfs.DFSOutputStream.newStreamForCreate(DFSOutputStream.java:1600) 
&lt;BR /&gt;&amp;nbsp;at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:1465) 
&lt;BR /&gt;&amp;nbsp;at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:1390) 
&lt;BR /&gt;&amp;nbsp;at org.apache.hadoop.hdfs.DistributedFileSystem$6.doCall(DistributedFileSystem.java:394) 
&lt;BR /&gt;&amp;nbsp;at org.apache.hadoop.hdfs.DistributedFileSystem$6.doCall(DistributedFileSystem.java:390) 
&lt;BR /&gt;&amp;nbsp;at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81) 
&lt;BR /&gt;&amp;nbsp;at org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:390) 
&lt;BR /&gt;&amp;nbsp;at org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:334) 
&lt;BR /&gt;&amp;nbsp;at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:906) 
&lt;BR /&gt;&amp;nbsp;at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:887) 
&lt;BR /&gt;&amp;nbsp;at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:784) 
&lt;BR /&gt;&amp;nbsp;at bigdatademo.jsonfilereader_5_4.jsonFileReader.tFileInputJSON_1Process(jsonFileReader.java:771) 
&lt;BR /&gt;&amp;nbsp;at bigdatademo.jsonfilereader_5_4.jsonFileReader.tHDFSConnection_1Process(jsonFileReader.java:396) 
&lt;BR /&gt;&amp;nbsp;at bigdatademo.jsonfilereader_5_4.jsonFileReader.runJobInTOS(jsonFileReader.java:1526) 
&lt;BR /&gt;&amp;nbsp;at bigdatademo.jsonfilereader_5_4.jsonFileReader.main(jsonFileReader.java:1383) 
&lt;BR /&gt;Caused by: com.google.protobuf.InvalidProtocolBufferException: Protocol message tag had invalid wire type. 
&lt;BR /&gt;&amp;nbsp;at com.google.protobuf.InvalidProtocolBufferException.invalidWireType(InvalidProtocolBufferException.java:99) 
&lt;BR /&gt;&amp;nbsp;at com.google.protobuf.UnknownFieldSet$Builder.mergeFieldFrom(UnknownFieldSet.java:498) 
&lt;BR /&gt;&amp;nbsp;at com.google.protobuf.GeneratedMessage.parseUnknownField(GeneratedMessage.java:193) 
&lt;BR /&gt;&amp;nbsp;at org.apache.hadoop.ipc.protobuf.RpcHeaderProtos$RpcResponseHeaderProto.&amp;lt;init&amp;gt;(RpcHeaderProtos.java:1404) 
&lt;BR /&gt;&amp;nbsp;at org.apache.hadoop.ipc.protobuf.RpcHeaderProtos$RpcResponseHeaderProto.&amp;lt;init&amp;gt;(RpcHeaderProtos.java:1362) 
&lt;BR /&gt;&amp;nbsp;at org.apache.hadoop.ipc.protobuf.RpcHeaderProtos$RpcResponseHeaderProto$1.parsePartialFrom(RpcHeaderProtos.java:1492) 
&lt;BR /&gt;&amp;nbsp;at org.apache.hadoop.ipc.protobuf.RpcHeaderProtos$RpcResponseHeaderProto$1.parsePartialFrom(RpcHeaderProtos.java:1487) 
&lt;BR /&gt;&amp;nbsp;at com.google.protobuf.AbstractParser.parsePartialFrom(AbstractParser.java:200) 
&lt;BR /&gt;&amp;nbsp;at com.google.protobuf.AbstractParser.parsePartialDelimitedFrom(AbstractParser.java:241) 
&lt;BR /&gt;&amp;nbsp;at com.google.protobuf.AbstractParser.parseDelimitedFrom(AbstractParser.java:253) 
&lt;BR /&gt;&amp;nbsp;at com.google.protobuf.AbstractParser.parseDelimitedFrom(AbstractParser.java:259) 
&lt;BR /&gt;&amp;nbsp;at com.google.protobuf.AbstractParser.parseDelimitedFrom(AbstractParser.java:49) 
&lt;BR /&gt;&amp;nbsp;at org.apache.hadoop.ipc.protobuf.RpcHeaderProtos$RpcResponseHeaderProto.parseDelimitedFrom(RpcHeaderProtos.java:2364) 
&lt;BR /&gt;&amp;nbsp;at org.apache.hadoop.ipc.Client$Connection.receiveRpcResponse(Client.java:1055) 
&lt;BR /&gt;&amp;nbsp;at org.apache.hadoop.ipc.Client$Connection.run(Client.java:949) 
&lt;BR /&gt; disconnected 
&lt;BR /&gt;Job jsonFileReader ended at 11:47 12/12/2014.</description>
    <pubDate>Sat, 16 Nov 2024 11:23:21 GMT</pubDate>
    <dc:creator>Anonymous</dc:creator>
    <dc:date>2024-11-16T11:23:21Z</dc:date>
    <item>
      <title>[resolved] Exception in component tHDFSOutput_</title>
      <link>https://community.qlik.com/t5/Talend-Studio/resolved-Exception-in-component-tHDFSOutput/m-p/2284489#M58167</link>
      <description>Hi 
&lt;BR /&gt;I am new to Hadoop and Talend. I am trying to come up with a POC to see how Hadoop and Talend serve to our business needs. 
&lt;BR /&gt;We have set a VM with "Hortonworks Sandbox 2.1" and downloaded Talend Open Studio for BigData 5.6. In my job I have components "thdfsConnection" and "tfileInputJson"(to read a JSON file from my local Machine) and a tlogrow to see the output and thdfsoutput component to write file. Without HDFSOutput component,job is running fine and able to see JSON data. But not able to write this data into HDFS. Failing with the following error. 
&lt;BR /&gt;Here is the log. 
&lt;BR /&gt;Starting job jsonFileReader at 11:47 12/12/2014. 
&lt;BR /&gt; connecting to socket on port 3830 
&lt;BR /&gt; connected 
&lt;BR /&gt;: org.apache.hadoop.conf.Configuration.deprecation - fs.default.name is deprecated. Instead, use fs.defaultFS 
&lt;BR /&gt;: org.apache.hadoop.util.NativeCodeLoader - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable 
&lt;BR /&gt; 
&lt;FONT color="#ff3366"&gt;: org.apache.hadoop.util.Shell - Failed to locate the winutils binary in the hadoop binary path&lt;BR /&gt;java.io.IOException: Could not locate executable null\bin\winutils.exe in the Hadoop binaries.&lt;/FONT&gt; 
&lt;BR /&gt;&amp;nbsp;at org.apache.hadoop.util.Shell.getQualifiedBinPath(Shell.java:318) 
&lt;BR /&gt;&amp;nbsp;at org.apache.hadoop.util.Shell.getWinUtilsPath(Shell.java:333) 
&lt;BR /&gt;&amp;nbsp;at org.apache.hadoop.util.Shell.&amp;lt;clinit&amp;gt;(Shell.java:326) 
&lt;BR /&gt;&amp;nbsp;at org.apache.hadoop.util.StringUtils.&amp;lt;clinit&amp;gt;(StringUtils.java:76) 
&lt;BR /&gt;&amp;nbsp;at org.apache.hadoop.security.Groups.parseStaticMapping(Groups.java:93) 
&lt;BR /&gt;&amp;nbsp;at org.apache.hadoop.security.Groups.&amp;lt;init&amp;gt;(Groups.java:77) 
&lt;BR /&gt;&amp;nbsp;at org.apache.hadoop.security.Groups.getUserToGroupsMappingService(Groups.java:240) 
&lt;BR /&gt;&amp;nbsp;at org.apache.hadoop.security.UserGroupInformation.initialize(UserGroupInformation.java:256) 
&lt;BR /&gt;&amp;nbsp;at org.apache.hadoop.security.UserGroupInformation.ensureInitialized(UserGroupInformation.java:233) 
&lt;BR /&gt;&amp;nbsp;at org.apache.hadoop.security.UserGroupInformation.isAuthenticationMethodEnabled(UserGroupInformation.java:310) 
&lt;BR /&gt;&amp;nbsp;at org.apache.hadoop.security.UserGroupInformation.isSecurityEnabled(UserGroupInformation.java:304) 
&lt;BR /&gt;&amp;nbsp;at org.apache.hadoop.ipc.RPC.getProtocolProxy(RPC.java:534) 
&lt;BR /&gt;&amp;nbsp;at org.apache.hadoop.hdfs.NameNodeProxies.createNNProxyWithClientProtocol(NameNodeProxies.java:348) 
&lt;BR /&gt;&amp;nbsp;at org.apache.hadoop.hdfs.NameNodeProxies.createNonHAProxy(NameNodeProxies.java:244) 
&lt;BR /&gt;&amp;nbsp;at org.apache.hadoop.hdfs.NameNodeProxies.createProxy(NameNodeProxies.java:144) 
&lt;BR /&gt;&amp;nbsp;at org.apache.hadoop.hdfs.DFSClient.&amp;lt;init&amp;gt;(DFSClient.java:579) 
&lt;BR /&gt;&amp;nbsp;at org.apache.hadoop.hdfs.DFSClient.&amp;lt;init&amp;gt;(DFSClient.java:524) 
&lt;BR /&gt;&amp;nbsp;at org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:146) 
&lt;BR /&gt;&amp;nbsp;at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2397) 
&lt;BR /&gt;&amp;nbsp;at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:89) 
&lt;BR /&gt;&amp;nbsp;at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2431) 
&lt;BR /&gt;&amp;nbsp;at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2413) 
&lt;BR /&gt;&amp;nbsp;at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:368) 
&lt;BR /&gt;&amp;nbsp;at org.apache.hadoop.fs.FileSystem$1.run(FileSystem.java:157) 
&lt;BR /&gt;&amp;nbsp;at org.apache.hadoop.fs.FileSystem$1.run(FileSystem.java:154) 
&lt;BR /&gt;&amp;nbsp;at java.security.AccessController.doPrivileged(Native Method) 
&lt;BR /&gt;&amp;nbsp;at javax.security.auth.Subject.doAs(Unknown Source) 
&lt;BR /&gt;&amp;nbsp;at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1557) 
&lt;BR /&gt;&amp;nbsp;at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:154) 
&lt;BR /&gt;&amp;nbsp;at bigdatademo.jsonfilereader_5_4.jsonFileReader.tFileInputJSON_1Process(jsonFileReader.java:760) 
&lt;BR /&gt;&amp;nbsp;at bigdatademo.jsonfilereader_5_4.jsonFileReader.tHDFSConnection_1Process(jsonFileReader.java:396) 
&lt;BR /&gt;&amp;nbsp;at bigdatademo.jsonfilereader_5_4.jsonFileReader.runJobInTOS(jsonFileReader.java:1526) 
&lt;BR /&gt;&amp;nbsp;at bigdatademo.jsonfilereader_5_4.jsonFileReader.main(jsonFileReader.java:1383) 
&lt;BR /&gt; 
&lt;FONT color="#ff3366"&gt;Exception in component tHDFSOutput_1&lt;BR /&gt;java.io.IOException: Failed on local exception: com.google.protobuf.InvalidProtocolBufferException: Protocol message tag had invalid wire type.; Host Details : local host is: "U90-CNADELLA/10.90.23.50"; destination host is: "10.90.22.112":8000;&lt;/FONT&gt; 
&lt;BR /&gt;&amp;nbsp;at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:764) 
&lt;BR /&gt;&amp;nbsp;at org.apache.hadoop.ipc.Client.call(Client.java:1414) 
&lt;BR /&gt;&amp;nbsp;at org.apache.hadoop.ipc.Client.call(Client.java:1363) 
&lt;BR /&gt;&amp;nbsp;at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:206) 
&lt;BR /&gt;&amp;nbsp;at com.sun.proxy.$Proxy7.create(Unknown Source) 
&lt;BR /&gt;&amp;nbsp;at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) 
&lt;BR /&gt;&amp;nbsp;at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source) 
&lt;BR /&gt;&amp;nbsp;at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source) 
&lt;BR /&gt;&amp;nbsp;at java.lang.reflect.Method.invoke(Unknown Source) 
&lt;BR /&gt;&amp;nbsp;at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:190) 
&lt;BR /&gt;&amp;nbsp;at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:103) 
&lt;BR /&gt;&amp;nbsp;at com.sun.proxy.$Proxy7.create(Unknown Source) 
&lt;BR /&gt;&amp;nbsp;at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.create(ClientNamenodeProtocolTranslatorPB.java:258) 
&lt;BR /&gt;&amp;nbsp;at org.apache.hadoop.hdfs.DFSOutputStream.newStreamForCreate(DFSOutputStream.java:1600) 
&lt;BR /&gt;&amp;nbsp;at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:1465) 
&lt;BR /&gt;&amp;nbsp;at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:1390) 
&lt;BR /&gt;&amp;nbsp;at org.apache.hadoop.hdfs.DistributedFileSystem$6.doCall(DistributedFileSystem.java:394) 
&lt;BR /&gt;&amp;nbsp;at org.apache.hadoop.hdfs.DistributedFileSystem$6.doCall(DistributedFileSystem.java:390) 
&lt;BR /&gt;&amp;nbsp;at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81) 
&lt;BR /&gt;&amp;nbsp;at org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:390) 
&lt;BR /&gt;&amp;nbsp;at org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:334) 
&lt;BR /&gt;&amp;nbsp;at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:906) 
&lt;BR /&gt;&amp;nbsp;at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:887) 
&lt;BR /&gt;&amp;nbsp;at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:784) 
&lt;BR /&gt;&amp;nbsp;at bigdatademo.jsonfilereader_5_4.jsonFileReader.tFileInputJSON_1Process(jsonFileReader.java:771) 
&lt;BR /&gt;&amp;nbsp;at bigdatademo.jsonfilereader_5_4.jsonFileReader.tHDFSConnection_1Process(jsonFileReader.java:396) 
&lt;BR /&gt;&amp;nbsp;at bigdatademo.jsonfilereader_5_4.jsonFileReader.runJobInTOS(jsonFileReader.java:1526) 
&lt;BR /&gt;&amp;nbsp;at bigdatademo.jsonfilereader_5_4.jsonFileReader.main(jsonFileReader.java:1383) 
&lt;BR /&gt;Caused by: com.google.protobuf.InvalidProtocolBufferException: Protocol message tag had invalid wire type. 
&lt;BR /&gt;&amp;nbsp;at com.google.protobuf.InvalidProtocolBufferException.invalidWireType(InvalidProtocolBufferException.java:99) 
&lt;BR /&gt;&amp;nbsp;at com.google.protobuf.UnknownFieldSet$Builder.mergeFieldFrom(UnknownFieldSet.java:498) 
&lt;BR /&gt;&amp;nbsp;at com.google.protobuf.GeneratedMessage.parseUnknownField(GeneratedMessage.java:193) 
&lt;BR /&gt;&amp;nbsp;at org.apache.hadoop.ipc.protobuf.RpcHeaderProtos$RpcResponseHeaderProto.&amp;lt;init&amp;gt;(RpcHeaderProtos.java:1404) 
&lt;BR /&gt;&amp;nbsp;at org.apache.hadoop.ipc.protobuf.RpcHeaderProtos$RpcResponseHeaderProto.&amp;lt;init&amp;gt;(RpcHeaderProtos.java:1362) 
&lt;BR /&gt;&amp;nbsp;at org.apache.hadoop.ipc.protobuf.RpcHeaderProtos$RpcResponseHeaderProto$1.parsePartialFrom(RpcHeaderProtos.java:1492) 
&lt;BR /&gt;&amp;nbsp;at org.apache.hadoop.ipc.protobuf.RpcHeaderProtos$RpcResponseHeaderProto$1.parsePartialFrom(RpcHeaderProtos.java:1487) 
&lt;BR /&gt;&amp;nbsp;at com.google.protobuf.AbstractParser.parsePartialFrom(AbstractParser.java:200) 
&lt;BR /&gt;&amp;nbsp;at com.google.protobuf.AbstractParser.parsePartialDelimitedFrom(AbstractParser.java:241) 
&lt;BR /&gt;&amp;nbsp;at com.google.protobuf.AbstractParser.parseDelimitedFrom(AbstractParser.java:253) 
&lt;BR /&gt;&amp;nbsp;at com.google.protobuf.AbstractParser.parseDelimitedFrom(AbstractParser.java:259) 
&lt;BR /&gt;&amp;nbsp;at com.google.protobuf.AbstractParser.parseDelimitedFrom(AbstractParser.java:49) 
&lt;BR /&gt;&amp;nbsp;at org.apache.hadoop.ipc.protobuf.RpcHeaderProtos$RpcResponseHeaderProto.parseDelimitedFrom(RpcHeaderProtos.java:2364) 
&lt;BR /&gt;&amp;nbsp;at org.apache.hadoop.ipc.Client$Connection.receiveRpcResponse(Client.java:1055) 
&lt;BR /&gt;&amp;nbsp;at org.apache.hadoop.ipc.Client$Connection.run(Client.java:949) 
&lt;BR /&gt; disconnected 
&lt;BR /&gt;Job jsonFileReader ended at 11:47 12/12/2014.</description>
      <pubDate>Sat, 16 Nov 2024 11:23:21 GMT</pubDate>
      <guid>https://community.qlik.com/t5/Talend-Studio/resolved-Exception-in-component-tHDFSOutput/m-p/2284489#M58167</guid>
      <dc:creator>Anonymous</dc:creator>
      <dc:date>2024-11-16T11:23:21Z</dc:date>
    </item>
    <item>
      <title>Re: [resolved] Exception in component tHDFSOutput_</title>
      <link>https://community.qlik.com/t5/Talend-Studio/resolved-Exception-in-component-tHDFSOutput/m-p/2284490#M58168</link>
      <description>Two issues (well one really since the first is probably not really an issue): 
&lt;BR /&gt;": org.apache.hadoop.util.Shell - Failed to locate the winutils binary in the hadoop binary path 
&lt;BR /&gt;java.io.IOException: Could not locate executable null\bin\winutils.exe in the Hadoop binaries." 
&lt;BR /&gt;Is an upstream issue in the Hadoop code, but I don't believe it will actually hurt anything here (This happens any time org.apache.hadoop.util.Shell is called in Windows even as a library). 
&lt;BR /&gt; 
&lt;A href="https://issues.apache.org/jira/browse/HADOOP-11003" target="_blank" rel="nofollow noopener noreferrer"&gt;https://issues.apache.org/jira/browse/HADOOP-11003&lt;/A&gt; 
&lt;BR /&gt;The second: 
&lt;BR /&gt;"Exception in component tHDFSOutput_1 
&lt;BR /&gt;java.io.IOException: Failed on local exception: com.google.protobuf.InvalidProtocolBufferException: Protocol message tag had invalid wire type.; Host Details : local host is: "U90-CNADELLA/10.90.23.50"; destination host is: "10.90.22.112":8000;" 
&lt;BR /&gt;This is a version mismatch. You either need to upgrade the JARs used by the Talend HDFS components to match your Hadoop distribution or select the correct Hadoop distribution with the current Talend components.</description>
      <pubDate>Tue, 16 Dec 2014 20:58:41 GMT</pubDate>
      <guid>https://community.qlik.com/t5/Talend-Studio/resolved-Exception-in-component-tHDFSOutput/m-p/2284490#M58168</guid>
      <dc:creator>Anonymous</dc:creator>
      <dc:date>2014-12-16T20:58:41Z</dc:date>
    </item>
  </channel>
</rss>

