<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic HDFS Copy fails in Talend Studio</title>
    <link>https://community.qlik.com/t5/Talend-Studio/HDFS-Copy-fails/m-p/2227653#M19192</link>
    <description>Hi,
&lt;BR /&gt;i am trying to copy file from Linux Local File System to HDFS. i built the job and ran the shell on Linux. Got the below error. Appreciate your help.
&lt;BR /&gt;---------------------------------------------------------------------------
&lt;BR /&gt;-bash-4.1$ ./HDFS_CopyFromLocal_run.sh
&lt;BR /&gt;14/01/10 13:00:38 ERROR security.UserGroupInformation: PriviledgedActionException as:user_name cause:java.io.IOException: Call to Hadoop_lab/11.19.58.154:50070 failed on local exception: java.io.EOFException
&lt;BR /&gt;Exception in component tHDFSCopy_1
&lt;BR /&gt;java.io.IOException: Call to Hadoop_lab/11.19.58.154:50070 failed on local exception: java.io.EOFException
&lt;BR /&gt; at org.apache.hadoop.ipc.Client.wrapException(Client.java:1103)
&lt;BR /&gt; at org.apache.hadoop.ipc.Client.call(Client.java:1071)
&lt;BR /&gt; at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:225)
&lt;BR /&gt; at sun.proxy.$Proxy1.getProtocolVersion(Unknown Source)
&lt;BR /&gt; at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:396)
&lt;BR /&gt; at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:379)
&lt;BR /&gt; at org.apache.hadoop.hdfs.DFSClient.createRPCNamenode(DFSClient.java:119)
&lt;BR /&gt; at org.apache.hadoop.hdfs.DFSClient.&amp;lt;init&amp;gt;(DFSClient.java:238)
&lt;BR /&gt; at org.apache.hadoop.hdfs.DFSClient.&amp;lt;init&amp;gt;(DFSClient.java:203)
&lt;BR /&gt; at org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:89)
&lt;BR /&gt; at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1386)
&lt;BR /&gt; at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:66)
&lt;BR /&gt; at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1404)
&lt;BR /&gt; at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:254)
&lt;BR /&gt; at org.apache.hadoop.fs.FileSystem$1.run(FileSystem.java:117)
&lt;BR /&gt; at org.apache.hadoop.fs.FileSystem$1.run(FileSystem.java:115)
&lt;BR /&gt; at java.security.AccessController.doPrivileged(Native Method)
&lt;BR /&gt; at javax.security.auth.Subject.doAs(Subject.java:416)
&lt;BR /&gt; at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1083)
&lt;BR /&gt; at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:115)
&lt;BR /&gt; at big_data_demo.hdfs_copyfromlocal_0_1.HDFS_CopyFromLocal.tHDFSCopy_1Process(HDFS_CopyFromLocal.java:300)
&lt;BR /&gt; at big_data_demo.hdfs_copyfromlocal_0_1.HDFS_CopyFromLocal.runJobInTOS(HDFS_CopyFromLocal.java:518)
&lt;BR /&gt; at big_data_demo.hdfs_copyfromlocal_0_1.HDFS_CopyFromLocal.main(HDFS_CopyFromLocal.java:407)
&lt;BR /&gt;Caused by: java.io.EOFException
&lt;BR /&gt; at java.io.DataInputStream.readInt(DataInputStream.java:392)
&lt;BR /&gt; at org.apache.hadoop.ipc.Client$Connection.receiveResponse(Client.java:807)
&lt;BR /&gt; at org.apache.hadoop.ipc.Client$Connection.run(Client.java:745)</description>
    <pubDate>Fri, 10 Jan 2014 18:08:05 GMT</pubDate>
    <dc:creator>Anonymous</dc:creator>
    <dc:date>2014-01-10T18:08:05Z</dc:date>
    <item>
      <title>HDFS Copy fails</title>
      <link>https://community.qlik.com/t5/Talend-Studio/HDFS-Copy-fails/m-p/2227653#M19192</link>
      <description>Hi,
&lt;BR /&gt;i am trying to copy file from Linux Local File System to HDFS. i built the job and ran the shell on Linux. Got the below error. Appreciate your help.
&lt;BR /&gt;---------------------------------------------------------------------------
&lt;BR /&gt;-bash-4.1$ ./HDFS_CopyFromLocal_run.sh
&lt;BR /&gt;14/01/10 13:00:38 ERROR security.UserGroupInformation: PriviledgedActionException as:user_name cause:java.io.IOException: Call to Hadoop_lab/11.19.58.154:50070 failed on local exception: java.io.EOFException
&lt;BR /&gt;Exception in component tHDFSCopy_1
&lt;BR /&gt;java.io.IOException: Call to Hadoop_lab/11.19.58.154:50070 failed on local exception: java.io.EOFException
&lt;BR /&gt; at org.apache.hadoop.ipc.Client.wrapException(Client.java:1103)
&lt;BR /&gt; at org.apache.hadoop.ipc.Client.call(Client.java:1071)
&lt;BR /&gt; at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:225)
&lt;BR /&gt; at sun.proxy.$Proxy1.getProtocolVersion(Unknown Source)
&lt;BR /&gt; at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:396)
&lt;BR /&gt; at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:379)
&lt;BR /&gt; at org.apache.hadoop.hdfs.DFSClient.createRPCNamenode(DFSClient.java:119)
&lt;BR /&gt; at org.apache.hadoop.hdfs.DFSClient.&amp;lt;init&amp;gt;(DFSClient.java:238)
&lt;BR /&gt; at org.apache.hadoop.hdfs.DFSClient.&amp;lt;init&amp;gt;(DFSClient.java:203)
&lt;BR /&gt; at org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:89)
&lt;BR /&gt; at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1386)
&lt;BR /&gt; at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:66)
&lt;BR /&gt; at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1404)
&lt;BR /&gt; at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:254)
&lt;BR /&gt; at org.apache.hadoop.fs.FileSystem$1.run(FileSystem.java:117)
&lt;BR /&gt; at org.apache.hadoop.fs.FileSystem$1.run(FileSystem.java:115)
&lt;BR /&gt; at java.security.AccessController.doPrivileged(Native Method)
&lt;BR /&gt; at javax.security.auth.Subject.doAs(Subject.java:416)
&lt;BR /&gt; at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1083)
&lt;BR /&gt; at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:115)
&lt;BR /&gt; at big_data_demo.hdfs_copyfromlocal_0_1.HDFS_CopyFromLocal.tHDFSCopy_1Process(HDFS_CopyFromLocal.java:300)
&lt;BR /&gt; at big_data_demo.hdfs_copyfromlocal_0_1.HDFS_CopyFromLocal.runJobInTOS(HDFS_CopyFromLocal.java:518)
&lt;BR /&gt; at big_data_demo.hdfs_copyfromlocal_0_1.HDFS_CopyFromLocal.main(HDFS_CopyFromLocal.java:407)
&lt;BR /&gt;Caused by: java.io.EOFException
&lt;BR /&gt; at java.io.DataInputStream.readInt(DataInputStream.java:392)
&lt;BR /&gt; at org.apache.hadoop.ipc.Client$Connection.receiveResponse(Client.java:807)
&lt;BR /&gt; at org.apache.hadoop.ipc.Client$Connection.run(Client.java:745)</description>
      <pubDate>Fri, 10 Jan 2014 18:08:05 GMT</pubDate>
      <guid>https://community.qlik.com/t5/Talend-Studio/HDFS-Copy-fails/m-p/2227653#M19192</guid>
      <dc:creator>Anonymous</dc:creator>
      <dc:date>2014-01-10T18:08:05Z</dc:date>
    </item>
    <item>
      <title>Re: HDFS Copy fails</title>
      <link>https://community.qlik.com/t5/Talend-Studio/HDFS-Copy-fails/m-p/2227654#M19193</link>
      <description>Hi got past the above error and now i get the below error. 
&lt;BR /&gt;log4j:WARN No appenders could be found for logger (org.apache.hadoop.conf.Configuration.deprecation). 
&lt;BR /&gt;log4j:WARN Please initialize the log4j system properly. 
&lt;BR /&gt;log4j:WARN See 
&lt;A href="http://logging.apache.org/log4j/1.2/faq.html#noconfig" rel="nofollow noopener noreferrer"&gt;http://logging.apache.org/log4j/1.2/faq.html#noconfig&lt;/A&gt; for more info. 
&lt;BR /&gt;Exception in component tHDFSCopy_1 
&lt;BR /&gt;java.io.IOException: Failed on local exception: com.google.protobuf.InvalidProtocolBufferException: Protocol message end-group tag did not match expected tag.; Host Details : local host is: "Hadoop_lab/11.19.58.154"; destination host is: "Hadoop_lab":50070; 
&lt;BR /&gt; at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:764) 
&lt;BR /&gt; at org.apache.hadoop.ipc.Client.call(Client.java:1351) 
&lt;BR /&gt; at org.apache.hadoop.ipc.Client.call(Client.java:1300) 
&lt;BR /&gt; at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:206) 
&lt;BR /&gt; at sun.proxy.$Proxy9.getFileInfo(Unknown Source) 
&lt;BR /&gt; at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) 
&lt;BR /&gt; at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) 
&lt;BR /&gt; at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) 
&lt;BR /&gt; at java.lang.reflect.Method.invoke(Method.java:616) 
&lt;BR /&gt; at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:186) 
&lt;BR /&gt; at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) 
&lt;BR /&gt; at sun.proxy.$Proxy9.getFileInfo(Unknown Source) 
&lt;BR /&gt; at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:651) 
&lt;BR /&gt; at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:1679) 
&lt;BR /&gt; at org.apache.hadoop.hdfs.DistributedFileSystem$17.doCall(DistributedFileSystem.java:1106) 
&lt;BR /&gt; at org.apache.hadoop.hdfs.DistributedFileSystem$17.doCall(DistributedFileSystem.java:1102) 
&lt;BR /&gt; at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81) 
&lt;BR /&gt; at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1102) 
&lt;BR /&gt; at org.apache.hadoop.fs.FileSystem.exists(FileSystem.java:1397) 
&lt;BR /&gt; at big_data_demo.hdfs_copyfromlocal_0_1.HDFS_CopyFromLocal.tHDFSCopy_1Process(HDFS_CopyFromLocal.java:315) 
&lt;BR /&gt; at big_data_demo.hdfs_copyfromlocal_0_1.HDFS_CopyFromLocal.runJobInTOS(HDFS_CopyFromLocal.java:518) 
&lt;BR /&gt; at big_data_demo.hdfs_copyfromlocal_0_1.HDFS_CopyFromLocal.main(HDFS_CopyFromLocal.java:407) 
&lt;BR /&gt;Caused by: com.google.protobuf.InvalidProtocolBufferException: Protocol message end-group tag did not match expected tag. 
&lt;BR /&gt; at com.google.protobuf.InvalidProtocolBufferException.invalidEndTag(InvalidProtocolBufferException.java:94) 
&lt;BR /&gt; at com.google.protobuf.CodedInputStream.checkLastTagWas(CodedInputStream.java:124) 
&lt;BR /&gt; at com.google.protobuf.AbstractParser.parsePartialFrom(AbstractParser.java:202) 
&lt;BR /&gt; at com.google.protobuf.AbstractParser.parsePartialDelimitedFrom(AbstractParser.java:241) 
&lt;BR /&gt; at com.google.protobuf.AbstractParser.parseDelimitedFrom(AbstractParser.java:253) 
&lt;BR /&gt; at com.google.protobuf.AbstractParser.parseDelimitedFrom(AbstractParser.java:259) 
&lt;BR /&gt; at com.google.protobuf.AbstractParser.parseDelimitedFrom(AbstractParser.java:49) 
&lt;BR /&gt; at org.apache.hadoop.ipc.protobuf.RpcHeaderProtos$RpcResponseHeaderProto.parseDelimitedFrom(RpcHeaderProtos.java:2364) 
&lt;BR /&gt; at org.apache.hadoop.ipc.Client$Connection.receiveRpcResponse(Client.java:996) 
&lt;BR /&gt; at org.apache.hadoop.ipc.Client$Connection.run(Client.java:891)</description>
      <pubDate>Fri, 10 Jan 2014 19:20:23 GMT</pubDate>
      <guid>https://community.qlik.com/t5/Talend-Studio/HDFS-Copy-fails/m-p/2227654#M19193</guid>
      <dc:creator>Anonymous</dc:creator>
      <dc:date>2014-01-10T19:20:23Z</dc:date>
    </item>
    <item>
      <title>Re: HDFS Copy fails</title>
      <link>https://community.qlik.com/t5/Talend-Studio/HDFS-Copy-fails/m-p/2227655#M19194</link>
      <description>i got this resolved once i put in the correct port # which is 8020</description>
      <pubDate>Fri, 10 Jan 2014 20:28:49 GMT</pubDate>
      <guid>https://community.qlik.com/t5/Talend-Studio/HDFS-Copy-fails/m-p/2227655#M19194</guid>
      <dc:creator>Anonymous</dc:creator>
      <dc:date>2014-01-10T20:28:49Z</dc:date>
    </item>
  </channel>
</rss>

