Do not input private or sensitive data. View Qlik Privacy & Cookie Policy.
Skip to main content

Announcements
Join us in NYC Sept 4th for Qlik's AI Reality Tour! Register Now
cancel
Showing results for 
Search instead for 
Did you mean: 
sravanth49
Contributor III
Contributor III

[resolved] unable to connect hadoop

Hi Team,
I am new to Big data. I have hadoop installed in vmware(ubuntu). Talend for big data is there in windows 7.
While I am trying to create a file in hadoop. I am getting error.
I have given namenode uri as "hdfs://100.100.100.100:50070/"
I am getting below exception:
Exception in component tHDFSOutput_1
java.io.IOException: Call to /100.100.100.100:50070 failed on local exception: java.io.EOFException
at org.apache.hadoop.ipc.Client.wrapException(Client.java:1103)
at org.apache.hadoop.ipc.Client.call(Client.java:1071)
at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:225)
at com.sun.proxy.$Proxy1.getProtocolVersion(Unknown Source)
at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:396)
at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:379)
at org.apache.hadoop.hdfs.DFSClient.createRPCNamenode(DFSClient.java:119)
at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:238)
at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:203)
at org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:89)
at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1386)
at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:66)
at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1404)
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:254)
: org.apache.hadoop.security.UserGroupInformation - PriviledgedActionException as:ubuntu cause:java.io.IOException: Call to /100.100.100.101:50070 failed on local exception: java.io.EOFException
at org.apache.hadoop.fs.FileSystem$1.run(FileSystem.java:117)
at org.apache.hadoop.fs.FileSystem$1.run(FileSystem.java:115)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Unknown Source)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1083)
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:115)
at demo_big_data.hdfs_writer_0_1.HDFS_WRITER.tFixedFlowInput_1Process(HDFS_WRITER.java:500)
at demo_big_data.hdfs_writer_0_1.HDFS_WRITER.runJobInTOS(HDFS_WRITER.java:861)
at demo_big_data.hdfs_writer_0_1.HDFS_WRITER.main(HDFS_WRITER.java:708)
Caused by: java.io.EOFException
at java.io.DataInputStream.readInt(Unknown Source)
at org.apache.hadoop.ipc.Client$Connection.receiveResponse(Client.java:807)
at org.apache.hadoop.ipc.Client$Connection.run(Client.java:745)
Please give me solution for this issue
Labels (3)
1 Solution

Accepted Solutions
sravanth49
Contributor III
Contributor III
Author

Hi Team,
I found the issue. I have tried to connect hadoop 2.x version where as talend supports upto Hadoop 1.0.0.

View solution in original post

6 Replies
Anonymous
Not applicable

Hello -- You should configure the NameNode to use the port 8020, which is the default for Hadoop RPC calls.
The port 50070 is used for the Web User Interface.  You should be able to go to the NameNode at , but other components must access it using hdfs://mynamenode:8020
I hope this helps, Ryan
sravanth49
Contributor III
Contributor III
Author

Hi Team,
I have changed the port as 9000(hdfs://localhost:9000), even I am getting error as below:
: org.apache.hadoop.ipc.Client - Retrying connect to server: 100.100.100.101/100.100.100.101:9000. Already tried 9 time(s).
Exception in component tHDFSOutput_1
java.net.ConnectException: Call to 100.100.100.101/100.100.100.101:9000 failed on connection exception: java.net.ConnectException: Connection refused: no further information
at org.apache.hadoop.ipc.Client.wrapException(Client.java:1095)
at org.apache.hadoop.ipc.Client.call(Client.java:1071)
at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:225)
at com.sun.proxy.$Proxy1.getProtocolVersion(Unknown Source)
at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:396)
at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:379)
at org.apache.hadoop.hdfs.DFSClient.createRPCNamenode(DFSClient.java:119)
at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:238)
at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:203)
at org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:89)
at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1386)
at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:66)
at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1404)
: org.apache.hadoop.security.UserGroupInformation - PriviledgedActionException as:ubuntu cause:java.net.ConnectException: Call to 100.100.100.101/100.100.100.101:9000 failed on connection exception: java.net.ConnectException: Connection refused: no further information
disconnected
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:254)
at org.apache.hadoop.fs.FileSystem$1.run(FileSystem.java:117)
at org.apache.hadoop.fs.FileSystem$1.run(FileSystem.java:115)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Unknown Source)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1083)
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:115)
at demo_big_data.hdfs_writer_0_1.HDFS_WRITER.tFixedFlowInput_1Process(HDFS_WRITER.java:500)
at demo_big_data.hdfs_writer_0_1.HDFS_WRITER.runJobInTOS(HDFS_WRITER.java:861)
at demo_big_data.hdfs_writer_0_1.HDFS_WRITER.main(HDFS_WRITER.java:708)
Caused by: java.net.ConnectException: Connection refused: no further information
at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
at sun.nio.ch.SocketChannelImpl.finishConnect(Unknown Source)
at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:656)
at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:434)
at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:560)
at org.apache.hadoop.ipc.Client$Connection.access$2000(Client.java:184)
at org.apache.hadoop.ipc.Client.getConnection(Client.java:1202)
at org.apache.hadoop.ipc.Client.call(Client.java:1046)
... 21 more
0683p000009MBSc.png
Anonymous
Not applicable

Hello,
Do you know where this IP address could come from ?  100.100.100.101 ?
sravanth49
Contributor III
Contributor III
Author

Hi,
My IP address is different, I have overwritted  my ip address with 100.100.100.101 in the above post.
I don't want to share my IP address
sravanth49
Contributor III
Contributor III
Author

Hi Team,
I have changed the port as 9000(hdfs://localhost:9000), even I am getting error as below 
: org.apache.hadoop.ipc.Client - Retrying connect to server: Ip address  Ip address :9000. Already tried 9 time(s).
Exception in component tHDFSOutput_1
java.net.ConnectException: Call to   Ip address /  Ip address :9000 failed on connection exception: java.net.ConnectException: Connection refused: no further information
at org.apache.hadoop.ipc.Client.wrapException(Client.java:1095)
at org.apache.hadoop.ipc.Client.call(Client.java:1071)
at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:225)
at com.sun.proxy.$Proxy1.getProtocolVersion(Unknown Source)
at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:396)
at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:379)
at org.apache.hadoop.hdfs.DFSClient.createRPCNamenode(DFSClient.java:119)
at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:238)
at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:203)
at org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:89)
at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1386)
at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:66)
at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1404)
: org.apache.hadoop.security.UserGroupInformation - PriviledgedActionException as:ubuntu cause:java.net.ConnectException: Call to   Ip address /  Ip address .101:9000 failed on connection exception: java.net.ConnectException: Connection refused: no further information
disconnected
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:254)
at org.apache.hadoop.fs.FileSystem$1.run(FileSystem.java:117)
at org.apache.hadoop.fs.FileSystem$1.run(FileSystem.java:115)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Unknown Source)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1083)
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:115)
at demo_big_data.hdfs_writer_0_1.HDFS_WRITER.tFixedFlowInput_1Process(HDFS_WRITER.java:500)
at demo_big_data.hdfs_writer_0_1.HDFS_WRITER.runJobInTOS(HDFS_WRITER.java:861)
at demo_big_data.hdfs_writer_0_1.HDFS_WRITER.main(HDFS_WRITER.java:708)
Caused by: java.net.ConnectException: Connection refused: no further information
at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
at sun.nio.ch.SocketChannelImpl.finishConnect(Unknown Source)
at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:656)
at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:434)
at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:560)
at org.apache.hadoop.ipc.Client$Connection.access$2000(Client.java:184)
at org.apache.hadoop.ipc.Client.getConnection(Client.java:1202)
at org.apache.hadoop.ipc.Client.call(Client.java:1046)
... 21 more
sravanth49
Contributor III
Contributor III
Author

Hi Team,
I found the issue. I have tried to connect hadoop 2.x version where as talend supports upto Hadoop 1.0.0.