Do not input private or sensitive data. View Qlik Privacy & Cookie Policy.
Skip to main content

Announcements
Join us in Bucharest on Sept 18th for Qlik's AI Reality Tour! Register Now
cancel
Showing results for 
Search instead for 
Did you mean: 
YPMAL
Contributor III
Contributor III

hdfs connection error

I have installed cloudera VM and using CDH 5.13.

I am able to connect it using putty.

In THdfsInput I can view the HDFS file after choosing browse button. See screentshot view1

But after running I am getting an error

[WARN ]: org.apache.hadoop.util.NativeCodeLoader - Unable to load native-hadoop library for your platform... "using builtin-java classes where applicable
Exception in component tHDFSInput_1 (frstjob)
java.nio.channels.UnresolvedAddressException"

Labels (3)
1 Solution

Accepted Solutions
manodwhb
Champion II
Champion II

@yogeshmalekar,can you un-tick Use Datanode Hostname option in basic settings of tHDFSInput and let me know.

View solution in original post

6 Replies
manodwhb
Champion II
Champion II

@yogeshmalekar,can you un-tick Use Datanode Hostname option in basic settings of tHDFSInput and let me know.

YPMAL
Contributor III
Contributor III
Author

After removing use datanode hostname I am getting following error

 

[statistics] connecting to socket on port 3524
[statistics] connected
[WARN ]: org.apache.hadoop.util.NativeCodeLoader - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
[WARN ]: org.apache.hadoop.hdfs.BlockReaderFactory - I/O error constructing remote block reader.
java.net.ConnectException: Connection refused: no further information
at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
at sun.nio.ch.SocketChannelImpl.finishConnect(Unknown Source)
at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:530)
at org.apache.hadoop.hdfs.DFSClient.newConnectedPeer(DFSClient.java:3553)
at org.apache.hadoop.hdfs.BlockReaderFactory.nextTcpPeer(BlockReaderFactory.java:840)
at org.apache.hadoop.hdfs.BlockReaderFa



To see the whole post, download it here
OriginalPost.pdf
manodwhb
Champion II
Champion II

@yogeshmalekar,revert back that change and can you check that the Namenode URI.

 

i believe you might have used hdfs://localhost:8020..can you changed to IP address and let me know that is working or not?

YPMAL
Contributor III
Contributor III
Author

I am using the ip address. I can connect in putty using that ip address.

YPMAL
Contributor III
Contributor III
Author

Thanks.. I was doing two errors. 

1) I was not giving port which is 8020 by default

2) Use DataNode HostName was checked.

 

Thanks for your help.

manodwhb
Champion II
Champion II

@yogeshmalekar,great,your issue resolved. Kudos also accepted.