Do not input private or sensitive data. View Qlik Privacy & Cookie Policy.
Skip to main content

Announcements
Discover how organizations are unlocking new revenue streams: Watch here
cancel
Showing results for 
Search instead for 
Did you mean: 
Anonymous
Not applicable

Error while putting file into HDFS using Talend Bigdata Open Source

Hi , I am a newbie into Talend development environment. I have installed Talend open studio for Big data in Windows 8 OS and trying to connect to Cloudera CDH V5.4.3(non commercial) using VMWare Player. I have followed the Talend getting stared manual for Big data HDFS connectivity and able to connect to the CDH environment(tried both NAT and Host Private type of Network Adapters). But when I am trying to put a file from my local to the HDFS I am getting an error 
org.apache.hadoop.ipc.RemoteException(java.io.IOException): File /home/training/states_demo5mn.txt could only be replicated to 0 nodes instead of minReplication (=1).  There are 1 datanode(s) running and 1 node(s) are excluded in this operation. As I know its a vary common error with Hadoop ecosystem and there may be many reasons behind this issue, all my possible efforts till now are in vain. I have checked the following 1) I am able to ping and telnet the Name Node a

To see the whole post, download it here
Labels (4)
5 Replies
Anonymous
Not applicable
Author

Additional log from NameNode which may be meaningful 
2016-08-25 05:14:40,694 WARN org.apache.hadoop.hdfs.server.blockmanagement.BlockPlacementPolicy: Failed to place enough replicas, still in need of 1 to reach 1 (unavailableStorages=[], storagePolicy=BlockStoragePolicy{HOT:7, storageTypes=, creationFallbacks=[], replicationFallbacks=}, newBlock=true) For more information, please enable DEBUG log level on org.apache.hadoop.hdfs.server.blockmanagement.BlockPlacementPolicy
2016-08-25 05:14:40,695 WARN org.apache.hadoop.hdfs.protocol.BlockStoragePolicy: Failed to place enough replicas: expected size is 1 but only 0 storage types can be selected (replication=1, selected=[], unavailable=, removed=, policy=BlockStoragePolicy{HOT:7, storageTypes=, creationFallbacks=[], replicationFallbacks=})
2016-08-25 05:14:40,695 WARN org.apache.hadoop.hdfs.server.blockmanagement.BlockPlacementPolicy: Failed to place enough replicas, still in need of 1 to reach 1 (unavailableStorages=, storagePolicy=BlockStoragePolicy{HOT:7, storageTypes=, creationFallbacks=[], replicationFallbacks=}, newBlock=true) All required storage types are unavailable:  unavailableStorages=, storagePolicy=BlockStoragePolicy{HOT:7, storageTypes=, creationFallbacks=[], replicationFallbacks=}
2016-08-25 05:14:40,696 WARN org.apache.hadoop.security.UserGroupInformation: PriviledgedActionException as:training (auth0683p000009M9p6.pngIMPLE) cause:java.io.IOException: File /talend/states_demo5mn.txt could only be replicated to 0 nodes instead of minReplication (=1).  There are 1 datanode(s) running and 1 node(s) are excluded in this operation.
2016-08-25 05:14:40,697 INFO org.apache.hadoop.ipc.Server: IPC Server handler 0 on 8020, call org.apache.hadoop.hdfs.protocol.ClientProtocol.addBlock from 192.168.206.1:64070 Call#8 Retry#0
java.io.IOException: File /talend/states_demo5mn.txt could only be replicated to 0 nodes instead of minReplication (=1).  There are 1 datanode(s) running and 1 node(s) are excluded in this operation.
at org.apache.hadoop.hdfs.server.blockmanagement.BlockManager.chooseTarget4NewBlock(BlockManager.java:1541)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getAdditionalBlock(FSNamesystem.java:3289)
at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.addBlock(NameNodeRpcServer.java:668)
at org.apache.hadoop.hdfs.server.namenode.AuthorizationProviderProxyClientProtocol.addBlock(AuthorizationProviderProxyClientProtocol.java:212)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.addBlock(ClientNamenodeProtocolServerSideTranslatorPB.java:483)
at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:619)
at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1060)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2044)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2040)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1671)
at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2038)
Anonymous
Not applicable
Author

I tried to write data into my hadoop env from the windows using a java program and I am able to write into. So I believe this issue is specific to Talend. Appreciate any suggestion.
Anonymous
Not applicable
Author

I am facing the similar error message. Can someone share the solution for it? Thank you very much.

Anonymous
Not applicable
Author

I am facing the similar issue. Does anyone have any idea about this issue?

Anonymous
Not applicable
Author

Hello,

Could you please let us know if this online documentation helps?

TalendHelpCenter:The missing winutils.exe program in the Big Data Jobs

Best regards

Sabrina