Do not input private or sensitive data. View Qlik Privacy & Cookie Policy.
Skip to main content

Announcements
Qlik Open Lakehouse is Now Generally Available! Discover the key highlights and partner resources here.
cancel
Showing results for 
Search instead for 
Did you mean: 
Anonymous
Not applicable

Problem with tHdfsConnection

Hello everyone,
I try to use talend big data but fail to use it properly.
I have the following error when I'm lauching the job in attachment (job_launched.JPG)
 connecting to socket on port 3673
connected
DAL_Extrateur_Segment_Prime_Individuelle_v2 - Test du début d'un job
: org.apache.hadoop.util.NativeCodeLoader - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
: org.apache.hadoop.hdfs.shortcircuit.DomainSocketFactory - The short-circuit local reads feature cannot be used because UNIX Domain sockets are not available on Windows.
: gouvernance_bd.dal_extrateur_segment_prime_individuelle_v2_0_1.DAL_Extrateur_Segment_Prime_Individuelle_v2 - tHDFSPut_1 Call From GM64XXX/XXX.XXX.XX.X to myhbaseserv:50070 failed on connection exception: java.net.ConnectException: Connection refused: no further information; For more details see:  http://wiki.apache.org/hadoop/ConnectionRefused
Exception in component tHDFSPut_1
java.net.ConnectException: Call From GM64XXX/XXX.XXX.XX.X to myhbaseserv99:50070 failed on connection exception: java.net.ConnectException: Connection refused: no further information; For more details see:  http://wiki.apache.org/hadoop/ConnectionRefused
 at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
 at sun.reflect.NativeConstructorAccessorImpl.newInstance(Unknown Source)
 at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(Unknown Source)
 at java.lang.reflect.Constructor.newInstance(Unknown Source)
 at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:792)
 at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:732)
 at org.apache.hadoop.ipc.Client.call(Client.java:1431)
 at org.apache.hadoop.ipc.Client.call(Client.java:1358)
 at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229)
 at com.sun.proxy.$Proxy8.mkdirs(Unknown Source)
 at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.mkdirs(ClientNamenodeProtocolTranslatorPB.java:558)
 at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
 at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
 at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
 at java.lang.reflect.Method.invoke(Unknown Source)
 at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:187)
 at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
 at com.sun.proxy.$Proxy9.mkdirs(Unknown Source)
 at org.apache.hadoop.hdfs.DFSClient.primitiveMkdir(DFSClient.java:3008)
 at org.apache.hadoop.hdfs.DFSClient.mkdirs(DFSClient.java:2978)
 at org.apache.hadoop.hdfs.DistributedFileSystem$21.doCall(DistributedFileSystem.java:1047)
 at org.apache.hadoop.hdfs.DistributedFileSystem$21.doCall(DistributedFileSystem.java:1043)
 at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
 at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirsInternal(DistributedFileSystem.java:1043)
 at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirs(DistributedFileSystem.java:1036)
 at org.apache.hadoop.fs.FileSystem.mkdirs(FileSystem.java:1877)
 at gouvernance_bd.dal_extrateur_segment_prime_individuelle_v2_0_1.DAL_Extrateur_Segment_Prime_Individuelle_v2.tFileList_1Process(DAL_Extrateur_Segment_Prime_Individuelle_v2.java:671)
 at gouvernance_bd.dal_extrateur_segment_prime_individuelle_v2_0_1.DAL_Extrateur_Segment_Prime_Individuelle_v2.tOracleInput_1Process(DAL_Extrateur_Segment_Prime_Individuelle_v2.java:3180)
 at gouvernance_bd.dal_extrateur_segment_prime_individuelle_v2_0_1.DAL_Extrateur_Segment_Prime_Individuelle_v2.tRunJob_1Process(DAL_Extrateur_Segment_Prime_Individuelle_v2.java:3456)
 at gouvernance_bd.dal_extrateur_segment_prime_individuelle_v2_0_1.DAL_Extrateur_Segment_Prime_Individuelle_v2.runJobInTOS(DAL_Extrateur_Segment_Prime_Individuelle_v2.java:3705)
 at gouvernance_bd.dal_extrateur_segment_prime_individuelle_v2_0_1.DAL_Extrateur_Segment_Prime_Individuelle_v2.main(DAL_Extrateur_Segment_Prime_Individuelle_v2.java:3539)
Caused by: java.net.ConnectException: Connection refused: no further information
 at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
 at sun.nio.ch.SocketChannelImpl.finishConnect(Unknown Source)
 at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
 at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:531)
 at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:495)
 at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:612)
 at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:710)
 at org.apache.hadoop.ipc.Client$Connection.access$2800(Client.java:373)
 at org.apache.hadoop.ipc.Client.getConnection(Client.java:1493)
 at org.apache.hadoop.ipc.Client.call(Client.java:1397)
 ... 24 more
disconnected
Job DAL_Extrateur_Segment_Prime_Individuelle_v2 ended at 08:48 01/06/2016.

On my thdfsconnection, I have :
Distribution : Hortonworks 2.3.0
Namenode URI : hdfs://myhbaseserv:50070 (==> which is the port of my dfs.namenode.http-address variable in the hdfs-site.xml)
User name : hdfs (which is the dfs.cluster.administrators variable in my hdfs-site.xml)
Is that ok ?
I see on a lot of samples that the port 8020 is used for the Namenode URI.
Thanks a lot for your help !

0683p000009MEFu.jpg
0683p000009MEFg.jpg                  
0683p000009MEAv.jpg                                                           
 
Labels (4)
1 Reply
Anonymous
Not applicable
Author

Neither 8020 or any others ports can be access by my composant.
If you have any advice ... Thanks