Skip to main content
Announcements
Join us at Qlik Connect for 3 magical days of learning, networking,and inspiration! REGISTER TODAY and save!
cancel
Showing results for 
Search instead for 
Did you mean: 
Anonymous
Not applicable

Cannot connect to Hortonworks services

Hello, I am experiencing some issues trying to connect Talend on windows to our hortonworks hadoop cluster. The issue comes when I try the "check services" button when setting up a new hadoop cluster connection. For Namenode URI we are getting the following exception:

 

org.talend.designer.hdfsbrowse.exceptions.HadoopServerException: org.talend.designer.hdfsbrowse.exceptions.HadoopServerException: java.util.concurrent.ExecutionException: java.lang.reflect.InvocationTargetException
	at org.talend.designer.hdfsbrowse.hadoop.service.check.AbstractCheckedServiceProvider.checkService(AbstractCheckedServiceProvider.java:57)
	at org.talend.designer.hdfsbrowse.hadoop.service.HadoopServiceBean.check(HadoopServiceBean.java:102)
	at org.talend.designer.hdfsbrowse.hadoop.service.check.CheckHadoopServicesDialog$5.run(CheckHadoopServicesDialog.java:373)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source)
	at java.lang.Thread.run(Unknown Source)
Caused by: org.talend.designer.hdfsbrowse.exceptions.HadoopServerException: java.util.concurrent.ExecutionException: java.lang.reflect.InvocationTargetException
	at org.talend.designer.hdfsbrowse.hadoop.service.check.CheckedWorkUnit.execute(CheckedWorkUnit.java:47)
	at org.talend.designer.hdfsbrowse.hadoop.service.check.AbstractCheckedServiceProvider.checkService(AbstractCheckedServiceProvider.java:54)
	... 5 more
Caused by: java.util.concurrent.ExecutionException: java.lang.reflect.InvocationTargetException
	at java.util.concurrent.FutureTask.report(Unknown Source)
	at java.util.concurrent.FutureTask.get(Unknown Source)
	at org.talend.designer.hdfsbrowse.hadoop.service.check.CheckedWorkUnit.execute(CheckedWorkUnit.java:44)
	... 6 more
Caused by: java.lang.reflect.InvocationTargetException
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
	at java.lang.reflect.Method.invoke(Unknown Source)
	at org.talend.core.utils.ReflectionUtils.invokeMethod(ReflectionUtils.java:166)
	at org.talend.designer.hdfsbrowse.hadoop.service.check.provider.CheckedNamenodeProvider.check(CheckedNamenodeProvider.java:75)
	at org.talend.designer.hdfsbrowse.hadoop.service.check.AbstractCheckedServiceProvider$1.run(AbstractCheckedServiceProvider.java:49)
	at org.talend.designer.hdfsbrowse.hadoop.service.check.CheckedWorkUnit$1.call(CheckedWorkUnit.java:65)
	at java.util.concurrent.FutureTask.run(Unknown Source)
	... 3 more
Caused by: java.io.IOException: Failed on local exception: java.io.IOException: An existing connection was forcibly closed by the remote host; Host Details : local host is: "AndyH-D/10.13.3.7"; destination host is: "hadoopctrl.dev.local":8020; 
	at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:782)
	at org.apache.hadoop.ipc.Client.getRpcResponse(Client.java:1556)
	at org.apache.hadoop.ipc.Client.call(Client.java:1496)
	at org.apache.hadoop.ipc.Client.call(Client.java:1396)
	at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:233)
	at com.sun.proxy.$Proxy181.getListing(Unknown Source)
	at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getListing(ClientNamenodeProtocolTranslatorPB.java:618)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
	at java.lang.reflect.Method.invoke(Unknown Source)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:278)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:194)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:176)
	at com.sun.proxy.$Proxy185.getListing(Unknown Source)
	at org.apache.hadoop.hdfs.DFSClient.listPaths(DFSClient.java:2136)
	at org.apache.hadoop.hdfs.DFSClient.listPaths(DFSClient.java:2119)
	at org.apache.hadoop.hdfs.DistributedFileSystem.listStatusInternal(DistributedFileSystem.java:900)
	at org.apache.hadoop.hdfs.DistributedFileSystem.access$600(DistributedFileSystem.java:113)
	at org.apache.hadoop.hdfs.DistributedFileSystem$21.doCall(DistributedFileSystem.java:966)
	at org.apache.hadoop.hdfs.DistributedFileSystem$21.doCall(DistributedFileSystem.java:962)
	at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
	at org.apache.hadoop.hdfs.DistributedFileSystem.listStatus(DistributedFileSystem.java:962)
	... 12 more
Caused by: java.io.IOException: An existing connection was forcibly closed by the remote host
	at sun.nio.ch.SocketDispatcher.read0(Native Method)
	at sun.nio.ch.SocketDispatcher.read(Unknown Source)
	at sun.nio.ch.IOUtil.readIntoNativeBuffer(Unknown Source)
	at sun.nio.ch.IOUtil.read(Unknown Source)
	at sun.nio.ch.SocketChannelImpl.read(Unknown Source)
	at org.apache.hadoop.net.SocketInputStream$Reader.performIO(SocketInputStream.java:57)
	at org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:142)
	at org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:161)
	at org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:131)
	at java.io.FilterInputStream.read(Unknown Source)
	at java.io.FilterInputStream.read(Unknown Source)
	at org.apache.hadoop.ipc.Client$Connection$PingInputStream.read(Client.java:554)
	at java.io.BufferedInputStream.fill(Unknown Source)
	at java.io.BufferedInputStream.read(Unknown Source)
	at java.io.DataInputStream.readInt(Unknown Source)
	at org.apache.hadoop.ipc.Client$Connection.receiveRpcResponse(Client.java:1117)
	at org.apache.hadoop.ipc.Client$Connection.run(Client.java:1012)

The resource manager is simply timing out:

 

 

org.talend.designer.hdfsbrowse.exceptions.HadoopServerException: org.talend.designer.hdfsbrowse.exceptions.HadoopServerException: java.util.concurrent.TimeoutException
	at org.talend.designer.hdfsbrowse.hadoop.service.check.AbstractCheckedServiceProvider.checkService(AbstractCheckedServiceProvider.java:57)
	at org.talend.designer.hdfsbrowse.hadoop.service.HadoopServiceBean.check(HadoopServiceBean.java:102)
	at org.talend.designer.hdfsbrowse.hadoop.service.check.CheckHadoopServicesDialog$5.run(CheckHadoopServicesDialog.java:373)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source)
	at java.lang.Thread.run(Unknown Source)
Caused by: org.talend.designer.hdfsbrowse.exceptions.HadoopServerException: java.util.concurrent.TimeoutException
	at org.talend.designer.hdfsbrowse.hadoop.service.check.CheckedWorkUnit.execute(CheckedWorkUnit.java:47)
	at org.talend.designer.hdfsbrowse.hadoop.service.check.AbstractCheckedServiceProvider.checkService(AbstractCheckedServiceProvider.java:54)
	... 5 more
Caused by: java.util.concurrent.TimeoutException
	at java.util.concurrent.FutureTask.get(Unknown Source)
	at org.talend.designer.hdfsbrowse.hadoop.service.check.CheckedWorkUnit.execute(CheckedWorkUnit.java:44)
	... 6 more

 

I have set up the connection both manually and by getting connection attributes through Ambari. I am using the "hdfs" user name. Can provide whatever more information is needed for help! Thanks!

Labels (3)
6 Replies
Anonymous
Not applicable
Author

Hello,

Could you please post your cluster connection setting screenshot on forum which will be helpful for us to address your issue? What's hortonworks hadoop cluster version you are using?

 Can you connect to your hadoop cluster & read sample hdfs file through client without using talend tool?

Best regards

Sabrina

Anonymous
Not applicable
Author

Here are the connection screenshots. I'm using HDP version 2.6.0.3-8. I am first getting the connection properties through Ambari, though I have also manually verified they are correct according to the Talend setup document. I am able to connect to the cluster from my client machine via ssh (and read hdfs) and I am also able to successfully ping all of the ports in the connection settings. I've also added those addresses to my hosts file.

 

First from import wizard to get the connection settings:

0683p000009LvSY.png

Then the connection settings:

0683p000009Lvpf.png

Thanks!

Anonymous
Not applicable
Author

Hello,

Are you using Talend 6.4? Could you please take a look at online document about:TalendHelpCenter: Supported Hadoop distribution versions?

Best regards

Sabrina

Anonymous
Not applicable
Author

Hi, yes we are using Talend 6.4. From the link you sent it seems that HDP 2.6 is not supported by Talend yet. I am also trying to connect from Windows Server 2008 R2 which is not on your list of supported Windows versions. Does this necessarily mean Talend will not work between these two systems?

Anonymous
Not applicable
Author

Hello,

In the documentation we provide a list of platforms that are supported, in the sense that we do provide an SLA and technical support for them. This doesn't mean other (non-listed) platforms will not work but simply we won't necessarily be equipped to help you with any issue you may face with them.

Best regards

Sabrina

Anonymous
Not applicable
Author

Uncheck the "use custom hadoop configuration" checkbox. This is actually a bug that the Support team reproduced as well.