Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
Hi ,
I am trying to setup connection from metadata for Hadoop cluster connection from TOS for Bigdata 6.3.0 but getting below exception...
Caused by: java.io.IOException: Failed on local exception: java.io.IOException: Couldn't setup connection for xxx-xxxx_xx_xxd@xxxx.xx.COM to xxxxxxxxxx-hdp.xxx.xxxxxxx.com/xx.xxx.xx.xx:8020; Host Details : local host is: "xxxxxxxxxxx/xx.xx.xxx.xxx"; destination host is: "xxxxxxxxx-hdp.xxx.xxxxxx.com":8020;
Talend : TOS for Bigdata 6.3.0 (Local machine)- Free Version for POC
Hadoop Cluster : Hartonworks 2.5
TOS for Bigdata 6.3.0 is installed in my local machine and trying to connect Hadoop cluster by using kerbros authentication. I have copied keytab to my local machine and using that to connect. But its unable to establish connection from local machine to Name node.
Could you please help me to get it resolved ASAP.
Thanks
pcrao
Hello,
Please refer to this online document about:TalendHelpCenter:Connecting to the cluster using keytab
Let us know if this article is helpful.
Best regards
Sabrina
Hi Sabrina,
Thanks for the response.
I am unable to fix the issue by using your link.
My issues is, not establishing connection from local machine to namenode server. I am providing keytab entries also.
I am assuming the issue with krb5.conf/krb5.ini file, not sure where to place this file to make use it by Talend to establish the connection from local machine to Hadoop cluster.
Since it is my office laptop I don't have permissions to copy krb5 file to C:\Windows.
For SQL developer , we have copied krb5 file where SQL developer is located (sqldeveloper\jdk\jre\lib\security path) and able to access to hive from my local machine.
But for Talend I am not sure where to place this krb5 file.
Note:
I am using free version of Talend Open studio for Bigdata 6.3.0 (which got installed on my local machine) and connecting to Hadoop cluster by using Kerbros security.
My use case: To ingest files / oracle tables data to Hive tables.
Please let me know if you have any other alternatives to do my use case.
Thanks,
pcrao.