<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Null Pointer Exception in tHiveConnection using HDP 2.0 sandbox in Talend Studio</title>
    <link>https://community.qlik.com/t5/Talend-Studio/Null-Pointer-Exception-in-tHiveConnection-using-HDP-2-0-sandbox/m-p/2244674#M30751</link>
    <description>Hi, 
&lt;BR /&gt;I am working through Hortonworks Sandbox Examples in TOS v5.4.1. The HDFS examples worked perfectly. I then moved on to the HIVE examples, and am encountering an issue. Each time I run the job it fails in the tHiveConnection component with the following error: 
&lt;BR /&gt;Starting job Simple_hive_row_input at 17:03 12/02/2014. 
&lt;BR /&gt; connecting to socket on port 4021 
&lt;BR /&gt; connected 
&lt;BR /&gt;: org.apache.hadoop.util.Shell - Failed to locate the winutils binary in the hadoop binary path 
&lt;BR /&gt;java.io.IOException: Could not locate executable null\bin\winutils.exe in the Hadoop binaries. 
&lt;BR /&gt; at org.apache.hadoop.util.Shell.getQualifiedBinPath(Shell.java:278) 
&lt;BR /&gt; at org.apache.hadoop.util.Shell.getWinUtilsPath(Shell.java:300) 
&lt;BR /&gt; at org.apache.hadoop.util.Shell.&amp;lt;clinit&amp;gt;(Shell.java:293) 
&lt;BR /&gt; at org.apache.hadoop.hive.conf.HiveConf$ConfVars.findHadoopBinary(HiveConf.java:917) 
&lt;BR /&gt; at org.apache.hadoop.hive.conf.HiveConf$ConfVars.&amp;lt;clinit&amp;gt;(HiveConf.java:238) 
&lt;BR /&gt; at org.apache.hadoop.hive.conf.HiveConf.&amp;lt;clinit&amp;gt;(HiveConf.java:74) 
&lt;BR /&gt; at org.apache.hadoop.hive.service.HiveServer$HiveServerHandler.&amp;lt;init&amp;gt;(HiveServer.java:122) 
&lt;BR /&gt; at org.apache.hadoop.hive.jdbc.HiveConnection.&amp;lt;init&amp;gt;(HiveConnection.java:95) 
&lt;BR /&gt; at org.apache.hadoop.hive.jdbc.HiveDriver.connect(HiveDriver.java:106) 
&lt;BR /&gt; at java.sql.DriverManager.getConnection(Unknown Source) 
&lt;BR /&gt; at java.sql.DriverManager.getConnection(Unknown Source) 
&lt;BR /&gt; at bigdatademos.simple_hive_row_input_0_1.Simple_hive_row_input.tHiveConnection_1Process(Simple_hive_row_input.java:1458) 
&lt;BR /&gt; at bigdatademos.simple_hive_row_input_0_1.Simple_hive_row_input.tFixedFlowInput_1Process(Simple_hive_row_input.java:1336) 
&lt;BR /&gt; at bigdatademos.simple_hive_row_input_0_1.Simple_hive_row_input.tCreateTemporaryFile_1Process(Simple_hive_row_input.java:658) 
&lt;BR /&gt; at bigdatademos.simple_hive_row_input_0_1.Simple_hive_row_input.runJobInTOS(Simple_hive_row_input.java:3295) 
&lt;BR /&gt; at bigdatademos.simple_hive_row_input_0_1.Simple_hive_row_input.main(Simple_hive_row_input.java:3112) 
&lt;BR /&gt;: org.apache.hadoop.conf.Configuration.deprecation - mapred.input.dir.recursive is deprecated. Instead, use mapreduce.input.fileinputformat.input.dir.recursive 
&lt;BR /&gt;: org.apache.hadoop.conf.Configuration.deprecation - mapred.max.split.size is deprecated. Instead, use mapreduce.input.fileinputformat.split.maxsize 
&lt;BR /&gt;: org.apache.hadoop.conf.Configuration.deprecation - mapred.min.split.size is deprecated. Instead, use mapreduce.input.fileinputformat.split.minsize 
&lt;BR /&gt;: org.apache.hadoop.conf.Configuration.deprecation - mapred.min.split.size.per.rack is deprecated. Instead, use mapreduce.input.fileinputformat.split.minsize.per.rack 
&lt;BR /&gt;: org.apache.hadoop.conf.Configuration.deprecation - mapred.min.split.size.per.node is deprecated. Instead, use mapreduce.input.fileinputformat.split.minsize.per.node 
&lt;BR /&gt;: org.apache.hadoop.conf.Configuration.deprecation - mapred.reduce.tasks is deprecated. Instead, use mapreduce.job.reduces 
&lt;BR /&gt;: org.apache.hadoop.conf.Configuration.deprecation - mapred.reduce.tasks.speculative.execution is deprecated. Instead, use mapreduce.reduce.speculative 
&lt;BR /&gt;: org.apache.hadoop.conf.Configuration.deprecation - fs.default.name is deprecated. Instead, use fs.defaultFS 
&lt;BR /&gt;: org.apache.hadoop.hive.metastore.HiveMetaStore - 0: Opening raw store with implemenation class 
&lt;span class="lia-inline-image-display-wrapper" image-alt="0683p000009MA5A.png"&gt;&lt;img src="https://community.qlik.com/t5/image/serverpage/image-id/143082iB236712184B767DA/image-size/large?v=v2&amp;amp;px=999" role="button" title="0683p000009MA5A.png" alt="0683p000009MA5A.png" /&gt;&lt;/span&gt;rg.apache.hadoop.hive.metastore.ObjectStore 
&lt;BR /&gt;: org.apache.hadoop.hive.metastore.ObjectStore - ObjectStore, initialize called 
&lt;BR /&gt;: DataNucleus.Persistence - Property datanucleus.cache.level2 unknown - will be ignored 
&lt;BR /&gt;: DataNucleus.Connection - BoneCP specified but not present in CLASSPATH (or one of dependencies) 
&lt;BR /&gt;: DataNucleus.Connection - BoneCP specified but not present in CLASSPATH (or one of dependencies) 
&lt;BR /&gt;: org.apache.hadoop.conf.Configuration.deprecation - fs.default.name is deprecated. Instead, use fs.defaultFS 
&lt;BR /&gt;: org.apache.hadoop.hive.metastore.ObjectStore - Setting MetaStore object pin classes with hive.metastore.cache.pinobjtypes="Table,StorageDescriptor,SerDeInfo,Partition,Database,Type,FieldSchema,Order" 
&lt;BR /&gt;: org.apache.hadoop.hive.metastore.ObjectStore - Initialized ObjectStore 
&lt;BR /&gt;: org.apache.hadoop.util.NativeCodeLoader - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable 
&lt;BR /&gt;Exception in component tHiveConnection_1 
&lt;BR /&gt;java.lang.RuntimeException: org.apache.hadoop.hive.ql.metadata.HiveException: java.lang.NullPointerException 
&lt;BR /&gt; at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:286) 
&lt;BR /&gt; at org.apache.hadoop.hive.service.HiveServer$HiveServerHandler.&amp;lt;init&amp;gt;(HiveServer.java:137) 
&lt;BR /&gt; at org.apache.hadoop.hive.service.HiveServer$HiveServerHandler.&amp;lt;init&amp;gt;(HiveServer.java:122) 
&lt;BR /&gt; at org.apache.hadoop.hive.jdbc.HiveConnection.&amp;lt;init&amp;gt;(HiveConnection.java:95) 
&lt;BR /&gt; at org.apache.hadoop.hive.jdbc.HiveDriver.connect(HiveDriver.java:106) 
&lt;BR /&gt; at java.sql.DriverManager.getConnection(Unknown Source) 
&lt;BR /&gt; at java.sql.DriverManager.getConnection(Unknown Source) 
&lt;BR /&gt; at bigdatademos.simple_hive_row_input_0_1.Simple_hive_row_input.tHiveConnection_1Process(Simple_hive_row_input.java:1458) 
&lt;BR /&gt; at bigdatademos.simple_hive_row_input_0_1.Simple_hive_row_input.tFixedFlowInput_1Process(Simple_hive_row_input.java:1336) 
&lt;BR /&gt; at bigdatademos.simple_hive_row_input_0_1.Simple_hive_row_input.tCreateTemporaryFile_1Process(Simple_hive_row_input.java:658) 
&lt;BR /&gt; disconnected 
&lt;BR /&gt; at bigdatademos.simple_hive_row_input_0_1.Simple_hive_row_input.runJobInTOS(Simple_hive_row_input.java:3295) 
&lt;BR /&gt; at bigdatademos.simple_hive_row_input_0_1.Simple_hive_row_input.main(Simple_hive_row_input.java:3112) 
&lt;BR /&gt;Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: java.lang.NullPointerException 
&lt;BR /&gt; at org.apache.hadoop.hive.ql.metadata.HiveUtils.getAuthenticator(HiveUtils.java:368) 
&lt;BR /&gt; at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:278) 
&lt;BR /&gt; ... 11 more 
&lt;BR /&gt;Caused by: java.lang.NullPointerException 
&lt;BR /&gt; at java.lang.ProcessBuilder.start(Unknown Source) 
&lt;BR /&gt; at org.apache.hadoop.util.Shell.runCommand(Shell.java:404) 
&lt;BR /&gt; at org.apache.hadoop.util.Shell.run(Shell.java:379) 
&lt;BR /&gt; at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:589) 
&lt;BR /&gt; at org.apache.hadoop.util.Shell.execCommand(Shell.java:678) 
&lt;BR /&gt; at org.apache.hadoop.util.Shell.execCommand(Shell.java:661) 
&lt;BR /&gt; at org.apache.hadoop.security.ShellBasedUnixGroupsMapping.getUnixGroups(ShellBasedUnixGroupsMapping.java:83) 
&lt;BR /&gt; at org.apache.hadoop.security.ShellBasedUnixGroupsMapping.getGroups(ShellBasedUnixGroupsMapping.java:52) 
&lt;BR /&gt; at org.apache.hadoop.security.JniBasedUnixGroupsMappingWithFallback.getGroups(JniBasedUnixGroupsMappingWithFallback.java:50) 
&lt;BR /&gt; at org.apache.hadoop.security.Groups.getGroups(Groups.java:89) 
&lt;BR /&gt; at org.apache.hadoop.security.UserGroupInformation.getGroupNames(UserGroupInformation.java:1352) 
&lt;BR /&gt; at org.apache.hadoop.hive.ql.security.HadoopDefaultAuthenticator.setConf(HadoopDefaultAuthenticator.java:62) 
&lt;BR /&gt; at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:73) 
&lt;BR /&gt; at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:133) 
&lt;BR /&gt; at org.apache.hadoop.hive.ql.metadata.HiveUtils.getAuthenticator(HiveUtils.java:365) 
&lt;BR /&gt; ... 12 more 
&lt;BR /&gt;Job Simple_hive_row_input ended at 17:05 12/02/2014. 
&lt;BR /&gt;My settings in the tHiveConnection component are as follows: 
&lt;BR /&gt;Distribution: Hortonworks 
&lt;BR /&gt;Hive Version: Hortonworks Data Platform V2.0.0(BigWheel) 
&lt;BR /&gt;Connection Mode: Embedded (only choice) 
&lt;BR /&gt;Hive Server: Hive1 
&lt;BR /&gt;Host: context.hive_host (context states this is sandbox) 
&lt;BR /&gt;Port: context.hive_port (context states this is 9083; have also tried 9933 and 10000) 
&lt;BR /&gt;Database: "" (have also tried "default") 
&lt;BR /&gt;Username: "" (have also tried "hue" and "hdp") 
&lt;BR /&gt;Password: context.mysql_passwd (context states this is "hdp") 
&lt;BR /&gt;Set Resource Manager: "sandbox:8032" (have also tried localhost:8032, which was the default) 
&lt;BR /&gt;Set Namenode URI: "hdfs://"+ context.namenode_host +":" + context.namenode_port 
&lt;BR /&gt;Any advice on why I am receiving this error, and if there are other values I need to be using in this component to get it to work, would be most appreciated. Thanks in advance for your help!</description>
    <pubDate>Sat, 16 Nov 2024 11:45:15 GMT</pubDate>
    <dc:creator>Anonymous</dc:creator>
    <dc:date>2024-11-16T11:45:15Z</dc:date>
    <item>
      <title>Null Pointer Exception in tHiveConnection using HDP 2.0 sandbox</title>
      <link>https://community.qlik.com/t5/Talend-Studio/Null-Pointer-Exception-in-tHiveConnection-using-HDP-2-0-sandbox/m-p/2244674#M30751</link>
      <description>Hi, 
&lt;BR /&gt;I am working through Hortonworks Sandbox Examples in TOS v5.4.1. The HDFS examples worked perfectly. I then moved on to the HIVE examples, and am encountering an issue. Each time I run the job it fails in the tHiveConnection component with the following error: 
&lt;BR /&gt;Starting job Simple_hive_row_input at 17:03 12/02/2014. 
&lt;BR /&gt; connecting to socket on port 4021 
&lt;BR /&gt; connected 
&lt;BR /&gt;: org.apache.hadoop.util.Shell - Failed to locate the winutils binary in the hadoop binary path 
&lt;BR /&gt;java.io.IOException: Could not locate executable null\bin\winutils.exe in the Hadoop binaries. 
&lt;BR /&gt; at org.apache.hadoop.util.Shell.getQualifiedBinPath(Shell.java:278) 
&lt;BR /&gt; at org.apache.hadoop.util.Shell.getWinUtilsPath(Shell.java:300) 
&lt;BR /&gt; at org.apache.hadoop.util.Shell.&amp;lt;clinit&amp;gt;(Shell.java:293) 
&lt;BR /&gt; at org.apache.hadoop.hive.conf.HiveConf$ConfVars.findHadoopBinary(HiveConf.java:917) 
&lt;BR /&gt; at org.apache.hadoop.hive.conf.HiveConf$ConfVars.&amp;lt;clinit&amp;gt;(HiveConf.java:238) 
&lt;BR /&gt; at org.apache.hadoop.hive.conf.HiveConf.&amp;lt;clinit&amp;gt;(HiveConf.java:74) 
&lt;BR /&gt; at org.apache.hadoop.hive.service.HiveServer$HiveServerHandler.&amp;lt;init&amp;gt;(HiveServer.java:122) 
&lt;BR /&gt; at org.apache.hadoop.hive.jdbc.HiveConnection.&amp;lt;init&amp;gt;(HiveConnection.java:95) 
&lt;BR /&gt; at org.apache.hadoop.hive.jdbc.HiveDriver.connect(HiveDriver.java:106) 
&lt;BR /&gt; at java.sql.DriverManager.getConnection(Unknown Source) 
&lt;BR /&gt; at java.sql.DriverManager.getConnection(Unknown Source) 
&lt;BR /&gt; at bigdatademos.simple_hive_row_input_0_1.Simple_hive_row_input.tHiveConnection_1Process(Simple_hive_row_input.java:1458) 
&lt;BR /&gt; at bigdatademos.simple_hive_row_input_0_1.Simple_hive_row_input.tFixedFlowInput_1Process(Simple_hive_row_input.java:1336) 
&lt;BR /&gt; at bigdatademos.simple_hive_row_input_0_1.Simple_hive_row_input.tCreateTemporaryFile_1Process(Simple_hive_row_input.java:658) 
&lt;BR /&gt; at bigdatademos.simple_hive_row_input_0_1.Simple_hive_row_input.runJobInTOS(Simple_hive_row_input.java:3295) 
&lt;BR /&gt; at bigdatademos.simple_hive_row_input_0_1.Simple_hive_row_input.main(Simple_hive_row_input.java:3112) 
&lt;BR /&gt;: org.apache.hadoop.conf.Configuration.deprecation - mapred.input.dir.recursive is deprecated. Instead, use mapreduce.input.fileinputformat.input.dir.recursive 
&lt;BR /&gt;: org.apache.hadoop.conf.Configuration.deprecation - mapred.max.split.size is deprecated. Instead, use mapreduce.input.fileinputformat.split.maxsize 
&lt;BR /&gt;: org.apache.hadoop.conf.Configuration.deprecation - mapred.min.split.size is deprecated. Instead, use mapreduce.input.fileinputformat.split.minsize 
&lt;BR /&gt;: org.apache.hadoop.conf.Configuration.deprecation - mapred.min.split.size.per.rack is deprecated. Instead, use mapreduce.input.fileinputformat.split.minsize.per.rack 
&lt;BR /&gt;: org.apache.hadoop.conf.Configuration.deprecation - mapred.min.split.size.per.node is deprecated. Instead, use mapreduce.input.fileinputformat.split.minsize.per.node 
&lt;BR /&gt;: org.apache.hadoop.conf.Configuration.deprecation - mapred.reduce.tasks is deprecated. Instead, use mapreduce.job.reduces 
&lt;BR /&gt;: org.apache.hadoop.conf.Configuration.deprecation - mapred.reduce.tasks.speculative.execution is deprecated. Instead, use mapreduce.reduce.speculative 
&lt;BR /&gt;: org.apache.hadoop.conf.Configuration.deprecation - fs.default.name is deprecated. Instead, use fs.defaultFS 
&lt;BR /&gt;: org.apache.hadoop.hive.metastore.HiveMetaStore - 0: Opening raw store with implemenation class 
&lt;span class="lia-inline-image-display-wrapper" image-alt="0683p000009MA5A.png"&gt;&lt;img src="https://community.qlik.com/t5/image/serverpage/image-id/143082iB236712184B767DA/image-size/large?v=v2&amp;amp;px=999" role="button" title="0683p000009MA5A.png" alt="0683p000009MA5A.png" /&gt;&lt;/span&gt;rg.apache.hadoop.hive.metastore.ObjectStore 
&lt;BR /&gt;: org.apache.hadoop.hive.metastore.ObjectStore - ObjectStore, initialize called 
&lt;BR /&gt;: DataNucleus.Persistence - Property datanucleus.cache.level2 unknown - will be ignored 
&lt;BR /&gt;: DataNucleus.Connection - BoneCP specified but not present in CLASSPATH (or one of dependencies) 
&lt;BR /&gt;: DataNucleus.Connection - BoneCP specified but not present in CLASSPATH (or one of dependencies) 
&lt;BR /&gt;: org.apache.hadoop.conf.Configuration.deprecation - fs.default.name is deprecated. Instead, use fs.defaultFS 
&lt;BR /&gt;: org.apache.hadoop.hive.metastore.ObjectStore - Setting MetaStore object pin classes with hive.metastore.cache.pinobjtypes="Table,StorageDescriptor,SerDeInfo,Partition,Database,Type,FieldSchema,Order" 
&lt;BR /&gt;: org.apache.hadoop.hive.metastore.ObjectStore - Initialized ObjectStore 
&lt;BR /&gt;: org.apache.hadoop.util.NativeCodeLoader - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable 
&lt;BR /&gt;Exception in component tHiveConnection_1 
&lt;BR /&gt;java.lang.RuntimeException: org.apache.hadoop.hive.ql.metadata.HiveException: java.lang.NullPointerException 
&lt;BR /&gt; at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:286) 
&lt;BR /&gt; at org.apache.hadoop.hive.service.HiveServer$HiveServerHandler.&amp;lt;init&amp;gt;(HiveServer.java:137) 
&lt;BR /&gt; at org.apache.hadoop.hive.service.HiveServer$HiveServerHandler.&amp;lt;init&amp;gt;(HiveServer.java:122) 
&lt;BR /&gt; at org.apache.hadoop.hive.jdbc.HiveConnection.&amp;lt;init&amp;gt;(HiveConnection.java:95) 
&lt;BR /&gt; at org.apache.hadoop.hive.jdbc.HiveDriver.connect(HiveDriver.java:106) 
&lt;BR /&gt; at java.sql.DriverManager.getConnection(Unknown Source) 
&lt;BR /&gt; at java.sql.DriverManager.getConnection(Unknown Source) 
&lt;BR /&gt; at bigdatademos.simple_hive_row_input_0_1.Simple_hive_row_input.tHiveConnection_1Process(Simple_hive_row_input.java:1458) 
&lt;BR /&gt; at bigdatademos.simple_hive_row_input_0_1.Simple_hive_row_input.tFixedFlowInput_1Process(Simple_hive_row_input.java:1336) 
&lt;BR /&gt; at bigdatademos.simple_hive_row_input_0_1.Simple_hive_row_input.tCreateTemporaryFile_1Process(Simple_hive_row_input.java:658) 
&lt;BR /&gt; disconnected 
&lt;BR /&gt; at bigdatademos.simple_hive_row_input_0_1.Simple_hive_row_input.runJobInTOS(Simple_hive_row_input.java:3295) 
&lt;BR /&gt; at bigdatademos.simple_hive_row_input_0_1.Simple_hive_row_input.main(Simple_hive_row_input.java:3112) 
&lt;BR /&gt;Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: java.lang.NullPointerException 
&lt;BR /&gt; at org.apache.hadoop.hive.ql.metadata.HiveUtils.getAuthenticator(HiveUtils.java:368) 
&lt;BR /&gt; at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:278) 
&lt;BR /&gt; ... 11 more 
&lt;BR /&gt;Caused by: java.lang.NullPointerException 
&lt;BR /&gt; at java.lang.ProcessBuilder.start(Unknown Source) 
&lt;BR /&gt; at org.apache.hadoop.util.Shell.runCommand(Shell.java:404) 
&lt;BR /&gt; at org.apache.hadoop.util.Shell.run(Shell.java:379) 
&lt;BR /&gt; at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:589) 
&lt;BR /&gt; at org.apache.hadoop.util.Shell.execCommand(Shell.java:678) 
&lt;BR /&gt; at org.apache.hadoop.util.Shell.execCommand(Shell.java:661) 
&lt;BR /&gt; at org.apache.hadoop.security.ShellBasedUnixGroupsMapping.getUnixGroups(ShellBasedUnixGroupsMapping.java:83) 
&lt;BR /&gt; at org.apache.hadoop.security.ShellBasedUnixGroupsMapping.getGroups(ShellBasedUnixGroupsMapping.java:52) 
&lt;BR /&gt; at org.apache.hadoop.security.JniBasedUnixGroupsMappingWithFallback.getGroups(JniBasedUnixGroupsMappingWithFallback.java:50) 
&lt;BR /&gt; at org.apache.hadoop.security.Groups.getGroups(Groups.java:89) 
&lt;BR /&gt; at org.apache.hadoop.security.UserGroupInformation.getGroupNames(UserGroupInformation.java:1352) 
&lt;BR /&gt; at org.apache.hadoop.hive.ql.security.HadoopDefaultAuthenticator.setConf(HadoopDefaultAuthenticator.java:62) 
&lt;BR /&gt; at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:73) 
&lt;BR /&gt; at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:133) 
&lt;BR /&gt; at org.apache.hadoop.hive.ql.metadata.HiveUtils.getAuthenticator(HiveUtils.java:365) 
&lt;BR /&gt; ... 12 more 
&lt;BR /&gt;Job Simple_hive_row_input ended at 17:05 12/02/2014. 
&lt;BR /&gt;My settings in the tHiveConnection component are as follows: 
&lt;BR /&gt;Distribution: Hortonworks 
&lt;BR /&gt;Hive Version: Hortonworks Data Platform V2.0.0(BigWheel) 
&lt;BR /&gt;Connection Mode: Embedded (only choice) 
&lt;BR /&gt;Hive Server: Hive1 
&lt;BR /&gt;Host: context.hive_host (context states this is sandbox) 
&lt;BR /&gt;Port: context.hive_port (context states this is 9083; have also tried 9933 and 10000) 
&lt;BR /&gt;Database: "" (have also tried "default") 
&lt;BR /&gt;Username: "" (have also tried "hue" and "hdp") 
&lt;BR /&gt;Password: context.mysql_passwd (context states this is "hdp") 
&lt;BR /&gt;Set Resource Manager: "sandbox:8032" (have also tried localhost:8032, which was the default) 
&lt;BR /&gt;Set Namenode URI: "hdfs://"+ context.namenode_host +":" + context.namenode_port 
&lt;BR /&gt;Any advice on why I am receiving this error, and if there are other values I need to be using in this component to get it to work, would be most appreciated. Thanks in advance for your help!</description>
      <pubDate>Sat, 16 Nov 2024 11:45:15 GMT</pubDate>
      <guid>https://community.qlik.com/t5/Talend-Studio/Null-Pointer-Exception-in-tHiveConnection-using-HDP-2-0-sandbox/m-p/2244674#M30751</guid>
      <dc:creator>Anonymous</dc:creator>
      <dc:date>2024-11-16T11:45:15Z</dc:date>
    </item>
    <item>
      <title>Re: Null Pointer Exception in tHiveConnection using HDP 2.0 sandbox</title>
      <link>https://community.qlik.com/t5/Talend-Studio/Null-Pointer-Exception-in-tHiveConnection-using-HDP-2-0-sandbox/m-p/2244675#M30752</link>
      <description>Hi,
&lt;BR /&gt;This time, you are facing a know issue of HDP 2.0.
&lt;BR /&gt;Here is the YARN JIRA: 
&lt;A href="https://issues.apache.org/jira/browse/YARN-1298" rel="nofollow noopener noreferrer"&gt;https://issues.apache.org/jira/browse/YARN-1298&lt;/A&gt;
&lt;BR /&gt;It's actually not possible to execute jobs from windows if your cluster is installed on Linux. And the reverse is also true.
&lt;BR /&gt;You would need to execute your talend job from a linux machine.
&lt;BR /&gt;Additionally, the resource manager port for HDP 2.0 is not 8032 but 8050.
&lt;BR /&gt;Regards,</description>
      <pubDate>Thu, 13 Feb 2014 09:22:20 GMT</pubDate>
      <guid>https://community.qlik.com/t5/Talend-Studio/Null-Pointer-Exception-in-tHiveConnection-using-HDP-2-0-sandbox/m-p/2244675#M30752</guid>
      <dc:creator>Anonymous</dc:creator>
      <dc:date>2014-02-13T09:22:20Z</dc:date>
    </item>
    <item>
      <title>Re: Null Pointer Exception in tHiveConnection using HDP 2.0 sandbox</title>
      <link>https://community.qlik.com/t5/Talend-Studio/Null-Pointer-Exception-in-tHiveConnection-using-HDP-2-0-sandbox/m-p/2244676#M30753</link>
      <description>Hi Remy, 
&lt;BR /&gt;Thanks for the response. I couldn't quite tell if the YARN JIRA was exactly the same issue as they didn't refer to a null pointer exception, which is what I have received. However, if you are correct that this is the issue, would it then be an issue across all YARN-enabled Hadoop distributions? That is, can I not connect to Hive and submit jobs from Talend running on Windows to a Linux cluster based on another distribution such as Cloudera? In other words, is this a YARN issue across all Hadoop distributions and not a Hortonworks problem? 
&lt;BR /&gt;I am trying to identify what options I have, as the use case I am evaluating would very likely rely on Windows-based clients connecting and submitting jobs to a Linux-based cluster, and I'd like to determine if this is currently possible with Talend regardless of the specific Hadoop distribution. 
&lt;BR /&gt;Thanks!</description>
      <pubDate>Thu, 13 Feb 2014 17:18:29 GMT</pubDate>
      <guid>https://community.qlik.com/t5/Talend-Studio/Null-Pointer-Exception-in-tHiveConnection-using-HDP-2-0-sandbox/m-p/2244676#M30753</guid>
      <dc:creator>Anonymous</dc:creator>
      <dc:date>2014-02-13T17:18:29Z</dc:date>
    </item>
  </channel>
</rss>

