<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Re: Big Data Spark Job - Load data into Hive in Talend Studio</title>
    <link>https://community.qlik.com/t5/Talend-Studio/Big-Data-Spark-Job-Load-data-into-Hive/m-p/2278194#M53740</link>
    <description>&lt;P&gt;Which version you are using? With&amp;nbsp;6.2.1, all these tHive component is not available. Only tHiveConfiguration, tHiveInput and tHiveOutput is there.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Data is loaded using spark job but facing problem when the Hive table is dynamically partitioned.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Thanks.&amp;nbsp;&lt;/P&gt;</description>
    <pubDate>Tue, 16 May 2017 06:43:26 GMT</pubDate>
    <dc:creator>Anonymous</dc:creator>
    <dc:date>2017-05-16T06:43:26Z</dc:date>
    <item>
      <title>Big Data Spark Job - Load data into Hive</title>
      <link>https://community.qlik.com/t5/Talend-Studio/Big-Data-Spark-Job-Load-data-into-Hive/m-p/2278192#M53738</link>
      <description>&lt;P&gt;I&amp;nbsp;am creating big data spark job and want to load data into &lt;STRONG&gt;dynamic partitioned&lt;/STRONG&gt; hive tables.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;Which component I can use to load data into hive and what would be the workflow?&lt;/P&gt;</description>
      <pubDate>Sat, 16 Nov 2024 09:47:07 GMT</pubDate>
      <guid>https://community.qlik.com/t5/Talend-Studio/Big-Data-Spark-Job-Load-data-into-Hive/m-p/2278192#M53738</guid>
      <dc:creator>Anonymous</dc:creator>
      <dc:date>2024-11-16T09:47:07Z</dc:date>
    </item>
    <item>
      <title>Re: Big Data Spark Job - Load data into Hive</title>
      <link>https://community.qlik.com/t5/Talend-Studio/Big-Data-Spark-Job-Load-data-into-Hive/m-p/2278193#M53739</link>
      <description>&lt;P&gt;tHDFSConnection--&amp;gt; HIVE--&amp;gt;tHiveRow---&amp;gt;tFileInputDelimiter----&amp;gt;tHDFSOutput&lt;/P&gt; 
&lt;P&gt;&amp;nbsp;&lt;/P&gt; 
&lt;P&gt;&lt;SPAN class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="Hive load data" style="width: 999px;"&gt;&lt;span class="lia-inline-image-display-wrapper" image-alt="0683p000009LuUI.png"&gt;&lt;img src="https://community.qlik.com/t5/image/serverpage/image-id/147522i60E1CFFECBE5D1FA/image-size/large?v=v2&amp;amp;px=999" role="button" title="0683p000009LuUI.png" alt="0683p000009LuUI.png" /&gt;&lt;/span&gt;&lt;SPAN class="lia-inline-image-caption" onclick="event.preventDefault();"&gt;Hive load data&lt;/SPAN&gt;&lt;/SPAN&gt;&lt;/P&gt; 
&lt;P&gt;&lt;SPAN&gt;tHiveRow&amp;nbsp;configuration&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt; 
&lt;P&gt;&lt;SPAN&gt;&lt;SPAN class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="tHiveRow1" style="width: 999px;"&gt;&lt;span class="lia-inline-image-display-wrapper" image-alt="0683p000009LuAm.png"&gt;&lt;img src="https://community.qlik.com/t5/image/serverpage/image-id/139163i1A7BABDA80ACA453/image-size/large?v=v2&amp;amp;px=999" role="button" title="0683p000009LuAm.png" alt="0683p000009LuAm.png" /&gt;&lt;/span&gt;&lt;SPAN class="lia-inline-image-caption" onclick="event.preventDefault();"&gt;tHiveRow1&lt;/SPAN&gt;&lt;/SPAN&gt;&lt;/SPAN&gt;&lt;/P&gt; 
&lt;P&gt;&amp;nbsp;&lt;/P&gt; 
&lt;P&gt;tHiveRow2&lt;/P&gt; 
&lt;P&gt;&lt;SPAN class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="tHiveRow2.PNG" style="width: 999px;"&gt;&lt;span class="lia-inline-image-display-wrapper" image-alt="0683p000009LuUR.png"&gt;&lt;img src="https://community.qlik.com/t5/image/serverpage/image-id/156747iC0687ACD02E6A7CF/image-size/large?v=v2&amp;amp;px=999" role="button" title="0683p000009LuUR.png" alt="0683p000009LuUR.png" /&gt;&lt;/span&gt;&lt;/SPAN&gt;&lt;/P&gt; 
&lt;P&gt;&amp;nbsp;&lt;/P&gt; 
&lt;P&gt;tFileInput Delimiter&lt;/P&gt; 
&lt;P&gt;&amp;nbsp;&lt;/P&gt; 
&lt;P&gt;&lt;SPAN class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="tFile.PNG" style="width: 999px;"&gt;&lt;span class="lia-inline-image-display-wrapper" image-alt="0683p000009Ltxv.png"&gt;&lt;img src="https://community.qlik.com/t5/image/serverpage/image-id/132959i43FDCC8001E97B76/image-size/large?v=v2&amp;amp;px=999" role="button" title="0683p000009Ltxv.png" alt="0683p000009Ltxv.png" /&gt;&lt;/span&gt;&lt;/SPAN&gt;&lt;/P&gt; 
&lt;P&gt;&lt;SPAN&gt;tHDFSoutput &lt;/SPAN&gt;&lt;/P&gt; 
&lt;P&gt;&amp;nbsp;&lt;/P&gt; 
&lt;P&gt;&lt;SPAN&gt;&lt;SPAN class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="hdfs.PNG" style="width: 996px;"&gt;&lt;span class="lia-inline-image-display-wrapper" image-alt="0683p000009LuUb.png"&gt;&lt;img src="https://community.qlik.com/t5/image/serverpage/image-id/140580iCE5FC328C7B4CFF8/image-size/large?v=v2&amp;amp;px=999" role="button" title="0683p000009LuUb.png" alt="0683p000009LuUb.png" /&gt;&lt;/span&gt;&lt;/SPAN&gt;&lt;/SPAN&gt;&lt;/P&gt; 
&lt;P&gt;1.tHDFSConnection set your connection to your Hadoop cluster&lt;/P&gt; 
&lt;P&gt;2.set connection to hive in HIVE component&lt;/P&gt; 
&lt;P&gt;3.tHiveRow drop the table if present&lt;/P&gt; 
&lt;P&gt;4.tHiveRow create external table table&lt;/P&gt; 
&lt;P&gt;5. tFileInutDelimiter load the data to HDFS external table location&lt;/P&gt; 
&lt;P&gt;6. tHDFSoutput to load data to external table path &amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Mon, 15 May 2017 08:25:48 GMT</pubDate>
      <guid>https://community.qlik.com/t5/Talend-Studio/Big-Data-Spark-Job-Load-data-into-Hive/m-p/2278193#M53739</guid>
      <dc:creator>Anonymous</dc:creator>
      <dc:date>2017-05-15T08:25:48Z</dc:date>
    </item>
    <item>
      <title>Re: Big Data Spark Job - Load data into Hive</title>
      <link>https://community.qlik.com/t5/Talend-Studio/Big-Data-Spark-Job-Load-data-into-Hive/m-p/2278194#M53740</link>
      <description>&lt;P&gt;Which version you are using? With&amp;nbsp;6.2.1, all these tHive component is not available. Only tHiveConfiguration, tHiveInput and tHiveOutput is there.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Data is loaded using spark job but facing problem when the Hive table is dynamically partitioned.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Thanks.&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Tue, 16 May 2017 06:43:26 GMT</pubDate>
      <guid>https://community.qlik.com/t5/Talend-Studio/Big-Data-Spark-Job-Load-data-into-Hive/m-p/2278194#M53740</guid>
      <dc:creator>Anonymous</dc:creator>
      <dc:date>2017-05-16T06:43:26Z</dc:date>
    </item>
    <item>
      <title>Re: Big Data Spark Job - Load data into Hive</title>
      <link>https://community.qlik.com/t5/Talend-Studio/Big-Data-Spark-Job-Load-data-into-Hive/m-p/2278195#M53741</link>
      <description>&lt;P&gt;I am using Talend open studio for Big data Version TOS_BD-20150508_1414-V5.6.2 . try to Download those components from&amp;nbsp;&lt;A href="https://exchange.talend.com/" target="_blank" rel="nofollow noopener noreferrer"&gt;https://exchange.talend.com/&lt;/A&gt;.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Wed, 17 May 2017 13:13:58 GMT</pubDate>
      <guid>https://community.qlik.com/t5/Talend-Studio/Big-Data-Spark-Job-Load-data-into-Hive/m-p/2278195#M53741</guid>
      <dc:creator>Anonymous</dc:creator>
      <dc:date>2017-05-17T13:13:58Z</dc:date>
    </item>
  </channel>
</rss>

