<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Re: Reading a json external table into spark job issue in Talend Studio</title>
    <link>https://community.qlik.com/t5/Talend-Studio/Reading-a-json-external-table-into-spark-job-issue/m-p/2330448#M99524</link>
    <description>&lt;P&gt;use the row name row1.filename&lt;/P&gt;</description>
    <pubDate>Tue, 03 Jul 2018 04:54:03 GMT</pubDate>
    <dc:creator>Anonymous</dc:creator>
    <dc:date>2018-07-03T04:54:03Z</dc:date>
    <item>
      <title>Reading a json external table into spark job issue</title>
      <link>https://community.qlik.com/t5/Talend-Studio/Reading-a-json-external-table-into-spark-job-issue/m-p/2330447#M99523</link>
      <description>&lt;P&gt;Hello,&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;I have a Spark Job with a Hive query &amp;nbsp;that reads an&amp;nbsp; external table (JSON format), but when i tried to get the colonne&amp;nbsp;INPUT__FILE__NAME i have this error :&lt;/P&gt;
&lt;P&gt;hive org.apache.spark.sql.AnalysisException: cannot resolve '`INPUT__FILE__NAME`'&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;My job&lt;/P&gt;
&lt;P&gt;tHiveInput -&amp;gt; tMap -&amp;gt; tHiveOutput&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;How can i get the file name ?&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;My config:&lt;/P&gt;
&lt;P&gt;Talend Big Data Batch Platform 6.5.1&lt;/P&gt;
&lt;P&gt;Cloudera 5.12&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;Thanks&lt;/P&gt;
&lt;P&gt;smer&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Sat, 16 Nov 2024 08:00:48 GMT</pubDate>
      <guid>https://community.qlik.com/t5/Talend-Studio/Reading-a-json-external-table-into-spark-job-issue/m-p/2330447#M99523</guid>
      <dc:creator>smer</dc:creator>
      <dc:date>2024-11-16T08:00:48Z</dc:date>
    </item>
    <item>
      <title>Re: Reading a json external table into spark job issue</title>
      <link>https://community.qlik.com/t5/Talend-Studio/Reading-a-json-external-table-into-spark-job-issue/m-p/2330448#M99524</link>
      <description>&lt;P&gt;use the row name row1.filename&lt;/P&gt;</description>
      <pubDate>Tue, 03 Jul 2018 04:54:03 GMT</pubDate>
      <guid>https://community.qlik.com/t5/Talend-Studio/Reading-a-json-external-table-into-spark-job-issue/m-p/2330448#M99524</guid>
      <dc:creator>Anonymous</dc:creator>
      <dc:date>2018-07-03T04:54:03Z</dc:date>
    </item>
    <item>
      <title>Re: Reading a json external table into spark job issue</title>
      <link>https://community.qlik.com/t5/Talend-Studio/Reading-a-json-external-table-into-spark-job-issue/m-p/2330449#M99525</link>
      <description>Hello jcruie, 
&lt;BR /&gt; 
&lt;BR /&gt;This solution in not working, because the row1.filename is unkown. 
&lt;BR /&gt;my job is 
&lt;BR /&gt;tHiveInput-&amp;gt;(row1)-&amp;gt;tmap-&amp;gt;(row2)-&amp;gt;thiveOutput 
&lt;BR /&gt;i use the row1.filename in ten tmap, and in the query hive but not working. 
&lt;BR /&gt; 
&lt;BR /&gt;Regards 
&lt;BR /&gt;smer 
&lt;BR /&gt; 
&lt;BR /&gt; 
&lt;BR /&gt;</description>
      <pubDate>Tue, 03 Jul 2018 11:22:58 GMT</pubDate>
      <guid>https://community.qlik.com/t5/Talend-Studio/Reading-a-json-external-table-into-spark-job-issue/m-p/2330449#M99525</guid>
      <dc:creator>smer</dc:creator>
      <dc:date>2018-07-03T11:22:58Z</dc:date>
    </item>
    <item>
      <title>Re: Reading a json external table into spark job issue</title>
      <link>https://community.qlik.com/t5/Talend-Studio/Reading-a-json-external-table-into-spark-job-issue/m-p/2330450#M99526</link>
      <description>&lt;P&gt;If filename is in tmap you need to use row2.filename&lt;/P&gt;</description>
      <pubDate>Tue, 03 Jul 2018 12:28:36 GMT</pubDate>
      <guid>https://community.qlik.com/t5/Talend-Studio/Reading-a-json-external-table-into-spark-job-issue/m-p/2330450#M99526</guid>
      <dc:creator>Anonymous</dc:creator>
      <dc:date>2018-07-03T12:28:36Z</dc:date>
    </item>
    <item>
      <title>Re: Reading a json external table into spark job issue</title>
      <link>https://community.qlik.com/t5/Talend-Studio/Reading-a-json-external-table-into-spark-job-issue/m-p/2330451#M99527</link>
      <description>&lt;P&gt;The solution is that "input__file_name" is a virtual hive collumn, i just use the spark's function&amp;nbsp;&lt;SPAN&gt;&amp;nbsp;input_file_name() and that's it &lt;span class="lia-inline-image-display-wrapper" image-alt="0683p000009MACn.png"&gt;&lt;img src="https://community.qlik.com/t5/image/serverpage/image-id/154443iC5B8CACEF3D12C6A/image-size/large?v=v2&amp;amp;px=999" role="button" title="0683p000009MACn.png" alt="0683p000009MACn.png" /&gt;&lt;/span&gt;&lt;/SPAN&gt;&lt;/P&gt; 
&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Mon, 11 Feb 2019 15:30:46 GMT</pubDate>
      <guid>https://community.qlik.com/t5/Talend-Studio/Reading-a-json-external-table-into-spark-job-issue/m-p/2330451#M99527</guid>
      <dc:creator>smer</dc:creator>
      <dc:date>2019-02-11T15:30:46Z</dc:date>
    </item>
  </channel>
</rss>

