Skip to main content
Announcements
SYSTEM MAINTENANCE: Thurs., Sept. 19, 1 AM ET, Platform will be unavailable for approx. 60 minutes.
cancel
Showing results for 
Search instead for 
Did you mean: 
smer
Contributor II
Contributor II

Reading a json external table into spark job issue

Hello,

 

I have a Spark Job with a Hive query  that reads an  external table (JSON format), but when i tried to get the colonne INPUT__FILE__NAME i have this error :

hive org.apache.spark.sql.AnalysisException: cannot resolve '`INPUT__FILE__NAME`'

 

My job

tHiveInput -> tMap -> tHiveOutput

 

How can i get the file name ?

 

My config:

Talend Big Data Batch Platform 6.5.1

Cloudera 5.12

 

 

Thanks

smer

 

Labels (3)
1 Solution

Accepted Solutions
smer
Contributor II
Contributor II
Author

The solution is that "input__file_name" is a virtual hive collumn, i just use the spark's function  input_file_name() and that's it 0683p000009MACn.png

 

View solution in original post

4 Replies
Anonymous
Not applicable

use the row name row1.filename

smer
Contributor II
Contributor II
Author

Hello jcruie,

This solution in not working, because the row1.filename is unkown.
my job is
tHiveInput->(row1)->tmap->(row2)->thiveOutput
i use the row1.filename in ten tmap, and in the query hive but not working.

Regards
smer


Anonymous
Not applicable

If filename is in tmap you need to use row2.filename

smer
Contributor II
Contributor II
Author

The solution is that "input__file_name" is a virtual hive collumn, i just use the spark's function  input_file_name() and that's it 0683p000009MACn.png