Do not input private or sensitive data. View Qlik Privacy & Cookie Policy.
Skip to main content

Announcements
Qlik Open Lakehouse is Now Generally Available! Discover the key highlights and partner resources here.
cancel
Showing results for 
Search instead for 
Did you mean: 
badri-nair
Contributor
Contributor

Talend Spark job . Join using tmap Issue

Hi,

 

I have created a simple BD spark job. I have used tmap where I am looking up to a few files.

The compilation is fine, The lookup hdfs files have contents in them.

But the Code fails during execution: Error message below.

If any one has faced similar issue, please let me know how was it resolved.

 

 

org.talend.bigdata.dataflow.SpecException: Invalid input accessor: clsn_62.null
at org.talend.bigdata.dataflow.hmap.HMapSpec$JoinDef.deserialize(HMapSpec.java:1206)
at org.talend.bigdata.dataflow.hmap.HMapSpec$JoinDef.access$1800(HMapSpec.java:1122)
at org.talend.bigdata.dataflow.hmap.HMapSpec.joinKey(HMapSpec.java:563)
at org.talend.bigdata.dataflow.hmap.HMapSpecBuilder.joinKey(HMapSpecBuilder.java:171)
at t_data_wh.tdata_wh_bd_spec_0_1.tdata_wh_BD_spec.tHiveInput_2Process(tdata_wh_BD_spec.java:2557)
at t_data_wh.tdata_wh_bd_spec_0_1.tdata_wh_BD_spec.tFileInputDelimited_6_HDFSInputFormatProcess(tdata_wh_BD_spec.java:4458)
at t_data_wh.tdata_wh_bd_spec_0_1.tdata_wh_BD_spec.run(tdata_wh_BD_spec.java:4856)
at t_data_wh.tdata_wh_bd_spec_0_1.tdata_wh_BD_spec.runJobInTOS(tdata_wh_BD_spec.java:4672)
at t_data_wh.tdata_wh_bd_spec_0_1.tdata_wh_BD_spec.main(tdata_wh_BD_spec.java:4554)
org.talend.bigdata.dataflow.SpecException: Invalid input accessor: clsn_62.null
at org.talend.bigdata.dataflow.hmap.HMapSpec$JoinDef.deserialize(HMapSpec.java:1206)
at org.talend.bigdata.dataflow.hmap.HMapSpec$JoinDef.access$1800(HMapSpec.java:1122)
at org.talend.bigdata.dataflow.hmap.HMapSpec.joinKey(HMapSpec.java:563)
at org.talend.bigdata.dataflow.hmap.HMapSpecBuilder.joinKey(HMapSpecBuilder.java:171)
at t_data_wh.tdata_wh_bd_spec_0_1.tdata_wh_BD_spec.tHiveInput_2Process(tdata_wh_BD_spec.java:2557)
at t_data_wh.tdata_wh_bd_spec_0_1.tdata_wh_BD_spec.tFileInputDelimited_6_HDFSInputFormatProcess(tdata_wh_BD_spec.java:4458)
at t_data_wh.tdata_wh_bd_spec_0_1.tdata_wh_BD_spec.run(tdata_wh_BD_spec.java:4856)
at t_data_wh.tdata_wh_bd_spec_0_1.tdata_wh_BD_spec.runJobInTOS(tdata_wh_BD_spec.java:4672)
at t_data_wh.tdata_wh_bd_spec_0_1.tdata_wh_BD_spec.main(tdata_wh_BD_spec.java:4554)
[ERROR]: t_data_wh.tdata_wh_bd_spec_0_1.tdata_wh_BD_spec - TalendJob: 'tdata_wh_BD_spec' - Failed with exit code: 1.

Labels (3)
1 Solution

Accepted Solutions
manodwhb
Champion II
Champion II

@badri-nair ,check below link,

 

1- Make sure the input for this flow ( row1 ) here 0 is not empty, this input can be a source file or a db query

2- make sure you are not joining on multiple keys, if you need to do so , please one tmap for each key join

http://talendexpert.com/talend-spark-error-2/

View solution in original post

4 Replies
manodwhb
Champion II
Champion II

@badri-nair ,check below link,

 

1- Make sure the input for this flow ( row1 ) here 0 is not empty, this input can be a source file or a db query

2- make sure you are not joining on multiple keys, if you need to do so , please one tmap for each key join

http://talendexpert.com/talend-spark-error-2/

badri-nair
Contributor
Contributor
Author

Thank you very much, Manohar. 

had to use 4 tmaps for 4 joins. I had in just one tmap in the previous standard job. But it does work.

 

 

Thanks

Badri Nair 

 

manodwhb
Champion II
Champion II

@badri-nair ,can you show your job design and you can first load to tHashoutput and read that using tHashInput ,if you want to take multiple lockups as you can use multiple tHasInputs

badri-nair
Contributor
Contributor
Author

HI Manohar, 

there is no thash components for a spark job. only tCache .

In the standard job i had 4 thash outputs and 4thash inputs was linked to each one of them. They 4 inputs were used as lookup in a single tmap.

Looks like that cant be done in0683p000009M4Ac.jpg0683p000009M4Ah.jpg a spark job .

 

Thanks

Badri Nair 

 

 

 

 


STD-job.JPG