Do not input private or sensitive data. View Qlik Privacy & Cookie Policy.
Skip to main content

Announcements
Join us in Toronto Sept 9th for Qlik's AI Reality Tour! Register Now
cancel
Showing results for 
Search instead for 
Did you mean: 
Naveenkumar3
Contributor
Contributor

Unable to build spark standalone Job ( Custom -Unsupported distribution )

Hi Team,

I am unable to build a spark standalone job and getting the below error message. Can anyone help me to configure spark standalone cluster details in talend and help me to fix this issue.

spark configuration details:

0693p000009J0cfAAC.png

Error message when building the job.

0693p000009J0czAAC.png

Labels (3)
5 Replies
manodwhb
Champion II
Champion II

@Naveenkumar Murugesan​ , you have some configuration issue in some component. Do you have tFileInput component in your job ? if possible can you show job?

Naveenkumar3
Contributor
Contributor
Author

@Manohar B​ , Yes I have tFileInput components. please find the below screenshots.

The job is getting build successfully in the" local mode" and the same job is getting failed in the "standalone mode"

0693p000009J7zqAAC.png

 

tfileInputDelimited:

 

0693p000009J7zvAAC.png

manodwhb
Champion II
Champion II

@Naveenkumar Murugesan​ , you need to check define a storage configuration component and you need to use S3connection or HDFSconnection component and you need to read from the file system.

 

Thanks,

Manohar

Naveenkumar3
Contributor
Contributor
Author

 

@Manohar B​  We don't have any Hadoop cluster or distributed file system like (S3or HDFS). We are using spark open source standalone cluster and the file system is NFS ( Network file system).

 

I have mentioned distribution as "Custom unsupported". Do you have any idea what type of .zip file needs to be uploaded in the custom unsupported wizard.

 

0693p000009J0cfAAC.png​ 

manodwhb
Champion II
Champion II

@Naveenkumar Murugesan​ , Have you looked at the below link? if not look into it

 

https://help.talend.com/reader/Ey7t5jMFEKou4HLney5XcQ/x9D~iohZ3_lbTIoDcJDOaQ

 

Thanks,

Manohar