
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
java.io.FileNotFoundException: spark job on HDP
Failure on executing spark job in HDP 2.5 sandbox. Job is submitted to Yarn and getting failed with below error tried changing spark "scratch" directory but no luck.
Is there any way to override c: or skip.
According to https://help.talend.com/reader/PJjYRkeHQCiEoH8eIxqbCw/R16fQAWCPE5UGCJK7~LOYA default is c: without modification.
Application application_1600700049240_0026 failed 2 times due to AM Container for appattempt_1600700049240_0026_000002 exited with exitCode: -1000
For more detailed output, check the application tracking page: http://sandbox.hortonworks.com:8088/cluster/app/application_1600700049240_0026 Then click on links to logs of each attempt.
Diagnostics: File file:/C:/tmp/spark-eeaf58bf-c0e4-4a95-b87c-2564d59c0609/__spark_conf__2267956587040359783.zip does not exist
java.io.FileNotFoundException: File file:/C:/tmp/spark-eeaf58bf-c0e4-4a95-b87c-2564d59c0609/__spark_conf__2267956587040359783.zip does not exist
at org.apache.hadoop.fs.RawLocalFileSystem.deprecatedGetFileStatus(RawLocalFileSystem.java:624)
at org.apache.hadoop.fs.RawLocalFileSystem.getFileLinkStatusInternal(RawLocalFileSystem.java:850)
