Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
Hi, I cannot retrive the physical folder in which my job is writing temp data. I searched in all my yarn datanodes but no way to find it!
I'm running a Spark job with a thdfs configuration
Can someone tell me where I have to search for "/tmp/test_latam_hdfs" folder?
Thanx
Hello,
When you select the "Use datanode hostname" check box, it allows the Job to access datanodes via their hostnames.
This actually sets the dfs.client.use.datanode.hostname property to true. When connecting to a S3N filesystem, you must select this check box.
Best regards
Sabrina