<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Re: Exception in component tSparkConnection_1 java.io.FileNotFoundExceptio in Talend Studio</title>
    <link>https://community.qlik.com/t5/Talend-Studio/Exception-in-component-tSparkConnection-1-java-io/m-p/2243062#M29690</link>
    <description>I think I could figure it out myself.&lt;BR /&gt;I had to change the jvm temp location to a different location as below using tjava component and I was able to connect to spark.</description>
    <pubDate>Tue, 01 Sep 2015 10:38:18 GMT</pubDate>
    <dc:creator>_AnonymousUser</dc:creator>
    <dc:date>2015-09-01T10:38:18Z</dc:date>
    <item>
      <title>Exception in component tSparkConnection_1 java.io.FileNotFoundExceptio</title>
      <link>https://community.qlik.com/t5/Talend-Studio/Exception-in-component-tSparkConnection-1-java-io/m-p/2243061#M29689</link>
      <description>Hi, 
&lt;BR /&gt;I am getting an error when trying to establish spark connection. 
&lt;BR /&gt;Starting job SparkDemoJob at 11:42 01/09/2015. 
&lt;BR /&gt; connecting to socket on port 4010 
&lt;BR /&gt; connected 
&lt;BR /&gt;: org.apache.spark.SecurityManager - Changing view acls to: GRM5KOR 
&lt;BR /&gt;: org.apache.spark.SecurityManager - SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(GRM5KOR) 
&lt;BR /&gt;: akka.event.slf4j.Slf4jLogger - Slf4jLogger started 
&lt;BR /&gt;: Remoting - Starting remoting 
&lt;BR /&gt;: Remoting - Remoting started; listening on addresses : 
&lt;BR /&gt;: Remoting - Remoting now listens on addresses: 
&lt;BR /&gt;: org.apache.spark.SparkEnv - Registering MapOutputTracker 
&lt;BR /&gt;: org.apache.spark.SparkEnv - Registering BlockManagerMaster 
&lt;BR /&gt;: org.apache.spark.storage.DiskBlockManager - Created local directory at C:\Users\GRM5KOR\AppData\Local\Temp\spark-local-20150901114228-6989 
&lt;BR /&gt;: org.apache.spark.storage.MemoryStore - MemoryStore started with capacity 546.3 MB. 
&lt;BR /&gt;: org.apache.spark.network.ConnectionManager - Bound socket to port 60526 with id = ConnectionManagerId(BMHE1056048.BMH.APAC.BOSCH.COM,60526) 
&lt;BR /&gt;: org.apache.spark.storage.BlockManagerMaster - Trying to register BlockManager 
&lt;BR /&gt;: org.apache.spark.storage.BlockManagerInfo - Registering block manager BMHE1056048.BMH.APAC.BOSCH.COM:60526 with 546.3 MB RAM 
&lt;BR /&gt;: org.apache.spark.storage.BlockManagerMaster - Registered BlockManager 
&lt;BR /&gt;: org.apache.spark.HttpServer - Starting HTTP Server 
&lt;BR /&gt;: org.eclipse.jetty.server.Server - jetty-8.y.z-SNAPSHOT 
&lt;BR /&gt;: org.eclipse.jetty.server.AbstractConnector - Started 
&lt;A href="mailto:SocketConnector@0.0.0.0:60527" target="_blank" rel="nofollow noopener noreferrer"&gt;SocketConnector@0.0.0.0:60527&lt;/A&gt; 
&lt;BR /&gt;: org.apache.spark.broadcast.HttpBroadcast - Broadcast server started at 
&lt;BR /&gt;: org.apache.spark.HttpFileServer - HTTP File server directory is C:\Users\GRM5KOR\AppData\Local\Temp\spark-8266001c-03e7-40c4-9c5f-eb4895fcd977 
&lt;BR /&gt;: org.apache.spark.HttpServer - Starting HTTP Server 
&lt;BR /&gt;: org.eclipse.jetty.server.Server - jetty-8.y.z-SNAPSHOT 
&lt;BR /&gt;: org.eclipse.jetty.server.AbstractConnector - Started 
&lt;A href="mailto:SocketConnector@0.0.0.0:60528" target="_blank" rel="nofollow noopener noreferrer"&gt;SocketConnector@0.0.0.0:60528&lt;/A&gt; 
&lt;BR /&gt;: org.eclipse.jetty.server.Server - jetty-8.y.z-SNAPSHOT 
&lt;BR /&gt;: org.eclipse.jetty.server.AbstractConnector - Started 
&lt;A href="mailto:SelectChannelConnector@0.0.0.0:4040" target="_blank" rel="nofollow noopener noreferrer"&gt;SelectChannelConnector@0.0.0.0:4040&lt;/A&gt; 
&lt;BR /&gt;: org.apache.spark.ui.SparkUI - Started SparkUI at 
&lt;BR /&gt;: org.apache.hadoop.util.NativeCodeLoader - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable 
&lt;BR /&gt;: org.apache.hadoop.util.Shell - Failed to locate the winutils binary in the hadoop binary path 
&lt;BR /&gt;java.io.IOException: Could not locate executable null\bin\winutils.exe in the Hadoop binaries. 
&lt;BR /&gt;&amp;nbsp;at org.apache.hadoop.util.Shell.getQualifiedBinPath(Shell.java:324) 
&lt;BR /&gt;&amp;nbsp;at org.apache.hadoop.util.Shell.getWinUtilsPath(Shell.java:339) 
&lt;BR /&gt;&amp;nbsp;at org.apache.hadoop.util.Shell.&amp;lt;clinit&amp;gt;(Shell.java:332) 
&lt;BR /&gt;&amp;nbsp;at org.apache.hadoop.util.StringUtils.&amp;lt;clinit&amp;gt;(StringUtils.java:78) 
&lt;BR /&gt;&amp;nbsp;at org.apache.hadoop.security.Groups.parseStaticMapping(Groups.java:93) 
&lt;BR /&gt;&amp;nbsp;at org.apache.hadoop.security.Groups.&amp;lt;init&amp;gt;(Groups.java:77) 
&lt;BR /&gt;&amp;nbsp;at org.apache.hadoop.security.Groups.getUserToGroupsMappingService(Groups.java:240) 
&lt;BR /&gt;&amp;nbsp;at org.apache.hadoop.security.UserGroupInformation.initialize(UserGroupInformation.java:256) 
&lt;BR /&gt;&amp;nbsp;at org.apache.hadoop.security.UserGroupInformation.setConfiguration(UserGroupInformation.java:284) 
&lt;BR /&gt;&amp;nbsp;at org.apache.spark.deploy.SparkHadoopUtil.&amp;lt;init&amp;gt;(SparkHadoopUtil.scala:36) 
&lt;BR /&gt;&amp;nbsp;at org.apache.spark.deploy.SparkHadoopUtil$.&amp;lt;init&amp;gt;(SparkHadoopUtil.scala:109) 
&lt;BR /&gt;&amp;nbsp;at org.apache.spark.deploy.SparkHadoopUtil$.&amp;lt;clinit&amp;gt;(SparkHadoopUtil.scala) 
&lt;BR /&gt;&amp;nbsp;at org.apache.spark.SparkContext.&amp;lt;init&amp;gt;(SparkContext.scala:228) 
&lt;BR /&gt;&amp;nbsp;at org.apache.spark.streaming.StreamingContext$.createNewSparkContext(StreamingContext.scala:549) 
&lt;BR /&gt;&amp;nbsp;at org.apache.spark.streaming.StreamingContext.&amp;lt;init&amp;gt;(StreamingContext.scala:75) 
&lt;BR /&gt;&amp;nbsp;at org.apache.spark.streaming.api.java.JavaStreamingContext.&amp;lt;init&amp;gt;(JavaStreamingContext.scala:130) 
&lt;BR /&gt;&amp;nbsp;at sparkprojecttalend.sparkdemojob_0_1.SparkDemoJob.tSparkConnection_1Process(SparkDemoJob.java:666) 
&lt;BR /&gt;&amp;nbsp;at sparkprojecttalend.sparkdemojob_0_1.SparkDemoJob.runJobInTOS(SparkDemoJob.java:1023) 
&lt;BR /&gt;&amp;nbsp;at sparkprojecttalend.sparkdemojob_0_1.SparkDemoJob.main(SparkDemoJob.java:880) 
&lt;BR /&gt;: org.apache.spark.SparkContext - Added JAR at with timestamp 1441087951886 
&lt;BR /&gt;: org.apache.spark.SparkContext - Added JAR at with timestamp 1441087953230 
&lt;BR /&gt;: org.apache.spark.SparkContext - Added JAR at with timestamp 1441087953754 
&lt;BR /&gt;: org.apache.spark.SparkContext - Added JAR at with timestamp 1441087954321 
&lt;BR /&gt;: org.apache.spark.SparkContext - Added JAR at with timestamp 1441087954818 
&lt;BR /&gt;: org.apache.spark.SparkContext - Added JAR at with timestamp 1441087954979 
&lt;BR /&gt;: org.apache.spark.SparkContext - Added JAR at with timestamp 1441087955401 
&lt;BR /&gt;: org.apache.spark.SparkContext - Added JAR at with timestamp 1441087955966 
&lt;BR /&gt;: org.apache.spark.SparkContext - Added JAR at with timestamp 1441087956631 
&lt;BR /&gt;: org.apache.spark.SparkContext - Added JAR at with timestamp 1441087957092 
&lt;BR /&gt;: org.apache.spark.SparkContext - Added JAR at with timestamp 1441087957256 
&lt;BR /&gt;: org.apache.spark.SparkContext - Added JAR at with timestamp 1441087957623 
&lt;BR /&gt;: org.apache.spark.SparkContext - Added JAR at with timestamp 1441087958014 
&lt;BR /&gt;: org.apache.spark.SparkContext - Added JAR at with timestamp 1441087958280 
&lt;BR /&gt;: org.apache.spark.SparkContext - Added JAR at with timestamp 1441087958434 
&lt;BR /&gt;: org.apache.spark.SparkContext - Added JAR at with timestamp 1441087958445 
&lt;BR /&gt;: org.apache.spark.SparkContext - Added JAR at with timestamp 1441087958656 
&lt;BR /&gt;Exception in component tSparkConnection_1 
&lt;BR /&gt;java.io.FileNotFoundException: \Users\GRM5KOR\AppData\Local\Temp\routines_SPARKPROJECTTALEND_SparkDemoJob_1.jar (The system cannot find the path specified) 
&lt;BR /&gt;&amp;nbsp;at java.io.FileInputStream.open(Native Method) 
&lt;BR /&gt;&amp;nbsp;at java.io.FileInputStream.&amp;lt;init&amp;gt;(Unknown Source) 
&lt;BR /&gt;&amp;nbsp;at com.google.common.io.Files$FileByteSource.openStream(Files.java:124) 
&lt;BR /&gt;&amp;nbsp;at com.google.common.io.Files$FileByteSource.openStream(Files.java:114) 
&lt;BR /&gt;&amp;nbsp;at com.google.common.io.ByteSource.copyTo(ByteSource.java:202) 
&lt;BR /&gt;&amp;nbsp;at com.google.common.io.Files.copy(Files.java:436) 
&lt;BR /&gt;&amp;nbsp;at org.apache.spark.HttpFileServer.addFileToDir(HttpFileServer.scala:62) 
&lt;BR /&gt;&amp;nbsp;at org.apache.spark.HttpFileServer.addJar(HttpFileServer.scala:57) 
&lt;BR /&gt;&amp;nbsp;at org.apache.spark.SparkContext.addJar(SparkContext.scala:944) 
&lt;BR /&gt;&amp;nbsp;at org.apache.spark.SparkContext$$anonfun$11.apply(SparkContext.scala:265) 
&lt;BR /&gt;&amp;nbsp;at org.apache.spark.SparkContext$$anonfun$11.apply(SparkContext.scala:265) 
&lt;BR /&gt;&amp;nbsp;at scala.collection.immutable.List.foreach(List.scala:318) 
&lt;BR /&gt;&amp;nbsp;at org.apache.spark.SparkContext.&amp;lt;init&amp;gt;(SparkContext.scala:265) 
&lt;BR /&gt; disconnected 
&lt;BR /&gt;&amp;nbsp;at org.apache.spark.streaming.StreamingContext$.createNewSparkContext(StreamingContext.scala:549) 
&lt;BR /&gt;&amp;nbsp;at org.apache.spark.streaming.StreamingContext.&amp;lt;init&amp;gt;(StreamingContext.scala:75) 
&lt;BR /&gt;&amp;nbsp;at org.apache.spark.streaming.api.java.JavaStreamingContext.&amp;lt;init&amp;gt;(JavaStreamingContext.scala:130) 
&lt;BR /&gt;&amp;nbsp;at sparkprojecttalend.sparkdemojob_0_1.SparkDemoJob.tSparkConnection_1Process(SparkDemoJob.java:666) 
&lt;BR /&gt;&amp;nbsp;at sparkprojecttalend.sparkdemojob_0_1.SparkDemoJob.runJobInTOS(SparkDemoJob.java:1023) 
&lt;BR /&gt;&amp;nbsp;at sparkprojecttalend.sparkdemojob_0_1.SparkDemoJob.main(SparkDemoJob.java:880) 
&lt;BR /&gt;Job SparkDemoJob ended at 11:42 01/09/2015. 
&lt;BR /&gt;The error log in short is due to file not found 
&lt;BR /&gt;Exception in component tSparkConnection_1 
&lt;BR /&gt;java.io.FileNotFoundException: \Users\GRM5KOR\AppData\Local\Temp\routines_SPARKPROJECTTALEND_SparkDemoJob_1.jar 
&lt;BR /&gt;Please help me resolving this.</description>
      <pubDate>Sat, 16 Nov 2024 11:04:11 GMT</pubDate>
      <guid>https://community.qlik.com/t5/Talend-Studio/Exception-in-component-tSparkConnection-1-java-io/m-p/2243061#M29689</guid>
      <dc:creator>_AnonymousUser</dc:creator>
      <dc:date>2024-11-16T11:04:11Z</dc:date>
    </item>
    <item>
      <title>Re: Exception in component tSparkConnection_1 java.io.FileNotFoundExceptio</title>
      <link>https://community.qlik.com/t5/Talend-Studio/Exception-in-component-tSparkConnection-1-java-io/m-p/2243062#M29690</link>
      <description>I think I could figure it out myself.&lt;BR /&gt;I had to change the jvm temp location to a different location as below using tjava component and I was able to connect to spark.</description>
      <pubDate>Tue, 01 Sep 2015 10:38:18 GMT</pubDate>
      <guid>https://community.qlik.com/t5/Talend-Studio/Exception-in-component-tSparkConnection-1-java-io/m-p/2243062#M29690</guid>
      <dc:creator>_AnonymousUser</dc:creator>
      <dc:date>2015-09-01T10:38:18Z</dc:date>
    </item>
    <item>
      <title>Re: Exception in component tSparkConnection_1 java.io.FileNotFoundExceptio</title>
      <link>https://community.qlik.com/t5/Talend-Studio/Exception-in-component-tSparkConnection-1-java-io/m-p/2243063#M29691</link>
      <description>code used in tjava
&lt;BR /&gt;System.setProperty("java.io.tmpdir", "d:/temp");</description>
      <pubDate>Tue, 01 Sep 2015 10:39:43 GMT</pubDate>
      <guid>https://community.qlik.com/t5/Talend-Studio/Exception-in-component-tSparkConnection-1-java-io/m-p/2243063#M29691</guid>
      <dc:creator>_AnonymousUser</dc:creator>
      <dc:date>2015-09-01T10:39:43Z</dc:date>
    </item>
  </channel>
</rss>

