Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
Hi,
I'm getting the below while adding tRedshiftconfiguration component and submitting as Spark job.
Any resolution or work out ?
[ERROR]: org.apache.spark.SparkContext - Error initializing SparkContext.
java.lang.NullPointerException
at scala.collection.mutable.ArrayOps$ofRef$.newBuilder$extension(ArrayOps.scala:190)
at scala.collection.mutable.ArrayOps$ofRef.newBuilder(ArrayOps.scala:186)
at scala.collection.TraversableLike$class.filterImpl(TraversableLike.scala:246)
at scala.collection.TraversableLike$class.filter(TraversableLike.scala:259)
at scala.collection.mutable.ArrayOps$ofRef.filter(ArrayOps.scala:186)
at org.apache.spark.deploy.yarn.Client$$anonfun$prepareLocalResources$6.apply(Client.scala:529)
at org.apache.spark.deploy.yarn.Client$$anonfun$prepareLocalResources$6.apply(Client.scala:525)
at scala.collection.mutable.ArraySeq.foreach(ArraySeq.scala:74)
at org.apache.spark.deploy.yarn.Client.prepareLocalResources(Client.scala:525)
at org.apache.spark.deploy.yarn.Client.createContainerLaunchContext(Client.scala:863)
at org.apache.spark.deploy.yarn.Client.submitApplication(Client.scala:169)
at org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend.start(YarnClientSchedulerBackend.scala:57)
at org.apache.spark.scheduler.TaskSchedulerImpl.start(TaskSchedulerImpl.scala:164)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:500)
at org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:58)
at .ld2parquet_0_1.LdParquet.runJobInTOS(LdParquet.java:1048)
at .ld2parquet_0_1.LdParquet.main(LdParquet.java:941)
java.lang.NullPointerException
at scala.collection.mutable.ArrayOps$ofRef$.newBuilder$extension(ArrayOps.scala:190)
at scala.collection.mutable.ArrayOps$ofRef.newBuilder(ArrayOps.scala:186)
at scala.collection.TraversableLike$class.filterImpl(TraversableLike.scala:246)
at scala.collection.TraversableLike$class.filter(TraversableLike.scala:259)
at scala.collection.mutable.ArrayOps$ofRef.filter(ArrayOps.scala:186)
at org.apache.spark.deploy.yarn.Client$$anonfun$prepareLocalResources$6.apply(Client.scala:529)
at org.apache.spark.deploy.yarn.Client$$anonfun$prepareLocalResources$6.apply(Client.scala:525)
at scala.collection.mutable.ArraySeq.foreach(ArraySeq.scala:74)
at org.apache.spark.deploy.yarn.Client.prepareLocalResources(Client.scala:525)
at org.apache.spark.deploy.yarn.Client.createContainerLaunchContext(Client.scala:863)
at org.apache.spark.deploy.yarn.Client.submitApplication(Client.scala:169)
at org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend.start(YarnClientSchedulerBackend.scala:57)
at org.apache.spark.scheduler.TaskSchedulerImpl.start(TaskSchedulerImpl.scala:164)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:500)
at org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:58)
at sld2parquet_0_1.LdParquet.runJobInTOS(Ld.java:1048)
at s.ldy2parquet_0_1.LdParquet.main(LdParquet.java:941)
Hi,
The job is giving null pointer exception. Could you please double check whether all the values have been provided for Redshift and Spark Configuration?
Warm Regards,
Nikhil Thampi
Please appreciate our Talend community members by giving Kudos for sharing their time for your query. If your query is answered, please mark the topic as resolved
Hi Nikhil,
Job running fine with the current Spark configuration, getting the below issue only while adding the tRedshiftConfiguration component. All the Redshift parameters are passed, non is passed empty or null values in connection parameters.
Hi,
I am not able to identify the error from current error message. I would recommend you to create a support ticket so that they can remotely check possible root cause using screen sharing session.
Warm Regards,
Nikhil Thampi
Please appreciate our Talend community members by giving Kudos for sharing their time for your query. If your query is answered, please mark the topic as resolved
I had the same issue, This issue is fixed in the latest patch.
Patch_20200407_TPS-3939_v1-7.1.1.zip