Do not input private or sensitive data. View Qlik Privacy & Cookie Policy.
Skip to main content

Announcements
Join us in Toronto Sept 9th for Qlik's AI Reality Tour! Register Now
cancel
Showing results for 
Search instead for 
Did you mean: 
sakura99
Contributor III
Contributor III

How to set parameter java.lang.ClassCastException: scala.None$ cannot be cast to scala.collection.Seq

Halo, I have been facing talend issue like this:

WARN ]: org.apache.spark.scheduler.TaskSetManager - Lost task 0.0 in stage 0.0 (TID 0, xxx.com, executor 1): java.io.IOException: java.lang.ClassCastException: scala.None$ cannot be cast to scala.collection.Seq

at org.apache.spark.util.Utils$.tryOrIOException(Utils.scala:1381)

at org.apache.spark.rdd.ParallelCollectionPartition.readObject(ParallelCollectionRDD.scala:70)

at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)

at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)

at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)

at java.lang.reflect.Method.invoke(Method.java:498)

at java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1170)

at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2177)

at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2068)

at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1572)

at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2286)

at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2210)

at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2068)

at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1572)

at java.io.ObjectInputStream.readObject(ObjectInputStream.java:430)

at org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:75)

at org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:114)

at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:375)

at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)

at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)

at java.lang.Thread.run(Thread.java:748)

Caused by: java.lang.ClassCastException: scala.None$ cannot be cast to scala.collection.Seq

at org.apache.spark.rdd.ParallelCollectionPartition$$anonfun$readObject$1$$anonfun$apply$mcV$sp$2.apply(ParallelCollectionRDD.scala:80)

at org.apache.spark.rdd.ParallelCollectionPartition$$anonfun$readObject$1$$anonfun$apply$mcV$sp$2.apply(ParallelCollectionRDD.scala:80)

at org.apache.spark.util.Utils$.deserializeViaNestedStream(Utils.scala:211)

at org.apache.spark.rdd.ParallelCollectionPartition$$anonfun$readObject$1.apply$mcV$sp(ParallelCollectionRDD.scala:80)

at org.apache.spark.rdd.ParallelCollectionPartition$$anonfun$readObject$1.apply(ParallelCollectionRDD.scala:70)

at org.apache.spark.rdd.ParallelCollectionPartition$$anonfun$readObject$1.apply(ParallelCollectionRDD.scala:70)

at org.apache.spark.util.Utils$.tryOrIOException(Utils.scala:1374)

... 20 more

[ERROR]: org.apache.spark.scheduler.TaskSetManager - Task 0 in stage 0.0 failed 4 times; aborting job

anyone have a idea of what happens here? Can we fix this problem with adding parameter in talend or something?

1 Solution

Accepted Solutions
sakura99
Contributor III
Contributor III
Author

Dear Sabrina,

thank you for your time. I already reinstall my talend and the issue was solved.

View solution in original post

3 Replies
Anonymous
Not applicable

Hello,

 

Is there any detailed steps to repro your issue? What does your job look like? Could you upload the screenshot of your job?

Best regards

Sabrina

 

 

 

sakura99
Contributor III
Contributor III
Author

Dear Sabrina,

thank you for your time. I already reinstall my talend and the issue was solved.

Anonymous
Not applicable

Hello,

Thanks for your feedback and letting us know you have resolved this issue by yourself.

Best regards

Sabrina