Do not input private or sensitive data. View Qlik Privacy & Cookie Policy.
Skip to main content

Announcements
Qlik Open Lakehouse is Now Generally Available! Discover the key highlights and partner resources here.
cancel
Showing results for 
Search instead for 
Did you mean: 
SI3
Contributor
Contributor

Getting a runtime exception in studio - Unsupported class file major version 55

My Talend Studio is on java 11.0.9 2020-10-20 LTS. While running a big data batch job using tJava/JavaRDD, I get the following. Any ideas on how to fix this?

Note : I have a plenty of maven dependencies since I use Azure SDK (azure-storage-file-datalake, azure-core, azure-identity etc) .

java.lang.IllegalArgumentException: Unsupported class file major version 55

at org.apache.xbean.asm6.ClassReader.<init>(ClassReader.java:166)

at org.apache.xbean.asm6.ClassReader.<init>(ClassReader.java:148)

at org.apache.xbean.asm6.ClassReader.<init>(ClassReader.java:136)

at org.apache.xbean.asm6.ClassReader.<init>(ClassReader.java:237)

at org.apache.spark.util.ClosureCleaner$.getClassReader(ClosureCleaner.scala:49)

at org.apache.spark.util.FieldAccessFinder$$anon$3$$anonfun$visitMethodInsn$2.apply(ClosureCleaner.scala:517)

at org.apache.spark.util.FieldAccessFinder$$anon$3$$anonfun$visitMethodInsn$2.apply(ClosureCleaner.scala:500)

at scala.collection.TraversableLike$WithFilter$$anonfun$foreach$1.apply(TraversableLike.scala:733)

at scala.collection.mutable.HashMap$$anon$1$$anonfun$foreach$2.apply(HashMap.scala:103)

at scala.collection.mutable.HashMap$$anon$1$$anonfun$foreach$2.apply(HashMap.scala:103)

at scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:230)

at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:40)

at scala.collection.mutable.HashMap$$anon$1.foreach(HashMap.scala:103)

at scala.collection.TraversableLike$WithFilter.foreach(TraversableLike.scala:732)

at org.apache.spark.util.FieldAccessFinder$$anon$3.visitMethodInsn(ClosureCleaner.scala:500)

at org.apache.xbean.asm6.ClassReader.readCode(ClassReader.java:2175)

at org.apache.xbean.asm6.ClassReader.readMethod(ClassReader.java:1238)

at org.apache.xbean.asm6.ClassReader.accept(ClassReader.java:631)

at org.apache.xbean.asm6.ClassReader.accept(ClassReader.java:355)

at org.apache.spark.util.ClosureCleaner$$anonfun$org$apache$spark$util$ClosureCleaner$$clean$14.apply(ClosureCleaner.scala:307)

at org.apache.spark.util.ClosureCleaner$$anonfun$org$apache$spark$util$ClosureCleaner$$clean$14.apply(ClosureCleaner.scala:306)

at scala.collection.immutable.List.foreach(List.scala:381)

at org.apache.spark.util.ClosureCleaner$.org$apache$spark$util$ClosureCleaner$$clean(ClosureCleaner.scala:306)

at org.apache.spark.util.ClosureCleaner$.clean(ClosureCleaner.scala:162)

at org.apache.spark.SparkContext.clean(SparkContext.scala:2326)

at org.apache.spark.SparkContext.runJob(SparkContext.scala:2100)

at org.apache.spark.SparkContext.runJob(SparkContext.scala:2126)

at org.apache.spark.rdd.RDD$$anonfun$collect$1.apply(RDD.scala:945)

at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)

at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112)

at org.apache.spark.rdd.RDD.withScope(RDD.scala:363)

at org.apache.spark.rdd.RDD.collect(RDD.scala:944)

at org.apache.spark.api.java.JavaRDDLike$class.collect(JavaRDDLike.scala:361)

at org.apache.spark.api.java.AbstractJavaRDDLike.collect(JavaRDDLike.scala:45)

Labels (2)
1 Reply
Anonymous
Not applicable

This looks like it it may be connected to Spark 2.4. Can you give a bit more detail? Looking online it suggests upgrading to Spark 3.0