<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Getting a runtime exception in studio - Unsupported class file major version 55 in Talend Studio</title>
    <link>https://community.qlik.com/t5/Talend-Studio/Getting-a-runtime-exception-in-studio-Unsupported-class-file/m-p/2326415#M95904</link>
    <description>&lt;P&gt;My Talend Studio is on java 11.0.9 2020-10-20 LTS. While running a big data batch job using tJava/JavaRDD, I get the following. Any ideas on how to fix this?&lt;/P&gt;&lt;P&gt;Note : I have a plenty of maven dependencies since I use Azure SDK (azure-storage-file-datalake, azure-core, azure-identity etc) . &lt;/P&gt;&lt;P&gt;java.lang.IllegalArgumentException: Unsupported class file major version 55&lt;/P&gt;&lt;P&gt;	at org.apache.xbean.asm6.ClassReader.&amp;lt;init&amp;gt;(ClassReader.java:166)&lt;/P&gt;&lt;P&gt;	at org.apache.xbean.asm6.ClassReader.&amp;lt;init&amp;gt;(ClassReader.java:148)&lt;/P&gt;&lt;P&gt;	at org.apache.xbean.asm6.ClassReader.&amp;lt;init&amp;gt;(ClassReader.java:136)&lt;/P&gt;&lt;P&gt;	at org.apache.xbean.asm6.ClassReader.&amp;lt;init&amp;gt;(ClassReader.java:237)&lt;/P&gt;&lt;P&gt;	at org.apache.spark.util.ClosureCleaner$.getClassReader(ClosureCleaner.scala:49)&lt;/P&gt;&lt;P&gt;	at org.apache.spark.util.FieldAccessFinder$$anon$3$$anonfun$visitMethodInsn$2.apply(ClosureCleaner.scala:517)&lt;/P&gt;&lt;P&gt;	at org.apache.spark.util.FieldAccessFinder$$anon$3$$anonfun$visitMethodInsn$2.apply(ClosureCleaner.scala:500)&lt;/P&gt;&lt;P&gt;	at scala.collection.TraversableLike$WithFilter$$anonfun$foreach$1.apply(TraversableLike.scala:733)&lt;/P&gt;&lt;P&gt;	at scala.collection.mutable.HashMap$$anon$1$$anonfun$foreach$2.apply(HashMap.scala:103)&lt;/P&gt;&lt;P&gt;	at scala.collection.mutable.HashMap$$anon$1$$anonfun$foreach$2.apply(HashMap.scala:103)&lt;/P&gt;&lt;P&gt;	at scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:230)&lt;/P&gt;&lt;P&gt;	at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:40)&lt;/P&gt;&lt;P&gt;	at scala.collection.mutable.HashMap$$anon$1.foreach(HashMap.scala:103)&lt;/P&gt;&lt;P&gt;	at scala.collection.TraversableLike$WithFilter.foreach(TraversableLike.scala:732)&lt;/P&gt;&lt;P&gt;	at org.apache.spark.util.FieldAccessFinder$$anon$3.visitMethodInsn(ClosureCleaner.scala:500)&lt;/P&gt;&lt;P&gt;	at org.apache.xbean.asm6.ClassReader.readCode(ClassReader.java:2175)&lt;/P&gt;&lt;P&gt;	at org.apache.xbean.asm6.ClassReader.readMethod(ClassReader.java:1238)&lt;/P&gt;&lt;P&gt;	at org.apache.xbean.asm6.ClassReader.accept(ClassReader.java:631)&lt;/P&gt;&lt;P&gt;	at org.apache.xbean.asm6.ClassReader.accept(ClassReader.java:355)&lt;/P&gt;&lt;P&gt;	at org.apache.spark.util.ClosureCleaner$$anonfun$org$apache$spark$util$ClosureCleaner$$clean$14.apply(ClosureCleaner.scala:307)&lt;/P&gt;&lt;P&gt;	at org.apache.spark.util.ClosureCleaner$$anonfun$org$apache$spark$util$ClosureCleaner$$clean$14.apply(ClosureCleaner.scala:306)&lt;/P&gt;&lt;P&gt;	at scala.collection.immutable.List.foreach(List.scala:381)&lt;/P&gt;&lt;P&gt;	at org.apache.spark.util.ClosureCleaner$.org$apache$spark$util$ClosureCleaner$$clean(ClosureCleaner.scala:306)&lt;/P&gt;&lt;P&gt;	at org.apache.spark.util.ClosureCleaner$.clean(ClosureCleaner.scala:162)&lt;/P&gt;&lt;P&gt;	at org.apache.spark.SparkContext.clean(SparkContext.scala:2326)&lt;/P&gt;&lt;P&gt;	at org.apache.spark.SparkContext.runJob(SparkContext.scala:2100)&lt;/P&gt;&lt;P&gt;	at org.apache.spark.SparkContext.runJob(SparkContext.scala:2126)&lt;/P&gt;&lt;P&gt;	at org.apache.spark.rdd.RDD$$anonfun$collect$1.apply(RDD.scala:945)&lt;/P&gt;&lt;P&gt;	at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)&lt;/P&gt;&lt;P&gt;	at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112)&lt;/P&gt;&lt;P&gt;	at org.apache.spark.rdd.RDD.withScope(RDD.scala:363)&lt;/P&gt;&lt;P&gt;	at org.apache.spark.rdd.RDD.collect(RDD.scala:944)&lt;/P&gt;&lt;P&gt;	at org.apache.spark.api.java.JavaRDDLike$class.collect(JavaRDDLike.scala:361)&lt;/P&gt;&lt;P&gt;	at org.apache.spark.api.java.AbstractJavaRDDLike.collect(JavaRDDLike.scala:45)&lt;/P&gt;</description>
    <pubDate>Sat, 16 Nov 2024 00:35:56 GMT</pubDate>
    <dc:creator>SI3</dc:creator>
    <dc:date>2024-11-16T00:35:56Z</dc:date>
    <item>
      <title>Getting a runtime exception in studio - Unsupported class file major version 55</title>
      <link>https://community.qlik.com/t5/Talend-Studio/Getting-a-runtime-exception-in-studio-Unsupported-class-file/m-p/2326415#M95904</link>
      <description>&lt;P&gt;My Talend Studio is on java 11.0.9 2020-10-20 LTS. While running a big data batch job using tJava/JavaRDD, I get the following. Any ideas on how to fix this?&lt;/P&gt;&lt;P&gt;Note : I have a plenty of maven dependencies since I use Azure SDK (azure-storage-file-datalake, azure-core, azure-identity etc) . &lt;/P&gt;&lt;P&gt;java.lang.IllegalArgumentException: Unsupported class file major version 55&lt;/P&gt;&lt;P&gt;	at org.apache.xbean.asm6.ClassReader.&amp;lt;init&amp;gt;(ClassReader.java:166)&lt;/P&gt;&lt;P&gt;	at org.apache.xbean.asm6.ClassReader.&amp;lt;init&amp;gt;(ClassReader.java:148)&lt;/P&gt;&lt;P&gt;	at org.apache.xbean.asm6.ClassReader.&amp;lt;init&amp;gt;(ClassReader.java:136)&lt;/P&gt;&lt;P&gt;	at org.apache.xbean.asm6.ClassReader.&amp;lt;init&amp;gt;(ClassReader.java:237)&lt;/P&gt;&lt;P&gt;	at org.apache.spark.util.ClosureCleaner$.getClassReader(ClosureCleaner.scala:49)&lt;/P&gt;&lt;P&gt;	at org.apache.spark.util.FieldAccessFinder$$anon$3$$anonfun$visitMethodInsn$2.apply(ClosureCleaner.scala:517)&lt;/P&gt;&lt;P&gt;	at org.apache.spark.util.FieldAccessFinder$$anon$3$$anonfun$visitMethodInsn$2.apply(ClosureCleaner.scala:500)&lt;/P&gt;&lt;P&gt;	at scala.collection.TraversableLike$WithFilter$$anonfun$foreach$1.apply(TraversableLike.scala:733)&lt;/P&gt;&lt;P&gt;	at scala.collection.mutable.HashMap$$anon$1$$anonfun$foreach$2.apply(HashMap.scala:103)&lt;/P&gt;&lt;P&gt;	at scala.collection.mutable.HashMap$$anon$1$$anonfun$foreach$2.apply(HashMap.scala:103)&lt;/P&gt;&lt;P&gt;	at scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:230)&lt;/P&gt;&lt;P&gt;	at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:40)&lt;/P&gt;&lt;P&gt;	at scala.collection.mutable.HashMap$$anon$1.foreach(HashMap.scala:103)&lt;/P&gt;&lt;P&gt;	at scala.collection.TraversableLike$WithFilter.foreach(TraversableLike.scala:732)&lt;/P&gt;&lt;P&gt;	at org.apache.spark.util.FieldAccessFinder$$anon$3.visitMethodInsn(ClosureCleaner.scala:500)&lt;/P&gt;&lt;P&gt;	at org.apache.xbean.asm6.ClassReader.readCode(ClassReader.java:2175)&lt;/P&gt;&lt;P&gt;	at org.apache.xbean.asm6.ClassReader.readMethod(ClassReader.java:1238)&lt;/P&gt;&lt;P&gt;	at org.apache.xbean.asm6.ClassReader.accept(ClassReader.java:631)&lt;/P&gt;&lt;P&gt;	at org.apache.xbean.asm6.ClassReader.accept(ClassReader.java:355)&lt;/P&gt;&lt;P&gt;	at org.apache.spark.util.ClosureCleaner$$anonfun$org$apache$spark$util$ClosureCleaner$$clean$14.apply(ClosureCleaner.scala:307)&lt;/P&gt;&lt;P&gt;	at org.apache.spark.util.ClosureCleaner$$anonfun$org$apache$spark$util$ClosureCleaner$$clean$14.apply(ClosureCleaner.scala:306)&lt;/P&gt;&lt;P&gt;	at scala.collection.immutable.List.foreach(List.scala:381)&lt;/P&gt;&lt;P&gt;	at org.apache.spark.util.ClosureCleaner$.org$apache$spark$util$ClosureCleaner$$clean(ClosureCleaner.scala:306)&lt;/P&gt;&lt;P&gt;	at org.apache.spark.util.ClosureCleaner$.clean(ClosureCleaner.scala:162)&lt;/P&gt;&lt;P&gt;	at org.apache.spark.SparkContext.clean(SparkContext.scala:2326)&lt;/P&gt;&lt;P&gt;	at org.apache.spark.SparkContext.runJob(SparkContext.scala:2100)&lt;/P&gt;&lt;P&gt;	at org.apache.spark.SparkContext.runJob(SparkContext.scala:2126)&lt;/P&gt;&lt;P&gt;	at org.apache.spark.rdd.RDD$$anonfun$collect$1.apply(RDD.scala:945)&lt;/P&gt;&lt;P&gt;	at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)&lt;/P&gt;&lt;P&gt;	at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112)&lt;/P&gt;&lt;P&gt;	at org.apache.spark.rdd.RDD.withScope(RDD.scala:363)&lt;/P&gt;&lt;P&gt;	at org.apache.spark.rdd.RDD.collect(RDD.scala:944)&lt;/P&gt;&lt;P&gt;	at org.apache.spark.api.java.JavaRDDLike$class.collect(JavaRDDLike.scala:361)&lt;/P&gt;&lt;P&gt;	at org.apache.spark.api.java.AbstractJavaRDDLike.collect(JavaRDDLike.scala:45)&lt;/P&gt;</description>
      <pubDate>Sat, 16 Nov 2024 00:35:56 GMT</pubDate>
      <guid>https://community.qlik.com/t5/Talend-Studio/Getting-a-runtime-exception-in-studio-Unsupported-class-file/m-p/2326415#M95904</guid>
      <dc:creator>SI3</dc:creator>
      <dc:date>2024-11-16T00:35:56Z</dc:date>
    </item>
    <item>
      <title>Re: Getting a runtime exception in studio - Unsupported class file major version 55</title>
      <link>https://community.qlik.com/t5/Talend-Studio/Getting-a-runtime-exception-in-studio-Unsupported-class-file/m-p/2326416#M95905</link>
      <description>&lt;P&gt;This looks like it it may be connected to Spark 2.4. Can you give a bit more detail? Looking online it suggests upgrading to Spark 3.0&lt;/P&gt;</description>
      <pubDate>Fri, 28 May 2021 12:11:40 GMT</pubDate>
      <guid>https://community.qlik.com/t5/Talend-Studio/Getting-a-runtime-exception-in-studio-Unsupported-class-file/m-p/2326416#M95905</guid>
      <dc:creator>Anonymous</dc:creator>
      <dc:date>2021-05-28T12:11:40Z</dc:date>
    </item>
  </channel>
</rss>

