Skip to main content
Announcements
Join us at Qlik Connect for 3 magical days of learning, networking,and inspiration! REGISTER TODAY and save!
cancel
Showing results for 
Search instead for 
Did you mean: 
albert1706
Contributor
Contributor

tkafkainput error when executing it in TAC

Hi everyone,

as you can see in the picture kafka1.png I have a very simple job just tkafkainput and log component, I already deploy it in TAC and run it, for several seconds I watch the log seems good some Kafka message appears in the log then it stops running because this error below, and I realize this error occurs if only tkafkainput has already consumed some message like if there is not any message this job just running well, yes I already test it with a simple message like "hello world", the message appears in the log then same error appears. can anybody know the solution?

[INFO ] 02:48:43 org.apache.spark.streaming.scheduler.JobScheduler- Added jobs for time 1671738523000 ms

[ERROR] 02:48:43 org.apache.spark.executor.Executor- Exception in task 0.0 in stage 0.0 (TID 0)

java.lang.NullPointerException: null

at anteraja_prd.kafka_consumer_log_0_5.KAFKA_CONSUMER_LOG$tKafkaInput_1_Function.call(KAFKA_CONSUMER_LOG.java:393) ~[kafka_consumer_log_0_5.jar:?]

at anteraja_prd.kafka_consumer_log_0_5.KAFKA_CONSUMER_LOG$tKafkaInput_1_Function.call(KAFKA_CONSUMER_LOG.java:1) ~[kafka_consumer_log_0_5.jar:?]

at org.apache.spark.api.java.JavaPairRDD$.$anonfun$pairFunToScalaFun$1(JavaPairRDD.scala:1044) ~[spark-core_2.12-3.0.3.jar:3.0.3]

at scala.collection.Iterator$$anon$10.next(Iterator.scala:459) ~[scala-library-2.12.10.jar:?]

at scala.collection.Iterator$$anon$10.next(Iterator.scala:459) ~[scala-library-2.12.10.jar:?]

at scala.collection.Iterator.foreach(Iterator.scala:941) ~[scala-library-2.12.10.jar:?]

at scala.collection.Iterator.foreach$(Iterator.scala:941) ~[scala-library-2.12.10.jar:?]

at scala.collection.AbstractIterator.foreach(Iterator.scala:1429) ~[scala-library-2.12.10.jar:?]

at scala.collection.generic.Growable.$plus$plus$eq(Growable.scala:62) ~[scala-library-2.12.10.jar:?]

at scala.collection.generic.Growable.$plus$plus$eq$(Growable.scala:53) ~[scala-library-2.12.10.jar:?]

at scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:105) ~[scala-library-2.12.10.jar:?]

at scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:49) ~[scala-library-2.12.10.jar:?]

at scala.collection.TraversableOnce.to(TraversableOnce.scala:315) ~[scala-library-2.12.10.jar:?]

at scala.collection.TraversableOnce.to$(TraversableOnce.scala:313) ~[scala-library-2.12.10.jar:?]

at scala.collection.AbstractIterator.to(Iterator.scala:1429) ~[scala-library-2.12.10.jar:?]

at scala.collection.TraversableOnce.toBuffer(TraversableOnce.scala:307) ~[scala-library-2.12.10.jar:?]

at scala.collection.TraversableOnce.toBuffer$(TraversableOnce.scala:307) ~[scala-library-2.12.10.jar:?]

at scala.collection.AbstractIterator.toBuffer(Iterator.scala:1429) ~[scala-library-2.12.10.jar:?]

at scala.collection.TraversableOnce.toArray(TraversableOnce.scala:294) ~[scala-library-2.12.10.jar:?]

at scala.collection.TraversableOnce.toArray$(TraversableOnce.scala:288) ~[scala-library-2.12.10.jar:?]

at scala.collection.AbstractIterator.toArray(Iterator.scala:1429) ~[scala-library-2.12.10.jar:?]

at org.apache.spark.rdd.RDD.$anonfun$collect$2(RDD.scala:1004) ~[spark-core_2.12-3.0.3.jar:3.0.3]

at org.apache.spark.SparkContext.$anonfun$runJob$5(SparkContext.scala:2154) ~[spark-core_2.12-3.0.3.jar:3.0.3]

at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90) ~[spark-core_2.12-3.0.3.jar:3.0.3]

at org.apache.spark.scheduler.Task.run(Task.scala:127) ~[spark-core_2.12-3.0.3.jar:3.0.3]

at org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$3(Executor.scala:463) ~[spark-core_2.12-3.0.3.jar:3.0.3]

at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1377) ~[spark-core_2.12-3.0.3.jar:3.0.3]

at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:466) [spark-core_2.12-3.0.3.jar:3.0.3]

at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) [?:1.8.0_322]

at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) [?:1.8.0_322]

at java.lang.Thread.run(Thread.java:750) [?:1.8.0_322]

Labels (3)
4 Replies
Anonymous
Not applicable

Hello,

We need a little bit more information to address your issue.

Is it a standard job or big data steaming job? Could you please confirm if you are using Apache Kafka or Confluent Kafka?

Is there any more error message in TAC Technical or execution log?

Best regards

Sabrina

albert1706
Contributor
Contributor
Author

Hi,

Thanks for replying,

it's a bug in talend studio [null body with an existing key on the topic], I already create a ticket for this issue

albert1706
Contributor
Contributor
Author

it's a big data streaming job using talend studio v8.0.1 and the messages are written to Kafka using CDC MongoDB Debezium connector

Anonymous
Not applicable

Hello,

Thanks for your prompt reply. Have you already raised a jira issue on talend bug tracker or support case on talend support portal?

Best regards

Sabrina