Skip to main content
Announcements
Introducing Qlik Answers: A plug-and-play, Generative AI powered RAG solution. READ ALL ABOUT IT!
cancel
Showing results for 
Search instead for 
Did you mean: 
MoeyE
Partner - Creator III
Partner - Creator III

Databricks cdc errors and latency

Hi guys, 
 
I just wanna check if anyone has seen this issue before?
 
Been having cdc issues and high latency. We check the logs and we saw this error a bunch of times for insert, update, delete, merge which has been occuring for about 1 month.  We've restarted the cluster and refreshed the tables as suggested in the error message but this error still persists. source and handling latency are about 8 hours each. Next step I might increase source_capture, target_apply, sorter and stream to try find more info as to the cause of these errors.
 
Failed (retcode -1) to execute statement: 'INSERT INTO `table`
 
RetCode: SQL_ERROR  SqlState: HY000 NativeError: 35 Message: [Simba][Hardy] (35) Error from server: error code: '0' error message: 'org.apache.hive.service.cli.HiveSQLException: Error running query: org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 3583754.0 failed 4 times, most recent failure: Lost task 0.3 in stage 3583754.0 (TID 6325042) (10.30.144.133 executor 560): com.databricks.sql.io.FileReadException: Error while reading file abfss:REDACTED_LOCAL_PART@someplace.dfs.core.windows.net/someplaceelse/attrep_changes6305620283B82FB1/CDC00000002.csv.gz. [DEFAULT_FILE_NOT_FOUND] It is possible the underlying files have been updated. You can explicitly invalidate the cache in Spark by running 'REFRESH TABLE tableName' command in SQL or by recreating the Dataset/DataFrame involved. If disk cache is stale or the underlying files have been removed, you can invalidate disk cache manually by restarting the cluster.
 
Regards,
Mohammed
Labels (1)
1 Reply
john_wang
Support
Support

Hello @MoeyE ,

Thanks for reaching out to Qlik Community!

What's the Qlik Replicate version please? and also please show more lines around the error, I'd like to see what's the SQL or operations Qlik Replicate is trying to do now.

Thank you,

John.

Help users find answers! Do not forget to mark a solution that worked for you! If already marked, give it a thumbs up!