Do not input private or sensitive data. View Qlik Privacy & Cookie Policy.
Skip to main content

Announcements
Join us in NYC Sept 4th for Qlik's AI Reality Tour! Register Now
cancel
Showing results for 
Search instead for 
Did you mean: 
Chirag_
Partner - Contributor III
Partner - Contributor III

Issue Using Talend to Load Data into Delta Lake (Databricks )using JDBC: Auto-Commit Error

I want to ingest Data from SQL server using talend studio with databricks to load data via JDBC into a Delta Lake table. what I’ve done so far is :
1.Configured Databricks and got essential credentials for connectivity (SQL Warehouse,JDBC URL and Access Token)

2.In Talend Studio:Configured tJDBCConnection with Databricks JDBC driver and Access Token ; and  test connection was successful.

Created the  same target table in Databricks.

Error encountered on running :

1.JDBC Output Commit Error: I got the error message "Cannot use commit while Connection is in auto-commit mode."

2.Commit Logic in tJDBCOutput: There’s no option to disable commit logic, which is causing issues.

It will be great if anyone can share  if they have face similar issue and how did they solved.

 

Detailed Error: [WARN ] 12:42:30 org.talend.components.jdbc.output.JDBCOutputInsertWriter- [Databricks][JDBC](10040) Cannot use commit while Connection is in auto-commit mode.

Labels (2)
1 Solution

Accepted Solutions
quentin-vigne
Partner - Creator II
Partner - Creator II

Did you select the "Use an existing connection" inside the tJDBCOutput component ? 

 

If you don't then it's going to use the "Commit every " option in advanced parameters and will give you this error.

View solution in original post

4 Replies
quentin-vigne
Partner - Creator II
Partner - Creator II

Hello @Chirag_ 

Inside the tJDBCConnection component go the advanced parameters tab and uncheck "Use auto- commit"

Then inside the tJDBCOutput don't forget to check "Use an existing connection" to use the connection your previously created without autocommit.

 

Once everything has been done you can then use a tJDBCCommit to commit the changes.

 

- Quentin.

Chirag_
Partner - Contributor III
Partner - Contributor III
Author

Hi, We Tried Doing that option but it still takes the commit manually.

I am attaching Snips for your reference.

Job Overview:

Chirag__0-1746447320494.png

 

TJdbCconnection:

Chirag__1-1746447352465.png

 

Error:

Chirag__2-1746447391706.png

 

Error:

[FATAL] 17:42:22 megh_dev.databricks_etl_0_1.Databricks_ETL- tDBOutput_1 (java.sql.SQLException) [Databricks][JDBC](10040) Cannot use commit while Connection is in auto-commit mode.

org.talend.sdk.component.api.exception.ComponentException: (java.sql.SQLException) [Databricks][JDBC](10040) Cannot use commit while Connection is in auto-commit mode.

at com.databricks.client.exceptions.ExceptionConverter.toSQLException(Unknown Source) ~[SparkJDBC42-2.6.32.1054.jar:?]

at com.databricks.client.jdbc.common.SConnection.commit(Unknown Source) ~[SparkJDBC42-2.6.32.1054.jar:?]

at org.talend.components.jdbc.output.JDBCOutputWriter.commitAndCloseAtLast(JDBCOutputWriter.java:216) ~[newjdbc-1.67.1.jar:?]

at org.talend.components.jdbc.output.JDBCOutputInsertWriter.close(JDBCOutputInsertWriter.java:143) ~[newjdbc-1.67.1.jar:?]

at org.talend.components.jdbc.output.OutputProcessor.release(OutputProcessor.java:261) ~[newjdbc-1.67.1.jar:?]

at jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:?]

at jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:77) ~[?:?]

at jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:?]

at java.lang.reflect.Method.invoke(Method.java:568) ~[?:?]

at org.talend.sdk.component.runtime.base.LifecycleImpl.doInvoke(LifecycleImpl.java:87) ~[component-runtime-impl-1.78.1.jar:?]

at org.talend.sdk.component.runtime.base.LifecycleImpl.lambda$invoke$1(LifecycleImpl.java:62) ~[component-runtime-impl-1.78.1.jar:?]

at java.util.stream.ForEachOps$ForEachOp$OfRef.accept(ForEachOps.java:183) ~[?:?]

at java.util.stream.ReferencePipeline$15$1.accept(ReferencePipeline.java:541) ~[?:?]

at java.util.stream.ReferencePipeline$2$1.accept(ReferencePipeline.java:179) ~[?:?]

at java.util.Spliterators$ArraySpliterator.forEachRemaining(Spliterators.java:992) ~[?:?]

at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:509) ~[?:?]

at java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:499) ~[?:?]

at java.util.stream.ForEachOps$ForEachOp.evaluateSequential(ForEachOps.java:150) ~[?:?]

at java.util.stream.ForEachOps$ForEachOp$OfRef.evaluateSequential(ForEachOps.java:173) ~[?:?]

at java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234) ~[?:?]

at java.util.stream.ReferencePipeline.forEach(ReferencePipeline.java:596) ~[?:?]

at org.talend.sdk.component.runtime.base.LifecycleImpl.invoke(LifecycleImpl.java:62) ~[component-runtime-impl-1.78.1.jar:?]

at org.talend.sdk.component.runtime.base.LifecycleImpl.stop(LifecycleImpl.java:58) ~[component-runtime-impl-1.78.1.jar:?]

at org.talend.sdk.component.runtime.manager.chain.AutoChunkProcessor.stop(AutoChunkProcessor.java:58) ~[component-runtime-manager-1.78.1.jar:?]

at megh_dev.databricks_etl_0_1.Databricks_ETL.tDBInput_1Process(Databricks_ETL.java:1871) [classes/:?]

at megh_dev.databricks_etl_0_1.Databricks_ETL.runJobInTOS(Databricks_ETL.java:3005) [classes/:?]

at megh_dev.databricks_etl_0_1.Databricks_ETL.main(Databricks_ETL.java:2630) [classes/:?]

quentin-vigne
Partner - Creator II
Partner - Creator II

Did you select the "Use an existing connection" inside the tJDBCOutput component ? 

 

If you don't then it's going to use the "Commit every " option in advanced parameters and will give you this error.

Chirag_
Partner - Contributor III
Partner - Contributor III
Author

This resolved the error :

Chirag__0-1746505346721.png

 

Thanks for the help Quentin !