Skip to main content
Announcements
July 15, NEW Customer Portal: Initial launch will improve how you submit Support Cases. IMPORTANT DETAILS
cancel
Showing results for 
Search instead for 
Did you mean: 
lguevara
Partner - Contributor III
Partner - Contributor III

File to Microsoft Azure Databricks Delta Error when copy data

Hi,

I configure my source as file and my target as Azure Databricks Delta; and test connection works fine.

When run task, create the table in databricks but no insert data.

Error

00011360: 2023-02-15T17:26:14 [TARGET_LOAD ]E: Failed (retcode -1) to execute statement: 'COPY INTO `default`.`regions` FROM(SELECT cast(_c0 as TINYINT) as `ID`, _c1 as `NAME` from 'abfss://myfilesystem1@datalake123134.dfs.core.windows.net') FILEFORMAT = CSV FILES = ('/staging1/File to Databricks/1/LOAD00000001.csv.gz') FORMAT_OPTIONS('nullValue' = 'attrep_null', 'multiLine'='true') COPY_OPTIONS('force' = 'true')' [1022502] (ar_odbc_stmt.c:4985)
00011360: 2023-02-15T17:26:14 [TARGET_LOAD ]E: RetCode: SQL_ERROR SqlState: 42000 NativeError: 80 Message: [Simba][Hardy] (80) Syntax or semantic analysis error thrown in server while executing query. Error message from server: org.apache.hive.service.cli.HiveSQLException: Error running query: Failure to initialize configurationInvalid configuration value detected for fs.azure.account.key
at org.apache.spark.sql.hive.thriftserver.HiveThriftServerErrors$.runningQueryError(HiveThriftServerErrors.scala:53)
at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.org$apache$spark$sql$hive$thriftserver$SparkExecuteStatementOperation$$execute(SparkExecuteStatementOperation.scala:435)
at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation$$anon$2$$anon$3.$anonfun$run$2(SparkExecuteStatementOperation.scala:257)
at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
at org.apache.spark.sql.hive.thriftserver.ThriftLocalProperties.withLocalProperties(ThriftLocalProperties.scala:123)
at org.apache.spark.sql.hive.thriftserver.ThriftLocalProperties.withLocalProperties [1022502] (ar_odbc_stmt.c:4992)

 

lguevara_0-1676501534978.png

 

Thanks

Labels (2)
1 Solution

Accepted Solutions
SwathiPulagam
Support
Support

Hi @lguevara ,

 

Make sure you added the below key value on databricks  Advanced Options --> spark config
fs.azure.account.key.datalake123134.dfs.core.windows.net xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx==

 value you have to collect from your storage account secret.

Let us know how it goes.

Thanks,

Swathi

View solution in original post

2 Replies
kmadhavi2
Employee
Employee

Hi,

Please cross check the value for fs.azure.account.key

Thanks,

Madhavi

SwathiPulagam
Support
Support

Hi @lguevara ,

 

Make sure you added the below key value on databricks  Advanced Options --> spark config
fs.azure.account.key.datalake123134.dfs.core.windows.net xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx==

 value you have to collect from your storage account secret.

Let us know how it goes.

Thanks,

Swathi