Skip to main content
Announcements
Introducing Qlik Answers: A plug-and-play, Generative AI powered RAG solution. READ ALL ABOUT IT!
cancel
Showing results for 
Search instead for 
Did you mean: 
SaidSayyad
Contributor II
Contributor II

Databricks Lakehouse (Delta) Target connection not working in DRP server.

Hello Team,

we are using Databricks Lakehouse (Delta) as a target. while setup connection we are observing below error.

we tried to uninstall & re-install  Simba Sparks driver version simbaspark-2.6.26.1045-1.x86_64.rpm. we also updated Simba file but still facing same error.

 

Update file /opt/simba/spark/Setup/odbcinst.ini

Add lines :

# Driver from the SimbaSparkODBC-2.6.29.1049-LinuxRPM-64bit package

# Setup from the unixODBC package

[Simba Spark ODBC Driver]

Description     = Simba Apache Spark ODBC Connector (64-bit)

Driver          = /opt/simba/spark/lib/64/libsparkodbc_sb64.so

 

Test connection Error: 

  • Failed prepare Cloud component
  • Cannot connect to Cloud server RetCode: SQL_ERROR SqlState: 01000 NativeError: 0 Message: [unixODBC][Driver Manager]Can't open lib 'Simba Spark ODBC Driver' : file not found

Logfile error:

01958621: 2023-09-29T10:32:27:706457 [STREAM_COMPONEN ]I: Going to connect to server westeurope.azuredatabricks.net database (null) (cloud_imp.c:3883)
01958621: 2023-09-29T10:32:27:706497 [STREAM_COMPONEN ]I: Target endpoint 'Databricks Lakehouse (Delta)' is using provider syntax 'DatabricksDelta' (provider_syntax_manager.c:894)
01958621: 2023-09-29T10:32:27:719774 [STREAM_COMPONEN ]E: RetCode: SQL_ERROR SqlState: 01000 NativeError: 0 Message: [unixODBC][Driver Manager]Can't open lib 'Simba Spark ODBC Driver' : file not found [1022502] (ar_odbc_conn.c:584)
01958621: 2023-09-29T10:32:27:719788 [STREAM_COMPONEN ]E: Cannot connect to Cloud server [1022501] (cloud_imp.c:3918)
01958621: 2023-09-29T10:32:27:719790 [STREAM_COMPONEN ]E: Failed prepare Cloud component [1022501] (cloud_metadata.c:239)
 
Thanks,
Said Sayyad
Labels (2)
22 Replies
nareshkumar
Contributor III
Contributor III

Hi John,

We tried the steps placed but still facing the same error #

  • Failed prepare Cloud component
  • Cannot connect to Cloud server RetCode: SQL_ERROR SqlState: 01000 NativeError: 0 Message: [unixODBC][Driver Manager]Can't open lib 'Simba Spark ODBC Driver 64-bit' : file not found

tried with both :

[Simba Spark ODBC Drivert]
Description=Amazon Hive ODBC Driver (64-bit)
Driver=/opt/simba/spark/lib/64/libsparkodbc_sb64.so

[Simba Spark ODBC Driver 64-bit]
Description=Amazon Hive ODBC Driver (64-bit)
Driver=/opt/simba/spark/lib/64/libsparkodbc_sb64.so

added the internal parameter driver  ...

-> also added the paths to site_arep_login.sh:


#Delta Databricks
export ODBCINI=/etc/odbc.ini
export ODBCSYSINI=/etc/odbc
export SPARKINI=/etc/simba.sparkodbc.ini
export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:/opt/simba/spark/lib/64/libsparkodbc_sb64.so

 

Copied simba.sparkodbc.ini file to /etc location 

 

restarted the services but still getting the same error .. Please help me.

john_wang
Support
Support

Hello @nareshkumar ,

Noticed the line:

export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:/opt/simba/spark/lib/64/libsparkodbc_sb64.so

This is inaccurate as we only need the DLL folder name rather than the DLL file name. Please change it to:

export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:/opt/simba/spark/lib/64/

Hope it works for you.

John.

Help users find answers! Do not forget to mark a solution that worked for you! If already marked, give it a thumbs up!
nareshkumar
Contributor III
Contributor III

Thank you So much for quick response and tried updating the export path to 

export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:/opt/simba/spark/lib/64/

sourced siterep file and restarted the service and still getting the same error.

john_wang
Support
Support

Hello @nareshkumar ,

Let's make things simple. Please use the "isql" command to troubleshoot the ODBC DSN connectivity issue rather than doing that in Qlik Replicate.

Let's try below steps:

1. Change to use "attunity" (or other account which you choosed)

2. make sure "/etc/odbcinst.ini" is there and configured correctly

3. make sure "/etc/odbc.ini" is there and configured correctly

4. run "source ./arep_login.sh"

5. run "isql" command to access the DSN. If the unixODBC is not installed or not configured correctly, please follow the steps to install it (by root or other privileged account).

In general this error is out of Qlik Replicate, if isql works then Replicate should work too.

Regards,

John.

Help users find answers! Do not forget to mark a solution that worked for you! If already marked, give it a thumbs up!
SushilKumar
Support
Support

Hello team,

 

If our response has been helpful, please consider clicking "Accept as Solution". This will assist other users in easily finding the answer.

 

Regards,

Sushil Kumar

nareshkumar
Contributor III
Contributor III

Apologies for the delayed response ...the solution not working  for us .. in previous reply from John it is stated that it is out of replicate . So didn't responded ..

nareshkumar
Contributor III
Contributor III

could you please confirm that AWS databriks  lakehouse (delta) is supported by Qlik or only Azure will support ?

Dana_Baldwin
Support
Support

Hi @nareshkumar 

Unfortunately, our documentation does not specify which platform is supported for Databricks Lakehouse Delta. This suggests it is supported for either, but I cannot say with certainty.

Supported target endpoints #Supported target endpoints | Qlik Replicate Help

I'll ask and follow up. We don't have a way to elevate questions/issues to our internal support team via the Community, so we may need a support case for this question.

Thanks,

Dana

Dana_Baldwin
Support
Support

Hi @nareshkumar 

Yes, we support Databricks Lakehouse Delta on both AWS and Azure.

Thanks,

Dana

nareshkumar
Contributor III
Contributor III

In order to check isql with DSN .. what are all the parameters to be part of DSN file .. we are using Token in the Replicate endpoint...

We are in Linux platform... Please help us in sharing sample parameters to setup a DSN in odbc.ini file ...

For AS 400 we are passing like below 
[CAMS]
Description = DSN for CAMS.TMS.TOYOTA.COM
Driver = IBM i Access ODBC Driver
System = camsdb.localhost
SSL=1

For DB2 as follows:

[DSN_TRG]
Description = DSN for DB2 cp1fa.vipa.tmm.toyota.com
Driver = IBM DB2 ODBC DRIVER

 

Please help us with Databriks Lakehouse(Delta DSN ) details ...

[DSN_DELTA]
SSLTrustedCertsPath=/opt/simba/spark/lib/64/databriks.pem
SSL=1
UseSystemTrustStore=1