Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
Hello Team,
we are using Databricks Lakehouse (Delta) as a target. while setup connection we are observing below error.
we tried to uninstall & re-install Simba Sparks driver version simbaspark-2.6.26.1045-1.x86_64.rpm. we also updated Simba file but still facing same error.
Update file /opt/simba/spark/Setup/odbcinst.ini
Add lines :
# Driver from the SimbaSparkODBC-2.6.29.1049-LinuxRPM-64bit package
# Setup from the unixODBC package
[Simba Spark ODBC Driver]
Description = Simba Apache Spark ODBC Connector (64-bit)
Driver = /opt/simba/spark/lib/64/libsparkodbc_sb64.so
Test connection Error:
Logfile error:
Hi @SaidSayyad ,
Thanks for the information.
The default driver name is "Simba Spark ODBC Driver". In your "odbcinst.ini" file, the driver name is "Simba Spark ODBC Driver 64-bit". So far you can either rename the driver name in "odbcinst.ini" file to default name, or you can use the predefined name in Databricks target endpoint, as below:
If the problem persists, please try to use "isql" to test the connectivity issue.
BTW, it seems the Samba folder was not added into LD_LIBRARY_PATH environment variable correctly: the line is commented out. please use "env|grep LD_LIBRARY_PATH " to check if it's set correctly.
Regards,
John.
Hello team,
If our response has been helpful, please consider clicking "Accept as Solution". This will assist other users in easily finding the answer.
Regards,
Sushil Kumar
Hello John,
As per your suggestion we have changed driver name under "odbcinst.ini" file and it's working now.
Thanks,
Said Sayyad
Hello @SaidSayyad ,
Thanks for reaching out to Qlik Community!
Looks to me this is an ODBC configuration issue. Let's troubleshoot the issue by steps:
1. Please merge below lines into "/etc/odbcinst.ini" file rather than using "/opt/simba/spark/Setup/odbcinst.ini"
[Simba Spark ODBC Driver] Description = Simba Apache Spark ODBC Connector (64-bit) Driver = /opt/simba/spark/lib/64/libsparkodbc_sb64.so |
2. Add the PATH "/opt/simba/spark/lib/64/" to LD_LIBRARY_PATH environment variable in file "/opt/Attunity/replicate/bin/site_arep_login.sh" (default location). Make sure the Replicate account (by default the name is "attunity") has enough privileges to access/execute the Simba files.
3. Restart Qlik Replicate Service after the above step 2 is done.
4. If the problem persists, it's better to create 64-bit ODBC DSN in file "/etc/odbc.ini" and then use command "isql" to test the connectivity issue, it's much easier than doing it by Replicate. If "isql" works fine then Replicate should work as well.
5. Link from Databricks: Troubleshooting JDBC and ODBC connections
Hope this helps.
Regards,
John.
Hi @SaidSayyad ,
Thanks for the information.
The default driver name is "Simba Spark ODBC Driver". In your "odbcinst.ini" file, the driver name is "Simba Spark ODBC Driver 64-bit". So far you can either rename the driver name in "odbcinst.ini" file to default name, or you can use the predefined name in Databricks target endpoint, as below:
If the problem persists, please try to use "isql" to test the connectivity issue.
BTW, it seems the Samba folder was not added into LD_LIBRARY_PATH environment variable correctly: the line is commented out. please use "env|grep LD_LIBRARY_PATH " to check if it's set correctly.
Regards,
John.
Hello team,
If our response has been helpful, please consider clicking "Accept as Solution". This will assist other users in easily finding the answer.
Regards,
Sushil Kumar
Hello @SaidSayyad ,
Thank you so much for your feedback.
Do you mind sharing with us the final solution? It's very helpful for all of us.
Best Regards,
John.
Hello John,
As per your suggestion we have changed driver name under "odbcinst.ini" file and it's working now.
Thanks,
Said Sayyad
Hi @SaidSayyad ,
Glad to hear that and thanks for your update.
Regards,
John.