My application needs to sometimes write back to tables on Databricks. I previously got code working which does this, by using LIB CONNECT and SQL INSERT OVERWRITE statements. Specifically, I am batch writing using a txt file:
[outputtable]:
REPLACE ONLY LOAD
concat('("' & column1 & '","' & column2 & '","' & column3 & '","' & column4 & '")' ,',') as [set data =]
resident [table_where_the_data_is];
if IsPartialReload() then
store [set data =] from [outputtable] into "$(vQVDPath)command.txt" (txt);
//execute the txt file, saving the string to a variable
$(include="(vQVDPath)command.txt");
LIB CONNECT TO 'Databricks_path (username)';
sql insert overwrite `database_name`.`table_name` (column1, column2, column3, column4) values $(data) !EXECUTE_NON_SELECT_QUERY;
So I know the code works. But it is not writing to the database as written during a partial reload. No error, just not getting there. To eliminate unknowns, I took only the last 3 lines and made a new Module (the command.txt file already exists after running in partial reload):
$(include="(vQVDPath)command.txt");
LIB CONNECT TO 'Databricks_path (username)';
sql insert overwrite `database_name`.`table_name` (column1, column2, column3, column4) values $(data) !EXECUTE_NON_SELECT_QUERY;
EXIT SCRIPT;
And ran it as a full reload. The table gets written on Databricks just fine. So now I am thinking that some part of the above three lines does not work in a partial reload. either the LIB CONNECT or the SQL INSERT? Is this true, and is there a known way around this problem?
-Dave