Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
Hello,
We are using HDFS as target. Endpoint connection is tested successfully but while uploading a file we getting following error:
"Handling new table db.tablename failed
Failed to create table tablename
RetCode: SQLERROR SqlState HY000 NativeError: 35 Message: [Cloudera](Hardy) (35)
Error from server, error code: 40000 error message. Error while compiling statement
FAILED. Execution Error, return code 40000 from
org apache.hadoop.hive.g.ddl.DDLTask. MetaException(message/l legal location for managed table, it has to be within database's managed location)
Failed (retcode -1) to execute statement: CREATE TABLE "
And also want to know few more points regarding same issue :
1. Is that mandatory to use Managed table location only while using HDFS as target if yes then why
2. What if we want to replicate data to unmanaged location and how to configure it
Regards,
Pranita
Hello @Pranita123 ,
Thanks for reaching out to Qlik Community!
If I understood correctly, you are trying to create external table with Parquet file format, is that correct?
How about if you try to create the target table manually before running Replicate task, and configure the task to not drop existing tables, eg TRUNCATE before loading:
When it comes to the questions:
>> 1. Is that mandatory to use Managed table location only while using HDFS as target if yes then why
No.
>> 2. What if we want to replicate data to unmanaged location and how to configure it
Please try above WA, we'd like to understand further about your task and target endpoint settings. Maybe need opening a support ticket later.
Hope this helps.
John.
Hello @Pranita123 ,
Thanks for reaching out to Qlik Community!
If I understood correctly, you are trying to create external table with Parquet file format, is that correct?
How about if you try to create the target table manually before running Replicate task, and configure the task to not drop existing tables, eg TRUNCATE before loading:
When it comes to the questions:
>> 1. Is that mandatory to use Managed table location only while using HDFS as target if yes then why
No.
>> 2. What if we want to replicate data to unmanaged location and how to configure it
Please try above WA, we'd like to understand further about your task and target endpoint settings. Maybe need opening a support ticket later.
Hope this helps.
John.
Hi @john_wang ,
Currently we have configured target file format as Parque, but when we pass unmanaged target folder location that time we are not able to load data and getting above mentioned error,
But when we configured target file format as text and pass unmanaged target folder location that time we are able to load data successfully.
so can we conclude that, with Parque we can only handle managed table, and with text we can handle both managed and unmanaged.
For reference, PFA image.
Hello @Pranita123 ,
Thanks for your patience. I'm in public holiday in the past weeks. Please open a support ticket with source table creation DDL, I'd like to work with support team together on the issue.
Best Regards,
John.
Hello Team,
Thanks for reaching out to Qlik via Qlik Community Support page. Such issue require through analysis of Task Settings, hence request you to reach out to technical Support via a case and provide Diagnostics package .
Please refer to Replicate https://community.qlik.com/t5/Knowledge/How-to-collect-Diagnostics-Package-from-Qlik-Replicate/ta-p/...
Regards,
Sushil Kumar