Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
Hello I have a question. Can I extract data from a table in Hadoop and insert it into an Oracle table?
thanks!!
Hello,
You are able to use tSqoopExport to export data from an HDFS to a relational database management system (RDBMS).
Here is online component reference about:
https://help.talend.com/r/en-US/8.0/sqoop/tsqoopexport
Best regards
Sabrina
Thanks Sabrina, and in the section of export control arguments I have to indicate the name destination table, and the Export Dir. If the table is Oracle I have to add the connection component to the job?
Is the first time that I have to work with hadoop
thanks a lot.
Hello,
You don't need to use tOracleConnection component in your work flow.
There are JDBC property and Connection section in tSqoopExport component basic settings.
In the Connection field, enter the URI of the Oracle database where the target table is stored.
Don't hesitate to post your issue here.
Best regards
Sabrina
Hi!! Thanks,
I have already configured the connection with Oracle, at the time of executing I get an error and I do not know why it is occurring, this is the info.
"Starting job hadoop_latam_load_core at 13:43 21/12/2022.
[statistics] connecting to socket on port 3382
[statistics] connected
log4j:WARN No appenders could be found for logger (org.apache.htrace.core.Tracer).
log4j:WARN Please initialize the log4j system properly.
log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
Exception in component tSqoopExport_1 (hadoop_latam_load_core)
java.lang.Exception: The Sqoop export job has failed. Please check the logs.
at or_gtm_cgp.hadoop_latam_load_core_0_1.hadoop_latam_load_core.tSqoopExport_1Process(hadoop_latam_load_core.java:606)
at or_gtm_cgp.hadoop_latam_load_core_0_1.hadoop_latam_load_core.runJobInTOS(hadoop_latam_load_core.java:1460)
at or_gtm_cgp.hadoop_latam_load_core_0_1.hadoop_latam_load_core.main(hadoop_latam_load_core.java:1208)
[FATAL] 13:43:25 or_gtm_cgp.hadoop_latam_load_core_0_1.hadoop_latam_load_core- tSqoopExport_1 The Sqoop export job has failed. Please check the logs.
java.lang.Exception: The Sqoop export job has failed. Please check the logs.
at or_gtm_cgp.hadoop_latam_load_core_0_1.hadoop_latam_load_core.tSqoopExport_1Process(hadoop_latam_load_core.java:606) [classes/:?]
at or_gtm_cgp.hadoop_latam_load_core_0_1.hadoop_latam_load_core.runJobInTOS(hadoop_latam_load_core.java:1460) [classes/:?]
at or_gtm_cgp.hadoop_latam_load_core_0_1.hadoop_latam_load_core.main(hadoop_latam_load_core.java:1208) [classes/:?]
[statistics] disconnected
job hadoop_latam_load_core ended at 13:43 21/12/2022. [Exit code = 1]"
in the log I didn´t see anything.
Hello,
Could you please confirm your Hadoop cluster and version and run the same job with debug log level so that we can make an investigation on your issue?
Best regards
Sabrina
Hello!!
I have checked what you have told me, it is correct, I doubt if what I am not putting well is the Export Dir, if the path of the Hive table is not correct, I have put the one that indicates me in the info of the HUE editor
Best Regards!