Error while using tSqoopImport component in Open Studio for BigData
Hi there, I'm trying to use tSqoopImport component in Talend Open Studio for Big Data to extract a sql server table into HDFS using Sqoop. However when executing the job I'm getting following error messages. Can someone advise what all do I need to fix. Note: I'm running this on Ubuntu14 client and my JAVA_HOME is pointed to /usr/lib/jvm/java-8-oracle/jre ------------------------------------------------------------------------------------------------------------------------------------------------ Starting job SqoopTest at 10:23 07/09/2016. connecting to socket on port 3446 connected : org.apache.hadoop.util.NativeCodeLoader - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable : org.apache.sqoop.ConnFactory - $SQOOP_CONF_DIR has not been set in the environment. Cannot check for additional configuration. Note: /tmp/sqoop-ubuntu/compile/59f62dd0a823faed91b88fda407b9f9b/cp_std_shift_amit.java uses or overrides a deprecated API. Note: Recompile with -Xlint:deprecation for details. : org.apache.sqoop.mapreduce.JobBase - SQOOP_HOME is unset. May not be able to find all job dependencies. Error: cp_std_shift_amit : Unsupported major.minor version 52.0 Error: cp_std_shift_amit : Unsupported major.minor version 52.0 Error: cp_std_shift_amit : Unsupported major.minor version 52.0 Error: cp_std_shift_amit : Unsupported major.minor version 52.0 Error: cp_std_shift_amit : Unsupported major.minor version 52.0 Error: cp_std_shift_amit : Unsupported major.minor version 52.0 Error: cp_std_shift_amit : Unsupported major.minor version 52.0 Error: cp_std_shift_amit : Unsupported major.minor version 52.0 Error: cp_std_shift_amit : Unsupported major.minor version 52.0 Error: cp_std_shift_amit : Unsupported major.minor version 52.0 Error: cp_std_shift_amit : Unsupported major.minor version 52.0 Error: cp_std_shift_amit : Unsupported major.minor version 52.0 : mapreduce.Counters - Group FileSystemCounters is deprecated. Use org.apache.hadoop.mapreduce.FileSystemCounter instead Exception in component tSqoopImport_1 java.lang.Exception: The Sqoop import job has failed. Please check the logs. at edw_poc.sqooptest_0_1.SqoopTest.tSqoopImport_1Process(SqoopTest.java:375) at edw_poc.sqooptest_0_1.SqoopTest.runJobInTOS(SqoopTest.java:649) at edw_poc.sqooptest_0_1.SqoopTest.main(SqoopTest.java:506) : org.apache.sqoop.tool.ImportTool - Error during import: Import job failed! disconnected Job SqoopTest ended at 10:24 07/09/2016. ----------------------------------------------------------------------------------------------------------------------
@amula, @amitjn1 :
Could you please help me out as i am facing the same error.
I'm using TOS_BD 6.3.0 (which uses Java 1.8) in my local machine and have connected it with a sandbox (Virtual machine) : single node cluster (Quickstart for CDH5.8) . This hadoop's java version is 1.7.0_67 so i have installed the same in my local and pointed the preferences in TALEND to that (Java 1.7.0_67) thus making both TALEND and Hadoop run has same Java version, but it still doesn't work. Any idea on where am i going wrong ?
I have also tried setting up the Sqoop Home and Config dir in .bashrc file in hadoop.
Error : WARN ]: org.apache.sqoop.ConnFactory - $SQOOP_CONF_DIR has not been set in the environment. Cannot check for additional configuration.
Note: \tmp\sqoop-Aj\compile\4f5bcdd65eb0ae3cc483d4e0f51f3a6d\table1.java uses or overrides a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
: org.apache.sqoop.manager.MySQLManager - It looks like you are importing from mysql.
: org.apache.sqoop.manager.MySQLManager - This transfer can be faster! Use the --direct
: org.apache.sqoop.manager.MySQLManager - option to exercise a MySQL-specific fast path.
: org.apache.sqoop.mapreduce.JobBase - SQOOP_HOME is unset. May not be able to find all job dependencies.
: org.apache.hadoop.hdfs.DFSClient - DataStreamer Exception