Do not input private or sensitive data. View Qlik Privacy & Cookie Policy.
Skip to main content

Announcements
Join us in Bucharest on Sept 18th for Qlik's AI Reality Tour! Register Now
cancel
Showing results for 
Search instead for 
Did you mean: 
Anonymous
Not applicable

Error in working with tSqoopImport component

Hi All,

Can anyone figure out the root cause for the following exception,
: org.apache.hadoop.conf.Configuration.deprecation - fs.default.name is deprecated. Instead, use fs.defaultFS
: org.apache.sqoop.ConnFactory - $SQOOP_CONF_DIR has not been set in the environment. Cannot check for additional configuration.
: org.apache.sqoop.manager.MySQLManager - Preparing to use a MySQL streaming resultset.
: org.apache.sqoop.tool.CodeGenTool - Beginning code generation
: org.apache.sqoop.manager.SqlManager - Executing SQL statement: SELECT t.* FROM `employee` AS t LIMIT 1
: org.apache.sqoop.manager.SqlManager - Executing SQL statement: SELECT t.* FROM `employee` AS t LIMIT 1
: org.apache.sqoop.orm.CompilationManager - $HADOOP_HOME is not set
Note: \tmp\sqoop-vengat.maran\compile\249e3ff4119cce039d52b482ec31cf50\employee.java uses or overrides a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
: org.apache.sqoop.orm.CompilationManager - Writing jar file: \tmp\sqoop-vengat.maran\compile\249e3ff4119cce039d52b482ec31cf50\employee.jar
: org.apache.sqoop.manager.MySQLManager - It looks like you are importing from mysql.
: org.apache.sqoop.manager.MySQLManager - This transfer can be faster! Use the --direct
: org.apache.sqoop.manager.MySQLManager - option to exercise a MySQL-specific fast path.
: org.apache.sqoop.manager.MySQLManager - Setting zero DATETIME behavior to convertToNull (mysql)
: org.apache.sqoop.mapreduce.ImportJobBase - Beginning import of employee
Exception in component tSqoopImport_1
java.lang.Exception: The Sqoop import job has failed
at sample.tableloadcheck_0_1.tableLoadCheck.tFileInputDelimited_1Process(tableLoadCheck.java:1228)
at sample.tableloadcheck_0_1.tableLoadCheck.tLibraryLoad_2Process(tableLoadCheck.java:746)
at sample.tableloadcheck_0_1.tableLoadCheck.tLibraryLoad_1Process(tableLoadCheck.java:635)
at sample.tableloadcheck_0_1.tableLoadCheck.tHDFSDelete_1Process(tableLoadCheck.java:524)
at sample.tableloadcheck_0_1.tableLoadCheck.runJobInTOS(tableLoadCheck.java:2176)
at sample.tableloadcheck_0_1.tableLoadCheck.main(tableLoadCheck.java:2035)
: org.apache.sqoop.tool.ImportTool - Encountered IOException running import job: java.io.IOException: Could not load jar \tmp\sqoop-vengat.maran\compile\249e3ff4119cce039d52b482ec31cf50\employee.jar into JVM. (Could not find class employee.)
at org.apache.sqoop.util.ClassLoaderStack.addJarFile(ClassLoaderStack.java:92)
at com.cloudera.sqoop.util.ClassLoaderStack.addJarFile(ClassLoaderStack.java:36)
at org.apache.sqoop.mapreduce.JobBase.loadJars(JobBase.java:230)
at org.apache.sqoop.mapreduce.ImportJobBase.runImport(ImportJobBase.java:192)
at org.apache.sqoop.manager.SqlManager.importTable(SqlManager.java:465)
at org.apache.sqoop.manager.MySQLManager.importTable(MySQLManager.java:108)
at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:403)
at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:476)
at org.apache.sqoop.Sqoop.run(Sqoop.java:145)
at sample.tableloadcheck_0_1.tableLoadCheck.tFileInputDelimited_1Process(tableLoadCheck.java:1226)
at sample.tableloadcheck_0_1.tableLoadCheck.tLibraryLoad_2Process(tableLoadCheck.java:746)
at sample.tableloadcheck_0_1.tableLoadCheck.tLibraryLoad_1Process(tableLoadCheck.java:635)
at sample.tableloadcheck_0_1.tableLoadCheck.tHDFSDelete_1Process(tableLoadCheck.java:524)
at sample.tableloadcheck_0_1.tableLoadCheck.runJobInTOS(tableLoadCheck.java:2176)
at sample.tableloadcheck_0_1.tableLoadCheck.main(tableLoadCheck.java:2035)
Caused by: java.lang.ClassNotFoundException: employee
at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
at java.net.FactoryURLClassLoader.loadClass(URLClassLoader.java:789)
at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:270)
at org.apache.sqoop.util.ClassLoaderStack.addJarFile(ClassLoaderStack.java:88)
... 14 more
Thanks in Advance.
Regards,
Vengat Maran.
Labels (3)
4 Replies
Anonymous
Not applicable
Author

Hi vengat,
It seems issue with your Hadoop and Sqoop's configuration.Sqoop component uses MapReduce,so you need to specify classpath variable HADOOP_HOME and HADOOP_CONF_DIR and SQOOP_HOME and SQOOP_CONF_DIR(in your .bashrc file in ubuntu).
Please try it and inform if resolved.
Anonymous
Not applicable
Author

 
I'm getting this error while exporting data from mySQL to HDFS environment using tSqoopExport.
can any one help me how to solve this issue
Anonymous
Not applicable
Author

I'm getting this error while exporting data from mySQL to HDFS environment using tSqoopExport.
can any one help me how to solve this issue

: org.apache.sqoop.ConnFactory - $SQOOP_CONF_DIR has not been set in the environment. Cannot check for additional configuration.
Note: \tmp\sqoop-David\compile\b4f030dff85e4c59d42ba6efe04c061a\store.java uses or overrides a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
: org.apache.sqoop.mapreduce.ExportJobBase - IOException checking input file header: java.io.EOFException
: org.apache.sqoop.mapreduce.JobBase - SQOOP_HOME is unset. May not be able to find all job dependencies.
: org.apache.hadoop.mapreduce.JobSubmitter - Hadoop command-line option parsing not performed. Implement the Tool interface and execute your application with ToolRunner to remedy this.
The Sqoop export job has failed. Please check the logs.
: org.apache.sqoop.tool.ExportTool - Encountered IOException running export job: java.io.IOException: org.apache.hadoop.yarn.exceptions.InvalidResourceRequestException: Invalid resource request, requested memory < 0, or requested memory > max configured, requestedMemory=1536, maxMemory=1403
shashidas
Contributor
Contributor

Hi,

 

I am also facing same error $SQOOP_CONF_DIR has not been set in the environment. Cannot check for additional configuration.

 

Also verified that my local machine and cluster edge node both has same java version.

I defined below 4 attributes in ./bash_profile as well. But still same error. The only thing i am not following from the blogs is /tmp/sqoop-XXX/compile check.

 

export HADOOP_HOME=/usr/hdp/2.6.4.0-91/hadoop
export HADOOP_CONF_DIR=/usr/hdp/2.6.4.0-91/hadoop/conf
export SQOOP_HOME=/usr/hdp/2.6.4.0-91/sqoop
export SQOOP_CONF_DIR=/usr/hdp/2.6.4.0-91/sqoop/conf

 

Can someone please guide here ?