Do not input private or sensitive data. View Qlik Privacy & Cookie Policy.
Skip to main content

Announcements
Join us in Toronto Sept 9th for Qlik's AI Reality Tour! Register Now
cancel
Showing results for 
Search instead for 
Did you mean: 
Anonymous
Not applicable

problem with sqoop into cloudera

I am not able to use talend community edition(version 6.1.20xxx) to import mysql table to hive table in cloudera (54).attached is component screen shot and error screen shot
I have 3 components: 1) sqoop library load 2) MySQL library load 3) sqoop
please let me know what I did wrong.
separately, do I need to install sqoop and hadoop on the box where talend is installed?
Labels (2)
6 Replies
Anonymous
Not applicable
Author

Hi,

Can you upload again the screenshots you wanted to show, please? For some reason it didn't make it to your post.

Best regards
Sabrina

0683p000009MCTQ.png
Anonymous
Not applicable
Author

Hi, Sabrina, there are something wrong with the upload applet in the forum, I tried both IE and chrome, and I am not able to see the 2 gray bordered box as depicted in your instruction. I have upload them in the dropbox shared.
link: https://www.dropbox.com/s/y74mbeczkeh42qu/error_msg.png?dl=0
link: https://www.dropbox.com/s/lf33jkfhi0in1wv/sqoop.png?dl=0
only in firebox, I can see the 2 gray box
Starting job hive at 10:48 23/11/2015.
connecting to socket on port 3346
connected
: org.apache.hadoop.util.NativeCodeLoader - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
: org.apache.sqoop.ConnFactory - $SQOOP_CONF_DIR has not been set in the environment. Cannot check for additional configuration.
Note: /tmp/sqoop-root/compile/70981af7c84953fa16b5ce744f241a69/agent_type.java uses or overrides a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
: org.apache.sqoop.mapreduce.JobBase - SQOOP_HOME is unset. May not be able to find all job dependencies.
: org.apache.hadoop.hdfs.DFSClient - DataStreamer Exception
java.nio.channels.UnresolvedAddressException
    at sun.nio.ch.Net.checkAddress(Net.java:101)
    at sun.nio.ch.SocketChannelImpl.connect(SocketChannelImpl.java:622)
    at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:192)
    at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:530)
    at org.apache.hadoop.hdfs.DFSOutputStream.createSocketForPipeline(DFSOutputStream.java:1622)
    at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1420)
    at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1373)
    at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:600)
: org.apache.hadoop.security.UserGroupInformation - PriviledgedActionException as:hdfs (auth 0683p000009M9p6.pngIMPLE) cause:java.io.IOException: DataStreamer Exception:
: org.apache.sqoop.tool.ImportTool - Encountered IOException running import job: java.io.IOException: DataStreamer Exception:
    at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:708)
Caused by: java.nio.channels.UnresolvedAddressException
    at sun.nio.ch.Net.checkAddress(Net.java:101)
    at sun.nio.ch.SocketChannelImpl.connect(SocketChannelImpl.java:622)
    at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:192)
    at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:530)
    at org.apache.hadoop.hdfs.DFSOutputStream.createSocketForPipeline(DFSOutputStream.java:1622)
    at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1420)
    at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1373)
    at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:600)
Exception in component tSqoopImport_1
java.lang.Exception: The Sqoop import job has failed. Please check the logs.
    at poc.hive_0_1.hive.tSqoopImport_1Process(hive.java:632)
    at poc.hive_0_1.hive.tLibraryLoad_2Process(hive.java:485)
    at poc.hive_0_1.hive.tLibraryLoad_1Process(hive.java:387)
    at poc.hive_0_1.hive.runJobInTOS(hive.java:906)
    at poc.hive_0_1.hive.main(hive.java:763)
0683p000009MCCN.png
_AnonymousUser
Specialist III
Specialist III

struggle with same issue. please share the solution if you have
Anonymous
Not applicable
Author

Please share the solution of the "org.apache.sqoop.tool.ImportTool - Encountered IOException running import job: java.io.IOException: DataStreamer Exception" issue.
Anonymous
Not applicable
Author

post your screen shot.If you are having same issue as "david" then looking at the screen shot the jdbc driver jar and class name are missing.
_AnonymousUser
Specialist III
Specialist III

Hi Amula,
I am facing the below error. Importing data from Mysql (version 5) to CDH5.8 Sandbox.  Please advice what are the Jars and Class Name should i use for my job. I have Java version 1.8.0 in both Hadoop and TALEND
Job : tLibrarayLoad  --> tSqoopImport
: org.apache.hadoop.util.NativeCodeLoader - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
: org.apache.hadoop.util.Shell - Failed to locate the winutils binary in the hadoop binary path
java.io.IOException: Could not locate executable null\bin\winutils.exe in the Hadoop binaries.
    at org.apache.hadoop.util.Shell.getQualifiedBinPath(Shell.java:381)
    at org.apache.hadoop.util.Shell.getWinUtilsPath(Shell.java:396)
    at org.apache.hadoop.util.Shell.<clinit>(Shell.java:389)
    at org.apache.hadoop.util.StringUtils.<clinit>(StringUtils.java:79)
    at org.apache.hadoop.security.Groups.parseStaticMapping(Groups.java:130)
    at org.apache.hadoop.security.Groups.<init>(Groups.java:94)
    at org.apache.hadoop.security.Groups.<init>(Groups.java:74)
    at org.apache.hadoop.security.Groups.getUserToGroupsMappingService(Groups.java:303)
    at org.apache.hadoop.security.UserGroupInformation.initialize(UserGroupInformation.java:283)
    at org.apache.hadoop.security.UserGroupInformation.ensureInitialized(UserGroupInformation.java:260)
    at org.apache.hadoop.security.UserGroupInformation.loginUserFromSubject(UserGroupInformation.java:790)
    at org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:760)
    at org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:633)
    at org.apache.hadoop.fs.FileSystem$Cache$Key.<init>(FileSystem.java:2859)
    at org.apache.hadoop.fs.FileSystem$Cache$Key.<init>(FileSystem.java:2851)
    at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2714)
    at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:382)
    at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:181)
    at ajay_talend.poc_1_0_1.poc_1.tSqoopImport_1Process(poc_1.java:483)
    at ajay_talend.poc_1_0_1.poc_1.tLibraryLoad_1Process(poc_1.java:370)
    at ajay_talend.poc_1_0_1.poc_1.runJobInTOS(poc_1.java:801)
    at ajay_talend.poc_1_0_1.poc_1.main(poc_1.java:658)
: org.apache.sqoop.ConnFactory - $SQOOP_CONF_DIR has not been set in the environment. Cannot check for additional configuration.
Note: \tmp\sqoop-Aj\compile\f0f413757a34cc91cf81734f4d7d39b8\table1.java uses or overrides a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
: org.apache.sqoop.mapreduce.JobBase - SQOOP_HOME is unset. May not be able to find all job dependencies.
: org.apache.sqoop.mapreduce.db.TextSplitter - Generating splits for a textual index column.
: org.apache.sqoop.mapreduce.db.TextSplitter - If your database sorts in a case-insensitive order, this may result in a partial import or duplicate records.
: org.apache.sqoop.mapreduce.db.TextSplitter - You are strongly encouraged to choose an integral split column.
: org.apache.sqoop.tool.ImportTool - Encountered IOException running import job: java.io.IOException: Job status not available
    at org.apache.hadoop.mapreduce.Job.updateStatus(Job.java:334)
    at org.apache.hadoop.mapreduce.Job.isComplete(Job.java:621)
    at org.apache.hadoop.mapreduce.Job.monitorAndPrintJob(Job.java:1366)
    at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1328)
    at org.apache.sqoop.mapreduce.ImportJobBase.doSubmitJob(ImportJobBase.java:203)
    at org.apache.sqoop.mapreduce.ImportJobBase.runJob(ImportJobBase.java:176)
    at org.apache.sqoop.mapreduce.ImportJobBase.runImport(ImportJobBase.java:273)
    at org.apache.sqoop.manager.DirectMySQLManager.importTable(DirectMySQLManager.java:92)
    at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:507)
    at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:615)
    at org.apache.sqoop.Sqoop.run(Sqoop.java:143)
    at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
    at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:179)
    at ajay_talend.poc_1_0_1.poc_1.tSqoopImport_1Process(poc_1.java:523)
    at ajay_talend.poc_1_0_1.poc_1.tLibraryLoad_1Process(poc_1.java:370)
    at ajay_talend.poc_1_0_1.poc_1.runJobInTOS(poc_1.java:801)
    at ajay_talend.poc_1_0_1.poc_1.main(poc_1.java:658)
Exception in component tSqoopImport_1
java.lang.Exception: The Sqoop import job has failed. Please check the logs.
    at ajay_talend.poc_1_0_1.poc_1.tSqoopImport_1Process(poc_1.java:527)
    at ajay_talend.poc_1_0_1.poc_1.tLibraryLoad_1Process(poc_1.java:370)
    at ajay_talend.poc_1_0_1.poc_1.runJobInTOS(poc_1.java:801)
    at ajay_talend.poc_1_0_1.poc_1.main(poc_1.java:658)