Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
Hello,
I am getting the error "package org.apache.hadoop.io does not exist" when trying to import using tsqoopimport from TOS for bigdata windows client to CDH 5.13 quickstart vm Hive database. Could you please help in resolving this.
I tried to set the environment variables in CDH but still the same issue continues. $JAVA_HOME is also set to JDK 1.7 path.
export HADOOP_HOME=/usr/lib/hadoop
export HADOOP_MAPRED_HOME=$HADOOP_HOME
export HADOOP_COMMON_HOME=$HADOOP_HOME
export HADOOP_HDFS_HOME=$HADOOP_HOME
export YARN_HOME=$HADOOP_HOME
export HADOOP_COMMON_LIB_NATIVE_DIR=$HADOOP_HOME/lib/native
export PATH=$PATH:$HADOOP_HOME/sbin:$HADOOP_HOME/bin
export HADOOP_INSTALL=$HADOOP_HOME
export CLASSPATH=/usr/lib/hadoop/client-0.20:/usr/lib/hive/lib
Error Log below :
Starting job Sqoop_DB2_to_Hive at 15:05 09/10/2018.
[statistics] connecting to socket on port 3768
[statistics] connected
[WARN ]: org.ap
Hi Sabrina,
I have exactly same problem here. Could you please provide step-by-step solution to check classpath that Talend & Hadoop use? My Hadoop server is in another instance, not local. I'm use Talend Big Data & I have installed JDK1.8
[WARN ]: org.apache.hadoop.util.NativeCodeLoader - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
[WARN ]: org.apache.sqoop.ConnFactory - $SQOOP_CONF_DIR has not been set in the environment. Cannot check for additional configuration.
\tmp\sqoop-monica.baniwijaya\compile\f39416108dab4711b50081d45ac7281d\STG1_KN_F_PRODUCT.java:7: error: package org.apache.hadoop.io does not exist
import org.apache.hadoop.io.BytesWritable;
^
\tmp\sqoop-monica.baniwijaya\compile\f39416108dab4711b50081d45ac7281d\STG1_KN_F_PRODUCT.java:8: error: package org.apache.hadoop.io does not exist
import org.apache.hadoop.io.Text;
^
\tmp\sqoop-monica.baniwijaya\compile\f39416108dab4711b50081d45ac7281d\STG1_KN_F_PRODUCT.java:9: error: package org.apache.hadoop.io does not exist
import org.apache.hadoop.io.Writable;
^
\tmp\sqoop-monica.baniwijaya\compile\f39416108dab4711b50081d45ac7281d\STG1_KN_F_PRODUCT.java:37: error: cannot access Writable
public class STG1_KN_F_PRODUCT extends SqoopRecord implements DBWritable, Writable {
^
class file for org.apache.hadoop.io.Writable not found
\tmp\sqoop-monica.baniwijaya\compile\f39416108dab4711b50081d45ac7281d\STG1_KN_F_PRODUCT.java:645: error: cannot find symbol
public void parse(Text __record) throws RecordParser.ParseError {
^
symbol: class Text
location: class STG1_KN_F_PRODUCT
\tmp\sqoop-monica.baniwijaya\compile\f39416108dab4711b50081d45ac7281d\STG1_KN_F_PRODUCT.java:124: error: cannot find symbol
this.LOB = LOB;
^
symbol: variable this
location: class STG1_KN_F_PRODUCT
\tmp\sqoop-monica.baniwijaya\compile\f39416108dab4711b50081d45ac7281d\STG1_KN_F_PRODUCT.java:127: error: cannot find symbol
this.LOB = LOB;
^
symbol: variable this
location: class STG1_KN_F_PRODUCT
\tmp\sqoop-monica.baniwijaya\compile\f39416108dab4711b50081d45ac7281d\STG1_KN_F_PRODUCT.java:128: error: cannot find symbol
return this;
Thank you
Hi fransiskakd.
Could you solve this problem?
Thanks.
Hello,
We don't find any related jira issue on talend bug tracker.
We met this issue before when using Sqoop to import data from a local database to HDFS.
package org.apache.hadoop.io does not exist
It seems Talend Studio is installed in the C:\Program Files (x86) path, so there is an empty character, and the Job can't find the correct JAR files.
Are you using Talend in Kerberos Environment?
Best regards
Sabrina