<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Re: problem with sqoop into cloudera in Talend Studio</title>
    <link>https://community.qlik.com/t5/Talend-Studio/problem-with-sqoop-into-cloudera/m-p/2199587#M2437</link>
    <description>struggle with same issue. please share the solution if you have</description>
    <pubDate>Mon, 17 Oct 2016 06:10:34 GMT</pubDate>
    <dc:creator>_AnonymousUser</dc:creator>
    <dc:date>2016-10-17T06:10:34Z</dc:date>
    <item>
      <title>problem with sqoop into cloudera</title>
      <link>https://community.qlik.com/t5/Talend-Studio/problem-with-sqoop-into-cloudera/m-p/2199584#M2434</link>
      <description>&lt;FONT color="#333333"&gt;&lt;FONT size="2"&gt;&lt;FONT face="Verdana, Arial, Helvetica, sans-serif"&gt;I am not able to use talend community edition(version 6.1.20xxx) to import mysql table to hive table in cloudera (54).attached is component screen shot and error screen shot&lt;/FONT&gt;&lt;/FONT&gt;&lt;/FONT&gt;&lt;BR /&gt;&lt;FONT color="#333333"&gt;&lt;FONT size="2"&gt;&lt;FONT face="Verdana, Arial, Helvetica, sans-serif"&gt;I have 3 components: 1) sqoop library load 2) MySQL library load 3) sqoop&lt;/FONT&gt;&lt;/FONT&gt;&lt;/FONT&gt;&lt;BR /&gt;please let me know what I did wrong.&lt;BR /&gt;separately, do I need to install sqoop and hadoop on the box where talend is installed?</description>
      <pubDate>Sat, 16 Nov 2024 10:56:19 GMT</pubDate>
      <guid>https://community.qlik.com/t5/Talend-Studio/problem-with-sqoop-into-cloudera/m-p/2199584#M2434</guid>
      <dc:creator>Anonymous</dc:creator>
      <dc:date>2024-11-16T10:56:19Z</dc:date>
    </item>
    <item>
      <title>Re: problem with sqoop into cloudera</title>
      <link>https://community.qlik.com/t5/Talend-Studio/problem-with-sqoop-into-cloudera/m-p/2199585#M2435</link>
      <description>Hi, 
&lt;BR /&gt; 
&lt;BR /&gt; 
&lt;FONT size="2"&gt;&lt;FONT face="" calibri=""&gt;Can you upload again the screenshots you wanted to show, please? For some reason it didn't make it to your post.&lt;/FONT&gt;&lt;/FONT&gt; 
&lt;BR /&gt; 
&lt;BR /&gt; 
&lt;FONT size="2"&gt;&lt;FONT face="" calibri=""&gt;Best regards&lt;/FONT&gt;&lt;/FONT&gt; 
&lt;BR /&gt; 
&lt;FONT size="2"&gt;&lt;FONT face="" calibri=""&gt;Sabrina&lt;BR /&gt;&lt;/FONT&gt;&lt;/FONT&gt; 
&lt;BR /&gt; 
&lt;span class="lia-inline-image-display-wrapper" image-alt="0683p000009MCTQ.png"&gt;&lt;img src="https://community.qlik.com/t5/image/serverpage/image-id/134223iDB8FAADDA8FD91F4/image-size/large?v=v2&amp;amp;px=999" role="button" title="0683p000009MCTQ.png" alt="0683p000009MCTQ.png" /&gt;&lt;/span&gt;</description>
      <pubDate>Mon, 23 Nov 2015 06:46:46 GMT</pubDate>
      <guid>https://community.qlik.com/t5/Talend-Studio/problem-with-sqoop-into-cloudera/m-p/2199585#M2435</guid>
      <dc:creator>Anonymous</dc:creator>
      <dc:date>2015-11-23T06:46:46Z</dc:date>
    </item>
    <item>
      <title>Re: problem with sqoop into cloudera</title>
      <link>https://community.qlik.com/t5/Talend-Studio/problem-with-sqoop-into-cloudera/m-p/2199586#M2436</link>
      <description>Hi, Sabrina, there are something wrong with the upload applet in the forum, I tried both IE and chrome, and I am not able to see the 2 gray bordered box as depicted in your instruction. I have upload them in the dropbox shared. 
&lt;BR /&gt;link: 
&lt;A href="https://www.dropbox.com/s/y74mbeczkeh42qu/error_msg.png?dl=0" target="_blank" rel="nofollow noopener noreferrer"&gt;https://www.dropbox.com/s/y74mbeczkeh42qu/error_msg.png?dl=0&lt;/A&gt; 
&lt;BR /&gt;link: 
&lt;A href="https://www.dropbox.com/s/lf33jkfhi0in1wv/sqoop.png?dl=0" target="_blank" rel="nofollow noopener noreferrer"&gt;https://www.dropbox.com/s/lf33jkfhi0in1wv/sqoop.png?dl=0&lt;/A&gt; 
&lt;BR /&gt;only in firebox, I can see the 2 gray box 
&lt;BR /&gt;Starting job hive at 10:48 23/11/2015. 
&lt;BR /&gt; connecting to socket on port 3346 
&lt;BR /&gt; connected 
&lt;BR /&gt;: org.apache.hadoop.util.NativeCodeLoader - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable 
&lt;BR /&gt;: org.apache.sqoop.ConnFactory - $SQOOP_CONF_DIR has not been set in the environment. Cannot check for additional configuration. 
&lt;BR /&gt;Note: /tmp/sqoop-root/compile/70981af7c84953fa16b5ce744f241a69/agent_type.java uses or overrides a deprecated API. 
&lt;BR /&gt;Note: Recompile with -Xlint:deprecation for details. 
&lt;BR /&gt;: org.apache.sqoop.mapreduce.JobBase - SQOOP_HOME is unset. May not be able to find all job dependencies. 
&lt;BR /&gt;: org.apache.hadoop.hdfs.DFSClient - DataStreamer Exception 
&lt;BR /&gt;java.nio.channels.UnresolvedAddressException 
&lt;BR /&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp; at sun.nio.ch.Net.checkAddress(Net.java:101) 
&lt;BR /&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp; at sun.nio.ch.SocketChannelImpl.connect(SocketChannelImpl.java:622) 
&lt;BR /&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp; at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:192) 
&lt;BR /&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp; at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:530) 
&lt;BR /&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp; at org.apache.hadoop.hdfs.DFSOutputStream.createSocketForPipeline(DFSOutputStream.java:1622) 
&lt;BR /&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp; at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1420) 
&lt;BR /&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp; at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1373) 
&lt;BR /&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp; at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:600) 
&lt;BR /&gt;: org.apache.hadoop.security.UserGroupInformation - PriviledgedActionException as:hdfs (auth 
&lt;span class="lia-inline-image-display-wrapper" image-alt="0683p000009M9p6.png"&gt;&lt;img src="https://community.qlik.com/t5/image/serverpage/image-id/134116iFBD5D7F21624A744/image-size/large?v=v2&amp;amp;px=999" role="button" title="0683p000009M9p6.png" alt="0683p000009M9p6.png" /&gt;&lt;/span&gt;IMPLE) cause:java.io.IOException: DataStreamer Exception: 
&lt;BR /&gt;: org.apache.sqoop.tool.ImportTool - Encountered IOException running import job: java.io.IOException: DataStreamer Exception: 
&lt;BR /&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp; at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:708) 
&lt;BR /&gt;Caused by: java.nio.channels.UnresolvedAddressException 
&lt;BR /&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp; at sun.nio.ch.Net.checkAddress(Net.java:101) 
&lt;BR /&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp; at sun.nio.ch.SocketChannelImpl.connect(SocketChannelImpl.java:622) 
&lt;BR /&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp; at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:192) 
&lt;BR /&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp; at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:530) 
&lt;BR /&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp; at org.apache.hadoop.hdfs.DFSOutputStream.createSocketForPipeline(DFSOutputStream.java:1622) 
&lt;BR /&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp; at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1420) 
&lt;BR /&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp; at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1373) 
&lt;BR /&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp; at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:600) 
&lt;BR /&gt;Exception in component tSqoopImport_1 
&lt;BR /&gt;java.lang.Exception: The Sqoop import job has failed. Please check the logs. 
&lt;BR /&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp; at poc.hive_0_1.hive.tSqoopImport_1Process(hive.java:632) 
&lt;BR /&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp; at poc.hive_0_1.hive.tLibraryLoad_2Process(hive.java:485) 
&lt;BR /&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp; at poc.hive_0_1.hive.tLibraryLoad_1Process(hive.java:387) 
&lt;BR /&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp; at poc.hive_0_1.hive.runJobInTOS(hive.java:906) 
&lt;BR /&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp; at poc.hive_0_1.hive.main(hive.java:763) 
&lt;BR /&gt; 
&lt;span class="lia-inline-image-display-wrapper" image-alt="0683p000009MCCN.png"&gt;&lt;img src="https://community.qlik.com/t5/image/serverpage/image-id/144865i42A7BAF151186CE9/image-size/large?v=v2&amp;amp;px=999" role="button" title="0683p000009MCCN.png" alt="0683p000009MCCN.png" /&gt;&lt;/span&gt;</description>
      <pubDate>Mon, 23 Nov 2015 18:22:17 GMT</pubDate>
      <guid>https://community.qlik.com/t5/Talend-Studio/problem-with-sqoop-into-cloudera/m-p/2199586#M2436</guid>
      <dc:creator>Anonymous</dc:creator>
      <dc:date>2015-11-23T18:22:17Z</dc:date>
    </item>
    <item>
      <title>Re: problem with sqoop into cloudera</title>
      <link>https://community.qlik.com/t5/Talend-Studio/problem-with-sqoop-into-cloudera/m-p/2199587#M2437</link>
      <description>struggle with same issue. please share the solution if you have</description>
      <pubDate>Mon, 17 Oct 2016 06:10:34 GMT</pubDate>
      <guid>https://community.qlik.com/t5/Talend-Studio/problem-with-sqoop-into-cloudera/m-p/2199587#M2437</guid>
      <dc:creator>_AnonymousUser</dc:creator>
      <dc:date>2016-10-17T06:10:34Z</dc:date>
    </item>
    <item>
      <title>Re: problem with sqoop into cloudera</title>
      <link>https://community.qlik.com/t5/Talend-Studio/problem-with-sqoop-into-cloudera/m-p/2199588#M2438</link>
      <description>&lt;FONT face="Verdana, Helvetica, Arial, sans-serif"&gt;&lt;FONT size="1"&gt;Please share the solution of the "&lt;/FONT&gt;&lt;FONT size="1"&gt;org.apache.sqoop.tool.ImportTool - Encountered IOException running import job: java.io.IOException: DataStreamer Exception" issue.&lt;/FONT&gt;&lt;/FONT&gt;</description>
      <pubDate>Mon, 17 Oct 2016 06:21:06 GMT</pubDate>
      <guid>https://community.qlik.com/t5/Talend-Studio/problem-with-sqoop-into-cloudera/m-p/2199588#M2438</guid>
      <dc:creator>Anonymous</dc:creator>
      <dc:date>2016-10-17T06:21:06Z</dc:date>
    </item>
    <item>
      <title>Re: problem with sqoop into cloudera</title>
      <link>https://community.qlik.com/t5/Talend-Studio/problem-with-sqoop-into-cloudera/m-p/2199589#M2439</link>
      <description>post your screen shot.If you are having same issue as "david" then looking at the screen shot the jdbc driver jar and class name are missing.</description>
      <pubDate>Mon, 17 Oct 2016 18:20:59 GMT</pubDate>
      <guid>https://community.qlik.com/t5/Talend-Studio/problem-with-sqoop-into-cloudera/m-p/2199589#M2439</guid>
      <dc:creator>Anonymous</dc:creator>
      <dc:date>2016-10-17T18:20:59Z</dc:date>
    </item>
    <item>
      <title>Re: problem with sqoop into cloudera</title>
      <link>https://community.qlik.com/t5/Talend-Studio/problem-with-sqoop-into-cloudera/m-p/2199590#M2440</link>
      <description>Hi Amula, 
&lt;BR /&gt;I am facing the below error. Importing data from Mysql (version 5) to CDH5.8 Sandbox.&amp;nbsp; Please advice what are the 
&lt;B&gt;Jars and Class Name&lt;/B&gt; should i use for my job. I have Java version 1.8.0 in both Hadoop and TALEND 
&lt;BR /&gt;Job : tLibrarayLoad&amp;nbsp; --&amp;gt; tSqoopImport 
&lt;BR /&gt;: org.apache.hadoop.util.NativeCodeLoader - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable 
&lt;BR /&gt;: org.apache.hadoop.util.Shell - Failed to locate the winutils binary in the hadoop binary path 
&lt;BR /&gt;java.io.IOException: Could not locate executable null\bin\winutils.exe in the Hadoop binaries. 
&lt;BR /&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp; at org.apache.hadoop.util.Shell.getQualifiedBinPath(Shell.java:381) 
&lt;BR /&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp; at org.apache.hadoop.util.Shell.getWinUtilsPath(Shell.java:396) 
&lt;BR /&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp; at org.apache.hadoop.util.Shell.&amp;lt;clinit&amp;gt;(Shell.java:389) 
&lt;BR /&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp; at org.apache.hadoop.util.StringUtils.&amp;lt;clinit&amp;gt;(StringUtils.java:79) 
&lt;BR /&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp; at org.apache.hadoop.security.Groups.parseStaticMapping(Groups.java:130) 
&lt;BR /&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp; at org.apache.hadoop.security.Groups.&amp;lt;init&amp;gt;(Groups.java:94) 
&lt;BR /&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp; at org.apache.hadoop.security.Groups.&amp;lt;init&amp;gt;(Groups.java:74) 
&lt;BR /&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp; at org.apache.hadoop.security.Groups.getUserToGroupsMappingService(Groups.java:303) 
&lt;BR /&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp; at org.apache.hadoop.security.UserGroupInformation.initialize(UserGroupInformation.java:283) 
&lt;BR /&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp; at org.apache.hadoop.security.UserGroupInformation.ensureInitialized(UserGroupInformation.java:260) 
&lt;BR /&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp; at org.apache.hadoop.security.UserGroupInformation.loginUserFromSubject(UserGroupInformation.java:790) 
&lt;BR /&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp; at org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:760) 
&lt;BR /&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp; at org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:633) 
&lt;BR /&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp; at org.apache.hadoop.fs.FileSystem$Cache$Key.&amp;lt;init&amp;gt;(FileSystem.java:2859) 
&lt;BR /&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp; at org.apache.hadoop.fs.FileSystem$Cache$Key.&amp;lt;init&amp;gt;(FileSystem.java:2851) 
&lt;BR /&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp; at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2714) 
&lt;BR /&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp; at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:382) 
&lt;BR /&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp; at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:181) 
&lt;BR /&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp; at ajay_talend.poc_1_0_1.poc_1.tSqoopImport_1Process(poc_1.java:483) 
&lt;BR /&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp; at ajay_talend.poc_1_0_1.poc_1.tLibraryLoad_1Process(poc_1.java:370) 
&lt;BR /&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp; at ajay_talend.poc_1_0_1.poc_1.runJobInTOS(poc_1.java:801) 
&lt;BR /&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp; at ajay_talend.poc_1_0_1.poc_1.main(poc_1.java:658) 
&lt;BR /&gt;: org.apache.sqoop.ConnFactory - $SQOOP_CONF_DIR has not been set in the environment. Cannot check for additional configuration. 
&lt;BR /&gt;Note: \tmp\sqoop-Aj\compile\f0f413757a34cc91cf81734f4d7d39b8\table1.java uses or overrides a deprecated API. 
&lt;BR /&gt;Note: Recompile with -Xlint:deprecation for details. 
&lt;BR /&gt;: org.apache.sqoop.mapreduce.JobBase - SQOOP_HOME is unset. May not be able to find all job dependencies. 
&lt;BR /&gt;: org.apache.sqoop.mapreduce.db.TextSplitter - Generating splits for a textual index column. 
&lt;BR /&gt;: org.apache.sqoop.mapreduce.db.TextSplitter - If your database sorts in a case-insensitive order, this may result in a partial import or duplicate records. 
&lt;BR /&gt;: org.apache.sqoop.mapreduce.db.TextSplitter - You are strongly encouraged to choose an integral split column. 
&lt;BR /&gt;: org.apache.sqoop.tool.ImportTool - Encountered IOException running import job: java.io.IOException: Job status not available 
&lt;BR /&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp; at org.apache.hadoop.mapreduce.Job.updateStatus(Job.java:334) 
&lt;BR /&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp; at org.apache.hadoop.mapreduce.Job.isComplete(Job.java:621) 
&lt;BR /&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp; at org.apache.hadoop.mapreduce.Job.monitorAndPrintJob(Job.java:1366) 
&lt;BR /&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp; at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1328) 
&lt;BR /&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp; at org.apache.sqoop.mapreduce.ImportJobBase.doSubmitJob(ImportJobBase.java:203) 
&lt;BR /&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp; at org.apache.sqoop.mapreduce.ImportJobBase.runJob(ImportJobBase.java:176) 
&lt;BR /&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp; at org.apache.sqoop.mapreduce.ImportJobBase.runImport(ImportJobBase.java:273) 
&lt;BR /&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp; at org.apache.sqoop.manager.DirectMySQLManager.importTable(DirectMySQLManager.java:92) 
&lt;BR /&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp; at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:507) 
&lt;BR /&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp; at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:615) 
&lt;BR /&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp; at org.apache.sqoop.Sqoop.run(Sqoop.java:143) 
&lt;BR /&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp; at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70) 
&lt;BR /&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp; at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:179) 
&lt;BR /&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp; at ajay_talend.poc_1_0_1.poc_1.tSqoopImport_1Process(poc_1.java:523) 
&lt;BR /&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp; at ajay_talend.poc_1_0_1.poc_1.tLibraryLoad_1Process(poc_1.java:370) 
&lt;BR /&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp; at ajay_talend.poc_1_0_1.poc_1.runJobInTOS(poc_1.java:801) 
&lt;BR /&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp; at ajay_talend.poc_1_0_1.poc_1.main(poc_1.java:658) 
&lt;BR /&gt;Exception in component tSqoopImport_1 
&lt;BR /&gt;java.lang.Exception: The Sqoop import job has failed. Please check the logs. 
&lt;BR /&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp; at ajay_talend.poc_1_0_1.poc_1.tSqoopImport_1Process(poc_1.java:527) 
&lt;BR /&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp; at ajay_talend.poc_1_0_1.poc_1.tLibraryLoad_1Process(poc_1.java:370) 
&lt;BR /&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp; at ajay_talend.poc_1_0_1.poc_1.runJobInTOS(poc_1.java:801) 
&lt;BR /&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp; at ajay_talend.poc_1_0_1.poc_1.main(poc_1.java:658)</description>
      <pubDate>Tue, 21 Feb 2017 14:37:51 GMT</pubDate>
      <guid>https://community.qlik.com/t5/Talend-Studio/problem-with-sqoop-into-cloudera/m-p/2199590#M2440</guid>
      <dc:creator>_AnonymousUser</dc:creator>
      <dc:date>2017-02-21T14:37:51Z</dc:date>
    </item>
  </channel>
</rss>

