<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic HDFSPut in Talend Studio</title>
    <link>https://community.qlik.com/t5/Talend-Studio/HDFSPut/m-p/2216646#M12562</link>
    <description>I have a environment with a cloudera hadoop vmware server and a windows laptop with talend open studio. I want to copy files into HDFS from my windows environment and have tried to use the HDFSput component. The connection works fine, and im able to browser the HDFS from my workstation. But when I try to copy in files I get this error :
&lt;BR /&gt; connecting to socket on port 4024
&lt;BR /&gt; connected
&lt;BR /&gt;: org.apache.hadoop.util.NativeCodeLoader - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
&lt;BR /&gt;Exception in component tHDFSPut_1
&lt;BR /&gt;java.io.IOException: DataStreamer Exception: 
&lt;BR /&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp; at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:708)
&lt;BR /&gt;Caused by: java.nio.channels.UnresolvedAddressException
&lt;BR /&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp; at sun.nio.ch.Net.checkAddress(Unknown Source)
&lt;BR /&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp; at sun.nio.ch.SocketChannelImpl.connect(Unknown Source)
&lt;BR /&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp; at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:192)
&lt;BR /&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp; at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:530)
&lt;BR /&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp; at org.apache.hadoop.hdfs.DFSOutputStream.createSocketForPipeline(DFSOutputStream.java:1622)
&lt;BR /&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp; at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1420)
&lt;BR /&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp; at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1373)
&lt;BR /&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp; at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutp
&lt;BR /&gt;Do I miss any library files on my workstation, or is it not possible to copy files to the HDFS from my workstaion. 
&lt;BR /&gt;Br
&lt;BR /&gt;Petter</description>
    <pubDate>Sat, 16 Nov 2024 10:55:47 GMT</pubDate>
    <dc:creator>Anonymous</dc:creator>
    <dc:date>2024-11-16T10:55:47Z</dc:date>
    <item>
      <title>HDFSPut</title>
      <link>https://community.qlik.com/t5/Talend-Studio/HDFSPut/m-p/2216646#M12562</link>
      <description>I have a environment with a cloudera hadoop vmware server and a windows laptop with talend open studio. I want to copy files into HDFS from my windows environment and have tried to use the HDFSput component. The connection works fine, and im able to browser the HDFS from my workstation. But when I try to copy in files I get this error :
&lt;BR /&gt; connecting to socket on port 4024
&lt;BR /&gt; connected
&lt;BR /&gt;: org.apache.hadoop.util.NativeCodeLoader - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
&lt;BR /&gt;Exception in component tHDFSPut_1
&lt;BR /&gt;java.io.IOException: DataStreamer Exception: 
&lt;BR /&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp; at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:708)
&lt;BR /&gt;Caused by: java.nio.channels.UnresolvedAddressException
&lt;BR /&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp; at sun.nio.ch.Net.checkAddress(Unknown Source)
&lt;BR /&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp; at sun.nio.ch.SocketChannelImpl.connect(Unknown Source)
&lt;BR /&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp; at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:192)
&lt;BR /&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp; at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:530)
&lt;BR /&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp; at org.apache.hadoop.hdfs.DFSOutputStream.createSocketForPipeline(DFSOutputStream.java:1622)
&lt;BR /&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp; at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1420)
&lt;BR /&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp; at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1373)
&lt;BR /&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp; at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutp
&lt;BR /&gt;Do I miss any library files on my workstation, or is it not possible to copy files to the HDFS from my workstaion. 
&lt;BR /&gt;Br
&lt;BR /&gt;Petter</description>
      <pubDate>Sat, 16 Nov 2024 10:55:47 GMT</pubDate>
      <guid>https://community.qlik.com/t5/Talend-Studio/HDFSPut/m-p/2216646#M12562</guid>
      <dc:creator>Anonymous</dc:creator>
      <dc:date>2024-11-16T10:55:47Z</dc:date>
    </item>
    <item>
      <title>Re: HDFSPut</title>
      <link>https://community.qlik.com/t5/Talend-Studio/HDFSPut/m-p/2216647#M12563</link>
      <description>I chaged the port number in my connection from 8020 to 50010, and the file is created but it's empty, here are the output 
&lt;BR /&gt; connecting to socket on port 3993 
&lt;BR /&gt; connected 
&lt;BR /&gt;: org.apache.hadoop.util.NativeCodeLoader - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable 
&lt;BR /&gt;Exception in component tHDFSPut_1 
&lt;BR /&gt;java.io.EOFException: End of File Exception between local host is: "NOHUSEBPET01/192.168.136.1"; destination host is: "192.168.181.128":50010; : java.io.EOFException; For more details see:&amp;nbsp; 
&lt;BR /&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp; at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) 
&lt;BR /&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp; at sun.reflect.NativeConstructorAccessorImpl.newInstance(Unknown Source) 
&lt;BR /&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp; at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(Unknown Source) 
&lt;BR /&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp; at java.lang.reflect.Constructor.newInstance(Unknown Source) 
&lt;BR /&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp; at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:791) 
&lt;BR /&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp; at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:764) 
&lt;BR /&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp; at org.apache.hadoop.ipc.Client.call(Client.java:1472) 
&lt;BR /&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp; at org.apache.hadoop.ipc.Client.call(Client.java:1399) 
&lt;BR /&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp; at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:232) 
&lt;BR /&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp; at com.sun.proxy.$Proxy7.mkdirs(Unknown Source) 
&lt;BR /&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp; at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.mkdirs(ClientNamenodeProtocolTranslatorPB.java:539) 
&lt;BR /&gt; disconnected 
&lt;BR /&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp; at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) 
&lt;BR /&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp; at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source) 
&lt;BR /&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp; at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source) 
&lt;BR /&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp; at java.lang.reflect.Method.invoke(Unknown Source) 
&lt;BR /&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp; at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:187) 
&lt;BR /&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp; at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) 
&lt;BR /&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp; at com.sun.proxy.$Proxy8.mkdirs(Unknown Source) 
&lt;BR /&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp; at org.apache.hadoop.hdfs.DFSClient.primitiveMkdir(DFSClient.java:2758) 
&lt;BR /&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp; at org.apache.hadoop.hdfs.DFSClient.mkdirs(DFSClient.java:2729) 
&lt;BR /&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp; at org.apache.hadoop.hdfs.DistributedFileSystem$17.doCall(DistributedFileSystem.java:870) 
&lt;BR /&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp; at org.apache.hadoop.hdfs.DistributedFileSystem$17.doCall(DistributedFileSystem.java:866) 
&lt;BR /&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp; at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81) 
&lt;BR /&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp; at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirsInternal(DistributedFileSystem.java:866) 
&lt;BR /&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp; at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirs(DistributedFileSystem.java:859) 
&lt;BR /&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp; at org.apache.hadoop.fs.FileSystem.mkdirs(FileSystem.java:1817) 
&lt;BR /&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp; at local_project.uu_0_1.uu.tHDFSPut_1Process(uu.java:442) 
&lt;BR /&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp; at local_project.uu_0_1.uu.runJobInTOS(uu.java:819) 
&lt;BR /&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp; at local_project.uu_0_1.uu.main(uu.java:664) 
&lt;BR /&gt;Caused by: java.io.EOFException 
&lt;BR /&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp; at java.io.DataInputStream.readInt(Unknown Source) 
&lt;BR /&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp; at org.apache.hadoop.ipc.Client$Connection.receiveRpcResponse(Client.java:1071) 
&lt;BR /&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp; at org.apache.hadoop.ipc.Client$Connection.run(Client.java:966) 
&lt;BR /&gt;Job uu ended at 22:16 28/11/2015.</description>
      <pubDate>Sat, 28 Nov 2015 21:16:59 GMT</pubDate>
      <guid>https://community.qlik.com/t5/Talend-Studio/HDFSPut/m-p/2216647#M12563</guid>
      <dc:creator>Anonymous</dc:creator>
      <dc:date>2015-11-28T21:16:59Z</dc:date>
    </item>
    <item>
      <title>Re: HDFSPut</title>
      <link>https://community.qlik.com/t5/Talend-Studio/HDFSPut/m-p/2216648#M12564</link>
      <description>Is this solved ?</description>
      <pubDate>Wed, 30 Dec 2015 10:27:52 GMT</pubDate>
      <guid>https://community.qlik.com/t5/Talend-Studio/HDFSPut/m-p/2216648#M12564</guid>
      <dc:creator>Anonymous</dc:creator>
      <dc:date>2015-12-30T10:27:52Z</dc:date>
    </item>
    <item>
      <title>Re: HDFSPut</title>
      <link>https://community.qlik.com/t5/Talend-Studio/HDFSPut/m-p/2216649#M12565</link>
      <description>Hello,&lt;BR /&gt;Can you confirm that the machine you are running the job on can access the machine "192.168.181.128" on the port 50010 ? It looks to be a firewall issue. Also, can you double check that the machine your are running the job on can ping "192.168.181.128" ?</description>
      <pubDate>Mon, 04 Jan 2016 12:16:57 GMT</pubDate>
      <guid>https://community.qlik.com/t5/Talend-Studio/HDFSPut/m-p/2216649#M12565</guid>
      <dc:creator>Anonymous</dc:creator>
      <dc:date>2016-01-04T12:16:57Z</dc:date>
    </item>
    <item>
      <title>Re: HDFSPut</title>
      <link>https://community.qlik.com/t5/Talend-Studio/HDFSPut/m-p/2216650#M12566</link>
      <description>I have the same issue using hortonworks instead cloudera, and yes the port 50010 is open, all ports have been opened in the firewall, and I can make a telnet to that port.</description>
      <pubDate>Fri, 15 Jan 2016 12:50:55 GMT</pubDate>
      <guid>https://community.qlik.com/t5/Talend-Studio/HDFSPut/m-p/2216650#M12566</guid>
      <dc:creator>_AnonymousUser</dc:creator>
      <dc:date>2016-01-15T12:50:55Z</dc:date>
    </item>
    <item>
      <title>Re: HDFSPut</title>
      <link>https://community.qlik.com/t5/Talend-Studio/HDFSPut/m-p/2216651#M12567</link>
      <description>Was this solved? i have the same issue.</description>
      <pubDate>Mon, 16 May 2016 20:40:48 GMT</pubDate>
      <guid>https://community.qlik.com/t5/Talend-Studio/HDFSPut/m-p/2216651#M12567</guid>
      <dc:creator>Anonymous</dc:creator>
      <dc:date>2016-05-16T20:40:48Z</dc:date>
    </item>
    <item>
      <title>Re: HDFSPut</title>
      <link>https://community.qlik.com/t5/Talend-Studio/HDFSPut/m-p/2216652#M12568</link>
      <description>Check to see if the user account you are using has write permission to the folder you are writing.</description>
      <pubDate>Fri, 27 May 2016 01:12:28 GMT</pubDate>
      <guid>https://community.qlik.com/t5/Talend-Studio/HDFSPut/m-p/2216652#M12568</guid>
      <dc:creator>Anonymous</dc:creator>
      <dc:date>2016-05-27T01:12:28Z</dc:date>
    </item>
    <item>
      <title>Re: HDFSPut</title>
      <link>https://community.qlik.com/t5/Talend-Studio/HDFSPut/m-p/2216653#M12569</link>
      <description>&lt;FONT size="1"&gt;&lt;FONT face="Verdana, Helvetica, Arial, sans-serif"&gt;&amp;nbsp;i have the same issue&lt;/FONT&gt;&lt;/FONT&gt;
&lt;BR /&gt;
&lt;FONT face="Verdana, Helvetica, Arial, sans-serif"&gt;&lt;FONT size="1"&gt;Exception in component tHDFSPut_1&lt;/FONT&gt;&lt;/FONT&gt;
&lt;BR /&gt;
&lt;FONT face="Verdana, Helvetica, Arial, sans-serif"&gt;&lt;FONT size="1"&gt;java.io.IOException: DataStreamer Exception:&amp;nbsp;&lt;/FONT&gt;&lt;/FONT&gt;
&lt;BR /&gt;
&lt;FONT face="Verdana, Helvetica, Arial, sans-serif"&gt;&lt;FONT size="1"&gt;: org.apache.hadoop.hdfs.DFSClient - DataStreamer Exception&lt;/FONT&gt;&lt;/FONT&gt;
&lt;BR /&gt;
&lt;FONT face="Verdana, Helvetica, Arial, sans-serif"&gt;&lt;FONT size="1"&gt;java.nio.channels.UnresolvedAddressException&lt;/FONT&gt;&lt;/FONT&gt;
&lt;BR /&gt;
&lt;FONT face="Verdana, Helvetica, Arial, sans-serif"&gt;&lt;FONT size="1"&gt; at sun.nio.ch.Net.checkAddress(Net.java:101)&lt;/FONT&gt;&lt;/FONT&gt;
&lt;BR /&gt;
&lt;FONT face="Verdana, Helvetica, Arial, sans-serif"&gt;&lt;FONT size="1"&gt; at sun.nio.ch.SocketChannelImpl.connect(SocketChannelImpl.java:622)&lt;/FONT&gt;&lt;/FONT&gt;
&lt;BR /&gt;
&lt;FONT face="Verdana, Helvetica, Arial, sans-serif"&gt;&lt;FONT size="1"&gt; at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:192)&lt;/FONT&gt;&lt;/FONT&gt;
&lt;BR /&gt;
&lt;FONT face="Verdana, Helvetica, Arial, sans-serif"&gt;&lt;FONT size="1"&gt; at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:530)&lt;/FONT&gt;&lt;/FONT&gt;
&lt;BR /&gt;
&lt;FONT face="Verdana, Helvetica, Arial, sans-serif"&gt;&lt;FONT size="1"&gt; at org.apache.hadoop.hdfs.DFSOutputStream.createSocketForPipeline(DFSOutputStream.java:1752)&lt;/FONT&gt;&lt;/FONT&gt;
&lt;BR /&gt;
&lt;FONT face="Verdana, Helvetica, Arial, sans-serif"&gt;&lt;FONT size="1"&gt; at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1530)&lt;/FONT&gt;&lt;/FONT&gt;
&lt;BR /&gt;
&lt;FONT face="Verdana, Helvetica, Arial, sans-serif"&gt;&lt;FONT size="1"&gt; at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1483)&lt;/FONT&gt;&lt;/FONT&gt;
&lt;BR /&gt;
&lt;FONT face="Verdana, Helvetica, Arial, sans-serif"&gt;&lt;FONT size="1"&gt; at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:668)&lt;/FONT&gt;&lt;/FONT&gt;
&lt;BR /&gt;
&lt;FONT face="Verdana, Helvetica, Arial, sans-serif"&gt;&lt;FONT size="1"&gt; at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:796)&lt;/FONT&gt;&lt;/FONT&gt;
&lt;BR /&gt;
&lt;FONT face="Verdana, Helvetica, Arial, sans-serif"&gt;&lt;FONT size="1"&gt;Caused by: java.nio.channels.UnresolvedAddressException&lt;/FONT&gt;&lt;/FONT&gt;
&lt;BR /&gt;
&lt;FONT face="Verdana, Helvetica, Arial, sans-serif"&gt;&lt;FONT size="1"&gt; at sun.nio.ch.Net.checkAddress(Net.java:101)&lt;/FONT&gt;&lt;/FONT&gt;
&lt;BR /&gt;
&lt;FONT face="Verdana, Helvetica, Arial, sans-serif"&gt;&lt;FONT size="1"&gt; at sun.nio.ch.SocketChannelImpl.connect(SocketChannelImpl.java:622)&lt;/FONT&gt;&lt;/FONT&gt;
&lt;BR /&gt;
&lt;FONT face="Verdana, Helvetica, Arial, sans-serif"&gt;&lt;FONT size="1"&gt; at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:192)&lt;/FONT&gt;&lt;/FONT&gt;
&lt;BR /&gt;
&lt;FONT face="Verdana, Helvetica, Arial, sans-serif"&gt;&lt;FONT size="1"&gt; at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:530)&lt;/FONT&gt;&lt;/FONT&gt;
&lt;BR /&gt;
&lt;FONT face="Verdana, Helvetica, Arial, sans-serif"&gt;&lt;FONT size="1"&gt; at org.apache.hadoop.hdfs.DFSOutputStream.createSocketForPipeline(DFSOutputStream.java:1752)&lt;/FONT&gt;&lt;/FONT&gt;
&lt;BR /&gt;
&lt;FONT face="Verdana, Helvetica, Arial, sans-serif"&gt;&lt;FONT size="1"&gt; at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1530)&lt;/FONT&gt;&lt;/FONT&gt;
&lt;BR /&gt;
&lt;FONT face="Verdana, Helvetica, Arial, sans-serif"&gt;&lt;FONT size="1"&gt; at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1483)&lt;/FONT&gt;&lt;/FONT&gt;
&lt;BR /&gt;
&lt;FONT face="Verdana, Helvetica, Arial, sans-serif"&gt;&lt;FONT size="1"&gt; at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:668)&lt;/FONT&gt;&lt;/FONT&gt;</description>
      <pubDate>Mon, 30 May 2016 08:00:22 GMT</pubDate>
      <guid>https://community.qlik.com/t5/Talend-Studio/HDFSPut/m-p/2216653#M12569</guid>
      <dc:creator>Anonymous</dc:creator>
      <dc:date>2016-05-30T08:00:22Z</dc:date>
    </item>
    <item>
      <title>Re: HDFSPut</title>
      <link>https://community.qlik.com/t5/Talend-Studio/HDFSPut/m-p/2216654#M12570</link>
      <description>Hi yepeng2007fei, your issue may be caused by this.... 
&lt;BR /&gt; 
&lt;A href="https://community.qlik.com/s/feed/0D53p00007vCpeZCAS" target="_blank" rel="nofollow noopener noreferrer"&gt;https://community.talend.com/t5/Design-and-Development/resolved-Simple-tHDFSPut-based-job-fails-on-6-1-but-not-6-0/td-p/103608&lt;/A&gt;</description>
      <pubDate>Tue, 07 Jun 2016 15:16:50 GMT</pubDate>
      <guid>https://community.qlik.com/t5/Talend-Studio/HDFSPut/m-p/2216654#M12570</guid>
      <dc:creator>Anonymous</dc:creator>
      <dc:date>2016-06-07T15:16:50Z</dc:date>
    </item>
  </channel>
</rss>

