<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Re: OutofMemory Error When considering a huge XML File as source in Talend Studio</title>
    <link>https://community.qlik.com/t5/Talend-Studio/OutofMemory-Error-When-considering-a-huge-XML-File-as-source/m-p/2236011#M24857</link>
    <description>I have made the suggested changes and run the Job....Still No Luck........
&lt;BR /&gt;Can Some one Please help me....
&lt;BR /&gt;Can we Split Large XML into Small Chunks (Multiple XML Files).......How can we do that...?
&lt;BR /&gt;What will be the impact if we do like that</description>
    <pubDate>Mon, 16 Jul 2012 14:29:06 GMT</pubDate>
    <dc:creator>Anonymous</dc:creator>
    <dc:date>2012-07-16T14:29:06Z</dc:date>
    <item>
      <title>OutofMemory Error When considering a huge XML File as source</title>
      <link>https://community.qlik.com/t5/Talend-Studio/OutofMemory-Error-When-considering-a-huge-XML-File-as-source/m-p/2236004#M24850</link>
      <description>Hi Shong, 
&lt;BR /&gt;I have a SDMX XML File as my source and earlier xml file size was in kb and able to run the jobs successfully. 
&lt;BR /&gt;Now we got a very huge 1GB XMl file and if run the same job job getting failed and throwing the below error. 
&lt;BR /&gt;Can you please suggest me with proper workarounds and their steps. 
&lt;BR /&gt;Please help me as it is very urgent and and life issue. 
&lt;BR /&gt;Error 1: 
&lt;BR /&gt;=========== 
&lt;BR /&gt;Starting job j500_ValidateLoad at 14:51 12/07/2012. 
&lt;BR /&gt; connecting to socket on port 3958 
&lt;BR /&gt; connected 
&lt;BR /&gt;Exception in thread "Thread-7" java.lang.OutOfMemoryError: Java heap space 
&lt;BR /&gt; at java.util.LinkedList.listIterator(LinkedList.java:667) 
&lt;BR /&gt; at java.util.AbstractList.listIterator(AbstractList.java:284) 
&lt;BR /&gt; at java.util.AbstractSequentialList.iterator(AbstractSequentialList.java:222) 
&lt;BR /&gt; at routines.system.RunStat.sendMessages(RunStat.java:248) 
&lt;BR /&gt; at routines.system.RunStat.run(RunStat.java:212) 
&lt;BR /&gt; at java.lang.Thread.run(Thread.java:619) 
&lt;BR /&gt;Exception in thread "main" java.lang.Error: java.lang.Error: java.lang.Error: java.lang.Error: java.lang.Error: java.lang.OutOfMemoryError: Java heap space 
&lt;BR /&gt; at datasetimportjobs.j500_validateload_6_0.j500_ValidateLoad.tOracleInput_1Process(j500_ValidateLoad.java:4042) 
&lt;BR /&gt; at datasetimportjobs.j500_validateload_6_0.j500_ValidateLoad.runJobInTOS(j500_ValidateLoad.java:5436) 
&lt;BR /&gt; at datasetimportjobs.j500_validateload_6_0.j500_ValidateLoad.main(j500_ValidateLoad.java:5113) 
&lt;BR /&gt;Caused by: java.lang.Error: java.lang.Error: java.lang.Error: java.lang.Error: java.lang.OutOfMemoryError: Java heap space 
&lt;BR /&gt; disconnected 
&lt;BR /&gt; at datasetimportjobs.j100_read_files_6_1.j100_Read_Files.tRunJob_1Process(j100_Read_Files.java:2555) 
&lt;BR /&gt; at datasetimportjobs.j100_read_files_6_1.j100_Read_Files.runJobInTOS(j100_Read_Files.java:2942) 
&lt;BR /&gt; at datasetimportjobs.j100_read_files_6_1.j100_Read_Files.runJob(j100_Read_Files.java:2608) 
&lt;BR /&gt; at datasetimportjobs.j500_validateload_6_0.j500_ValidateLoad.tOracleInput_1Process(j500_ValidateLoad.java:3375) 
&lt;BR /&gt; ... 2 more 
&lt;BR /&gt;Caused by: java.lang.Error: java.lang.Error: java.lang.Error: java.lang.OutOfMemoryError: Java heap space 
&lt;BR /&gt; at datasetimportjobs.j100_read_files_6_1.j100_Read_Files.tRunJob_2Process(j100_Read_Files.java:956) 
&lt;BR /&gt; at datasetimportjobs.j100_read_files_6_1.j100_Read_Files.tRunJob_1Process(j100_Read_Files.java:2546) 
&lt;BR /&gt; ... 5 more 
&lt;BR /&gt;Caused by: java.lang.Error: java.lang.Error: java.lang.OutOfMemoryError: Java heap space 
&lt;BR /&gt; at datasetimportjobs.j102_read_dataset_xml_0_6.j102_Read_Dataset_XML.tFileList_1Process(j102_Read_Dataset_XML.java:1030) 
&lt;BR /&gt; at datasetimportjobs.j102_read_dataset_xml_0_6.j102_Read_Dataset_XML.runJobInTOS(j102_Read_Dataset_XML.java:1498) 
&lt;BR /&gt; at datasetimportjobs.j102_read_dataset_xml_0_6.j102_Read_Dataset_XML.runJob(j102_Read_Dataset_XML.java:1282) 
&lt;BR /&gt; at datasetimportjobs.j100_read_files_6_1.j100_Read_Files.tRunJob_2Process(j100_Read_Files.java:895) 
&lt;BR /&gt; ... 6 more 
&lt;BR /&gt;Caused by: java.lang.Error: java.lang.OutOfMemoryError: Java heap space 
&lt;BR /&gt; at datasetimportjobs.j140_import_segments_0_6.j140_Import_Segments.tFileInputXML_1Process(j140_Import_Segments.java:2345) 
&lt;BR /&gt; at datasetimportjobs.j140_import_segments_0_6.j140_Import_Segments.runJobInTOS(j140_Import_Segments.java:2767) 
&lt;BR /&gt; at datasetimportjobs.j140_import_segments_0_6.j140_Import_Segments.runJob(j140_Import_Segments.java:2572) 
&lt;BR /&gt; at datasetimportjobs.j102_read_dataset_xml_0_6.j102_Read_Dataset_XML.tFileList_1Process(j102_Read_Dataset_XML.java:762) 
&lt;BR /&gt; ... 9 more 
&lt;BR /&gt;Caused by: java.lang.OutOfMemoryError: Java heap space 
&lt;BR /&gt; at org.dom4j.DocumentFactory.createAttribute(DocumentFactory.java:156) 
&lt;BR /&gt; at org.dom4j.tree.AbstractElement.setAttributes(AbstractElement.java:549) 
&lt;BR /&gt; at org.dom4j.io.SAXContentHandler.addAttributes(SAXContentHandler.java:916) 
&lt;BR /&gt; at org.dom4j.io.SAXContentHandler.startElement(SAXContentHandler.java:249) 
&lt;BR /&gt; at org.apache.xerces.parsers.AbstractSAXParser.startElement(Unknown Source) 
&lt;BR /&gt; at org.apache.xerces.parsers.AbstractXMLDocumentParser.emptyElement(Unknown Source) 
&lt;BR /&gt; at org.apache.xerces.impl.XMLNSDocumentScannerImpl.scanStartElement(Unknown Source) 
&lt;BR /&gt; at org.apache.xerces.impl.XMLDocumentFragmentScannerImpl$FragmentContentDispatcher.dispatch(Unknown Source) 
&lt;BR /&gt; at org.apache.xerces.impl.XMLDocumentFragmentScannerImpl.scanDocument(Unknown Source) 
&lt;BR /&gt; at org.apache.xerces.parsers.XML11Configuration.parse(Unknown Source) 
&lt;BR /&gt; at org.apache.xerces.parsers.XML11Configuration.parse(Unknown Source) 
&lt;BR /&gt; at org.apache.xerces.parsers.XMLParser.parse(Unknown Source) 
&lt;BR /&gt; at org.apache.xerces.parsers.AbstractSAXParser.parse(Unknown Source) 
&lt;BR /&gt; at org.apache.xerces.jaxp.SAXParserImpl$JAXPSAXParser.parse(Unknown Source) 
&lt;BR /&gt; at org.dom4j.io.SAXReader.read(SAXReader.java:465) 
&lt;BR /&gt; at datasetimportjobs.j140_import_segments_0_6.j140_Import_Segments.tFileInputXML_1Process(j140_Import_Segments.java:1601) 
&lt;BR /&gt; ... 12 more 
&lt;BR /&gt;Job j500_ValidateLoad ended at 14:58 12/07/2012. 
&lt;BR /&gt;Error 2: 
&lt;BR /&gt;========== 
&lt;BR /&gt;Starting job j500_ValidateLoad at 15:23 12/07/2012. 
&lt;BR /&gt; connecting to socket on port 3527 
&lt;BR /&gt; connected 
&lt;BR /&gt;Exception in thread "main" java.lang.Error: java.lang.Error: java.lang.Error: java.lang.Error: java.lang.Error: java.lang.OutOfMemoryError: Java heap space 
&lt;BR /&gt; at datasetimportjobs.j500_validateload_6_0.j500_ValidateLoad.tOracleInput_1Process(j500_ValidateLoad.java:4042) 
&lt;BR /&gt; at datasetimportjobs.j500_validateload_6_0.j500_ValidateLoad.runJobInTOS(j500_ValidateLoad.java:5436) 
&lt;BR /&gt; at datasetimportjobs.j500_validateload_6_0.j500_ValidateLoad.main(j500_ValidateLoad.java:5113) 
&lt;BR /&gt;Caused by: java.lang.Error: java.lang.Error: java.lang.Error: java.lang.Error: java.lang.OutOfMemoryError: Java heap space 
&lt;BR /&gt; disconnected 
&lt;BR /&gt; at datasetimportjobs.j100_read_files_6_1.j100_Read_Files.tRunJob_1Process(j100_Read_Files.java:2555) 
&lt;BR /&gt; at datasetimportjobs.j100_read_files_6_1.j100_Read_Files.runJobInTOS(j100_Read_Files.java:2942) 
&lt;BR /&gt; at datasetimportjobs.j100_read_files_6_1.j100_Read_Files.runJob(j100_Read_Files.java:2608) 
&lt;BR /&gt; at datasetimportjobs.j500_validateload_6_0.j500_ValidateLoad.tOracleInput_1Process(j500_ValidateLoad.java:3375) 
&lt;BR /&gt; ... 2 more 
&lt;BR /&gt;Caused by: java.lang.Error: java.lang.Error: java.lang.Error: java.lang.OutOfMemoryError: Java heap space 
&lt;BR /&gt; at datasetimportjobs.j100_read_files_6_1.j100_Read_Files.tRunJob_2Process(j100_Read_Files.java:956) 
&lt;BR /&gt; at datasetimportjobs.j100_read_files_6_1.j100_Read_Files.tRunJob_1Process(j100_Read_Files.java:2546) 
&lt;BR /&gt; ... 5 more 
&lt;BR /&gt;Caused by: java.lang.Error: java.lang.Error: java.lang.OutOfMemoryError: Java heap space 
&lt;BR /&gt; at datasetimportjobs.j102_read_dataset_xml_0_6.j102_Read_Dataset_XML.tFileList_1Process(j102_Read_Dataset_XML.java:1030) 
&lt;BR /&gt; at datasetimportjobs.j102_read_dataset_xml_0_6.j102_Read_Dataset_XML.runJobInTOS(j102_Read_Dataset_XML.java:1498) 
&lt;BR /&gt; at datasetimportjobs.j102_read_dataset_xml_0_6.j102_Read_Dataset_XML.runJob(j102_Read_Dataset_XML.java:1282) 
&lt;BR /&gt; at datasetimportjobs.j100_read_files_6_1.j100_Read_Files.tRunJob_2Process(j100_Read_Files.java:895) 
&lt;BR /&gt; ... 6 more 
&lt;BR /&gt;Caused by: java.lang.Error: java.lang.OutOfMemoryError: Java heap space 
&lt;BR /&gt; at datasetimportjobs.j140_import_segments_0_6.j140_Import_Segments.tFileInputXML_1Process(j140_Import_Segments.java:2345) 
&lt;BR /&gt; at datasetimportjobs.j140_import_segments_0_6.j140_Import_Segments.runJobInTOS(j140_Import_Segments.java:2767) 
&lt;BR /&gt; at datasetimportjobs.j140_import_segments_0_6.j140_Import_Segments.runJob(j140_Import_Segments.java:2572) 
&lt;BR /&gt; at datasetimportjobs.j102_read_dataset_xml_0_6.j102_Read_Dataset_XML.tFileList_1Process(j102_Read_Dataset_XML.java:762) 
&lt;BR /&gt; ... 9 more 
&lt;BR /&gt;Caused by: java.lang.OutOfMemoryError: Java heap space 
&lt;BR /&gt; at org.dom4j.DocumentFactory.createAttribute(DocumentFactory.java:156) 
&lt;BR /&gt; at org.dom4j.tree.AbstractElement.setAttributes(AbstractElement.java:549) 
&lt;BR /&gt; at org.dom4j.io.SAXContentHandler.addAttributes(SAXContentHandler.java:916) 
&lt;BR /&gt; at org.dom4j.io.SAXContentHandler.startElement(SAXContentHandler.java:249) 
&lt;BR /&gt; at org.apache.xerces.parsers.AbstractSAXParser.startElement(Unknown Source) 
&lt;BR /&gt; at org.apache.xerces.parsers.AbstractXMLDocumentParser.emptyElement(Unknown Source) 
&lt;BR /&gt; at org.apache.xerces.impl.XMLNSDocumentScannerImpl.scanStartElement(Unknown Source) 
&lt;BR /&gt; at org.apache.xerces.impl.XMLDocumentFragmentScannerImpl$FragmentContentDispatcher.dispatch(Unknown Source) 
&lt;BR /&gt; at org.apache.xerces.impl.XMLDocumentFragmentScannerImpl.scanDocument(Unknown Source) 
&lt;BR /&gt; at org.apache.xerces.parsers.XML11Configuration.parse(Unknown Source) 
&lt;BR /&gt; at org.apache.xerces.parsers.XML11Configuration.parse(Unknown Source) 
&lt;BR /&gt; at org.apache.xerces.parsers.XMLParser.parse(Unknown Source) 
&lt;BR /&gt; at org.apache.xerces.parsers.AbstractSAXParser.parse(Unknown Source) 
&lt;BR /&gt; at org.apache.xerces.jaxp.SAXParserImpl$JAXPSAXParser.parse(Unknown Source) 
&lt;BR /&gt; at org.dom4j.io.SAXReader.read(SAXReader.java:465) 
&lt;BR /&gt; at datasetimportjobs.j140_import_segments_0_6.j140_Import_Segments.tFileInputXML_1Process(j140_Import_Segments.java:1601) 
&lt;BR /&gt; ... 12 more 
&lt;BR /&gt;Exception in thread "Thread-7" java.lang.OutOfMemoryError: Java heap space 
&lt;BR /&gt; at java.util.LinkedList.listIterator(LinkedList.java:667) 
&lt;BR /&gt; at java.util.AbstractList.listIterator(AbstractList.java:284) 
&lt;BR /&gt; at java.util.AbstractSequentialList.iterator(AbstractSequentialList.java:222) 
&lt;BR /&gt; at routines.system.RunStat.sendMessages(RunStat.java:248) 
&lt;BR /&gt; at routines.system.RunStat.run(RunStat.java:212) 
&lt;BR /&gt; at java.lang.Thread.run(Thread.java:619) 
&lt;BR /&gt;Job j500_ValidateLoad ended at 15:25 12/07/2012. 
&lt;BR /&gt;Thanks, 
&lt;BR /&gt;KPK</description>
      <pubDate>Fri, 13 Jul 2012 10:38:57 GMT</pubDate>
      <guid>https://community.qlik.com/t5/Talend-Studio/OutofMemory-Error-When-considering-a-huge-XML-File-as-source/m-p/2236004#M24850</guid>
      <dc:creator>Anonymous</dc:creator>
      <dc:date>2012-07-13T10:38:57Z</dc:date>
    </item>
    <item>
      <title>Re: OutofMemory Error When considering a huge XML File as source</title>
      <link>https://community.qlik.com/t5/Talend-Studio/OutofMemory-Error-When-considering-a-huge-XML-File-as-source/m-p/2236005#M24851</link>
      <description>Hi KPK
&lt;BR /&gt;This is a common problem.
&lt;BR /&gt;What's the volume of your memory?
&lt;BR /&gt;What you can do now is trying to reduce the consumption of memory and increase the arguments of JVM.
&lt;BR /&gt;If there is tMap component in your job, you need to use "Store on disk" feature.
&lt;BR /&gt;You can increase the arguments of JVM as the following image.
&lt;BR /&gt;Regards,
&lt;BR /&gt;Pedro</description>
      <pubDate>Fri, 13 Jul 2012 11:04:38 GMT</pubDate>
      <guid>https://community.qlik.com/t5/Talend-Studio/OutofMemory-Error-When-considering-a-huge-XML-File-as-source/m-p/2236005#M24851</guid>
      <dc:creator>Anonymous</dc:creator>
      <dc:date>2012-07-13T11:04:38Z</dc:date>
    </item>
    <item>
      <title>Re: OutofMemory Error When considering a huge XML File as source</title>
      <link>https://community.qlik.com/t5/Talend-Studio/OutofMemory-Error-When-considering-a-huge-XML-File-as-source/m-p/2236006#M24852</link>
      <description>I have 10 Jobs calling in 1 Parent Job and in all the 10 jobs i have tmaps...Do i need to use Store on Disk for all the 10 Jobs or is it okay if i use in Parent Job.</description>
      <pubDate>Fri, 13 Jul 2012 16:24:45 GMT</pubDate>
      <guid>https://community.qlik.com/t5/Talend-Studio/OutofMemory-Error-When-considering-a-huge-XML-File-as-source/m-p/2236006#M24852</guid>
      <dc:creator>Anonymous</dc:creator>
      <dc:date>2012-07-13T16:24:45Z</dc:date>
    </item>
    <item>
      <title>Re: OutofMemory Error When considering a huge XML File as source</title>
      <link>https://community.qlik.com/t5/Talend-Studio/OutofMemory-Error-When-considering-a-huge-XML-File-as-source/m-p/2236007#M24853</link>
      <description>hi all, 
&lt;BR /&gt; 
&lt;BR /&gt;semms that you're using dom4j 
&lt;BR /&gt;try to use SAX parser ( advanced setting) .. it could solve problem for huge xml file. ...but it's slower 
&lt;span class="lia-inline-image-display-wrapper" image-alt="0683p000009MACn.png"&gt;&lt;img src="https://community.qlik.com/t5/image/serverpage/image-id/154443iC5B8CACEF3D12C6A/image-size/large?v=v2&amp;amp;px=999" role="button" title="0683p000009MACn.png" alt="0683p000009MACn.png" /&gt;&lt;/span&gt; 
&lt;BR /&gt;regards 
&lt;BR /&gt;laurent</description>
      <pubDate>Fri, 13 Jul 2012 21:43:11 GMT</pubDate>
      <guid>https://community.qlik.com/t5/Talend-Studio/OutofMemory-Error-When-considering-a-huge-XML-File-as-source/m-p/2236007#M24853</guid>
      <dc:creator>Anonymous</dc:creator>
      <dc:date>2012-07-13T21:43:11Z</dc:date>
    </item>
    <item>
      <title>Re: OutofMemory Error When considering a huge XML File as source</title>
      <link>https://community.qlik.com/t5/Talend-Studio/OutofMemory-Error-When-considering-a-huge-XML-File-as-source/m-p/2236008#M24854</link>
      <description>Hi
&lt;BR /&gt;I agree with Laurent.
&lt;BR /&gt;All these two features are used to reduce the consuming of JVM memory.
&lt;BR /&gt;Whether you use "Store on disk" for all child jobs depends on how you balance the performance and memory comsuming.
&lt;BR /&gt;Regards,
&lt;BR /&gt;Pedro</description>
      <pubDate>Mon, 16 Jul 2012 06:28:28 GMT</pubDate>
      <guid>https://community.qlik.com/t5/Talend-Studio/OutofMemory-Error-When-considering-a-huge-XML-File-as-source/m-p/2236008#M24854</guid>
      <dc:creator>Anonymous</dc:creator>
      <dc:date>2012-07-16T06:28:28Z</dc:date>
    </item>
    <item>
      <title>Re: OutofMemory Error When considering a huge XML File as source</title>
      <link>https://community.qlik.com/t5/Talend-Studio/OutofMemory-Error-When-considering-a-huge-XML-File-as-source/m-p/2236009#M24855</link>
      <description>Where can i find SAX Parser..Can u write me detailed steps to do that...</description>
      <pubDate>Mon, 16 Jul 2012 11:23:20 GMT</pubDate>
      <guid>https://community.qlik.com/t5/Talend-Studio/OutofMemory-Error-When-considering-a-huge-XML-File-as-source/m-p/2236009#M24855</guid>
      <dc:creator>Anonymous</dc:creator>
      <dc:date>2012-07-16T11:23:20Z</dc:date>
    </item>
    <item>
      <title>Re: OutofMemory Error When considering a huge XML File as source</title>
      <link>https://community.qlik.com/t5/Talend-Studio/OutofMemory-Error-When-considering-a-huge-XML-File-as-source/m-p/2236010#M24856</link>
      <description>under advanced setting of your xml input component.&lt;BR /&gt;it's just an option in a list (menu) to choice</description>
      <pubDate>Mon, 16 Jul 2012 11:35:35 GMT</pubDate>
      <guid>https://community.qlik.com/t5/Talend-Studio/OutofMemory-Error-When-considering-a-huge-XML-File-as-source/m-p/2236010#M24856</guid>
      <dc:creator>Anonymous</dc:creator>
      <dc:date>2012-07-16T11:35:35Z</dc:date>
    </item>
    <item>
      <title>Re: OutofMemory Error When considering a huge XML File as source</title>
      <link>https://community.qlik.com/t5/Talend-Studio/OutofMemory-Error-When-considering-a-huge-XML-File-as-source/m-p/2236011#M24857</link>
      <description>I have made the suggested changes and run the Job....Still No Luck........
&lt;BR /&gt;Can Some one Please help me....
&lt;BR /&gt;Can we Split Large XML into Small Chunks (Multiple XML Files).......How can we do that...?
&lt;BR /&gt;What will be the impact if we do like that</description>
      <pubDate>Mon, 16 Jul 2012 14:29:06 GMT</pubDate>
      <guid>https://community.qlik.com/t5/Talend-Studio/OutofMemory-Error-When-considering-a-huge-XML-File-as-source/m-p/2236011#M24857</guid>
      <dc:creator>Anonymous</dc:creator>
      <dc:date>2012-07-16T14:29:06Z</dc:date>
    </item>
    <item>
      <title>Re: OutofMemory Error When considering a huge XML File as source</title>
      <link>https://community.qlik.com/t5/Talend-Studio/OutofMemory-Error-When-considering-a-huge-XML-File-as-source/m-p/2236012#M24858</link>
      <description>like Pedro've said try to change param jvm into .bat or .sh depending on our OS :
&lt;BR /&gt;
&lt;PRE&gt;java -Xms256M -Xmx1024M ..... etc&lt;/PRE&gt;
&lt;BR /&gt;can read some data in 5Gb xml file with that config ...
&lt;BR /&gt;hope it's help
&lt;BR /&gt;laurent</description>
      <pubDate>Mon, 16 Jul 2012 14:57:44 GMT</pubDate>
      <guid>https://community.qlik.com/t5/Talend-Studio/OutofMemory-Error-When-considering-a-huge-XML-File-as-source/m-p/2236012#M24858</guid>
      <dc:creator>Anonymous</dc:creator>
      <dc:date>2012-07-16T14:57:44Z</dc:date>
    </item>
    <item>
      <title>Re: OutofMemory Error When considering a huge XML File as source</title>
      <link>https://community.qlik.com/t5/Talend-Studio/OutofMemory-Error-When-considering-a-huge-XML-File-as-source/m-p/2236013#M24859</link>
      <description>Can You Please help me with the exact code...</description>
      <pubDate>Mon, 16 Jul 2012 16:14:55 GMT</pubDate>
      <guid>https://community.qlik.com/t5/Talend-Studio/OutofMemory-Error-When-considering-a-huge-XML-File-as-source/m-p/2236013#M24859</guid>
      <dc:creator>Anonymous</dc:creator>
      <dc:date>2012-07-16T16:14:55Z</dc:date>
    </item>
    <item>
      <title>Re: OutofMemory Error When considering a huge XML File as source</title>
      <link>https://community.qlik.com/t5/Talend-Studio/OutofMemory-Error-When-considering-a-huge-XML-File-as-source/m-p/2236014#M24860</link>
      <description>This didn't Work for me...............Still iam getting the same Error</description>
      <pubDate>Tue, 17 Jul 2012 09:56:16 GMT</pubDate>
      <guid>https://community.qlik.com/t5/Talend-Studio/OutofMemory-Error-When-considering-a-huge-XML-File-as-source/m-p/2236014#M24860</guid>
      <dc:creator>Anonymous</dc:creator>
      <dc:date>2012-07-17T09:56:16Z</dc:date>
    </item>
    <item>
      <title>Re: OutofMemory Error When considering a huge XML File as source</title>
      <link>https://community.qlik.com/t5/Talend-Studio/OutofMemory-Error-When-considering-a-huge-XML-File-as-source/m-p/2236015#M24861</link>
      <description>Any Help?????????</description>
      <pubDate>Wed, 18 Jul 2012 17:12:30 GMT</pubDate>
      <guid>https://community.qlik.com/t5/Talend-Studio/OutofMemory-Error-When-considering-a-huge-XML-File-as-source/m-p/2236015#M24861</guid>
      <dc:creator>Anonymous</dc:creator>
      <dc:date>2012-07-18T17:12:30Z</dc:date>
    </item>
    <item>
      <title>Re: OutofMemory Error When considering a huge XML File as source</title>
      <link>https://community.qlik.com/t5/Talend-Studio/OutofMemory-Error-When-considering-a-huge-XML-File-as-source/m-p/2236016#M24862</link>
      <description>I have 300 MB XML File and as suggested changed all the JVM arguments to 256/512/1024 and 1024,using SAX Parser and Still iam facing the same issue..can anyone please help me out with some workarounds!!</description>
      <pubDate>Fri, 03 Aug 2012 09:30:08 GMT</pubDate>
      <guid>https://community.qlik.com/t5/Talend-Studio/OutofMemory-Error-When-considering-a-huge-XML-File-as-source/m-p/2236016#M24862</guid>
      <dc:creator>Anonymous</dc:creator>
      <dc:date>2012-08-03T09:30:08Z</dc:date>
    </item>
    <item>
      <title>Re: OutofMemory Error When considering a huge XML File as source</title>
      <link>https://community.qlik.com/t5/Talend-Studio/OutofMemory-Error-When-considering-a-huge-XML-File-as-source/m-p/2236017#M24863</link>
      <description>Hi 
&lt;BR /&gt;So there is the component tRunjob in your parent job? 
&lt;BR /&gt;You might use the feature of tRunJob--Use an independent process to run subjob. 
&lt;BR /&gt;It will reduce the memory consuming. But make sure there is no context or data transmition between parent job and this subjob. Or you will get error. 
&lt;BR /&gt;Regards, 
&lt;BR /&gt;Pedro</description>
      <pubDate>Fri, 03 Aug 2012 09:36:24 GMT</pubDate>
      <guid>https://community.qlik.com/t5/Talend-Studio/OutofMemory-Error-When-considering-a-huge-XML-File-as-source/m-p/2236017#M24863</guid>
      <dc:creator>Anonymous</dc:creator>
      <dc:date>2012-08-03T09:36:24Z</dc:date>
    </item>
    <item>
      <title>Re: OutofMemory Error When considering a huge XML File as source</title>
      <link>https://community.qlik.com/t5/Talend-Studio/OutofMemory-Error-When-considering-a-huge-XML-File-as-source/m-p/2236018#M24864</link>
      <description>Pedro, 
&lt;BR /&gt;Context is there and data transmission happends..... 
&lt;BR /&gt;What will be the reason behind this......it is not able to handle 300MB file...then how it will handle large files of size 5 or 10GB??? 
&lt;BR /&gt; 
&lt;BR /&gt;Thanks, 
&lt;BR /&gt;KPK</description>
      <pubDate>Fri, 03 Aug 2012 09:54:31 GMT</pubDate>
      <guid>https://community.qlik.com/t5/Talend-Studio/OutofMemory-Error-When-considering-a-huge-XML-File-as-source/m-p/2236018#M24864</guid>
      <dc:creator>Anonymous</dc:creator>
      <dc:date>2012-08-03T09:54:31Z</dc:date>
    </item>
    <item>
      <title>Re: OutofMemory Error When considering a huge XML File as source</title>
      <link>https://community.qlik.com/t5/Talend-Studio/OutofMemory-Error-When-considering-a-huge-XML-File-as-source/m-p/2236019#M24865</link>
      <description>Hi KPK 
&lt;BR /&gt;All the parent job and the child jobs share only one JVM thread. 
&lt;BR /&gt;The maximum memory is the JVM argument xmx. 
&lt;BR /&gt;Try to increase the JVM arguments. If it doesn't work either you have to simplify your jobs. 
&lt;BR /&gt;The feature "Use an independent process to run subjob" will create a new JVM thread for the subjob. But you say you can't because you need to transmit context variables. 
&lt;BR /&gt;Regards, 
&lt;BR /&gt;Pedro</description>
      <pubDate>Fri, 03 Aug 2012 09:59:33 GMT</pubDate>
      <guid>https://community.qlik.com/t5/Talend-Studio/OutofMemory-Error-When-considering-a-huge-XML-File-as-source/m-p/2236019#M24865</guid>
      <dc:creator>Anonymous</dc:creator>
      <dc:date>2012-08-03T09:59:33Z</dc:date>
    </item>
  </channel>
</rss>

