<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Re: How build schemas at the runtime when write data from Oracle tables into csv files? in Talend Studio</title>
    <link>https://community.qlik.com/t5/Talend-Studio/How-build-schemas-at-the-runtime-when-write-data-from-Oracle/m-p/2307218#M78673</link>
    <description>&lt;P&gt;My 1st &amp;nbsp;recommendation will be to use the enterprise version of Talend to get access to dynamic schema functionality and much more. &amp;nbsp;&lt;/P&gt; 
&lt;P&gt;&amp;nbsp;&lt;/P&gt; 
&lt;P&gt;Another way will be to build a job with the complete input schema of 1 table and the target components.&lt;/P&gt; 
&lt;P&gt;Then try to understand how the .item file is generated and use Talend itself to generate the .item file. &amp;nbsp;This way you create a job generator. &amp;nbsp;However, the effort you will spend designing&amp;nbsp;this job generator (without support and the risk of not being able to do it properly), you can easily save time and money&amp;nbsp;by just going enterprise and have a product that you can use for much more than just this 1 use case.&lt;/P&gt;</description>
    <pubDate>Sat, 01 Jul 2017 10:55:07 GMT</pubDate>
    <dc:creator>Anonymous</dc:creator>
    <dc:date>2017-07-01T10:55:07Z</dc:date>
    <item>
      <title>How build schemas at the runtime when write data from Oracle tables into csv files?</title>
      <link>https://community.qlik.com/t5/Talend-Studio/How-build-schemas-at-the-runtime-when-write-data-from-Oracle/m-p/2307217#M78672</link>
      <description>&lt;P&gt;Hi everyone,&lt;/P&gt; 
&lt;P&gt;&amp;nbsp;&lt;/P&gt; 
&lt;P&gt;I have used TOS product to write data from Oracle tables into csv files.&lt;/P&gt; 
&lt;P&gt;I have a input csv file with schema name and table name as bellow:&lt;/P&gt; 
&lt;P&gt;&amp;nbsp;&lt;/P&gt; 
&lt;P&gt;Schema_01,table_01&lt;/P&gt; 
&lt;P&gt;Schema_02,table_02&lt;/P&gt; 
&lt;P&gt;...&lt;/P&gt; 
&lt;P&gt;My job read records from this file one by one, and then it will read Oracle tables to get data for writing to csv files. Finally, the result I expected that I will have output csv files look like:&lt;/P&gt; 
&lt;P&gt;Schema_01_table_01_data.csv&lt;/P&gt; 
&lt;P&gt;Schema_02_table_02_data.csv&lt;/P&gt; 
&lt;P&gt;...&lt;/P&gt; 
&lt;P&gt;But after accessing and reading correct tables, how Talend can understand and build schemas at the runtime? I have used tOracleOutputBulk component to write data into csv files, and I know I have to define these schemas manually at the design step in TOS.&amp;nbsp;&lt;/P&gt; 
&lt;P&gt;I have found a useful link at &lt;A href="http://bigdatadimension.com/writing-arbitrary-database-tables-file-without-dynamic-schema-talend/" target="_self" rel="nofollow noopener noreferrer"&gt;http://bigdatadimension.com/writing-arbitrary-database-tables-file-without-dynamic-schema-talend/&lt;/A&gt;. The author used Object type at the defining schema and Java API&amp;nbsp;ResultSetMetaData to write data into correct csv format. However, this only work fine with small tables and take time, memory for large tables.&lt;/P&gt; 
&lt;P&gt;Do we have other solutions for this situation?&lt;/P&gt; 
&lt;P&gt;&amp;nbsp;&lt;/P&gt; 
&lt;P&gt;Thank you very much.&lt;/P&gt; 
&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Sat, 16 Nov 2024 09:34:12 GMT</pubDate>
      <guid>https://community.qlik.com/t5/Talend-Studio/How-build-schemas-at-the-runtime-when-write-data-from-Oracle/m-p/2307217#M78672</guid>
      <dc:creator>Anonymous</dc:creator>
      <dc:date>2024-11-16T09:34:12Z</dc:date>
    </item>
    <item>
      <title>Re: How build schemas at the runtime when write data from Oracle tables into csv files?</title>
      <link>https://community.qlik.com/t5/Talend-Studio/How-build-schemas-at-the-runtime-when-write-data-from-Oracle/m-p/2307218#M78673</link>
      <description>&lt;P&gt;My 1st &amp;nbsp;recommendation will be to use the enterprise version of Talend to get access to dynamic schema functionality and much more. &amp;nbsp;&lt;/P&gt; 
&lt;P&gt;&amp;nbsp;&lt;/P&gt; 
&lt;P&gt;Another way will be to build a job with the complete input schema of 1 table and the target components.&lt;/P&gt; 
&lt;P&gt;Then try to understand how the .item file is generated and use Talend itself to generate the .item file. &amp;nbsp;This way you create a job generator. &amp;nbsp;However, the effort you will spend designing&amp;nbsp;this job generator (without support and the risk of not being able to do it properly), you can easily save time and money&amp;nbsp;by just going enterprise and have a product that you can use for much more than just this 1 use case.&lt;/P&gt;</description>
      <pubDate>Sat, 01 Jul 2017 10:55:07 GMT</pubDate>
      <guid>https://community.qlik.com/t5/Talend-Studio/How-build-schemas-at-the-runtime-when-write-data-from-Oracle/m-p/2307218#M78673</guid>
      <dc:creator>Anonymous</dc:creator>
      <dc:date>2017-07-01T10:55:07Z</dc:date>
    </item>
  </channel>
</rss>

