<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Float and Double mismatch in Talend Studio</title>
    <link>https://community.qlik.com/t5/Talend-Studio/Float-and-Double-mismatch/m-p/2230838#M21339</link>
    <description>&lt;P&gt;I have a csv with large number of columns that I am trying to upload to Snowflake.&lt;/P&gt; 
&lt;P&gt;&amp;nbsp;&lt;/P&gt; 
&lt;P&gt;However the problem that I am running into is that when create metadata for the csv, the column data type for all my columns is inferred as FLOAT.&lt;/P&gt; 
&lt;P&gt;The columns in Snowflake in the destination table, are defined as FLOAT as well.&lt;/P&gt; 
&lt;P&gt;&amp;nbsp;&lt;/P&gt; 
&lt;P&gt;When I create a tDBoutput Snowflake for this destination table, the data type is inferred as double within Talend.&lt;/P&gt; 
&lt;P&gt;&amp;nbsp;&lt;/P&gt; 
&lt;P&gt;So within tMap source file has FLOAT as datatype for the columns and destination table columns have Double data type. Talend doesn't like this and will not run my job.&lt;/P&gt; 
&lt;P&gt;&amp;nbsp;&lt;/P&gt; 
&lt;P&gt;Since the number of columns are around 700, I really would like to find a way if there is a way where I can modify either all&amp;nbsp; of 700 data type for source within Talend metadata or the data type for the destination table. So both Source and Destination data type defined in Talend match.&lt;/P&gt; 
&lt;P&gt;&amp;nbsp;&lt;/P&gt; 
&lt;P&gt;thoughts?&lt;/P&gt; 
&lt;P&gt;&amp;nbsp;&lt;/P&gt; 
&lt;P&gt;thanks&lt;/P&gt;</description>
    <pubDate>Sat, 16 Nov 2024 04:45:44 GMT</pubDate>
    <dc:creator>talendstar</dc:creator>
    <dc:date>2024-11-16T04:45:44Z</dc:date>
    <item>
      <title>Float and Double mismatch</title>
      <link>https://community.qlik.com/t5/Talend-Studio/Float-and-Double-mismatch/m-p/2230838#M21339</link>
      <description>&lt;P&gt;I have a csv with large number of columns that I am trying to upload to Snowflake.&lt;/P&gt; 
&lt;P&gt;&amp;nbsp;&lt;/P&gt; 
&lt;P&gt;However the problem that I am running into is that when create metadata for the csv, the column data type for all my columns is inferred as FLOAT.&lt;/P&gt; 
&lt;P&gt;The columns in Snowflake in the destination table, are defined as FLOAT as well.&lt;/P&gt; 
&lt;P&gt;&amp;nbsp;&lt;/P&gt; 
&lt;P&gt;When I create a tDBoutput Snowflake for this destination table, the data type is inferred as double within Talend.&lt;/P&gt; 
&lt;P&gt;&amp;nbsp;&lt;/P&gt; 
&lt;P&gt;So within tMap source file has FLOAT as datatype for the columns and destination table columns have Double data type. Talend doesn't like this and will not run my job.&lt;/P&gt; 
&lt;P&gt;&amp;nbsp;&lt;/P&gt; 
&lt;P&gt;Since the number of columns are around 700, I really would like to find a way if there is a way where I can modify either all&amp;nbsp; of 700 data type for source within Talend metadata or the data type for the destination table. So both Source and Destination data type defined in Talend match.&lt;/P&gt; 
&lt;P&gt;&amp;nbsp;&lt;/P&gt; 
&lt;P&gt;thoughts?&lt;/P&gt; 
&lt;P&gt;&amp;nbsp;&lt;/P&gt; 
&lt;P&gt;thanks&lt;/P&gt;</description>
      <pubDate>Sat, 16 Nov 2024 04:45:44 GMT</pubDate>
      <guid>https://community.qlik.com/t5/Talend-Studio/Float-and-Double-mismatch/m-p/2230838#M21339</guid>
      <dc:creator>talendstar</dc:creator>
      <dc:date>2024-11-16T04:45:44Z</dc:date>
    </item>
    <item>
      <title>Re: Float and Double mismatch</title>
      <link>https://community.qlik.com/t5/Talend-Studio/Float-and-Double-mismatch/m-p/2230839#M21340</link>
      <description>&lt;P&gt;Hello,&lt;/P&gt; 
&lt;P&gt;In Talend; the Mapping between DataType from the Database and TalendType can be customize to fit and match with your need.&lt;/P&gt; 
&lt;P&gt;For more information, please have a look at this article:&lt;A title="https://community.talend.com/t5/Migration-Configuration-and/Changing-the-default-data-type-mapping/ta-p/21668" href="https://community.qlik.com/s/article/ka03p0000006EZtAAM" target="_self"&gt;https://community.talend.com/t5/Migration-Configuration-and/Changing-the-default-data-type-mapping/ta-p/21668&lt;/A&gt;&amp;nbsp;&lt;/P&gt; 
&lt;P&gt;Best regards&lt;/P&gt; 
&lt;P&gt;Sabrina&lt;/P&gt;</description>
      <pubDate>Mon, 30 Sep 2019 08:34:35 GMT</pubDate>
      <guid>https://community.qlik.com/t5/Talend-Studio/Float-and-Double-mismatch/m-p/2230839#M21340</guid>
      <dc:creator>Anonymous</dc:creator>
      <dc:date>2019-09-30T08:34:35Z</dc:date>
    </item>
    <item>
      <title>Re: Float and Double mismatch</title>
      <link>https://community.qlik.com/t5/Talend-Studio/Float-and-Double-mismatch/m-p/2230840#M21341</link>
      <description>&lt;P&gt;First of all, the database type float does not mean necessarily the Java Float type. Example: PostgreSQL FLOAT8 is meant as Double.&lt;/P&gt;&lt;P&gt;If Talend takes the values as Double and write them with this type, this should always work.&lt;/P&gt;&lt;P&gt;What kind of problems do you get while running this job?&lt;/P&gt;&lt;P&gt;A way to automatically convert the types (I would not do that but if you think you should) is using the component tConvertType.&lt;/P&gt;</description>
      <pubDate>Mon, 30 Sep 2019 08:40:28 GMT</pubDate>
      <guid>https://community.qlik.com/t5/Talend-Studio/Float-and-Double-mismatch/m-p/2230840#M21341</guid>
      <dc:creator>Anonymous</dc:creator>
      <dc:date>2019-09-30T08:40:28Z</dc:date>
    </item>
  </channel>
</rss>

