Skip to main content
Announcements
Introducing a new Enhanced File Management feature in Qlik Cloud! GET THE DETAILS!
cancel
Showing results for 
Search instead for 
Did you mean: 
talendstar
Creator
Creator

Float and Double mismatch

I have a csv with large number of columns that I am trying to upload to Snowflake.

 

However the problem that I am running into is that when create metadata for the csv, the column data type for all my columns is inferred as FLOAT.

The columns in Snowflake in the destination table, are defined as FLOAT as well.

 

When I create a tDBoutput Snowflake for this destination table, the data type is inferred as double within Talend.

 

So within tMap source file has FLOAT as datatype for the columns and destination table columns have Double data type. Talend doesn't like this and will not run my job.

 

Since the number of columns are around 700, I really would like to find a way if there is a way where I can modify either all  of 700 data type for source within Talend metadata or the data type for the destination table. So both Source and Destination data type defined in Talend match.

 

thoughts?

 

thanks

Labels (4)
2 Replies
Anonymous
Not applicable

Hello,

In Talend; the Mapping between DataType from the Database and TalendType can be customize to fit and match with your need.

For more information, please have a look at this article:https://community.talend.com/t5/Migration-Configuration-and/Changing-the-default-data-type-mapping/t... 

Best regards

Sabrina

Anonymous
Not applicable

First of all, the database type float does not mean necessarily the Java Float type. Example: PostgreSQL FLOAT8 is meant as Double.

If Talend takes the values as Double and write them with this type, this should always work.

What kind of problems do you get while running this job?

A way to automatically convert the types (I would not do that but if you think you should) is using the component tConvertType.