Do not input private or sensitive data. View Qlik Privacy & Cookie Policy.
Skip to main content

Announcements
Join us in Bucharest on Sept 18th for Qlik's AI Reality Tour! Register Now
cancel
Showing results for 
Search instead for 
Did you mean: 
Anonymous
Not applicable

Timestamp datatype issue in Talend-Redshift

Hello,

 

I have a Talend job which should load data CSV from my local drive to Target Database Redshift. I have created an empty table through a DDL query in Redshift UI. But while loading data into the table via Talend Dynamic Schema all rows goes into reject flow which says error  "[Amazon][JDBC](10140) Error converting the value to Timestamp". 

However when I manually try to insert same values in RedShift UI via Insert query(DML)  then data is loaded successfully. Also when I try to keep the action on the table as "Drop table if exists and Create" then same data is loaded successfully but every field has datatype as "varchar" So can anyone please help me to understand why is it complaining about TimeStamp datatype in Talend. 0683p000009LvEa.jpg0683p000009LvKc.jpg0683p000009LvKT.jpg0683p000009LvKm.jpg

Labels (3)
6 Replies
vapukov
Master II
Master II

try to change date pattern to proper for Redshift - "yyyy-MM-dd HH:mm:ss"

if You have in source other format - not sure (need) test Dynamic schema will work or not

in this case You may need manually convert date by dateParse / dateFormat functions

 

better to start from begin 0683p000009MACn.png

 

what current date time format in file?

Anonymous
Not applicable
Author

Hello  Vapukov,

 

Datetime format in CSV file is 2017-03-24T23:20:26.000Z . And to cut short my question is it possible to Load data of CSV file into Redshift Table wherein Datatype in the schema is Dynamic. If yes then how? Because currently, I am able to do the same thing in SQL Server without any such issues.

 

Anonymous
Not applicable
Author

Does anyone have any solution?

vapukov
Master II
Master II

the simplest solution - do not use dynamic schema

Anonymous
Not applicable
Author

Well if I had to do that then I would not have posted for help on this forum in the first place 0683p000009MACn.png But anyways I have managed to load data from CSV into redshift DB using Dynamic schema for all datatypes except for fields which have a timezone. It seems Redshift JDBC driver doesn't support timestamptz datatype anymore. So don't know how to load it without driver support.

 

vapukov
Master II
Master II

it always pain

not only redshift (which is +- postgresql)

 

but try to check Hortonworks forums about problems with Avro and NiFi ... all same - date error always, int, long require manual edit avro schema and etc

 

fact simple - all software which positioning it self as "Dynamic" perfect work with String (Varchar and etc)

more or less work with int/long/double

 

sometime - You spend much more time for try to resolve simple problem - "" is not allowed for INT, than define fixed structure