Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
Hello,
I have a Talend job which should load data CSV from my local drive to Target Database Redshift. I have created an empty table through a DDL query in Redshift UI. But while loading data into the table via Talend Dynamic Schema all rows goes into reject flow which says error "[Amazon][JDBC](10140) Error converting the value to Timestamp".
However when I manually try to insert same values in RedShift UI via Insert query(DML) then data is loaded successfully. Also when I try to keep the action on the table as "Drop table if exists and Create" then same data is loaded successfully but every field has datatype as "varchar" So can anyone please help me to understand why is it complaining about TimeStamp datatype in Talend.
try to change date pattern to proper for Redshift - "yyyy-MM-dd HH:mm:ss"
if You have in source other format - not sure (need) test Dynamic schema will work or not
in this case You may need manually convert date by dateParse / dateFormat functions
better to start from begin
what current date time format in file?
Hello Vapukov,
Datetime format in CSV file is 2017-03-24T23:20:26.000Z . And to cut short my question is it possible to Load data of CSV file into Redshift Table wherein Datatype in the schema is Dynamic. If yes then how? Because currently, I am able to do the same thing in SQL Server without any such issues.
Does anyone have any solution?
the simplest solution - do not use dynamic schema
Well if I had to do that then I would not have posted for help on this forum in the first place But anyways I have managed to load data from CSV into redshift DB using Dynamic schema for all datatypes except for fields which have a timezone. It seems Redshift JDBC driver doesn't support timestamptz datatype anymore. So don't know how to load it without driver support.
it always pain
not only redshift (which is +- postgresql)
but try to check Hortonworks forums about problems with Avro and NiFi ... all same - date error always, int, long require manual edit avro schema and etc
fact simple - all software which positioning it self as "Dynamic" perfect work with String (Varchar and etc)
more or less work with int/long/double
sometime - You spend much more time for try to resolve simple problem - "" is not allowed for INT, than define fixed structure