Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
I'm working with Talend and utilizing TBigQuery as both my source and target. In my Product_Orders table, the columns are ID (int), Product_name (string), Price (int), and Order_date (timestamp). However, when retrieving the schema using TBigQueryInput, I notice that the ID datatype becomes Long and Order_date becomes String.
Given this situation, I'm facing challenges performing an incremental load due to datatype differences between the source and target schemas. How can I effectively handle this datatype mismatch issue when performing an incremental load from a source (TBigQuery) where datatypes are transformed (ID becomes Long and Order_date becomes String(
1700827138.567
)) to a target (also in TBigQuery) with the expected datatypes (ID as int and Order_date as timestamp)? Any guidance or best practices to handle this discrepancy during the incremental load process would be greatly appreciated."I encountered an issue while attempting to convert the 'Order_date' column to a date format within a TMap expression. The code I used for this conversion is currently working with strings, but I'm encountering difficulty converting it to a date format.
this code :
TalendDate.formatDate("yyyy-MM-dd HH:mm:ss.SSS",
TalendDate.addDate(
TalendDate.addDate(
TalendDate.parseDate("yyyy-MM-dd HH:mm:ss.SSS",
TalendDate.formatDate("yyyy-MM-dd HH:mm:ss.SSS",
new java.util.Date(Long.parseLong(row1.Order_date.split("\\.")[0]) * 1000 + Long.parseLong(row1.Order_date.split("\\.")[1]))
)
),
-5, "HH"
),
-30, "mm"
)
)
Your 1st issue, sounds like you don’t have the schema within Talend correctly defined.
the 2nd issue, I’d suggest you split that large function out into a series of steps in the tMap variables box, so you can see where it’s going wrong. You should also be careful where you get a timestamp that doesn’t have a decimal element associated.