Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
Hello All,
My requirement is to load data from excel to sql server table.
Im suppose to validate my data in tmap to check for null values:
here are the my validations sample :
1. Integer (source and target) integer type. -> (row1.XXAPID == null)? null:row1.XXAPID
2. for Date type .-> row1.INVOICE_DATE.equals("null")?null:TalendDate.parseDate("MM-dd-yyyy HH:mm:ss",row1.INVOICE_DATE) and checked with TalendDate.formateDate also.
3. string to string type-> (row1.LINE_TYPE_LOOKUP_CODE.equals("null")? null :row1.LINE_TYPE_LOOKUP_CODE
4. float to float type --> (row1.DIST_ITEM_AMOUNT ==null) ? null : row1.DIST_ITEM_AMOUNT.floatValue()
I'm facing the following error while running my job
Can any one please help me to resolve this.
It's very emergency.
I would like to use something like below code
(row1.GL_DATE!=null && row1.GL_DATE.equals(TalendDate.formatDate("M-DD-yyyy")) && row1.GL_Date.equals(TalendDate.formatDate("MM-dd-yyyy"))?TalendDate.parseDate("EEE MMM dd HH:mm:ss 'IST' yyyy",row1.GL_DATE):null
Is this possible?
using this format "EEE MMM dd HH: mm: ss 'IST' yyyy" this date issues got solved "Mon Nov 04 00:00:00 IST 2013"
then what is the possibilities there to solve the issues for all date formats in the same column?
I would suggest using a routine to parse the dates in that case.
Take a look at the below thread in StackOverflow,
Is this is the only option?
can't we validate via writing condition?
I can't really think of anything else. Moreover using a routine here would be a safer and easier solution compared to anything else.
I think the pattern "M-d-YYYY" should do the job.
Sorry, read partially.
No you cannot, because one of the parse will fail with an exception and not witha a condition.
You need a routine that check the possible format or better try to understand which one use (maybe through regexp).
I suggest if possible to normalize the date format with a single format, if not will be a nightmare with some performance impact, if the data become huge.