
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Handling BigDecimal
I found an issue with handling BigDecimal data conversion in the tMap.
The scenario is I am reading Hive data in tHiveInput and some of the fields are defined as bigint in the Hive and are by default treated as BigDecimal. I am writing the data to oracle with tDBPOutput. For the target, I am converting the BigDecimal fields to Int using the Talend function .intValue() in the tMap.
It is working fine when the source fields have some values in them. However, when they are blank, (null), then I am getting a Null pointer exception in tMap.
Any thoughts on best way to handle this?
Accepted Solutions

- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi Nikhil,
I do not want to change the datatype on the Schema to String. As there could be several such columns.
I think the tConvertType for the BigDecimal to Integer is working fine.
I see some other issue with the Date though. I will research some more and post it differently as this thread for BigDecimal is solved via the tConvertType.

- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi,
Either you will have to handle the null using additional Relational ISNULL function available in tMap or you will have to use tConvertType component so that Talend tool will do the data conversions for you.
Warm Regards,
Nikhil Thampi
Please appreciate our Talend community members by giving Kudos for sharing their time for your query. If your query is answered, please mark the topic as resolved 🙂

- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi Nikhil,
I already tried the ISNULL with the BigDecimal and it throwed me an error. I had to explicity handle the NULLs as 0s to be able to load the data.
Error when using ISNULL is:
Detail Message: The method ISNULL(Integer) is undefined for the type <job name>
Do you know if there is any thing to check that's not working as expected?
Thanks
Siddartha

- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi,
When you are doing the data read, could you please mark it as String instead of BigDecimal in your schema? Then you can do the null check and conversion easily. Some screenshots will be appreciated to understand more about the issue.
Warm Regards,
Nikhil Thampi
Please appreciate our Talend community members by giving Kudos for sharing their time for your query. If your query is answered, please mark the topic as resolved 🙂

- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
I like the tConverType Solution. It is a cool feature, I am yet to validate the results but atleast the job is running successfully.

- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Perfect!
Could you please mark the topic as resolved after validation?
Warm Regards,
Nikhil Thampi
Please appreciate our Talend community members by giving Kudos for sharing their time for your query. If your query is answered, please mark the topic as resolved 🙂

- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi Nikhil,
I do not want to change the datatype on the Schema to String. As there could be several such columns.
I think the tConvertType for the BigDecimal to Integer is working fine.
I see some other issue with the Date though. I will research some more and post it differently as this thread for BigDecimal is solved via the tConvertType.
