Skip to main content
Announcements
Introducing a new Enhanced File Management feature in Qlik Cloud! GET THE DETAILS!
cancel
Showing results for 
Search instead for 
Did you mean: 
Anonymous
Not applicable

Retrieving Data from BIG Data - Hadoop(HIVE) DB and getting Illegal Conversion Error

Getting error as Below:

Starting job hw_prod_individual at 17:32 21/07/2017.

[statistics] connecting to socket on port 3857
[statistics] connected
Exception in component tHiveInput_1
java.sql.SQLException: Illegal conversion
at org.apache.hive.jdbc.HiveBaseResultSet.getBigDecimal(HiveBaseResultSet.java:135)
at local_project.hw_prod_individual_0_1.hw_prod_individual.tHiveInput_1Process(hw_prod_individual.java:4099)
at local_project.hw_prod_individual_0_1.hw_prod_individual.tNetezzaInput_1Process(hw_prod_individual.java:2214)
at local_project.hw_prod_individual_0_1.hw_prod_individual.runJobInTOS(hw_prod_individual.java:4762)
at local_project.hw_prod_individual_0_1.hw_prod_individual.main(hw_prod_individual.java:4619)
[statistics] disconnected
Job hw_prod_individual ended at 17:32 21/07/2017. [exit code=1]

 

 

Please check the issue and help to resolve this issue.

Thanks in Advance

Labels (4)
1 Solution

Accepted Solutions
Anonymous
Not applicable
Author

Hi Team,
We have changed the data type in Metadata of Talendtype for the Bigdecimal type to Long as true.
as below
<dbType type="BIGINT">
<talendType type="id_BigDecimal"/>
<talendType type="id_Byte"/>
<talendType type="id_Integer"/>
<talendType type="id_Long" default="true"/>
</dbType>

View solution in original post

8 Replies
Anonymous
Not applicable
Author

This looks like the code is trying to convert a field in your Hive table to BigDecimal and throwing an error: double-check all your fields to make sure you aren't trying to convert a String to a numeric value.

 

Good luck!

 

David

Anonymous
Not applicable
Author

Hi,

Are you using method getBigDecimal in hive? Could you please try to change your data schema (using double instead of Bigdecimal) to see if it works?

Best regards

Sabrina

Anonymous
Not applicable
Author

Hi Sabrina,

We can change the schema Data type from Bigdecimal to double or Long, if we have one or two tables.
But here the Use case is we have around 600 Tables, it is not easy to change all Schema's Data types manually while retrieving data from Hive.

Note: Not only Bigint , for Timestamp data types also not there in talend.
In case of one or two tables​ then we can change the schema data types manually, but in case of 500 or more than that it not good to change manually and not efficient.
Can you please let me know for the use case to retrieve table data in Hive which is having the data types of Bigint, Timestamp.

Thanks & Regards,
N.ThirupathiRayudu
Anonymous
Not applicable
Author

Hi Team,
We have changed the data type in Metadata of Talendtype for the Bigdecimal type to Long as true.
as below
<dbType type="BIGINT">
<talendType type="id_BigDecimal"/>
<talendType type="id_Byte"/>
<talendType type="id_Integer"/>
<talendType type="id_Long" default="true"/>
</dbType>
Anonymous
Not applicable
Author

Thanks SABRINA and DAVID For your Replies.
Hi Team,
Issue with Bigdecimal Data type resolved , as i changed the Metadata of Talendtype for the Bigdecimal as "id_Long" as true.

But for the TIMESTAMP Data type in Hive while retrieving getting the same error as illegal conversion as below.
[statistics] connecting to socket on port 4068
[statistics] connected
Exception in component tHiveInput_1
java.sql.SQLException: Illegal conversion
at org.apache.hive.jdbc.HiveBaseResultSet.getTimestamp(HiveBaseResultSet.java:577)
at routines.system.JDBCUtil.getDate(JDBCUtil.java:61)
at local_project.read_hive_table_hw_prod_individual_0_1.read_Hive_Table_hw_prod_individual.tHiveInput_1Process(read_Hive_Table_hw_prod_individual.java:1295)
at local_project.read_hive_table_hw_prod_individual_0_1.read_Hive_Table_hw_prod_individual.runJobInTOS(read_Hive_Table_hw_prod_individual.java:2002)
at local_project.read_hive_table_hw_prod_individual_0_1.read_Hive_Table_hw_prod_individual.main(read_Hive_Table_hw_prod_individual.java:1859)
[statistics] disconnected
Job read_Hive_Table_hw_prod_individual ended at 18:22 24/07/2017. [exit code=1]

Please check whether any way to change the Data type EXCEPT Edit the Schema manually.

Thanks & Regards,
Ntrayudu
Anonymous
Not applicable
Author

If you're talking about changing the schema in the Hive tables, the best way is probably through a shell script and the Hive CLI: you would list the tables and store them in a text file, then use that file as input to a script that invokes the Hive CLI to modify the schema of each table in turn. To do it in Talend, you *might* be able to Iterate over the tables and update each schema that way, but I've never done this: it's just a thought.

David
Anonymous
Not applicable
Author

Thanks David for your Reply and suggestion to change the data type by using shell script and the Hive CLI.

Can you please give the sample shell script to change the table data type by using shell script and the Hive CLI.

Thanks in Advance.

TamilM
Contributor II
Contributor II

can anyone please tell me where I need to change this settings with screenshots,

Metadata of Talendtype for the Bigdecimal as "id_Long" as true.
<dbType type="BIGINT">
<talendType type="id_BigDecimal"/>
<talendType type="id_Byte"/>
<talendType type="id_Integer"/>
<talendType type="id_Long" default="true"/>
</dbType>
I'm facing same similar issue while converting to Bigint and other data types from Hive to Talend and to sql server target.

also can anyone explain is detail of Mapping tables between hive type and Talend data type in talend