Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
Hi
Using Talend Real-time Big Data Platform Version: 6.4.1 Build id: 20170720_1238-patch
PS: I have already created a support ticket with Talend, trying to raise the question to this community as there might be someone who faced a similar issue.
I am getting, "Exception in thread "main" java.lang.NoSuchMethodError: org.apache.avro.Schema$Field.addProp(Ljava/lang/String;Ljava/lang/ObjectV" error when I am trying to extract data from a Snowflake table. I have attached the snapshot of the workflow that I built. Please note that everything worked fine until I added tHiveConnection component to it. When I deactivated hive related components from my workflow, it again started working fine.
Reading at a post here, https://stackoverflow.com/questions/31687550/nosuchmethoderror-writing-avro-object-to-hdfs-using-bui... looked similar to the issue I am facing. Basically I am assuming there is some issue with the avro jar file when the workflow has both Hive and Snowflake related components. I do need both of them in my job. So is there a way I can fix this issue?
Exception in thread "main" java.lang.NoSuchMethodError: org.apache.avro.Schema$Field.addProp(Ljava/lang/String;Ljava/lang/ObjectV
at org.talend.components.snowflake.runtime.SnowflakeAvroRegistry.sqlType2Avro(SnowflakeAvroRegistry.java:46)
at org.talend.components.common.avro.JDBCAvroRegistry.inferSchemaResultSet(JDBCAvroRegistry.java:114)
at org.talend.components.common.avro.JDBCAvroRegistry$1.apply(JDBCAvroRegistry.java:52)
at org.talend.components.common.avro.JDBCAvroRegistry$1.apply(JDBCAvroRegistry.java:44)
at org.talend.daikon.avro.AvroRegistry.inferSchema(AvroRegistry.java:154)
at org.talend.components.snowflake.runtime.SnowflakeSourceOrSink.getSchema(SnowflakeSourceOrSink.java:280)
at org.talend.components.snowflake.runtime.SnowflakeSourceOrSink.getEndpointSchema(SnowflakeSourceOrSink.java:265)
at org.talend.components.snowflake.runtime.SnowflakeReader.getSchema(SnowflakeReader.java:82)
at org.talend.components.snowflake.runtime.SnowflakeReader.<init>(SnowflakeReader.java:64)
at org.talend.components.snowflake.runtime.SnowflakeSource.createReader(SnowflakeSource.java:53)
at org.talend.components.snowflake.runtime.SnowflakeSource.createReader(SnowflakeSource.java:25)
while Talend fix an issue (it could take a time), you could split job into 2 separate
because your components independent - they did not use same data flow, you can:
- run it one by one (cron or TAC execution plan) - this is will work for 100%
- call as subjobs from parent job - this is need to test if error affected for single Job or parent/shields jobs as well
@vapukov thanks for the suggestion. Your solution might not work for my use case. The workflow I showed you is not how it will be, when it is complete. In the sense, i would need to access the hive tables to get the latest updated date and then go back to snowflake again to get the incremental data.
of course - when Talend fix the issue it would be good, but for period when it happens ... you need some bypass
we have similar issue (different components) and for this period we used csv (on hdfs) as intermediate storage between subjobs. Ugly ... but it works 🙂
you can pass values between subjobs
Did this issue resolved?
I think when a thmap is introduced with snowflake output, its not working and giving same exception.
[statistics] connecting to socket on port 3358
[statistics] connected
[statistics] disconnected
Exception in thread "main" java.lang.NoSuchMethodError: org.apache.avro.Schema.getLogicalType()Lorg/apache/avro/LogicalType;
at org.talend.daikon.avro.LogicalTypeUtils.isLogicalDate(LogicalTypeUtils.java:74)
at org.talend.codegen.enforcer.IncomingSchemaEnforcer.put(IncomingSchemaEnforcer.java:446)
at org.talend.codegen.enforcer.IncomingSchemaEnforcer.put(IncomingSchemaEnforcer.java:379)
at reusabledataexchangetemplate.snowflake_to_sap_0_1.SNOWFLAKE_TO_SAP.tFixedFlowInput_1Process(SNOWFLAKE_TO_SAP.java:1924)
at reusabledataexchangetemplate.snowflake_to_sap_0_1.SNOWFLAKE_TO_SAP.tDBConnection_1Process(SNOWFLAKE_TO_SAP.java:689)
at reusabledataexchangetemplate.snowflake_to_sap_0_1.SNOWFLAKE_TO_SAP.runJobInTOS(SNOWFLAKE_TO_SAP.java:2740)
at reusabledataexchangetemplate.snowflake_to_sap_0_1.SNOWFLAKE_TO_SAP.main(SNOWFLAKE_TO_SAP.java:2432)
[statistics] disconnected
I'm not sure if it all comes in same problem bucket.
I think when a thmap is introduced with snowflake output, its not working and giving same exception.
[statistics] connecting to socket on port 3358
[statistics] connected
[statistics] disconnected
Exception in thread "main" java.lang.NoSuchMethodError: org.apache.avro.Schema.getLogicalType()Lorg/apache/avro/LogicalType;
at org.talend.daikon.avro.LogicalTypeUtils.isLogicalDate(LogicalTypeUtils.java:74)
at org.talend.codegen.enforcer.IncomingSchemaEnforcer.put(IncomingSchemaEnforcer.java:446)
at org.talend.codegen.enforcer.IncomingSchemaEnforcer.put(IncomingSchemaEnforcer.java:379)
at reusabledataexchangetemplate.snowflake_to_sap_0_1.SNOWFLAKE_TO_SAP.tFixedFlowInput_1Process(SNOWFLAKE_TO_SAP.java:1924)
at reusabledataexchangetemplate.snowflake_to_sap_0_1.SNOWFLAKE_TO_SAP.tDBConnection_1Process(SNOWFLAKE_TO_SAP.java:689)
at reusabledataexchangetemplate.snowflake_to_sap_0_1.SNOWFLAKE_TO_SAP.runJobInTOS(SNOWFLAKE_TO_SAP.java:2740)
at reusabledataexchangetemplate.snowflake_to_sap_0_1.SNOWFLAKE_TO_SAP.main(SNOWFLAKE_TO_SAP.java:2432)
[statistics] disconnected
I'm not sure if it all comes in same problem bucket.