You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I'm using Kafka Connect HDFS Connector to write Avro topics with Decimal fields.
I'm using integration with Hive.
When reading data I'm getting following exception:
Error: java.io.IOException: org.apache.hadoop.hive.ql.metadata.HiveException: java.lang.ClassCastException: org.apache.hadoop.hive.serde2.io.HiveDecimalWritable cannot be cast to org.apache.hadoop.io.BytesWritable (state=,code=0)
I think the reason of the problem is that connector is creating fields with binary data_type in Hive (instead of Decimal)
I'm using Kafka Connect HDFS Connector to write Avro topics with Decimal fields.
I'm using integration with Hive.
When reading data I'm getting following exception:
Error: java.io.IOException: org.apache.hadoop.hive.ql.metadata.HiveException: java.lang.ClassCastException: org.apache.hadoop.hive.serde2.io.HiveDecimalWritable cannot be cast to org.apache.hadoop.io.BytesWritable (state=,code=0)
I think the reason of the problem is that connector is creating fields with binary data_type in Hive (instead of Decimal)
The text was updated successfully, but these errors were encountered: