I am using kafka connect cassandra source connector 1.0 version. I have a decimal datatype column(price) in cassandra table and writing it to the kafka topic as json from source connector,it is writing the decimal value in some string format like like "price":"AA=="
.
Now it is giving error in my spark streaming while converting to float as "number format exception"....?? please suggest what may went wrong while writing the value in kafka topic.
Advance thanks.
可以将文章内容翻译成中文,广告屏蔽插件可能会导致该功能失效(如失效,请关闭广告屏蔽插件后再试):
问题:
回答1:
It looks like the known bug in Kafka Connect + Decimals. As proposed in the issue, you'll need to perform "manual" conversion of data from base64-encoded string into BigDecimal
:
BigDecimal bigDecimal = new BigDecimal(
new BigInteger(Base64.getDecoder().decode("BfXhAA==")), scale);