I am using kafka connect cassandra source connector 1.0 version. I have a decimal datatype column(price) in cassandra table and writing it to the kafka topic as json from source connector,it is writing the decimal value in some string format like like "price":"AA=="
.
Now it is giving error in my spark streaming while converting to float as "number format exception"....?? please suggest what may went wrong while writing the value in kafka topic.
Advance thanks.
相关问题
- Delete Messages from a Topic in Apache Kafka
- How to maintain order of key-value in DataFrame sa
- Spark on Yarn Container Failure
- What version of Java does Cassandra 3 require
- In Spark Streaming how to process old data and del
相关文章
- Livy Server: return a dataframe as JSON?
- Cassandra Read a negative frame size
- SQL query Frequency Distribution matrix for produc
- How to filter rows for a specific aggregate with s
- How does cassandra split keyspace data when multip
- How does Cassandra scale horizontally ?
- How to name file when saveAsTextFile in spark?
- Spark save(write) parquet only one file
It looks like the known bug in Kafka Connect + Decimals. As proposed in the issue, you'll need to perform "manual" conversion of data from base64-encoded string into
BigDecimal
: