I am upgrading my Spark Scala App Kafka API to v. 0.10. I used to create custom method for deserialization of the message which comes in byte string format.
I have realized there is a way to pass StringDeserializer or ByteArrayDeserializer as parameter to either key or value.
However,I can not find any information on how to create custom Avro schema deserializer so my kafkaStream can use it when I createDirectStream and consume data from Kafka.
is it possible?
It is possible. You need to override the
Deserializer<T>
interface defined inorg.apache.kafka.common.serialization
and you need to pointkey.deserializer
orvalue.deserializer
to your custom class via theConsumerStrategy[K, V]
class which holds the Kafka parameters. For example:And then: