Error retrieving Avro schema for id 1, Subject not

2019-08-14 05:24发布

Caused by: org.apache.kafka.common.errors.SerializationException: Error retrieving Avro schema for id 1
Caused by: io.confluent.kafka.schemaregistry.client.rest.exceptions.RestClientException: Subject not found.; error code: 40401

Confluent Version 4.1.0

I am consuming data from a couple of topics(topic_1, topic_2) using KTable, joining the data and then pushing the data onto another topic(topic_out) using KStream. (Ktable.toStream())

The data is in avro format

When I check the schema by using

curl -X GET http://localhost:8081/subjects/ 

I find

topic_1-value
topic_1-key
topic_2-value
topic_2-key
topic_out-value

but there is no subject with topic_out-key. Why is it not created?

output from topic_out:

kafka-avro-console-consumer --bootstrap-server localhost:9092 --from-beginning --property print.key=true --topic topic_out

"code1  "   {"code":{"string":"code1  "},"personid":{"string":"=NA="},"agentoffice":{"string":"lic1        "},"status":{"string":"a"},"sourcesystem":{"string":"ILS"},"lastupdate":{"long":1527240990138}}

I can see the key being generated, but no subject for key.

Why is subject with key required?
I am feeding this topic to another connector (hdfs-sink) to push the data to hdfs but it fails with below error

Caused by: org.apache.kafka.common.errors.SerializationException: Error retrieving Avro schema for id 5\nCaused by: io.confluent.kafka.schemaregistry.client.rest.exceptions.RestClientException: Subject not found.; error code: 40401

when I look at the schema-registry.logs, I can see:

[2018-05-24 15:40:06,230] INFO 127.0.0.1 - - 
[24/May/2018:15:40:06 +0530] "POST /subjects/topic_out-key?deleted=true HTTP/1.1" 404 51  9 (io.confluent.rest-utils.requests:77)

any idea why the subject topic_out-key not being created?

1条回答
女痞
2楼-- · 2019-08-14 06:16

any idea why the subject topic_out-key not being created

Because the Key of your Kafka Streams output is a String, not an Avro encoded string.

You can verify that using kafka-console-consumer instead and adding --property print.value=false and not seeing any special characters compared to the same command when you do print the value (this is showing the data is binary Avro)

From Kafka Connect, you must use Kafka's StringConverter class for key.converter property rather than the Confluent Avro one

查看更多
登录 后发表回答