I have a use case where I have a JSON and I want to generate schema and record out of the JSON and publish a record. I have configured the value serializer and Schema setting is Backward compatible.
First JSON
String json = "{\n" +
" \"id\": 1,\n" + " \"name\": \"Headphones\",\n" + " \"price\": 1250.0,\n" + " \"tags\": [\"home\", \"green\"]\n" + "}\n" ;
Version 1 schema registered.
Received message in avro console consumer.
Second JSON.
String json = "{\n" +
" \"id\": 1,\n" + " \"price\": 1250.0,\n" + " \"tags\": [\"home\", \"green\"]\n" + "}\n" ;
Registered schema Successfully. Sent message.
Now tried sending the JSON 1 sent successfully
Schema 3:
String json = "{\n" +
" \"id\": 1,\n" + " \"name\": \"Headphones\",\n" + " \"tags\": [\"home\", \"green\"]\n" + "}\n" ;
Got error for this case. Caused by: io.confluent.kafka.schemaregistry.client.rest.exceptions.RestClientException: Schema being registered is incompatible with an earlier schema; error code: 409
How is that schema generated from 2nd JSON was registered and the third one was rejected? Although I didn't have any Default key for the deleted field? Is it that the Schema Registry always accepts the 1st evolution? (2nd schema over 1st)
Schema in schema registry
Version 1 schema
{ "fields": [
{ "doc": "Type inferred from '1'", "name": "id", "type": "int" }, { "doc": "Type inferred from '\"Headphones\"'", "name": "name", "type": "string" }, { "doc": "Type inferred from '1250.0'", "name": "price", "type": "double" }, { "doc": "Type inferred from '[\"home\",\"green\"]'", "name": "tags", "type": { "items": "string", "type": "array" } } ], "name": "myschema", "type": "record" }
Version 2:
{ "fields": [
{ "doc": "Type inferred from '1'", "name": "id", "type": "int" }, { "doc": "Type inferred from '1250.0'", "name": "price", "type": "double" }, { "doc": "Type inferred from '[\"home\",\"green\"]'", "name": "tags", "type": { "items": "string", "type": "array" } } ], "name": "myschema", "type": "record" }
Let's go over the backwards compatibility rules... https://docs.confluent.io/current/schema-registry/avro.html#compatibility-types
First, the default isn't transitive, so version 3 only will look at version 2.
The backwards rule states you can delete fields or add optional fields (those with a default). I assume your schema generator tool doesn't know how to use optionals, so you're only allowed to delete, not add.
Between version 1 and 2, you've deleted the name field, which is valid.
Between version 2 and the incoming 3, it thinks you're trying to post a new schema which removes price (this is okay}, but adds a required name field, which is not allowed.