I am completely new to Kafka and avro and trying to use the confluent package. We have existing POJOs we use for JPA and I'd like to be able to simply produce an instance of my POJOs without having to reflect each value into a generic record manually. I seem to be missing how this is done in the documentation.
The examples use a generic record and set each value one by one like so:
String key = "key1";
String userSchema = "{\"type\":\"record\"," +
"\"name\":\"myrecord\"," +
"\"fields\":[{\"name\":\"f1\",\"type\":\"string\"}]}";
Schema.Parser parser = new Schema.Parser();
Schema schema = parser.parse(userSchema);
GenericRecord avroRecord = new GenericData.Record(schema);
avroRecord.put("f1", "value1");
record = new ProducerRecord<Object, Object>("topic1", key, avroRecord);
try {
producer.send(record);
} catch(SerializationException e) {
// may need to do something with it
}
There are several examples for getting a schema from a class and I found the annotations to alter that schema as necessary. Now how do I take an instance of a POJO and just send it to the serializer as is and have the library do the work of matching up the schema from the class and then copying the values into a generic record? Am I going about this all wrong? What I want to end up doing is something like this:
String key = "key1";
Schema schema = ReflectData.get().getSchema(myObject.getClass());
GenericRecord avroRecord = ReflectData.get().getRecord(myObject, schema);
record = new ProducerRecord<Object, Object>("topic1", key, avroRecord);
try {
producer.send(record);
} catch(SerializationException e) {
// may need to do something with it
}
Thanks!