1, Why
We use Kafka heavily in our project, and the producers and consumers may lie in different modules/micro services. Producers and consumers with the same topics share identical Kafka message payload structures. From the engineering point of view, we don’t wanna define the same payload structure in different modules, so we have a schema module there which handles such things. In the schema module, we can define Avro schema files, and use Apache Avro tools to generate corresponding Java/Kotlin classes. We can import the common library to other modules to use these generated classes if necessary.
2, How
After you checkout our SaFi repository, please import SaFiMono/common/schema into your favorite IDE, then copy an existing file ending with 'avsc' and modify it accordingly. As for the grammar tips of Avro, please refer to https://avro.apache.org/docs/1.7.2/#schemas. The following steps help you go though the procedures to create a schema file, say TestAvro in directory test (test.TestAvro)
create file TestAvro-v1.avsc
{ "type": "record", "name": "TestAvro", "namespace": "ph.safibank.avro.audit", "fields": [ { "name": "customerId", "type": "string" }, { "name": "userId", "type": ["null", "string"] }, { "name": "ticketId", "type": ["null", "string"] }, { "name": "source", "type": "string" }, { "name": "communicationId", "type": "string" } }
add an entry to topicSchemasDefinitions.json like this
{ "name": "${Your_Kafka_Topic_Name}", "schema": "test.TestAvro" }
Replace placeholder ${Your_Kafka_Topic_Name} with your real Kafka topic name. We use this file to register schema definitions to Kafka later.
run the following script to validate the newly-added schema files
sh localAvroValidator.sh schemas/test/TestAvro-v1.avsc
execute the following script to publish the generated jar file to your local maven repository
sh localAvroGenerator.sh schemas/test/TestAvro-v1.avsc
finally you can use
ph.safibank.avro.audit
.TestAvro.class in your module after you add the common dependency entry to your gradle/maven file