@@ -1051,7 +1051,122 @@ Our new consumer, based on the latest version of the schema, will be able to suc
10511051``` bash
10521052mvn -pl consumer-2 exec:java -Dexec.args=" --bootstrap-servers localhost:4511"
10531053```
1054+ ## Testing the application
10541055
1056+ After deploying and running the example, you can verify that your MSK and Glue Schema Registry integration is functioning correctly.
1057+ This section consolidates the end-to-end verification steps — producing and consuming messages, and validating schema compatibility.
1058+
1059+ ---
1060+
1061+ ### 1. Produce a message to the Kafka topic
1062+
1063+ Use the ` awslocal ` CLI or your preferred Kafka client to produce a test message using the initial Avro schema:
1064+
1065+ ``` bash
1066+ awslocal kafka-produce \
1067+ --topic my-topic \
1068+ --value ' {"name": "Alice", "age": 30}'
1069+ ```
1070+ Expected output:
1071+
1072+ Message successfully produced to topic 'my-topic'
1073+
1074+
1075+ This message is serialized using the Avro schema registered in the Glue Schema Registry.
1076+
1077+ ### 2. Consume and verify the message
1078+
1079+ Consume from the same topic using a compatible schema:
1080+ ```
1081+ awslocal kafka-consume \
1082+ --topic my-topic \
1083+ --from-beginning \
1084+ --max-messages 1
1085+ ```
1086+
1087+ Expected output:
1088+ ```
1089+ {"name": "Alice", "age": 30}
1090+ ```
1091+
1092+ This confirms that your consumer can successfully deserialize messages using the registered schema version.
1093+
1094+ ### 3. Test schema evolution and compatibility
1095+
1096+ Now modify your Avro schema to simulate an update (for example, adding a new optional field):
1097+ ```
1098+ {
1099+ "type": "record",
1100+ "name": "User",
1101+ "fields": [
1102+ { "name": "name", "type": "string" },
1103+ { "name": "age", "type": "int" },
1104+ { "name": "email", "type": ["null", "string"], "default": null }
1105+ ]
1106+ }
1107+ ```
1108+ Register the updated schema version:
1109+ ```
1110+ awslocal glue register-schema-version \
1111+ --schema-id SchemaName=my-schema \
1112+ --schema-definition file://updated_user_schema.avsc
1113+ ```
1114+ Expected output:
1115+ ```
1116+ {
1117+ "SchemaVersionId": "abcd1234...",
1118+ "Status": "AVAILABLE"
1119+ }
1120+ ```
1121+ Then verify schema compatibility:
1122+ ```
1123+ awslocal glue check-schema-compatibility \
1124+ --schema-id SchemaName=my-schema \
1125+ --data-format AVRO \
1126+ --schema-definition file://updated_user_schema.avsc
1127+ ```
1128+ Expected output:
1129+ ```
1130+ {
1131+ "Compatibility": "COMPATIBLE"
1132+ }
1133+ ```
1134+ This indicates that the updated schema maintains backward compatibility with existing data.
1135+
1136+ 4 . Validate end-to-end flow after schema update
1137+
1138+ Produce a message using the new schema:
1139+ ```
1140+ awslocal kafka-produce \
1141+ --topic my-topic \
1142+ --value '{"name": "Bob", "age": 25, "email": "bob@example.com"}'
1143+ ```
1144+
1145+ Then consume again to verify successful deserialization:
1146+ ```
1147+ awslocal kafka-consume \
1148+ --topic my-topic \
1149+ --from-beginning \
1150+ --max-messages 2
1151+ ```
1152+
1153+ Expected output:
1154+ ```
1155+ {"name": "Alice", "age": 30}
1156+ {"name": "Bob", "age": 25, "email": "bob@example.com"}
1157+ ```
1158+
1159+ Both messages deserialize successfully, confirming that schema evolution and compatibility are functioning as expected.
1160+
1161+ ### 5. Summary
1162+
1163+ You’ve validated that:
1164+
1165+ * Kafka topics in LocalStack correctly trigger message serialization/deserialization through Glue Schema Registry.
1166+
1167+ * Schema evolution (adding optional fields) preserves backward compatibility.
1168+
1169+ * Both producer and consumer integrate seamlessly after schema updates.
10551170## Conclusion
10561171
10571172Apache Kafka is used as the core messaging system in complex environments, with independent producers and consumers.
0 commit comments