How to Install and Use Kafka Schema Registry
Kafka Schema Registry is a separate service that helps ensure message format consistency in Apache Kafka. Let’s see how to install it and use it in real-world scenarios.
Installation Options
- Docker / Docker Compose
The fastest way to try it is using Confluent’s Docker images. Exampledocker-compose.yml
snippet:
version: '3'
services:
zookeeper:
image: confluentinc/cp-zookeeper:7.5.0
environment:
ZOOKEEPER_CLIENT_PORT: 2181
kafka:
image: confluentinc/cp-kafka:7.5.0
environment:
KAFKA_ZOOKEEPER_CONNECT: zookeeper:2181
KAFKA_ADVERTISED_LISTENERS: PLAINTEXT://kafka:9092
KAFKA_OFFSETS_TOPIC_REPLICATION_FACTOR: 1
schema-registry:
image: confluentinc/cp-schema-registry:7.5.0
environment:
SCHEMA_REGISTRY_KAFKASTORE_BOOTSTRAP_SERVERS: PLAINTEXT://kafka:9092
SCHEMA_REGISTRY_LISTENERS: http://0.0.0.0:8081
ports:
- "8081:8081"
Run with:
docker-compose up -d
- Helm in Kubernetes Use Confluent’s Helm chart:
helm repo add confluentinc https://packages.confluent.io/helm
helm install schema-registry confluentinc/cp-schema-registry
- Binary Installation Download from Confluent and run as a standalone Java process.
Using Schema Registry Register a Schema
Example Avro schema for orders (order.avsc):
{
"type": "record",
"name": "Order",
"fields": [
{"name": "order_id", "type": "string"},
{"name": "customer_id", "type": "string"},
{"name": "total", "type": "double"}
]
}
Register via REST API:
curl -X POST -H "Content-Type: application/vnd.schemaregistry.v1+json" \
--data '{"schema": "{\"type\":\"record\",\"name\":\"Order\",\"fields\":[{\"name\":\"order_id\",\"type\":\"string\"},{\"name\":\"customer_id\",\"type\":\"string\"},{\"name\":\"total\",\"type\":\"double\"}]}"}' \
http://localhost:8081/subjects/orders-value/versions
Produce and Consume with Schema Registry
- Producers use Avro/JSON/Protobuf serializers that talk to Schema Registry.
- Consumers automatically fetch the schema by ID and deserialize messages.
Example in Java (simplified):
Properties props = new Properties();
props.put("bootstrap.servers", "localhost:9092");
props.put("key.serializer", StringSerializer.class);
props.put("value.serializer", KafkaAvroSerializer.class);
props.put("schema.registry.url", "http://localhost:8081");
Why It Matters
- Guarantees data compatibility.
- Reduces message errors.
- Enables safe schema evolution.
For DevOps, having Schema Registry as part of Kafka infrastructure ensures your data pipelines remain stable and future-proof.