Schema Registry expert for Avro, Protobuf, and JSON Schema management. Covers schema evolution strategies, compatibility modes, validation, and best practices for managing schemas in Confluent Cloud and self-hosted Schema Registry. Activates for schema registry, avro, protobuf, json schema, schema evolution, compatibility modes, schema validation.
Inherits all available tools
Additional assets for this skill
This skill inherits all available tools. When active, it can use any tool Claude has access to.
Expert knowledge of Confluent Schema Registry for managing Avro, Protobuf, and JSON Schema schemas in Kafka ecosystems.
Avro (Most Popular):
Protobuf (Google Protocol Buffers):
JSON Schema:
| Mode | Producer Can | Consumer Can | Use Case |
|---|---|---|---|
| BACKWARD | Remove fields, add optional fields | Read old data with new schema | Most common, safe for consumers |
| FORWARD | Add fields, remove optional fields | Read new data with old schema | Safe for producers |
| FULL | Add/remove optional fields only | Bi-directional compatibility | Both producers and consumers upgrade independently |
| NONE | Any change | Must coordinate upgrades | Development only, NOT production |
| BACKWARD_TRANSITIVE | BACKWARD across all versions | Read any old data | Strictest backward compatibility |
| FORWARD_TRANSITIVE | FORWARD across all versions | Read any new data | Strictest forward compatibility |
| FULL_TRANSITIVE | FULL across all versions | Complete bi-directional | Strictest overall |
Default: BACKWARD (recommended for production)
Adding Fields:
// V1
{
"type": "record",
"name": "User",
"fields": [
{"name": "id", "type": "long"},
{"name": "name", "type": "string"}
]
}
// V2 - BACKWARD compatible (added optional field with default)
{
"type": "record",
"name": "User",
"fields": [
{"name": "id", "type": "long"},
{"name": "name", "type": "string"},
{"name": "email", "type": ["null", "string"], "default": null}
]
}
Removing Fields (BACKWARD compatible):
// V1
{"name": "address", "type": "string"}
// V2 - Remove field (old consumers will ignore it)
// Field removed from schema
Changing Field Types (Breaking Change!):
// ❌ BREAKING - Cannot change string to int
{"name": "age", "type": "string"} → {"name": "age", "type": "int"}
// ✅ SAFE - Use union types
{"name": "age", "type": ["string", "int"], "default": "unknown"}
Activate me when you need help with:
✅ DO:
❌ DON'T:
NONE compatibility in productionHierarchical Namespaces:
com.company.domain.EntityName
com.acme.ecommerce.Order
com.acme.ecommerce.OrderLineItem
Subject Naming (Kafka topics):
<topic-name>-value - For record values<topic-name>-key - For record keysorders-value, orders-keyProducer (with Avro):
const { Kafka } = require('kafkajs');
const { SchemaRegistry } = require('@kafkajs/confluent-schema-registry');
const registry = new SchemaRegistry({
host: 'https://schema-registry:8081',
auth: {
username: 'SR_API_KEY',
password: 'SR_API_SECRET'
}
});
// Register schema
const schema = `
{
"type": "record",
"name": "User",
"fields": [
{"name": "id", "type": "long"},
{"name": "name", "type": "string"}
]
}
`;
const { id } = await registry.register({
type: SchemaType.AVRO,
schema
});
// Encode message with schema
const payload = await registry.encode(id, {
id: 1,
name: 'John Doe'
});
await producer.send({
topic: 'users',
messages: [{ value: payload }]
});
Consumer (with Avro):
const consumer = kafka.consumer({ groupId: 'user-processor' });
await consumer.subscribe({ topic: 'users' });
await consumer.run({
eachMessage: async ({ message }) => {
// Decode message (schema ID is in header)
const decodedMessage = await registry.decode(message.value);
console.log(decodedMessage); // { id: 1, name: 'John Doe' }
}
});
Before Registering:
CLI Validation:
# Check compatibility (before registering)
curl -X POST http://localhost:8081/compatibility/subjects/users-value/versions/latest \
-H "Content-Type: application/vnd.schemaregistry.v1+json" \
-d '{"schema": "{...}"}'
# Register schema
curl -X POST http://localhost:8081/subjects/users-value/versions \
-H "Content-Type: application/vnd.schemaregistry.v1+json" \
-d '{"schema": "{...}"}'
Error:
Schema being registered is incompatible with an earlier schema
Root Cause: Violates compatibility mode (e.g., removed required field with BACKWARD mode)
Solution:
curl http://localhost:8081/config/users-value
curl -X POST http://localhost:8081/compatibility/subjects/users-value/versions/latest \
-d '{"schema": "{...}"}'
Error:
Subject 'users-value' not found
Root Cause: Schema not registered yet OR wrong subject name
Solution:
curl http://localhost:8081/subjects
<topic>-key or <topic>-value)Error:
Unknown magic byte!
Root Cause: Message not encoded with Schema Registry (missing magic byte + schema ID)
Solution:
@kafkajs/confluent-schema-registry libraryNeed to change schema?
├─ Adding new field?
│ ├─ Required field? → Add with default value (BACKWARD)
│ └─ Optional field? → Add with default null (BACKWARD)
│
├─ Removing field?
│ ├─ Required field? → ❌ BREAKING CHANGE (coordinate upgrade)
│ └─ Optional field? → ✅ BACKWARD compatible
│
├─ Changing field type?
│ ├─ Compatible types (e.g., int → long)? → Use union types
│ └─ Incompatible types? → ❌ BREAKING CHANGE (add new field, deprecate old)
│
└─ Renaming field?
└─ ❌ BREAKING CHANGE → Add new field + mark old as deprecated
| Feature | Avro | Protobuf | JSON Schema |
|---|---|---|---|
| Encoding | Binary | Binary | Text (JSON) |
| Message Size | Small (90% smaller) | Small (80% smaller) | Large (baseline) |
| Human Readable | No | No | Yes |
| Schema Evolution | Excellent | Good | Fair |
| Language Support | Java, Python, C++ | 20+ languages | Universal |
| Performance | Very Fast | Very Fast | Slower |
| Debugging | Harder | Harder | Easy |
| Best For | Data warehousing, ETL | Polyglot, gRPC | REST APIs, dev |
Recommendation:
Invoke me when you need schema management, evolution strategies, or compatibility guidance!