.claude-plugin/plugins/redpanda-connect/skills/pipeline-assistant/resources/recipes/kafka-replication.md
Pattern: Replication - Kafka to Kafka Difficulty: Intermediate Components: kafka_franz, fallback, retry, file Use Case: Replicate Kafka topics between clusters while preserving order, timestamps, and headers
Replicate data between Kafka clusters with full fidelity - preserving partitions, keys, timestamps, and headers. Includes retry logic and DLQ for poison messages. Essential for cross-datacenter replication, disaster recovery, and data migration.
See kafka-replication.yaml for the complete configuration.
Preserve all source characteristics:
fallback:
- retry:
max_retries: 3
output:
kafka_franz: {}
- file: {} # DLQ
Try writing with retries, fall back to DLQ on failure.
Messages that fail after retries go to DLQ with full context for manual recovery.
# Set environment variables
export SOURCE_BROKER=source:9092
export DEST_BROKER=dest:9092
export SOURCE_TOPIC=events
export DEST_TOPIC_PREFIX=replicated_
export CONSUMER_GROUP=replication_cg
export DLQ_PATH=./dlq
# Run replication
rpk connect run kafka-replication.yaml