Modern applications demand high scalability, real-time processing, and loose coupling between services. This is where Event-Driven Architecture (EDA) shines โ especially when powered by Apache Kafka and Spring Boot.
In this post, you’ll learn how to implement Event-Driven Architecture with Spring Kafka, covering core concepts like producers, consumers, topics, and message serialization. Weโll use real-world examples in the package com.kscodes.springboot.advanced
.

๐ What is Event-Driven Architecture?
Event-Driven Architecture (EDA) is a software design pattern where components communicate via events instead of direct calls. In this model:
- Producers emit events
- Consumers react to those events
- Event brokers (e.g., Kafka) handle message distribution
๐ง Tools and Technologies
- Spring Boot 3.x
- Apache Kafka
- Spring for Apache Kafka
- Docker (for Kafka setup)
- Java 21
๐๏ธ Project Structure
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 |
com.kscodes.springboot.advanced โ โโโ config โ โโโ KafkaProducerConfig.java โ โโโ KafkaConsumerConfig.java โ โโโ model โ โโโ OrderEvent.java โ โโโ service โ โโโ OrderEventProducer.java โ โโโ OrderEventListener.java โ โโโ controller โโโ OrderController.java |
๐ฆ Maven Dependencies
1 2 3 4 5 6 7 8 9 10 11 |
<dependency> <groupId>org.springframework.kafka</groupId> <artifactId>spring-kafka</artifactId> </dependency> <dependency> <groupId>com.fasterxml.jackson.core</groupId> <artifactId>jackson-databind</artifactId> </dependency> |
๐งฑ 1. Kafka Docker Setup (for local)
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 |
# docker-compose.yml version: '3' services: zookeeper: image: confluentinc/cp-zookeeper:7.5.0 environment: ZOOKEEPER_CLIENT_PORT: 2181 kafka: image: confluentinc/cp-kafka:7.5.0 ports: - "9092:9092" environment: KAFKA_ZOOKEEPER_CONNECT: zookeeper:2181 KAFKA_ADVERTISED_LISTENERS: PLAINTEXT://localhost:9092 KAFKA_OFFSETS_TOPIC_REPLICATION_FACTOR: 1 |
Run with:
1 2 3 4 |
docker-compose up -d |
๐งฉ 2. Model: OrderEvent.java
1 2 3 4 5 6 7 8 9 10 11 12 |
package com.kscodes.springboot.advanced.model; public class OrderEvent { private String orderId; private String product; private int quantity; // Constructors, Getters, Setters } |
โ๏ธ 3. Kafka Producer Configuration
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 |
package com.kscodes.springboot.advanced.config; import com.kscodes.springboot.advanced.model.OrderEvent; import org.apache.kafka.clients.producer.ProducerConfig; import org.apache.kafka.common.serialization.StringSerializer; import org.springframework.context.annotation.*; import org.springframework.kafka.core.*; import org.springframework.kafka.support.serializer.JsonSerializer; import java.util.*; @Configuration public class KafkaProducerConfig { @Bean public ProducerFactory<String, OrderEvent> producerFactory() { Map<String, Object> config = new HashMap<>(); config.put(ProducerConfig.BOOTSTRAP_SERVERS_CONFIG, "localhost:9092"); config.put(ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG, StringSerializer.class); config.put(ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG, JsonSerializer.class); return new DefaultKafkaProducerFactory<>(config); } @Bean public KafkaTemplate<String, OrderEvent> kafkaTemplate() { return new KafkaTemplate<>(producerFactory()); } } |
๐ฏ 4. Kafka Consumer Configuration
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 |
package com.kscodes.springboot.advanced.config; import com.kscodes.springboot.advanced.model.OrderEvent; import org.apache.kafka.clients.consumer.ConsumerConfig; import org.apache.kafka.common.serialization.StringDeserializer; import org.springframework.context.annotation.*; import org.springframework.kafka.annotation.EnableKafka; import org.springframework.kafka.config.*; import org.springframework.kafka.core.*; import org.springframework.kafka.support.serializer.JsonDeserializer; import java.util.*; @EnableKafka @Configuration public class KafkaConsumerConfig { @Bean public ConsumerFactory<String, OrderEvent> consumerFactory() { JsonDeserializer<OrderEvent> deserializer = new JsonDeserializer<>(OrderEvent.class); deserializer.addTrustedPackages("*"); Map<String, Object> props = new HashMap<>(); props.put(ConsumerConfig.BOOTSTRAP_SERVERS_CONFIG, "localhost:9092"); props.put(ConsumerConfig.GROUP_ID_CONFIG, "order-events"); props.put(ConsumerConfig.KEY_DESERIALIZER_CLASS_CONFIG, StringDeserializer.class); props.put(ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG, deserializer); return new DefaultKafkaConsumerFactory<>(props, new StringDeserializer(), deserializer); } @Bean public ConcurrentKafkaListenerContainerFactory<String, OrderEvent> kafkaListenerContainerFactory() { ConcurrentKafkaListenerContainerFactory<String, OrderEvent> factory = new ConcurrentKafkaListenerContainerFactory<>(); factory.setConsumerFactory(consumerFactory()); return factory; } } |
๐ค 5. Producer: OrderEventProducer.java
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 |
package com.kscodes.springboot.advanced.service; import com.kscodes.springboot.advanced.model.OrderEvent; import org.springframework.kafka.core.KafkaTemplate; import org.springframework.stereotype.Service; @Service public class OrderEventProducer { private final KafkaTemplate<String, OrderEvent> kafkaTemplate; public OrderEventProducer(KafkaTemplate<String, OrderEvent> kafkaTemplate) { this.kafkaTemplate = kafkaTemplate; } public void sendOrderEvent(OrderEvent event) { kafkaTemplate.send("order-events", event); } } |
๐ฅ 6. Consumer: OrderEventListener.java
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 |
package com.kscodes.springboot.advanced.service; import com.kscodes.springboot.advanced.model.OrderEvent; import org.springframework.kafka.annotation.KafkaListener; import org.springframework.stereotype.Service; @Service public class OrderEventListener { @KafkaListener(topics = "order-events", groupId = "order-events") public void handleOrderEvent(OrderEvent event) { System.out.println("Received Order Event: " + event.getOrderId()); // process order } } |
๐ 7. REST Controller: OrderController.java
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 |
package com.kscodes.springboot.advanced.controller; import com.kscodes.springboot.advanced.model.OrderEvent; import com.kscodes.springboot.advanced.service.OrderEventProducer; import org.springframework.web.bind.annotation.*; @RestController @RequestMapping("/orders") public class OrderController { private final OrderEventProducer producer; public OrderController(OrderEventProducer producer) { this.producer = producer; } @PostMapping public String createOrder(@RequestBody OrderEvent orderEvent) { producer.sendOrderEvent(orderEvent); return "Order event sent!"; } } |
๐งช Testing the Flow
- Run Kafka using Docker.
- Start the Spring Boot app.
- POST an order event:
1 2 3 4 5 6 |
curl -X POST http://localhost:8080/orders \ -H "Content-Type: application/json" \ -d '{"orderId":"ORD-001","product":"iPhone","quantity":2}' |
Check logs for the received event.
โ๏ธ Advantages of Event-Driven Architecture
Benefit | Description |
---|---|
Decoupling | Services don’t know about each other |
Scalability | Easily add more consumers |
Real-time Processing | React to changes as they happen |
Fault Tolerance | Retry and reprocess events if needed |
๐ก๏ธ Best Practices
- Always version your events
- Handle message idempotency
- Log events for audit trails
- Use dead-letter topics for failures
๐ Conclusion
Adopting Event-Driven Architecture with Spring Kafka enables your microservices to communicate asynchronously, react to events in real-time, and remain loosely coupled. Whether you’re building order systems, inventory updates, or analytics pipelines โ Kafka + Spring is the modern choice for robust messaging.
Start with a single event, grow your event model, and build systems that scale with confidence.