Building Event-Driven Architecture with Spring Kafka: Complete Guide

Modern applications demand high scalability, real-time processing, and loose coupling between services. This is where Event-Driven Architecture (EDA) shines โ€” especially when powered by Apache Kafka and Spring Boot.

In this post, you’ll learn how to implement Event-Driven Architecture with Spring Kafka, covering core concepts like producers, consumers, topics, and message serialization. Weโ€™ll use real-world examples in the package com.kscodes.springboot.advanced.

Event-Driven Architecture Spring Kafka

๐Ÿš€ What is Event-Driven Architecture?

Event-Driven Architecture (EDA) is a software design pattern where components communicate via events instead of direct calls. In this model:

  • Producers emit events
  • Consumers react to those events
  • Event brokers (e.g., Kafka) handle message distribution

๐Ÿ”ง Tools and Technologies

  • Spring Boot 3.x
  • Apache Kafka
  • Spring for Apache Kafka
  • Docker (for Kafka setup)
  • Java 21

๐Ÿ—๏ธ Project Structure


com.kscodes.springboot.advanced
โ”‚
โ”œโ”€โ”€ config
โ”‚   โ””โ”€โ”€ KafkaProducerConfig.java
โ”‚   โ””โ”€โ”€ KafkaConsumerConfig.java
โ”‚
โ”œโ”€โ”€ model
โ”‚   โ””โ”€โ”€ OrderEvent.java
โ”‚
โ”œโ”€โ”€ service
โ”‚   โ””โ”€โ”€ OrderEventProducer.java
โ”‚   โ””โ”€โ”€ OrderEventListener.java
โ”‚
โ””โ”€โ”€ controller
    โ””โ”€โ”€ OrderController.java

๐Ÿ“ฆ Maven Dependencies



    org.springframework.kafka
    spring-kafka


    com.fasterxml.jackson.core
    jackson-databind


๐Ÿงฑ 1. Kafka Docker Setup (for local)


# docker-compose.yml
version: '3'
services:
  zookeeper:
    image: confluentinc/cp-zookeeper:7.5.0
    environment:
      ZOOKEEPER_CLIENT_PORT: 2181
  kafka:
    image: confluentinc/cp-kafka:7.5.0
    ports:
      - "9092:9092"
    environment:
      KAFKA_ZOOKEEPER_CONNECT: zookeeper:2181
      KAFKA_ADVERTISED_LISTENERS: PLAINTEXT://localhost:9092
      KAFKA_OFFSETS_TOPIC_REPLICATION_FACTOR: 1

Run with:


docker-compose up -d

๐Ÿงฉ 2. Model: OrderEvent.java


package com.kscodes.springboot.advanced.model;

public class OrderEvent {
    private String orderId;
    private String product;
    private int quantity;

    // Constructors, Getters, Setters
}

โš™๏ธ 3. Kafka Producer Configuration


package com.kscodes.springboot.advanced.config;

import com.kscodes.springboot.advanced.model.OrderEvent;
import org.apache.kafka.clients.producer.ProducerConfig;
import org.apache.kafka.common.serialization.StringSerializer;
import org.springframework.context.annotation.*;
import org.springframework.kafka.core.*;
import org.springframework.kafka.support.serializer.JsonSerializer;

import java.util.*;

@Configuration
public class KafkaProducerConfig {

    @Bean
    public ProducerFactory producerFactory() {
        Map config = new HashMap<>();
        config.put(ProducerConfig.BOOTSTRAP_SERVERS_CONFIG, "localhost:9092");
        config.put(ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG, StringSerializer.class);
        config.put(ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG, JsonSerializer.class);
        return new DefaultKafkaProducerFactory<>(config);
    }

    @Bean
    public KafkaTemplate kafkaTemplate() {
        return new KafkaTemplate<>(producerFactory());
    }
}

๐ŸŽฏ 4. Kafka Consumer Configuration


package com.kscodes.springboot.advanced.config;

import com.kscodes.springboot.advanced.model.OrderEvent;
import org.apache.kafka.clients.consumer.ConsumerConfig;
import org.apache.kafka.common.serialization.StringDeserializer;
import org.springframework.context.annotation.*;
import org.springframework.kafka.annotation.EnableKafka;
import org.springframework.kafka.config.*;
import org.springframework.kafka.core.*;
import org.springframework.kafka.support.serializer.JsonDeserializer;

import java.util.*;

@EnableKafka
@Configuration
public class KafkaConsumerConfig {

    @Bean
    public ConsumerFactory consumerFactory() {
        JsonDeserializer deserializer = new JsonDeserializer<>(OrderEvent.class);
        deserializer.addTrustedPackages("*");

        Map props = new HashMap<>();
        props.put(ConsumerConfig.BOOTSTRAP_SERVERS_CONFIG, "localhost:9092");
        props.put(ConsumerConfig.GROUP_ID_CONFIG, "order-events");
        props.put(ConsumerConfig.KEY_DESERIALIZER_CLASS_CONFIG, StringDeserializer.class);
        props.put(ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG, deserializer);
        return new DefaultKafkaConsumerFactory<>(props, new StringDeserializer(), deserializer);
    }

    @Bean
    public ConcurrentKafkaListenerContainerFactory kafkaListenerContainerFactory() {
        ConcurrentKafkaListenerContainerFactory factory = new ConcurrentKafkaListenerContainerFactory<>();
        factory.setConsumerFactory(consumerFactory());
        return factory;
    }
}

๐Ÿ“ค 5. Producer: OrderEventProducer.java


package com.kscodes.springboot.advanced.service;

import com.kscodes.springboot.advanced.model.OrderEvent;
import org.springframework.kafka.core.KafkaTemplate;
import org.springframework.stereotype.Service;

@Service
public class OrderEventProducer {

    private final KafkaTemplate kafkaTemplate;

    public OrderEventProducer(KafkaTemplate kafkaTemplate) {
        this.kafkaTemplate = kafkaTemplate;
    }

    public void sendOrderEvent(OrderEvent event) {
        kafkaTemplate.send("order-events", event);
    }
}

๐Ÿ“ฅ 6. Consumer: OrderEventListener.java


package com.kscodes.springboot.advanced.service;

import com.kscodes.springboot.advanced.model.OrderEvent;
import org.springframework.kafka.annotation.KafkaListener;
import org.springframework.stereotype.Service;

@Service
public class OrderEventListener {

    @KafkaListener(topics = "order-events", groupId = "order-events")
    public void handleOrderEvent(OrderEvent event) {
        System.out.println("Received Order Event: " + event.getOrderId());
        // process order
    }
}

๐ŸŒ 7. REST Controller: OrderController.java


package com.kscodes.springboot.advanced.controller;

import com.kscodes.springboot.advanced.model.OrderEvent;
import com.kscodes.springboot.advanced.service.OrderEventProducer;
import org.springframework.web.bind.annotation.*;

@RestController
@RequestMapping("/orders")
public class OrderController {

    private final OrderEventProducer producer;

    public OrderController(OrderEventProducer producer) {
        this.producer = producer;
    }

    @PostMapping
    public String createOrder(@RequestBody OrderEvent orderEvent) {
        producer.sendOrderEvent(orderEvent);
        return "Order event sent!";
    }
}

๐Ÿงช Testing the Flow

  1. Run Kafka using Docker.
  2. Start the Spring Boot app.
  3. POST an order event:

curl -X POST http://localhost:8080/orders \
  -H "Content-Type: application/json" \
  -d '{"orderId":"ORD-001","product":"iPhone","quantity":2}'

Check logs for the received event.

โš–๏ธ Advantages of Event-Driven Architecture

BenefitDescription
DecouplingServices don’t know about each other
ScalabilityEasily add more consumers
Real-time ProcessingReact to changes as they happen
Fault ToleranceRetry and reprocess events if needed

๐Ÿ›ก๏ธ Best Practices

  • Always version your events
  • Handle message idempotency
  • Log events for audit trails
  • Use dead-letter topics for failures

๐Ÿ”š Conclusion

Adopting Event-Driven Architecture with Spring Kafka enables your microservices to communicate asynchronously, react to events in real-time, and remain loosely coupled. Whether you’re building order systems, inventory updates, or analytics pipelines โ€” Kafka + Spring is the modern choice for robust messaging.

Start with a single event, grow your event model, and build systems that scale with confidence.

๐Ÿ”— External References