In a microservices architecture, logging becomes more complex as logs are scattered across multiple services. This makes debugging and monitoring challenging.
Centralized Logging with ELK Stack solves this problem by aggregating logs from all services into a central platform, enabling real-time visualization and search.
In this guide, youโll learn how to integrate Spring Boot microservices with the ELK Stack โ Elasticsearch, Logstash, and Kibana โ for powerful centralized logging.

๐ฆ Stack Overview
- Elasticsearch: Stores structured log data
- Logstash: Parses and transports log data
- Kibana: Visualizes log data with dashboards
๐งฐ Step 1: ELK Stack Setup Using Docker
Create a docker-compose.yml
:
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 |
version: '3.7' services: elasticsearch: image: docker.elastic.co/elasticsearch/elasticsearch:8.10.0 container_name: elasticsearch environment: - discovery.type=single-node - xpack.security.enabled=false ports: - 9200:9200 logstash: image: docker.elastic.co/logstash/logstash:8.10.0 container_name: logstash volumes: - ./logstash.conf:/usr/share/logstash/pipeline/logstash.conf ports: - 5000:5000 kibana: image: docker.elastic.co/kibana/kibana:8.10.0 container_name: kibana ports: - 5601:5601 environment: - ELASTICSEARCH_HOSTS=http://elasticsearch:9200 |
โ๏ธ Step 2: Logstash Configuration (logstash.conf
)
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 |
input { tcp { port => 5000 codec => json_lines } } output { elasticsearch { hosts => ["http://elasticsearch:9200"] index => "springboot-logs-%{+YYYY.MM.dd}" } stdout { codec => rubydebug } } |
๐ Step 3: Spring Boot Logback Configuration
Youโll use Logback + Logstash encoder in your Spring Boot apps.
๐ง Add dependencies to pom.xml
:
1 2 3 4 5 6 7 8 |
<dependency> <groupId>net.logstash.logback</groupId> <artifactId>logstash-logback-encoder</artifactId> <version>7.4</version> </dependency> |
๐ logback-spring.xml
Configuration:
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 |
<configuration> <include resource="org/springframework/boot/logging/logback/base.xml" /> <appender name="LOGSTASH" class="net.logstash.logback.appender.LogstashTcpSocketAppender"> <destination>localhost:5000</destination> <encoder class="net.logstash.logback.encoder.LoggingEventCompositeJsonEncoder"> <providers> <timestamp> <fieldName>timestamp</fieldName> </timestamp> <pattern> <pattern> { "level": "%level", "logger": "%logger", "message": "%message", "thread": "%thread" } </pattern> </pattern> </providers> </encoder> </appender> <root level="INFO"> <appender-ref ref="LOGSTASH" /> </root> </configuration> |
๐งโ๐ป Example Microservice Logging
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 |
package com.kscodes.springboot.microservice.orderservice; import org.slf4j.Logger; import org.slf4j.LoggerFactory; import org.springframework.web.bind.annotation.*; @RestController @RequestMapping("/orders") public class OrderController { private static final Logger logger = LoggerFactory.getLogger(OrderController.class); @GetMapping("/{id}") public String getOrder(@PathVariable String id) { logger.info("Fetching order with ID: {}", id); return "Order details for " + id; } } |
๐งช Step 4: Run Everything
- Run
docker-compose up
- Run your Spring Boot microservice
- Visit Kibana UI at http://localhost:5601
- Create an index pattern:
springboot-logs-*
Now you can search and visualize logs.
๐ง Best Practices
- ๐งพ Use structured JSON logs
- ๐ก๏ธ Mask sensitive data before logging
- โ ๏ธ Set log levels wisely (
ERROR
,INFO
,DEBUG
) - ๐ Separate environments by index (e.g.,
logs-dev-*
,logs-prod-*
)
๐ Benefits of Centralized Logging with ELK Stack
- ๐ Full-text search of logs with Kibana
- ๐ง Easy debugging and traceability across microservices
- ๐งฉ Seamless integration with Spring Boot via Logback
- ๐ Real-time dashboards and alerts
๐ Conclusion
Implementing Centralized Logging helps unify and analyze logs across all your microservices. With structured logging, searchable indexes, and visual dashboards, the ELK stack provides visibility and observability into your applications.