In the spring boot microservice architecture, log aggregation is an important part. Log details of multiple spring boot services can be aggregated by an aggregation system like ELK stack, where it can be stored and monitored in one place.
In this article, we will learn how to configure the ELK stack and monitor spring boot application logs. ELK stands for Elasticsearch, Logstash, and Kibana.
We will set up the ELK servers and then aggregate the log details generated by the Spring boot application.
Following are the version details:
- Java version 1.8.
- Spring boot : 2.2.5.RELEASE
- Elasticsearch : version-7.6.1
- Logstash : version-7.6.1
- Kibana : version-7.6.1
Let’s begin! 🙂
Table of Contents
- Configure Elasticsearch
- Configure Kibana
- Configure Logstash
- Create a Spring boot application
- Testing the application
- Consclusion
Configure Elasticsearch
Download the Elasticsearch Zip file from the official website and configure the Elasticsearch server on the windows system. Click on the link on the website, and the zip file will be downloaded.

Also, unzip the file, navigate to the /bin folder and start the Elasticsearch server by opening the command prompt and running the elasticsearch.bat file.

The server will start on default port 9200, and we can access the server at http://localhost:9200.
Finally, we should be able to get a screen as given below if the server starts without any issues.

Configure Kibana
Download the Kibana server from the official website. Similar to the Elasticsearch download, we can download the Zip file of the Kibana server.
Unzip the file and navigate to /bin folder. Open the command prompt and execute kibana.bat, as shown below.

Kibana server will start on port 5601, and we can access the server at http://localhost:5601.
We should be able to view the following screen.

Configure Logstash
Download the Latest Logstash server from the official website. Similar to Elasticsearch and Kibana, we can download the Zip file. Unzip the downloaded compressed file.
We need to configure the Logstash server to pass the Spring boot application logs.
Adding Logstash configuration
Navigate to the {Logstash}/config folder and create a new logstash-custom-config.conf configuration files and add the following content.
input { file { type => "java" path => "D:/elk_logging.log" codec => multiline { pattern => "^%{TIMESTAMP_ISO8601} " negate => true what => "previous" } } } filter { date { match => [ "timestamp" , "dd/MMM/yyyy:HH:mm:ss Z" ] } } output { stdout { codec => rubydebug } elasticsearch { hosts => ["http://localhost:9200"] } }
The Logstash configuration file may contain the following configuration parts.
input: Indicates the Logstash inputs. We can have multiple input elements defined. In this example, we will read the Spring boot application log from the log file. The log file path is mentioned in the path setting.
filter: This configuration filters the log input.
output: This is the output from Logstash that is passed to the Elasticsearch server.
Create a Spring boot application
Now the ELK stack configuration is ready. We need to create a Spring boot application and log some information to check the overall ELK logging.
Create a simple Spring boot application with spring-boot-starter-web dependency.
Modify the Spring boot starter java file and add a REST HTTP GET endpoint. Also, log some messages so that they can be passed to the ELK stack.
package com.asbnotebook; import org.apache.commons.logging.Log; import org.apache.commons.logging.LogFactory; import org.springframework.boot.SpringApplication; import org.springframework.boot.autoconfigure.SpringBootApplication; import org.springframework.web.bind.annotation.GetMapping; import org.springframework.web.bind.annotation.RestController; @SpringBootApplication @RestController public class SpringBootElkExampleApplication { private static Log log = LogFactory.getLog(SpringBootElkExampleApplication.class); public static void main(String[] args) { SpringApplication.run(SpringBootElkExampleApplication.class, args); } @GetMapping("/test") public String testELK() { log.info("Inside test ELK!!"); return "Hello!!"; } }
Add the following property to the application.properties file. We are specifying the same log file name with the file path that was mentioned in the Logstash configuration file earlier.
logging.file.name=D:/elk_logging.log
Testing the application
It’s time to test the application. 🙂
Start the Spring boot application
Start the Spring boot application. Check the GET HTTP endpoint /test by accessing http://localhost:8080/test.

We can be able to observe the logs printed on the log file.

Start the Logstash server
Start the Logstash server by executing the following command from the {Logstash}/bin folder.
logstash.bat -f ..\config\logstash-custom-config.conf

Logstash server starts on port 9600 and can be accessed at http://localhost:9600/.
Configuring the Kibana dashboard
Open the Kibana server at http://localhost:5601 and click on the Discover icon. Enter the text logstash* on the index pattern input.

Click on the Next Step button.
On the next screen, select the timestamp option and click on the Create Index Pattern button.

We will get the created index page.

Click on the Dashboard icon again to get the application log details.

We can observe the application log details on the Kibana server dashboard. 🙂
Consclusion
In this post, we learned the basics of configuring the ELK stack to aggregate the Spring boot application log details. We also learned about the Kibana dashboard and configuration steps to create a log search index.
The sample code is available on the GitHub repository.