SpringBoot集成ELK實現(xiàn)數(shù)據(jù)存儲和日志管理的代碼示例
摘要
本文介紹在SpringBoot 2.7.18中集成Elasticsearch、Logstash、Kibana的步驟,簡單展示了ES增刪改查的API用法,測試Logstash日志收集,并實現(xiàn)Kibana數(shù)據(jù)看板可視化分析日志。
示例步驟
1) 啟動docker中的elk服務(wù)
我這里logstash通信地址是192.168.233.129:4560

2) 引入依賴
<dependencies>
<!--我這里sprinboot用的2.7.18 es9.0.3服務(wù)調(diào)save接口時會有報錯 但不影響邏輯生效-->
<!-- Spring Data Elasticsearch -->
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-data-elasticsearch</artifactId>
</dependency>
<!-- Elasticsearch Rest High Level Client -->
<dependency>
<groupId>org.elasticsearch.client</groupId>
<artifactId>elasticsearch-rest-high-level-client</artifactId>
</dependency>
<!--集成logstash 版本有講究 不然會報錯-->
<dependency>
<groupId>ch.qos.logback</groupId>
<artifactId>logback-classic</artifactId>
<version>1.2.11</version>
</dependency>
<dependency>
<groupId>net.logstash.logback</groupId>
<artifactId>logstash-logback-encoder</artifactId>
<version>6.6</version>
</dependency>
<dependency>
<groupId>org.projectlombok</groupId>
<artifactId>lombok</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-test</artifactId>
</dependency>
</dependencies>
3)配置連接ES(據(jù)說過時了,寫法僅供參考)
package org.coffeebeans.config;
import org.apache.http.HttpHost;
import org.apache.http.auth.AuthScope;
import org.apache.http.auth.UsernamePasswordCredentials;
import org.apache.http.client.CredentialsProvider;
import org.apache.http.impl.client.BasicCredentialsProvider;
import org.elasticsearch.client.RestClient;
import org.elasticsearch.client.RestHighLevelClient;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import org.springframework.data.elasticsearch.core.ElasticsearchOperations;
import org.springframework.data.elasticsearch.core.ElasticsearchRestTemplate;
@Configuration
public class ElasticsearchConfig {
@Value("${spring.data.elasticsearch.rest.uris}")
private String elasticsearchUrl;
@Value("${spring.data.elasticsearch.rest.username}")
private String username;
@Value("${spring.data.elasticsearch.rest.password}")
private String password;
@Bean
public RestHighLevelClient restHighLevelClient() {
final CredentialsProvider credentialsProvider = new BasicCredentialsProvider();
credentialsProvider.setCredentials(AuthScope.ANY, new UsernamePasswordCredentials(username, password));
return new RestHighLevelClient(
RestClient.builder(new HttpHost("192.168.233.129", 9200, "http"))
.setHttpClientConfigCallback(httpClientBuilder ->
httpClientBuilder.setDefaultCredentialsProvider(credentialsProvider)));
}
@Bean
public ElasticsearchOperations elasticsearchTemplate(RestHighLevelClient client) {
return new ElasticsearchRestTemplate(client);
}
}
application.yml
spring:
data:
elasticsearch:
rest:
uris: http://192.168.233.129:9200 # Elasticsearch 地址
username: elastic
password: your_password
connection-timeout: 1000 #連接超時
read-timeout: 3000 #請求超時
data:
elasticsearch:
repositories:
enabled: true
4)建立實體對象映射ES索引
package org.coffeebeans.entity;
import com.fasterxml.jackson.annotation.JsonFormat;
import lombok.Data;
import org.elasticsearch.common.geo.GeoPoint;
import org.springframework.data.annotation.CreatedDate;
import org.springframework.data.annotation.Id;
import org.springframework.data.elasticsearch.annotations.*;
import java.math.BigDecimal;
import java.util.Date;
/**
* <li>ClassName: Book </li>
* <li>Author: OakWang </li>
* 把實體映射成ES的索引
*/
@Data
@Document(indexName = "book")//indexName指定索引名 默認存儲庫引導(dǎo)時創(chuàng)建索引
public class Book {
//標(biāo)記主鍵字段 對應(yīng)ES的_id
@Id
private Integer id;
//type指定字段類型;analyzer指定自定義分析器和規(guī)范化器,這里使用IK分詞器(需要es安裝插件)
//FieldType常用 Text, Keyword, Integer, Long, Double, Date, Boolean, Object, Nested, Ip
@Field(type = FieldType.Text, analyzer = "ik_max_word")
private String title;
@Field(type = FieldType.Keyword)
private String author;
@Field(type = FieldType.Text, analyzer = "ik_max_word")
private String content;
@Field(type = FieldType.Double)
private BigDecimal price;
//標(biāo)記地理坐標(biāo)字段(經(jīng)緯度)
@GeoPointField
private GeoPoint location;
@CreatedDate
@Field(type = FieldType.Date, format = DateFormat.date_hour_minute_second)// format指定內(nèi)置日期格式
@JsonFormat(pattern = "yyyy-MM-dd HH:mm:ss", timezone = "GMT+8")//這里應(yīng)該確保時間格式和索引里的匹配
private Date createTime;
/*
對應(yīng)ES的restful寫法 使用IK分詞器的寫法(ES需要提前安裝好IK分詞器)
PUT /book
{
"settings": {
"number_of_shards": 2,
"number_of_replicas": 1,
"analysis": {
"analyzer": {
"ik_max_word": {
"type": "custom",
"tokenizer": "ik_max_word"
}
}
}
},
"mappings": {
"properties": {
"title": {
"type": "text",
"analyzer": "ik_max_word"
},
"author": {
"type": "keyword"
},
"content": {
"type": "text",
"analyzer": "ik_max_word"
},
"price": {
"type": "double"
},
"location": {
"type": "geo_point"
},
"createTime": {
"type": "date",
"format": "yyyy-MM-dd HH:mm:ss||yyyy-MM-dd'T'HH:mm:ss||strict_date_optional_time"
}
}
}
}
*/
/*
使用內(nèi)置分詞器的寫法
PUT /book
{
"settings": {
"number_of_shards": 2,
"number_of_replicas": 1
},
"mappings": {
"properties": {
"title": { "type": "text", "analyzer": "standard" },
"author": { "type": "keyword" },
"content": { "type": "text", "analyzer": "standard" },
"price": { "type": "double" },
"location":{ "type": "geo_point" },
"createTime": { "type": "date", "format": "yyyy-MM-dd HH:mm:ss||yyyy-MM-dd'T'HH:mm:ss||strict_date_optional_time" }
}
}
}
*/
}
5)定義Repository
package org.coffeebeans.mapper;
import org.coffeebeans.entity.Book;
import org.springframework.data.elasticsearch.repository.ElasticsearchRepository;
import org.springframework.stereotype.Repository;
/**
* <li>ClassName: BookRepository </li>
* <li>Author: OakWang </li>
*/
@Repository
public interface BookRepository extends ElasticsearchRepository<Book, String> {
// 自定義查詢方法(根據(jù)方法名自動生成查詢) 這一步?jīng)]試過
}
6)邏輯類Service
package org.coffeebeans.service;
import lombok.extern.slf4j.Slf4j;
import org.coffeebeans.entity.Book;
import org.coffeebeans.mapper.BookRepository;
import org.elasticsearch.index.query.QueryBuilders;
import org.elasticsearch.search.builder.SearchSourceBuilder;
import org.elasticsearch.search.fetch.subphase.highlight.HighlightBuilder;
import org.elasticsearch.action.search.SearchRequest;
import org.elasticsearch.action.search.SearchResponse;
import org.elasticsearch.client.RequestOptions;
import org.elasticsearch.client.RestHighLevelClient;
import org.elasticsearch.search.SearchHit;
import org.elasticsearch.search.fetch.subphase.highlight.HighlightField;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.stereotype.Service;
import java.io.IOException;
import java.util.ArrayList;
import java.util.List;
import java.util.Map;
/**
* <li>ClassName: BookService </li>
* <li>Author: OakWang </li>
*/
@Slf4j
@Service
public class BookService {
@Autowired
private RestHighLevelClient restHighLevelClient;
@Autowired
private BookRepository bookRepository;
// 保存
public Book save(Book book) {
return bookRepository.save(book);
}
// 查詢
public Book findById(String id) {
return bookRepository.findById(id).orElse(null);
}
// 刪除
public void deleteById(String id) {
bookRepository.deleteById(id);
}
// 高亮查詢
public List<Map<String, Object>> searchWithHighlight(String keyword) {
// 1. 構(gòu)建查詢條件
SearchSourceBuilder sourceBuilder = new SearchSourceBuilder();
sourceBuilder.query(QueryBuilders.matchQuery("title", keyword));
// 2. 配置高亮字段和樣式
HighlightBuilder highlightBuilder = new HighlightBuilder();
highlightBuilder
.field("title") // 指定需要高亮的字段
.preTags("<span style= 'color:red'>") // 匹配詞前綴標(biāo)簽
.postTags("</span>"); // 匹配詞后綴標(biāo)簽
sourceBuilder.highlighter(highlightBuilder);
// 3. 構(gòu)建搜索請求
SearchRequest searchRequest = new SearchRequest("book");
searchRequest.source(sourceBuilder);
List<Map<String, Object>> results = new ArrayList<>();
try {
// 4. 執(zhí)行查詢
SearchResponse searchResponse= restHighLevelClient.search(searchRequest, RequestOptions.DEFAULT);
// 5. 處理高亮結(jié)果
for (SearchHit hit : searchResponse.getHits()) {
Map<String, Object> sourceAsMap = hit.getSourceAsMap(); // 獲取結(jié)果集
Map<String, HighlightField> highlightFields = hit.getHighlightFields(); // 獲取高亮字段
// 將高亮字段值替換到原始結(jié)果中
if (highlightFields.containsKey("title")) {
HighlightField highlightField = highlightFields.get("title");
if (highlightField.getFragments() != null && highlightField.getFragments().length > 0) {
sourceAsMap.put("title", highlightField.getFragments()[0].string());
}
}
results.add(sourceAsMap);
}
} catch (IOException e) {
e.printStackTrace();
log.error("查詢失?。? + e.getMessage());
}
return results;
}
}
7)綁定logstash
定義輸出到logstash的日志邏輯logback.xml
<?xml version="1.0" encoding="UTF-8"?>
<configuration>
<property name="log.path" value="../logs/elasticsearch-use"/>
<property name="console.log.pattern"
value="%red(%d{yyyy-MM-dd HH:mm:ss.SSS}) %green([%thread]) %highlight(%-5level) %boldMagenta(%logger{36}%n) - %msg%n"/>
<property name="log.pattern" value="%d{yyyy-MM-dd HH:mm:ss} [%thread] %-5level %logger{36} - %msg%n"/>
<!-- logstash推送 -->
<appender name="LOGSTASH" class="net.logstash.logback.appender.LogstashTcpSocketAppender">
<destination>192.168.233.129:4560</destination>
<encoder class="net.logstash.logback.encoder.LogstashEncoder" charset="UTF-8"/>
</appender>
<!-- 控制臺輸出 -->
<appender name="console" class="ch.qos.logback.core.ConsoleAppender">
<encoder class="ch.qos.logback.classic.encoder.PatternLayoutEncoder">
<pattern>${console.log.pattern}</pattern>
<charset>utf-8</charset>
</encoder>
</appender>
<!-- 系統(tǒng)操作日志 -->
<root level="info">
<appender-ref ref="LOGSTASH"/>
<appender-ref ref="console"/>
</root>
</configuration>
8)運行測試類,寫入日志
package org.coffeebeans;
import lombok.extern.slf4j.Slf4j;
import org.coffeebeans.entity.Book;
import org.coffeebeans.service.BookService;
import org.elasticsearch.common.geo.GeoPoint;
import org.junit.jupiter.api.Test;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.boot.test.context.SpringBootTest;
import java.math.BigDecimal;
import java.util.Date;
/**
* <li>ClassName: org.coffeebeans.ESTest </li>
* <li>Author: OakWang </li>
*/
@Slf4j
@SpringBootTest
public class ESTest {
@Autowired
private BookService bookService;
@Test
void test1() {
Book book = new Book();
book.setId(1);
book.setAuthor("作者");
book.setPrice(new BigDecimal("30.00"));
book.setLocation(new GeoPoint(40.12, -71.34));
book.setTitle("你是一個牛馬");
book.setContent("賺錢生活學(xué)習(xí)");
book.setCreateTime(new Date());
bookService.save(book); //這里es和服務(wù)端版本不兼容會報錯Unable to parse response body for Response,但不影響結(jié)果
/*
類比request:
PUT /book/_doc/1
{
"id": 1,
"author": "作者",
"price": 30.00,
"location": {
"lat": 40.12,
"lon": -71.34
},
"title": "你是一個牛馬",
"content": "賺錢生活學(xué)習(xí)",
"createTime": "2025-07-20 12:30:39"
}
*/
}
@Test
void test2() {
log.info("返回結(jié)果:" + bookService.findById("1"));
/*
返回結(jié)果:Book(id=1, title=你是一個牛馬, author=作者, content=賺錢生活學(xué)習(xí), price=30.0, location=40.12, -71.34, createTime=Mon Jul 20 12:30:39 CST 2025)
類比request:GET /book/_doc/1
返回:{"id":1,"title":"你是一個牛馬","author":"作者","content":"賺錢生活學(xué)習(xí)","price":30.0,"location":{"lat":40.12,"lon":-71.34,"geohash":"drjk0xegcw06","fragment":true},"createTime":"2025-07-20 12:30:39"}
*/
}
@Test
void test3() {
log.info("返回結(jié)果:" + bookService.searchWithHighlight("你是一個牛馬"));
/*
返回結(jié)果:[{createTime=2025-07-28T14:09:39, author=作者, price=30.0, location={lon=-71.34, lat=40.12}, _class=org.coffeebeans.entity.Book, id=1, title=<span style= 'color:red'>你</span><span style= 'color:red'>是</span><span style= 'color:red'>一個</span><span style= 'color:red'>牛馬</span>, content=賺錢生活學(xué)習(xí)}]
類比request:
GET /book/_search
{
"query": {
"match": {
"title": {
"query": "你是一個牛馬",
"analyzer": "ik_max_word"
}
}
},
"highlight": {
"pre_tags": ["<span style='color:red'>"],
"post_tags": ["</span>"],
"fields": {
"title": {}
}
}
}
返回:[{"createTime":"2025-07-20T12:30:39","author":"作者","price":30.0,"location":{"lon":-71.34,"lat":40.12},"_class":"org.coffeebeans.entity.Book","id":1,"title":"<span style= 'color:red'>你</span><span style= 'color:red'>是</span><span style= 'color:red'>一個</span><span style= 'color:red'>牛馬</span>","content":"賺錢生活學(xué)習(xí)"}]
*/
}
@Test
void test4() {
bookService.deleteById("1"); //這里es和服務(wù)端版本不兼容會報錯Unable to parse response body for Response,但不影響結(jié)果
/*
類比request:
DELETE /book/_doc/1
*/
}
@Test
void test5() {
log.info("返回結(jié)果:來自sprinboot的測試日志");
}
}
9)kibana新建數(shù)據(jù)視圖管理日志
需確保日志寫進來了,索引模式才能匹配到源,進行視圖管理。


打開discover進行日志篩選

總結(jié)
以上我們了解了如何在Springboot2.7.18中集成ELK,以實現(xiàn)ES的API調(diào)用,并能用Logstash收集日志,ES存儲日志,Kibana可視化分析日志。常見問題有版本間的兼容性、日志推送的集成寫法、端口通信、ES配置等。
以上就是SpringBoot集成ELK實現(xiàn)數(shù)據(jù)存儲和日志管理的代碼示例的詳細內(nèi)容,更多關(guān)于SpringBoot ELK數(shù)據(jù)存儲和日志管理的資料請關(guān)注腳本之家其它相關(guān)文章!
相關(guān)文章
Java的volatile和sychronized底層實現(xiàn)原理解析
文章詳細介紹了Java中的synchronized和volatile關(guān)鍵字的底層實現(xiàn)原理,包括字節(jié)碼層面、JVM層面的實現(xiàn)細節(jié),以及鎖的類型和MESI協(xié)議在多核處理器中的作用,文章還探討了synchronized和volatile的區(qū)別,以及如何通過Atomic類來實現(xiàn)更細粒度的原子操作,感興趣的朋友一起看看吧2025-03-03
基于Java將Excel科學(xué)計數(shù)法解析成數(shù)字
這篇文章主要介紹了基于Java將Excel科學(xué)計數(shù)法解析成數(shù)字,文中通過示例代碼介紹的非常詳細,對大家的學(xué)習(xí)或者工作具有一定的參考學(xué)習(xí)價值,需要的朋友可以參考下2020-09-09
將RestTemplate的編碼格式改為UTF-8,防止亂碼問題
這篇文章主要介紹了將RestTemplate的編碼格式改為UTF-8,防止亂碼問題,具有很好的參考價值,希望對大家有所幫助。如有錯誤或未考慮完全的地方,望不吝賜教2021-10-10

