背景:当我们的线上服务出了问题时,我们最先想到的是查看服务器日志。但是,传统的项目的日志一般都写在服务器的log文件中,非常不方便查阅。为了更好的查看线上的日志,我之前搭建了elk集群日志,虽然可以用kibana的discover查看,但是查看的日志特别乱,无法快速定位到哪个请求。于是,使用plumelog来代替elk日志,plumelog也是将日志写入到elasticsearch中,但是整体的页面更好看,方便查询。
plumelog官网,使用流程按官方的文档配置即可。这里我列下我的整合文件,基于配置中心的logback日志。
plumelog-server:
1.pom.xml:
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<parent>
<groupId>com.plumelog</groupId>
<artifactId>plumelog</artifactId>
<version>3.5</version>
</parent>
<modelVersion>4.0.0</modelVersion>
<artifactId>plumelog-server</artifactId>
<name>plumelog-server</name>
<properties>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
<redis.version>3.1.0</redis.version>
<kafka.version>2.5.0</kafka.version>
<springboot.version>2.6.3</springboot.version>
<elasticsearch.version>7.7.0</elasticsearch.version>
<spring-cloud.version>2021.0.1</spring-cloud.version>
<spring-cloud-alibaba.version>2021.0.1.0</spring-cloud-alibaba.version>
</properties>
<dependencyManagement>
<dependencies>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-dependencies</artifactId>
<version>2.6.3</version>
<type>pom</type>
<scope>import</scope>
</dependency>
<dependency>
<groupId>org.springframework.cloud</groupId>
<artifactId>spring-cloud-dependencies</artifactId>
<version>${spring-cloud.version}</version>
<type>pom</type>
<scope>import</scope>
</dependency>
<dependency>
<groupId>com.alibaba.cloud</groupId>
<artifactId>spring-cloud-alibaba-dependencies</artifactId>
<version>${spring-cloud-alibaba.version}</version>
<type>pom</type>
<scope>import</scope>
</dependency>
</dependencies>
</dependencyManagement>
<dependencies>
<dependency>
<groupId>com.alibaba.cloud</groupId>
<artifactId>spring-cloud-starter-alibaba-nacos-config</artifactId>
</dependency>
<dependency>
<groupId>com.alibaba.cloud</groupId>
<artifactId>spring-cloud-starter-alibaba-nacos-discovery</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.cloud</groupId>
<artifactId>spring-cloud-starter-bootstrap</artifactId>
</dependency>
<dependency>
<groupId>com.taobao.top</groupId>
<artifactId>lippi-oapi-encrpt</artifactId>
<version>dingtalk-SNAPSHOT</version>
<scope>system</scope>
<systemPath>${project.basedir}/lib/taobao-sdk-java-auto_1479188381469-20200701.jar</systemPath>
</dependency>
<dependency>
<groupId>org.elasticsearch.client</groupId>
<artifactId>elasticsearch-rest-client</artifactId>
<version>7.7.0</version>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-web</artifactId>
<exclusions>
<exclusion>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-tomcat</artifactId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-jetty</artifactId>
</dependency>
<dependency>
<groupId>net.sourceforge.nekohtml</groupId>
<artifactId>nekohtml</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-thymeleaf</artifactId>
</dependency>
<dependency>
<groupId>com.alibaba</groupId>
<artifactId>fastjson</artifactId>
<version>1.2.62</version>
</dependency>
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>4.11</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>com.plumelog</groupId>
<artifactId>plumelog-logback</artifactId>
<version>${version}</version>
</dependency>
<dependency>
<groupId>org.apache.lucene</groupId>
<artifactId>lucene-queryparser</artifactId>
<version>7.7.3</version>
</dependency>
<dependency>
<groupId>org.apache.lucene</groupId>
<artifactId>lucene-highlighter</artifactId>
<version>7.7.3</version>
</dependency>
<dependency>
<groupId>org.apache.lucene</groupId>
<artifactId>lucene-grouping</artifactId>
<version>7.7.3</version>
</dependency>
<dependency>
<groupId>org.apache.commons</groupId>
<artifactId>commons-io</artifactId>
<version>1.3.2</version>
</dependency>
<dependency>
<groupId>org.apache.lucene</groupId>
<artifactId>lucene-analyzers-smartcn</artifactId>
<version>7.7.3</version>
</dependency>
<dependency>
<groupId>de.codecentric</groupId>
<artifactId>spring-boot-admin-starter-server</artifactId>
<exclusions>
<exclusion>
<groupId>io.projectreactor.netty</groupId>
<artifactId>reactor-netty</artifactId>
</exclusion>
</exclusions>
<version>2.6.3</version>
</dependency>
<dependency>
<groupId>io.projectreactor.netty</groupId>
<artifactId>reactor-netty</artifactId>
<version>0.9.10.RELEASE</version>
</dependency>
<dependency>
<groupId>com.plumelog</groupId>
<artifactId>plumelog-lucene</artifactId>
<version>3.5</version>
<scope>compile</scope>
</dependency>
</dependencies>
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-surefire-plugin</artifactId>
<version>2.18.1</version>
<configuration>
<skipTests>true</skipTests>
</configuration>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-deploy-plugin</artifactId>
<version>2.8.2</version>
<configuration>
<skip>true</skip>
</configuration>
</plugin>
<plugin>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-maven-plugin</artifactId>
<version>2.6.3</version>
<configuration>
<fork>true</fork>
<jvmArguments>-Dfile.encoding=UTF-8</jvmArguments>
<includeSystemScope>true</includeSystemScope>
</configuration>
<executions>
<execution>
<goals>
<goal>repackage</goal>
</goals>
</execution>
</executions>
</plugin>
<!-- maven资源文件复制插件 -->
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-resources-plugin</artifactId>
<version>2.7</version>
<executions>
<execution>
<id>copy-config</id>
<phase>package</phase>
<goals>
<goal>copy-resources</goal>
</goals>
<configuration>
<outputDirectory>${project.build.directory}/</outputDirectory>
<resources>
<resource>
<directory>src/main/resources</directory>
<includes>
<exclude>**/*.xml</exclude>
<exclude>**/*.conf</exclude>
<exclude>**/*.properties</exclude>
<exclude>**/*.sh</exclude>
<exclude>**/*.bat</exclude>
</includes>
<filtering>true</filtering>
</resource>
</resources>
<encoding>UTF-8</encoding>
</configuration>
</execution>
</executions>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-jar-plugin</artifactId>
<version>2.4</version>
<configuration>
<finalName>plumelog-server</finalName>
<excludes>
<exclude>*.properties</exclude>
</excludes>
</configuration>
</plugin>
</plugins>
</build>
</project>
2.bootstrap.yaml:
admin:
log:
keepDays: 30
trace:
keepDays: 30
#管理密码,手动删除日志的时候需要输入的密码
password: 123456
#登录配置,配置后会有登录界面
login:
password: admin
username: admin
plumelog:
#elasticsearch相关配置,Hosts支持携带协议,如:http、https,集群逗号隔开,lite模式可以全部注释掉下面配置
es:
esHosts: https://192.168.0.28:29200
indexType:
model: day
refresh:
interval: 30s
replicas: 1
shards: 5
#是否信任自签证书
trustSelfSigned: true
userName: elastic
passWord: elastic
#是否hostname验证
#hostnameVerification: false
#拉取时间间隔,kafka不生效
interval: 100
#单次拉取日志条数
maxSendSize: 500
model: redis
queue:
redis:
redisDb: 0
redisHost: 192.168.0.143:30360
redisPassWord: wanfu!@#
ui:
url: http://demo.plumelog.com
server:
port: 8892
spring:
application:
name: plumelog-server
boot:
admin:
context-path: admin
mvc:
static-path-pattern: /plumelog/**
view:
prefix: classpath:/templates/
suffix: .html
profiles:
active: fat
thymeleaf:
mode: LEGACYHTML5
cloud:
nacos:
server-addr: ${NACOS_ADDRESS:192.168.0.143:8848}
username: ${NACOS_USERNAME:nacos}
password: ${NACOS_PASSWORD:wanfu!@#}
config:
namespace: ${spring.profiles.active}
group: WF_GROUP
file-extension: yaml
discovery:
namespace: ${spring.profiles.active}
group: WF_GROUP
-
logback-spring.xml:
<?xml version="1.0" encoding="UTF-8"?> <configuration debug="false"> <!--定义日志文件的存储地址 勿在 LogBack 的配置中使用相对路径--> <property name="LOG_HOME" value="/log" /> <!-- 控制台输出 --> <!-- 彩色日志 --> <!-- 彩色日志依赖的渲染类 --> <conversionRule conversionWord="clr" converterClass="org.springframework.boot.logging.logback.ColorConverter" /> <conversionRule conversionWord="wex" converterClass="org.springframework.boot.logging.logback.WhitespaceThrowableProxyConverter" /> <conversionRule conversionWord="wEx" converterClass="org.springframework.boot.logging.logback.ExtendedWhitespaceThrowableProxyConverter" /> <!-- 彩色日志格式 --> <property name="CONSOLE_LOG_PATTERN" value="${CONSOLE_LOG_PATTERN:-%clr(%d{yyyy-MM-dd HH:mm:ss.SSS}){faint} %clr(${LOG_LEVEL_PATTERN:-%5p}) %clr(${PID:- }){magenta} %clr(---){faint} %clr([%15.15t]){faint} %clr(%-40.40logger{39}){cyan} %clr(:){faint} %m%n${LOG_EXCEPTION_CONVERSION_WORD:-%wEx}}"/> <!--输出到控制台--> <appender name="CONSOLE" class="ch.qos.logback.core.ConsoleAppender"> <encoder> <Pattern>${CONSOLE_LOG_PATTERN}</Pattern> <!-- 设置字符集 --> <charset>UTF-8</charset> </encoder> </appender> <!-- 按照每天生成日志文件 --> <appender name="FILE" class="ch.qos.logback.core.rolling.RollingFileAppender"> <filter class="com.plumelog.logback.util.FilterSyncLogger"> <level>info</level> <filterPackage>com.plumelog.trace.aspect.AbstractAspect</filterPackage> </filter> <rollingPolicy class="ch.qos.logback.core.rolling.TimeBasedRollingPolicy"> <FileNamePattern>logs/plumelog-server.log.%d{yyyy-MM-dd}.log</FileNamePattern> <MaxHistory>3</MaxHistory> </rollingPolicy> <encoder> <Pattern>${CONSOLE_LOG_PATTERN}</Pattern> <!-- 设置字符集 --> <charset>UTF-8</charset> </encoder> </appender> <!-- 日志输出级别 --> <springProfile name="dev"> <root level="INFO"> <appender-ref ref="CONSOLE" /> <appender-ref ref="FILE" /> </root> </springProfile> <springProfile name="fat"> <root level="INFO"> <appender-ref ref="CONSOLE" /> <appender-ref ref="FILE" /> </root> </springProfile> <springProfile name="pro"> <root level="INFO"> <appender-ref ref="CONSOLE" /> <appender-ref ref="FILE" /> </root> </springProfile> </configuration>
4.使用docker-compose部署plumelog-server
4.1 dockerfile:
FROM openjdk:11.0.12-jre-buster
ADD application.jar /application.jar
ADD bootstrap.yaml /bootstrap.yaml
ADD logback-spring.xml /logback-spring.xml
CMD ["java", "-jar", "-Dfile.encoding=UTF-8", "/application.jar"]
4.2 docker-compose.yaml:
version: "3"
services:
plume-server-fat:
image: fat/com.wf/plume-service
container_name: fat-plume-server
ports:
- 28891:8891
restart: always
构建镜像:docker build -t fat/com.wf/plume-service .
启动服务 docker-compose up -d
访问ui界面:http://192.168.0.28:28891
-
配置客户端:
5.1.添加相关jar包:
<dependency> <groupId>org.springframework.boot</groupId> <artifactId>spring-boot-starter-aop</artifactId> <version>2.6.3</version> </dependency> <!-- admin的版本要跟服务端的一致 --> <dependency> <groupId>de.codecentric</groupId> <artifactId>spring-boot-admin-starter-client</artifactId> <version>2.6.3</version> </dependency> <dependency> <groupId>com.plumelog</groupId> <artifactId>plumelog-logback</artifactId> <version>3.5.3</version> </dependency> <!-- 配置traceId方便查询整个请求的日志 --> <dependency> <groupId>com.plumelog</groupId> <artifactId>plumelog-trace</artifactId> <version>3.5.3</version> </dependency>
-
// 在启动类上添加包扫描路径, @ComponentScan({"com.plumelog","com.wf.admin.**"})
5.2. 在配置文件添加绑定redis的配置:
plumelog: redis: appName: wf-us168-admin-service redisHost: 192.168.0.143:30360 redisAuth: wanfu!@# redisDb: 0
5.3 logback-spring.xml:
<?xml version="1.0" encoding="UTF-8"?> <configuration scan="true" scanPeriod="60 seconds" debug="false"> <!-- 参考SpringBoot默认的logback配置,增加了error日志文件 --> <!-- org/springframework/boot/logging/logback/base.xml --> <!-- <include resource="org/springframework/boot/logging/logback/defaults.xml"/>--> <conversionRule conversionWord="clr" converterClass="org.springframework.boot.logging.logback.ColorConverter"/> <conversionRule conversionWord="wex" converterClass="org.springframework.boot.logging.logback.WhitespaceThrowableProxyConverter"/> <conversionRule conversionWord="wEx" converterClass="org.springframework.boot.logging.logback.ExtendedWhitespaceThrowableProxyConverter"/> <springProperty scope="context" name="service_name" source="spring.application.name"/> <property name="LOG_PATH" value="${LOG_PATH:-${LOG_TEMP:-${java.io.tmpdir:-/tmp}}}"/> <property name="CONSOLE_LOG_PATTERN" value="${CONSOLE_LOG_PATTERN:-%clr(%d{${LOG_DATEFORMAT_PATTERN:-yyyy-MM-dd HH:mm:ss.SSS}}){faint} %clr(${LOG_LEVEL_PATTERN:-%5p}) %clr(${PID:- }){magenta} %clr(---){faint} %clr([%15.15t]){faint} %clr(%-40.40logger{39}){cyan} %clr(:){faint} %m%n${LOG_EXCEPTION_CONVERSION_WORD:-%wEx}}"/> <property name="FILE_LOG_PATTERN" value="${FILE_LOG_PATTERN:-%d{${LOG_DATEFORMAT_PATTERN:-yyyy-MM-dd HH:mm:ss.SSS}} ${LOG_LEVEL_PATTERN:-%5p} ${PID:- } --- [%t] %-40.40logger{39} : %m%n${LOG_EXCEPTION_CONVERSION_WORD:-%wEx}}"/> <!-- 控制台日志 --> <appender name="CONSOLE" class="ch.qos.logback.core.ConsoleAppender"> <encoder> <pattern>${CONSOLE_LOG_PATTERN}</pattern> </encoder> </appender> <appender name="FILE" class="ch.qos.logback.core.rolling.RollingFileAppender"> <file>${LOG_PATH}/${service_name}.log</file> <rollingPolicy class="ch.qos.logback.core.rolling.TimeBasedRollingPolicy"> <fileNamePattern>${LOG_PATH}/${service_name}-%d{yyyy-MM-dd}.log</fileNamePattern> <maxHistory>${LOG_FILE_MAX_HISTORY:-0}</maxHistory> </rollingPolicy> <encoder> <pattern>${FILE_LOG_PATTERN}</pattern> </encoder> </appender> <!-- 环境配置 --> <springProperty scope="context" name="plumelog.appName" source="plumelog.redis.appName"/> <springProperty scope="context" name="plumelog.redisHost" source="plumelog.redis.redisHost"/> <springProperty scope="context" name="plumelog.redisAuth" source="plumelog.redis.redisAuth"/> <springProperty scope="context" name="plumelog.redisDb" source="plumelog.redis.redisDb"/> <springProperty scope="context" name="plumelog.env" source="spring.profiles.active"/> <!-- 输出plumelog --> <appender name="plumelog" class="com.plumelog.logback.appender.RedisAppender"> <appName>${plumelog.appName}</appName> <redisHost>${plumelog.redisHost}</redisHost> <redisAuth>${plumelog.redisAuth}</redisAuth> <redisDb>${plumelog.redisDb}</redisDb> <env>${plumelog.env}</env> </appender> <!-- 配置日志输出,只输出info,只保留控制台和plumelog输出--> <!-- 正常开发环境本地,只输出到控制台,测试环境只输出到plumelog,生产环境输出到本地文件plumelog,因为有plumelog加持本地文件就保留3天即可--> <!-- 这些都可以根据环境配置不同加载不同的ref--> <!-- 输出到控制台 --> <springProfile name="dev"> <root level="INFO"> <appender-ref ref="CONSOLE"/> <!-- 输出到文件 --> <appender-ref ref="FILE"/> <!-- 输出plumelog --> <appender-ref ref="plumelog"/> </root> </springProfile> <springProfile name="fat"> <root level="INFO"> <appender-ref ref="CONSOLE"/> <appender-ref ref="plumelog"/> <appender-ref ref="FILE"/> </root> </springProfile> <springProfile name="test"> <root level="INFO"> <appender-ref ref="FILE" /> </root> </springProfile> <springProfile name="prod"> <root level="INFO"> <appender-ref ref="FILE" /> <appender-ref ref="plumelog"/> </root> </springProfile> </configuration>
5.3 配置aop(我这里设置全局的traceId,如果要对某个请求设置trceId可以使用@Trace注解)
链路追踪有时可以有时不行,待处理。
启动项目并访问,查看plumelog页面
版权声明:本文内容由互联网用户自发贡献,该文观点仅代表作者本人。本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。如发现本站有涉嫌侵权/违法违规的内容, 请发送邮件至 举报,一经查实,本站将立刻删除。
文章由极客之音整理,本文链接:https://www.bmabk.com/index.php/post/70915.html