0%

log4j2与kafka集成

log4j2与kafka集成

在很多时候大家都会在程序中记录日志,程序可以通过Log4j2直接将日志同步到kafka

pom依赖

1
2
3
4
5
6
7
8
9
10
<dependency>
<groupId>org.apache.kafka</groupId>
<artifactId>kafka-clients</artifactId>
<version>2.6.0</version>
</dependency>
<dependency>
<groupId>org.apache.logging.log4j</groupId>
<artifactId>log4j-core</artifactId>
<version>2.11.1</version>
</dependency>

Log4j2配置

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
<?xml version="1.0" encoding="UTF-8"?>
<Configuration status ="info">
<Appenders>
<!-- 控制台打印 -->
<Console name="STDOUT" target="SYSTEM_OUT">
<PatternLayout pattern="%d{yyyy-MM-dd HH:mm:ss.SSS} %4level %t - %m%n"/>
</Console>
<!-- 输入到kafka中 -->
<Kafka name="kafkaLog" topic="topic_request_log" ignoreExceptions="false">
<PatternLayout pattern="%d{yyyy-MM-dd HH:mm:ss.SSS} %4level %t - %m%n"/>
<Property name="bootstrap.servers">localhost:9092</Property>
</Kafka>
</Appenders>
<Loggers>
<Root level="info">
<AppenderRef ref="STDOUT"/>
<AppenderRef ref="kafkaLog"/>
</Root>
<Logger name="org.apache.kafka" level="INFO">
</Logger>
</Loggers>
</Configuration>

程序示例

一个简单的小示例

1
2
3
4
5
6
public class Log2Kafka {
private static final Logger LOGGER = LogManager.getLogger("APP");
public static void main(String[] args ) {
LOGGER.info("测试日志");
}
}

kafka存储结果

可以使用DumpLogSegments工具类查看消息内容

1
2
3
4
5
>kafka-run-class kafka.tools.DumpLogSegments --files /usr/local/var/lib/kafka-logs/topic_request_log-0/00000000000000000000.log --print-data-log
Dumping /usr/local/var/lib/kafka-logs/topic_request_log-0/00000000000000000000.log
Starting offset: 0
baseOffset: 0 lastOffset: 0 count: 1 baseSequence: -1 lastSequence: -1 producerId: -1 producerEpoch: -1 partitionLeaderEpoch: 0 isTransactional: false isControl: false position: 0 CreateTime: 1603250280109 size: 117 magic: 2 compresscodec: NONE crc: 2500714151 isvalid: true
| offset: 0 CreateTime: 1603250280109 keysize: -1 valuesize: 49 sequence: -1 headerKeys: [] payload: 2020-10-21 11:17:59.640 INFO main - 测试日志