gpt4 book ai didi

elasticsearch - 使用filebeat在日期/时间之间合并日志

转载 作者:行者123 更新时间:2023-12-03 00:59:11 25 4
gpt4 key购买 nike

我正在尝试使用elasticSearch将日志推送到fileBeat(No Logstash)

我想发送以下登录消息,但它分成多条消息,每一行变成单独的消息

20161014 17:49:09.169 [ERROR] [Thread-2974] some.java.class.:70 - some.java.Exception: write failed. History: [requestHost=123-some.org.com, time=Fri Oct 14 17:49:05 GMT-07:00 2016, exception=java.net.SocketTimeoutException]
[requestHost=123-some.org.com, time=Fri Oct 14 17:49:07 GMT-07:00 2016, exception=java.net.SocketTimeoutException]
[requestHost=123-some.org.com, time=Fri Oct 14 17:49:09 GMT-07:00 2016, exception=java.net.SocketTimeoutException]
Tried 3 times
at java.lang.Thread.run(Thread.java:745)
20161014 17:49:09.169 [ERROR] [Thread-3022]

我想合并2个日期之间的所有行(第一行和最后一行)

这是我的 filebeat.yml代码段
 paths:
- /test.log
multiline.pattern: '^\[0-9]{8}'
multiline.negate: true
multiline.match: after

我需要知道正确的 regex
我正在尝试不使用 logstash来解决此问题

最佳答案

在提供的日志示例中使用以下Filebeat配置会产生两个事件,其中每个消息均以日期开头。

我使用下面的配置运行./filebeat -c filebeat.yml -e -v -d "*"进行测试。我还测试了Go playground上的模式。

filebeat.yml:

filebeat:
prospectors:
- paths: ["input.txt"]
multiline:
pattern: '^[0-9]{8}'
negate: true
match: after
output:
console:
pretty: false

输出:
{   
"@timestamp": "2016-10-17T14:13:31.292Z",
"beat": {
"hostname": "host.example.com",
"name": "host.example.com",
},
"input_type": "log",
"message": "20161014 17:49:09.169 [ERROR] [Thread-2974] some.java.class.:70 - some.java.Exception: write failed. History: [requestHost=123-some.org.com, time=Fri Oct 14 17:49:05 GMT-07:00 2016, exception=java.net.SocketTimeoutException]\n[requestHost=123-some.org.com, time=Fri Oct 14 17:49:07 GMT-07:00 2016, exception=java.net.SocketTimeoutException]\n[requestHost=123-some.org.com, time=Fri Oct 14 17:49:09 GMT-07:00 2016, exception=java.net.SocketTimeoutException]\n Tried 3 times\n at java.lang.Thread.run(Thread.java:745)",
"offset": 519,
"source": "input.txt",
"type": "log"
}
{
"@timestamp": "2016-10-17T14:17:21.686Z",
"beat": {
"hostname": "host.example.com",
"name": "host.example.com",
},
"input_type": "log",
"message": "20161014 17:49:09.169 [ERROR] [Thread-3022]",
"offset": 563,
"source": "input.txt",
"type": "log"
}

关于elasticsearch - 使用filebeat在日期/时间之间合并日志,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/40055396/

25 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com