gpt4 book ai didi

elasticsearch - 要在Logstash中解析的多个模式

转载 作者:行者123 更新时间:2023-12-02 23:26:07 25 4
gpt4 key购买 nike

我的日志文件具有多个模式,包括JSON格式的日志。我想解析grok插件中的多个模式,但它似乎不起作用。

'filter {grok {  break_on_match => false 
match =>[ "message", "%{TIMESTAMP_ISO8601:LogDate} %{LOGLEVEL:loglevel} (?<threadName>[^:]+):%{NUMBER:ThreadID} - %{GREEDYDATA:Line}",
"message","%{TIMESTAMP_ISO8601:LogDate} %{LOGLEVEL:loglevel} (?<threadName>[^:]+):%{NUMBER:ThreadID} - %{IP:Clicnet} - - %{GREEDYDATA:Line}"]}
json {source => "Line"}mutate{remove_field => [ "Line","ThreadID" ]}}'

即使成功解析JSON字符串的行,也有grokparsefailure标记。

2017-01-27 11:54:48 INFO PropertiesReader:33-{“timestamp”:1485518878968,“h”:“2972​​68184dde”,“l”:“INFO”,“cN”:“org.com.logstash。演示”,“mN”:“loadProperties”,“m”:“从/ var / tmp / conf加载属性文件”}
{
"message" => "2017-01-27 11:54:48 INFO PropertiesReader:33 - {\"timestamp\":1485518878968,\"h\":\"297268184dde\", \"l\":\"INFO\", \"cN\":\"org.com.logstash.demo\", \"mN\":\"loadProperties\", \"m\":\"load property file from /var/tmp/conf\"}",
"@version" => "1",
"@timestamp" => "2017-03-20T17:19:16.316Z",
"type" => "stdin",
"host" => "ef3b82",
"LogDate" => "2017-01-27 11:54:48",
"loglevel" => "INFO",
"threadName" => "PropertiesReader",
"tags" => [
[0] "_grokparsefailure"
],
"timestamp" => 1485518878968,
"h" => "297268184dde",
"l" => "INFO",
"cN" => "org.com.logstash.demo",
"mN" => "loadProperties",
"m" => "load property file from /var/tmp/conf"
}

没有JSON的第二行完全失败

2017-01-20 15:46:16 INFO RequestLog:60-10.252.134.34--[20 / Jan / 2017:15:46:16 +0000]“OPTIONS //127.0.0.0:8080/ HTTP / 1.1” 404237 1
Error parsing json {:source=>"Line", :raw=>["10.252.134.34 - - [20/Jan/2017:15:46:16 +0000] \"OPTIONS //127.0.0.0:8080/ HTTP/1.1\" 404 237  1", "[20/Jan/2017:15:46:16 +0000] \"OPTIONS //127.0.0.0:8080/ HTTP/1.1\" 404 237  1"], :exception=>java.lang.ClassCastException: org.jruby.RubyArray cannot be cast to org.jruby.RubyIO, :level=>:warn}
{
"message" => "2017-01-20 15:46:16 INFO RequestLog:60 - 10.252.134.34 - - [20/Jan/2017:15:46:16 +0000] \"OPTIONS //127.0.0.0:8080/ HTTP/1.1\" 404 237 1",
"@version" => "1",
"@timestamp" => "2017-03-20T17:19:51.175Z",
"type" => "stdin",
"host" => "ef3b82",
"LogDate" => [
[0] "2017-01-20 15:46:16",
[1] "2017-01-20 15:46:16"
],
"loglevel" => [
[0] "INFO",
[1] "INFO"
],
"threadName" => [
[0] " RequestLog",
[1] " RequestLog"
],
"Clicnet" => "10.252.134.34",
"tags" => [
[0] "_jsonparsefailure"
]
}

最佳答案

花了5个小时后,我设法找到了解决方案。在以下模式下使用,该模式成功解析了两条日志行

/opt/logstash/bin/logstash -e 'filter {grok  { match =>{ "message" =>["%{TIMESTAMP_ISO8601:LogDate} %{LOGLEVEL:loglevel} (?<threadName>[^:]+):%{NUMBER:ThreadName} - %{IP:Client} - - %{GREEDYDATA:LogMessage}", "%{TIMESTAMP_ISO8601:LogDate} %{LOGLEVEL:loglevel} (?<threadName>[^:]+):%{NUMBER:ThreadID} - %{GREEDYDATA:Line}"]}}  json {source => "Line"} mutate{remove_field => [ "Line","ThreadID" ]}}'

关于elasticsearch - 要在Logstash中解析的多个模式,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/42909708/

25 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com