gpt4 book ai didi

elasticsearch - Logstash 在解析日志时报告 [0] _grokparsefailure

转载 作者:IT王子 更新时间:2023-10-29 05:55:38 26 4
gpt4 key购买 nike

我有来自这种格式的日志。我已将 logstash 变量分配给下面的模式。我相信我已使用随附的预定义 Grok 标签正确分配了这些元素中的每一个。但是,当我运行 logstash 时,它反射(reflect):[0] "_grokparsefailure"表明它无法解析请求。我不知道我的 conf 到底出了什么问题。这里有人知道是什么原因造成的吗?我对 logstash 很陌生。提前致谢

1383834858 0 71.172.136.12 20097903 198.2.20.171 80 TCP_HIT/200 252 HEAD http://podcasts.someserver.com/80830A/podcasts.someserver.com/nyv/voice-film-club/2013/11/the-sexy-god-thor.mp3 - 0 355 "-""播客/2.0"33546 "-"

%{BASE10NUM:timestamp} = 1383834858
%{BASE10NUM:time_taken} = 0
%{IP:clientip} = 71.172.136.12
%{BASE10NUM:filesize} = 20097903
%{IP:serverip} = 198.2.20.171
%{BASE10NUM:port} = 80
%{WORD:status_code} = TCP_HIT/200
%{BASE10NUM:sc_bytes} = 252
%{WORD:method} = HEAD
%{URI:cs_uri} = http://podcasts.someserver.com/80830A/podcasts.someserver.com/nyv/voice- film-club/2013/11/the-sexy-god-thor.mp3
%{NOTSPACE:ignore2} = -
%{BASE10NUM:rs_duration} = 0
%{BASE10NUM:rs_bytes} = 355
%{QS:c_referrer} = "-"
%{QS:user_agent} = "Podcasts/2.0"
%{BASE10NUM:customerid} = 33546
%{QS:ignore} = "-"

我的 logstash.conf 文件如下所示:

input {
#wpa_media logs from the CDN(see puppet module)
redis {
type => "wpc_media"
host => "devredis1.somedomain.com"
# these settings should match the output of the agent
data_type => "list"
key => "wpc_media"
codec => json
debug => true
}
}


filter {
grok {
type => "wpc_media"
pattern => [ "%{BASE10NUM:timestamp} %{BASE10NUM:time_taken} %{IP:clientip} %{BASE10NUM:filesize} %{IP:serverip} %{BASE10NUM:port} %{WORD:status_code} %{BASE10NUM:sc_bytes} %{WORD:method} %{URI:cs_uri} %{NOTSPACE:ignore2} %{BASE10NUM:rs_duration} %{BASE10NUM:rs_bytes} %{QS:c_referrer} %{QS:user_agent} %{BASE10NUM:customerid} %{QS:ignore} " ]
}

mutate {
#just something to cover up the error not really fixing it
#remove_tag => [ "_grokparsefailure" ]
remove => [ "customer_id", "ignore", "c_referrer", "time_taken" ]
}
}
output {
stdout { debug => true debug_format => "ruby"}
}

最佳答案

供您自己引用,GrokDebugger对于此类问题,该站点非常方便。

对于您提供的特定日志事件,%{WORD}不匹配 TCP_HIT/200 .

一个快速解决方法是匹配 %{DATA:status_code}相反(你可以看到 built-in patterns on GitHub )。您当然可以构建更有针对性的匹配项,但如果没有看到可能的输入就很难做到这一点。

如果你总是期待 word/number ,类似于 (?<status_code>%{WORD}/%{INT})可以工作。

关于elasticsearch - Logstash 在解析日志时报告 [0] _grokparsefailure,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/19848257/

26 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com