gpt4 book ai didi

elasticsearch - Logstash中的CSV过滤器引发 “_csvparsefailure”错误

转载 作者:行者123 更新时间:2023-12-02 22:50:22 26 4
gpt4 key购买 nike

我问了另一个问题,可能与这个问题有关:
JSON parser in logstash ignoring data?
我认为这与之相关的原因是,在上一个问题中,kibana并未显示JSON解析器的结果,该解析器的“PROGRAM”字段为“mfd_status”。现在,我正在改变工作方式,删除了JSON分析器,以防万一可能干扰某些东西,但是我仍然没有显示任何带有“mfd_status”的日志。

csv 
{
columns => ["unixTime", "unixTime2", "FACILITY_NUM", "LEVEL_NUM", "PROGRAM", "PID", "MSG_FULL"]
source => "message"
separator => " "
}

在上一个问题的过滤器中,我使用了两个grok过滤器,现在我将它们替换为csv过滤器。我也有两个日期和一个指纹过滤器,但我认为它们与这个问题无关。

日志消息示例:

"1452564798.76\t1452496397.00\t1\t4\tkernel\t\t[ 6252.000246] sonar: sonar_write(): waiting..."



输出:
        "unixTime" => "1452564798.76",
"unixTime2" => "1452496397.00",
"FACILITY_NUM" => "1",
"LEVEL_NUM" => "4",
"PROGRAM" => "kernel",
"PID" => nil,
"MSG_FULL" => "[ 6252.000246] sonar: sonar_write(): waiting...",
"TIMESTAMP" => "2016-01-12T02:13:18.760Z",
"TIMESTAMP_second" => "2016-01-11T07:13:17.000Z"

"1452564804.57\t1452496403.00\t1\t7\tmfd_status\t\t00800F08CFB0\textra\t{\"date\":1452543203,\"host\":\"ABCD1234\",\"inet\":[\"169.254.42.207/16\",\"10.8.207.176/32\",\"172.22.42.207/16\"],\"fb0\":[\"U:1280x800p-60\",32]}"



输出:
       "tags" => [
[0] "_csvparsefailure"

在日志中显示kernel / mfd_status后,不应再有分隔符,并且都应在MSG_FULL字段下。

因此,总而言之,为什么我的一条日志消息可以正确解析而另一条则不能正确解析?另外,即使它不能正确解析,它也应该仅将其发送给具有空白字段的 flex 搜索,我想,为什么它也不这样做?

最佳答案

您差不多不错,您需要在CSV过滤器中覆盖另外两个参数,并且这两行都将正确解析。

第一个是skip_empty_columns => true,因为第二个日志行中有一个空字段,您需要忽略它。

第二个是quote_char=> "'"(或双引号"以外的任何其他字符),因为您的JSON包含双引号。

csv {
columns => ["unixTime", "unixTime2", "FACILITY_NUM", "LEVEL_NUM", "PROGRAM", "PID", "MSG_FULL"]
source => "message"
separator => " "
skip_empty_columns => true
quote_char => "'"
}

使用此方法,您的第一条日志行解析为:
{
"message" => "1452564798.76\\t1452496397.00\\t1\\t4\\tkernel\\t\\t[ 6252.000246] sonar: sonar_write(): waiting...",
"@version" => "1",
"@timestamp" => "2016-01-12T04:21:34.051Z",
"host" => "iMac.local",
"unixTime" => "1452564798.76",
"unixTime2" => "1452496397.00",
"FACILITY_NUM" => "1",
"LEVEL_NUM" => "4",
"PROGRAM" => "kernel",
"MSG_FULL" => "[ 6252.000246] sonar: sonar_write(): waiting..."
}

第二条日志行解析为:
{
"message" => "1452564804.57\\t1452496403.00\\t1\\t7\\tmfd_status\\t\\t00800F08CFB0\\textra\\t{\\\"date\\\":1452543203,\\\"host\\\":\\\"ABCD1234\\\",\\\"inet\\\":[\\\"169.254.42.207/16\\\",\\\"10.8.207.176/32\\\",\\\"172.22.42.207/16\\\"],\\\"fb0\\\":[\\\"U:1280x800p-60\\\",32]}",
"@version" => "1",
"@timestamp" => "2016-01-12T04:21:07.974Z",
"host" => "iMac.local",
"unixTime" => "1452564804.57",
"unixTime2" => "1452496403.00",
"FACILITY_NUM" => "1",
"LEVEL_NUM" => "7",
"PROGRAM" => "mfd_status",
"MSG_FULL" => "00800F08CFB0",
"column8" => "extra",
"column9" => "{\\\"date\\\":1452543203,\\\"host\\\":\\\"ABCD1234\\\",\\\"inet\\\":[\\\"169.254.42.207/16\\\",\\\"10.8.207.176/32\\\",\\\"172.22.42.207/16\\\"],\\\"fb0\\\":[\\\"U:1280x800p-60\\\",32]}"
}

关于elasticsearch - Logstash中的CSV过滤器引发 “_csvparsefailure”错误,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/34735059/

26 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com