gpt4 book ai didi

csv - 从 CSV 文件输入数据到 logstash

转载 作者:行者123 更新时间:2023-11-29 02:54:40 25 4
gpt4 key购买 nike

我有一个 csv 文件,包含以下标题:

"PacketId","MACAddress","Date","PacketLength","SourceIP","SourcePort","DestIP","DestPort"

我想使用 LogStash 将数据索引到 ElasticSearch,但无法为其编写过滤器。

filter {
grok {
match => message => "%{IP:SourceIP}"
}
}

上面的过滤器很好地提取了 SourceIP 字段,但是我该如何编写 grok 模式来为所有字段提取它。

最佳答案

让下面的CSV文件:

1,00-14-22-01-23-45,13/09/2015,32,128.248.1.43,9980,128.248.23.13,9880
1,01-74-02-84-13-98,14/09/2015,64,128.248.1.94,9280,128.248.13.84,9380

您必须在此处设置 Logstash 配置:

input {
file {
path => "/path/of/your/csv/test.csv"
sincedb_path => "/path/of/your/csv/test.idx"
start_position => "beginning"
}
}

filter {
csv {
separator => ","
columns => ["PacketId","MACAddress","Date","PacketLength","SourceIP","SourcePort","DestIP","DestPort"]
}
}

output {
stdout {
codec => rubydebug
}
}

你会得到输出结果:

{
"message" => [
[0] "1,00-14-22-01-23-45,13/09/2015,32,128.248.1.43,9980,128.248.23.13,9880"
],
"@version" => "1",
"@timestamp" => "2015-09-14T20:11:28.976Z",
"host" => "MyHost.local",
"path" => "/path/of/your/csv/test.csv",
"PacketId" => "1",
"MACAddress" => "00-14-22-01-23-45",
"Date" => "13/09/2015",
"PacketLength" => "32",
"SourceIP" => "128.248.1.43",
"SourcePort" => "9980",
"DestIP" => "128.248.23.13",
"DestPort" => "9880"
}
{
"message" => [
[0] "1,01-74-02-84-13-98,14/09/2015,64,128.248.1.94,9280,128.248.13.84,9380"
],
"@version" => "1",
"@timestamp" => "2015-09-14T20:11:28.978Z",
"host" => "MyHost.local",
"path" => "/path/of/your/csv/test.csv",
"PacketId" => "1",
"MACAddress" => "01-74-02-84-13-98",
"Date" => "14/09/2015",
"PacketLength" => "64",
"SourceIP" => "128.248.1.94",
"SourcePort" => "9280",
"DestIP" => "128.248.13.84",
"DestPort" => "9380"
}

问候,阿兰

关于csv - 从 CSV 文件输入数据到 logstash,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/32551318/

25 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com