gpt4 book ai didi

elasticsearch - Elastic中的geo_point

转载 作者:行者123 更新时间:2023-12-03 01:44:17 24 4
gpt4 key购买 nike

我正在尝试将经度和纬度映射到Elastic中的geo_point。

这是我的日志文件条目:

13-01-2017 ORDER COMPLETE: £22.00 Glasgow, 55.856299, -4.258845

这是我的conf文件
input {
file {
path => "/opt/logs/orders.log"
start_position => "beginning"
}
}

filter {
grok {
match => { "message" => "(?<date>[0-9-]+) (?<order_status>ORDER [a-zA-Z]+): (?<order_amount>£[0-9.]+) (?<order_location>[a-zA-Z ]+)"}
}

mutate {
convert => { "order_amount" => "float" }
convert => { "order_lat" => "float" }
convert => { "order_long" => "float" }

rename => {
"order_long" => "[location][lon]"
"order_lat" => "[location][lat]"
}
}
}

output {
elasticsearch {
hosts => "localhost"

index => "sales"
document_type => "order"

}
stdout {}
}

我用 /bin/logstash -f orders.conf开始logstash,这给出了:
"@version"=>{"type"=>"keyword", "include_in_all"=>false}, "geoip"=>{"dynamic"=>true,
"properties"=>{"ip"=>{"type"=>"ip"},
"location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"},
"longitude"=>{"type"=>"half_float"}}}}}}}}

看到?它将 location视为geo_point。但是 GET sales/_mapping导致以下结果:
"location": {
"properties": {
"lat": {
"type": "text",
"fields": {
"keyword": {
"type": "keyword",
"ignore_above": 256
}
}
},
"lon": {
"type": "text",
"fields": {
"keyword": {
"type": "keyword",
"ignore_above": 256
}
}
}
}
},

更新资料

每次重新索引时,我都会停止logstash,然后从 /opt/logstash/data/plugins/inputs/file...中删除.sincedb。我还制作了一个全新的日志文件,每次都增加索引(我目前使用 sales7)。

conf文件
input {
file {
path => "/opt/ag-created/logs/orders2.log"
start_position => "beginning"
}
}

filter {
grok {
match => { "message" => "(?<date>[0-9-]+) (?<order_status>ORDER [a-zA-Z]+): (?<order_amount>£[0-9.]+) (?<order_location>[a-zA-Z ]+), (?<order_lat>[0-9.]+), (?<order_long>[-0-9.]+)( - (?<order_failure_reason>[A-Za-z :]+))?" }
}

mutate {
convert => { "order_amount" => "float" }
}

mutate {
convert => { "order_lat" => "float" }
}

mutate {
convert => { "order_long" => "float" }
}

mutate {
rename => { "order_long" => "[location][lon]" }
}

mutate {
rename => { "order_lat" => "[location][lat]" }
}
}

output {
elasticsearch {
hosts => "localhost"
index => "sales7"
document_type => "order"
template_name => "myindex"
template => "/tmp/templates/custom-orders2.json"
template_overwrite => true
}

stdout {}
}

JSON文件
 {
"template": "sales7",
"settings": {
"index.refresh_interval": "5s"
},
"mappings": {
"sales": {
"_source": {
"enabled": false
},
"properties": {
"location": {
"type": "geo_point"
}
}
}
},
"aliases": {}
}
index => "sales7"
document_type => "order"
template_name => "myindex"
template => "/tmp/templates/custom-orders.json"
template_overwrite => true
}

stdout {}
}

有趣的是,当 geo_point映射不起作用(即lat和long均为浮点数)时,我的数据已编制索引(30行)。但是,当 location正确地制成 geo_point时,我的所有行都没有索引。

最佳答案

有两种方法可以做到这一点。第一个是为您的映射创建模板,以便在为数据建立索引时创建正确的映射。因为Elasticseach不了解您的数据类型是什么。您应该像下面这样说这些事情。

首先,为您的映射结构创建一个template.json文件:

{
"template": "sales*",
"settings": {
"index.refresh_interval": "5s"
},
"mappings": {
"sales": {
"_source": {
"enabled": false
},
"properties": {
"location": {
"type": "geo_point"
}
}
}
},
"aliases": {}
}

之后,更改您的logstash配置,以将此映射映射到索引:
input {
file {
path => "/opt/logs/orders.log"
start_position => "beginning"
}
}

filter {
grok {
match => { "message" => "(?<date>[0-9-]+) (?<order_status>ORDER [a-zA-Z]+): (?<order_amount>£[0-9.]+) (?<order_location>[a-zA-Z ]+)"}
}

mutate {
convert => { "order_amount" => "float" }
convert => { "order_lat" => "float" }
convert => { "order_long" => "float" }

rename => {
"order_long" => "[location][lon]"
"order_lat" => "[location][lat]"
}
}
}

output {
elasticsearch {
hosts => "localhost"

index => "sales"
document_type => "order"
template_name => "myindex"
template => "/etc/logstash/conf.d/template.json"
template_overwrite => true


}
stdout {}
}

第二个选择是摄取节点功能。我将更新此选项的答案,但现在您可以检查 my dockerized repository。在此示例中,我在解析位置数据时使用了摄取节点功能而不是模板。

关于elasticsearch - Elastic中的geo_point,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/45264235/

24 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com