gpt4 book ai didi

elasticsearch - Logstash日志中的翻译缺少错误

转载 作者:行者123 更新时间:2023-12-02 22:57:26 24 4
gpt4 key购买 nike

我正在尝试使用logstash将csv文件中的数据输入到 flex 搜索中。我的logsatsh配置文件如下所示:

input {
file {
path => "C:\Users\shreya\Data\RetailData.csv"
start_position => "beginning"
#sincedb_path => "C:\Users\shreya\null"

}
}
filter {
csv {
separator => ","
id => "Store_ID"
columns => ["Store","Date","Temperature","Fuel_Price", "MarkDown1", "MarkDown2", "MarkDown3", "MarkDown4", "CPI", "Unemployment", "IsHoliday"]
}
mutate {convert => ["Store", "integer"]}
mutate {convert => ["Date", "date"]}
mutate {convert => ["Temperature", "float"]}
mutate {convert => ["Fuel_Price", "float"]}
mutate {convert => ["CPI", "float"]}
mutate {convert => ["Unemployment", "float"]}


}
output {
elasticsearch {
action => "index"
hosts => "localhost:9200"
index => "store"
document_type => "store_retail"
}
stdout {}
#stdout {
# codec => rubydebug
#}
}

但是我遇到了一个错误,无法找到一种解决方法。我是Logstash的新手。我的错误日志如下所示:
[2017-12-02T15:56:38,150][INFO ][logstash.modules.scaffold] Initializing module {:module_name=>"fb_apache", :directory=>"C:/Users/shreya/logstash-6.0.0/modules/fb_apache/configuration"}
[2017-12-02T15:56:38,165][INFO ][logstash.modules.scaffold] Initializing module {:module_name=>"netflow", :directory=>"C:/Users/shreya/logstash-6.0.0/modules/netflow/configuration"}
[2017-12-02T15:56:38,243][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2017-12-02T15:56:39,117][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}
[2017-12-02T15:56:42,965][WARN ][logstash.outputs.elasticsearch] You are using a deprecated config setting "document_type" set in elasticsearch. Deprecated settings will continue to work, but are scheduled for removal from logstash in the future. Document types are being deprecated in Elasticsearch 6.0, and removed entirely in 7.0. You should avoid this feature If you have any questions about this, please visit the #logstash channel on freenode irc. {:name=>"document_type", :plugin=><LogStash::Outputs::ElasticSearch action=>"index", hosts=>["localhost:9200"], index=>"store", document_type=>"store_retail", id=>"91a4406a13e9377abb312acf5f6be8e609a685f9c84a5906af957e956119798c">}
[2017-12-02T15:56:43,604][INFO ][logstash.outputs.elasticsearch] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://localhost:9200/]}}
[2017-12-02T15:56:43,604][INFO ][logstash.outputs.elasticsearch] Running health check to see if an Elasticsearch connection is working {:healthcheck_url=>http://localhost:9200/, :path=>"/"}
[2017-12-02T15:56:43,854][WARN ][logstash.outputs.elasticsearch] Restored connection to ES instance {:url=>"http://localhost:9200/"}
[2017-12-02T15:56:43,932][INFO ][logstash.outputs.elasticsearch] Using mapping template from {:path=>nil}
[2017-12-02T15:56:43,933][INFO ][logstash.outputs.elasticsearch] Attempting to install template {:manage_template=>{"template"=>"logstash-*", "version"=>60001, "settings"=>{"index.refresh_interval"=>"5s"}, "mappings"=>{"_default_"=>{"dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"*", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date"}, "@version"=>{"type"=>"keyword"}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}}
[2017-12-02T15:56:43,964][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["//localhost:9200"]}
[2017-12-02T15:56:44,011][ERROR][logstash.pipeline ] Error registering plugin {:pipeline_id=>"main", :plugin=>"#<LogStash::FilterDelegator:0x3e4985f1 @metric_events_out=org.jruby.proxy.org.logstash.instrument.metrics.counter.LongCounter$Proxy2 - namespace: [stats, pipelines, main, plugins, filters, e3501f879986420bd95a59d8a1c006d9bc4351a481c96bd5366e7edb54bc6fbb, events] key: out value:0, @metric_events_in=org.jruby.proxy.org.logstash.instrument.metrics.counter.LongCounter$Proxy2 - namespace: [stats, pipelines, main, plugins, filters, e3501f879986420bd95a59d8a1c006d9bc4351a481c96bd5366e7edb54bc6fbb, events] key: in value:0, @logger=#<LogStash::Logging::Logger:0x48eebcf8 @logger=#<Java::OrgApacheLoggingLog4jCore::Logger:0x113b0d16>>, @metric_events_time=org.jruby.proxy.org.logstash.instrument.metrics.counter.LongCounter$Proxy2 - namespace: [stats, pipelines, main, plugins, filters, e3501f879986420bd95a59d8a1c006d9bc4351a481c96bd5366e7edb54bc6fbb, events] key: duration_in_millis value:0, @id=\"e3501f879986420bd95a59d8a1c006d9bc4351a481c96bd5366e7edb54bc6fbb\", @klass=LogStash::Filters::Mutate, @metric_events=#<LogStash::Instrument::NamespacedMetric:0x7c8acc8 @metric=#<LogStash::Instrument::Metric:0x3afcd9b5 @collector=#<LogStash::Instrument::Collector:0x73e63041 @agent=nil, @metric_store=#<LogStash::Instrument::MetricStore:0x60e51f03 @store=#<Concurrent::Map:0x00000000000fb0 entries=3 default_proc=nil>, @structured_lookup_mutex=#<Mutex:0x2209413b>, @fast_lookup=#<Concurrent::Map:0x00000000000fb4 entries=86 default_proc=nil>>>>, @namespace_name=[:stats, :pipelines, :main, :plugins, :filters, :e3501f879986420bd95a59d8a1c006d9bc4351a481c96bd5366e7edb54bc6fbb, :events]>, @filter=<LogStash::Filters::Mutate convert=>{\"Date\"=>\"date\"}, id=>\"e3501f879986420bd95a59d8a1c006d9bc4351a481c96bd5366e7edb54bc6fbb\", enable_metric=>true, periodic_flush=>false>>", :error=>"translation missing: en.logstash.agent.configuration.invalid_plugin_register", :thread=>"#<Thread:0x3cc2461b@C:/Users/shreya/logstash-6.0.0/logstash-core/lib/logstash/pipeline.rb:290 run>"}
[2017-12-02T15:56:44,042][ERROR][logstash.pipeline ] Pipeline aborted due to error {:pipeline_id=>"main", :exception=>#<LogStash::ConfigurationError: translation missing: en.logstash.agent.configuration.invalid_plugin_register>, :backtrace=>["C:/Users/shreya/logstash-6.0.0/vendor/bundle/jruby/2.3.0/gems/logstash-filter-mutate-3.1.6/lib/logstash/filters/mutate.rb:186:in `block in register'", "org/jruby/RubyHash.java:1343:in `each'", "C:/Users/shreya/logstash-6.0.0/vendor/bundle/jruby/2.3.0/gems/logstash-filter-mutate-3.1.6/lib/logstash/filters/mutate.rb:184:in `register'", "C:/Users/shreya/logstash-6.0.0/logstash-core/lib/logstash/pipeline.rb:388:in `register_plugin'", "C:/Users/shreya/logstash-6.0.0/logstash-core/lib/logstash/pipeline.rb:399:in `block in register_plugins'", "org/jruby/RubyArray.java:1734:in `each'", "C:/Users/shreya/logstash-6.0.0/logstash-core/lib/logstash/pipeline.rb:399:in `register_plugins'", "C:/Users/shreya/logstash-6.0.0/logstash-core/lib/logstash/pipeline.rb:801:in `maybe_setup_out_plugins'", "C:/Users/shreya/logstash-6.0.0/logstash-core/lib/logstash/pipeline.rb:409:in `start_workers'", "C:/Users/shreya/logstash-6.0.0/logstash-core/lib/logstash/pipeline.rb:333:in `run'", "C:/Users/shreya/logstash-6.0.0/logstash-core/lib/logstash/pipeline.rb:293:in `block in start'"], :thread=>"#<Thread:0x3cc2461b@C:/Users/shreya/logstash-6.0.0/logstash-core/lib/logstash/pipeline.rb:290 run>"}
[2017-12-02T15:56:44,058][ERROR][logstash.agent ] Failed to execute action {:id=>:main, :action_type=>LogStash::ConvergeResult::FailedAction, :message=>"Could not execute action: LogStash::PipelineAction::Create/pipeline_id:main, action_result: false", :backtrace=>nil}

最佳答案

问题来自于其中一个突变过滤器中的转换目标。从documentation:

Valid conversion targets are: integer, float, string, and boolean.



因此,这部分导致崩溃:
mutate {convert => ["Date", "date"]}

如果您想将String转换为日期,则必须使用日期过滤器。

关于elasticsearch - Logstash日志中的翻译缺少错误,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/47606581/

24 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com