gpt4 book ai didi

elasticsearch - Logstash 索引错误 : [logstash-*] IndexNotFoundException[no such index]

转载 作者:行者123 更新时间:2023-12-03 22:40:51 26 4
gpt4 key购买 nike

我是 ELK 的新手。我在用 :- elasticsearch-2.1.0- logstash-2.1.1- kibana-4.3.0-windows我尝试配置 ELK 以监视我的应用程序日志,并且遵循了不同的教程和不同的 logstash 配置,但是当我打开 kibana 时出现此错误,并将请求发送到 elasticsearch。 :

[logstash-*] IndexNotFoundException[no such index]

这是我的 logstash 配置:

input {
file {
path => "/var/logs/*.log"
type => "syslog"
}
}
filter {
grok {match => [ "message", "%{COMBINEDAPACHELOG}" ] }
}
output {
elasticsearch { hosts => localhost }
stdout { codec => rubydebug }
}

我尝试删除所有文件夹并重新安装它并按照本教程一步步进行: https://www.elastic.co/guide/en/logstash/current/advanced-pipeline.html

但是我没有收到任何类型的索引,我又从 kibanaelasticsearch 得到了 index Error

有什么帮助吗?

问候调试日志 :`

C:\Users\xxx\Desktop\LOGS\logstash-2.1.1\bin>logstash -f first-pipeline.conf --debug
io/console not supported; tty will not be manipulated
←[36mReading config file {:config_file=>"C:/Users/xxx/Desktop/LOGS/logstash-2.1.1/bin/first-pipeline.conf", :level=>:debug, :file=>"/Users/xxx/Desktop/LOGS/logstash-2.1.1/vendor/bundle/jruby
/1.9/gems/logstash-core-2.1.1-java/lib/logstash/agent.rb", :line=>"325", :method=>"local_config"}←[0m
←[36mCompiled pipeline code:
@inputs = []
@filters = []
@outputs = []
@periodic_flushers = []
@shutdown_flushers = []
@input_file_1 = plugin("input", "file", LogStash::Util.hash_merge_many({ "path" => ("/var/logs/logstash-tutorial-dataset") }, { "start_position" => ("beginning") }))
@inputs << @input_file_1
@filter_grok_2 = plugin("filter", "grok", LogStash::Util.hash_merge_many({ "match" => {("message") => ("%{COMBINEDAPACHELOG}")} }))
@filters << @filter_grok_2
@filter_grok_2_flush = lambda do |options, &block|
@logger.debug? && @logger.debug("Flushing", :plugin => @filter_grok_2)
events = @filter_grok_2.flush(options)
return if events.nil? || events.empty?
@logger.debug? && @logger.debug("Flushing", :plugin => @filter_grok_2, :events => events)
events = @filter_geoip_3.multi_filter(events)
events.each{|e| block.call(e)}
end
if @filter_grok_2.respond_to?(:flush)
@periodic_flushers << @filter_grok_2_flush if @filter_grok_2.periodic_flush
@shutdown_flushers << @filter_grok_2_flush
end
@filter_geoip_3 = plugin("filter", "geoip", LogStash::Util.hash_merge_many({ "source" => ("clientip") }))
@filters << @filter_geoip_3
@filter_geoip_3_flush = lambda do |options, &block|
@logger.debug? && @logger.debug("Flushing", :plugin => @filter_geoip_3)
events = @filter_geoip_3.flush(options)
return if events.nil? || events.empty?
@logger.debug? && @logger.debug("Flushing", :plugin => @filter_geoip_3, :events => events)
events.each{|e| block.call(e)}
end
if @filter_geoip_3.respond_to?(:flush)
@periodic_flushers << @filter_geoip_3_flush if @filter_geoip_3.periodic_flush
@shutdown_flushers << @filter_geoip_3_flush
end
@output_elasticsearch_4 = plugin("output", "elasticsearch", LogStash::Util.hash_merge_many({ "hosts" => [("localhost")] }))
@outputs << @output_elasticsearch_4
def filter_func(event)
events = [event]
@logger.debug? && @logger.debug("filter received", :event => event.to_hash)
events = @filter_grok_2.multi_filter(events)
events = @filter_geoip_3.multi_filter(events)
events
end
def output_func(event)
@logger.debug? && @logger.debug("output received", :event => event.to_hash)
@output_elasticsearch_4.handle(event)
end {:level=>:debug, :file=>"/Users/xxx/Desktop/LOGS/logstash-2.1.1/vendor/bundle/jruby/1.9/gems/logstash-core-2.1.1-java/lib/logstash/pipeline.rb", :line=>"38", :method=>"initialize"}←[0m
←[36mPlugin not defined in namespace, checking for plugin file {:type=>"input", :name=>"file", :path=>"logstash/inputs/file", :level=>:debug, :file=>"/Users/xxx/Desktop/LOGS/logstash-2.1.1/vendor/bundle/jruby/1.9/gems/logstash-core-2.1.1-java/lib/logstash/plugin.rb", :line=>"76", :method=>"lookup"}←[0m
[...]
Logstash startup completed
←[32mFlushing buffer at interval {:instance=>"#<LogStash::Outputs::ElasticSearch::Buffer:0x75375e77@stopping=#<Concurrent::AtomicBoolean:0x61b12c0>, @last_flush=2015-12-29 15:45:27 +0000, @flush_thread=#<Thread:0x7008acbf run>, @max_size=500, @operations_lock=#<Java::JavaUtilConcurrentLocks::ReentrantLock:0x4985690f>, @submit_proc=#<Proc:0x3c9b0727@C:/Users/xxx/Desktop/LOGS/logstash-2.1.1/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-2.2.0-java/lib/logstash/outputs/elasticsearch/common.rb:55>, @flush_interval=1, @logger=#<Cabin::Channel:0x65f2b086 @subscriber_lock=#<Mutex:0x202361b4>, @data={}, @metrics=#<Cabin::Metrics:0x72e380e7 @channel=#<Cabin::Channel:0x65f2b086 ...>, @metrics={}, @metrics_lock=#<Mutex:0x3623f89e>>, @subscribers={12592=>#<Cabin::Outputs::IO:0x316290ee @lock=#<Mutex:0x3e191296>, @io=#<IO:fd 1>>}, @level=:debug>, @buffer=[], @operations_mutex=#<Mutex:0x601355b3>>", :interval=>1, :level=>:info, :file=>"/Users/xxx/Desktop/LOGS/logstash-2.1.1/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-2.2.0-java/lib/logstash/outputs/elasticsear
ch/buffer.rb", :line=>"90", :method=>"interval_flush"}←[0m
←[36m_globbed_files: /var/logs/logstash-tutorial-dataset: glob is: ["/var/logs/logstash-tutorial-dataset"] {:level=>:debug, :file=>"/Users/xxx/Desktop/LOGS/logstash-2.1.1/vendor/bundle/jruby/1.9/gems/filewatch-0.6.7/lib/filewatch/watch.rb", :line=>"190", :method=>"_globbed_files"}←[0m`

Elasticsearch .log:

[2015-12-29 15:15:01,702][WARN ][bootstrap                ] unable to install syscall filter: syscall filtering not supported for OS: 'Windows 8.1'
[2015-12-29 15:15:01,879][INFO ][node ] [Blue Marvel] version[2.1.1], pid[10152], build[40e2c53/2015-12-15T13:05:55Z]
[2015-12-29 15:15:01,880][INFO ][node ] [Blue Marvel] initializing ...
[2015-12-29 15:15:01,923][INFO ][plugins ] [Blue Marvel] loaded [], sites []
[2015-12-29 15:15:01,941][INFO ][env ] [Blue Marvel] using [1] data paths, mounts [[OS (C:)]], net usable_space [242.8gb], net total_space [458.4gb], spins? [unknown], types [NTFS]
[2015-12-29 15:15:03,135][INFO ][node ] [Blue Marvel] initialized
[2015-12-29 15:15:03,135][INFO ][node ] [Blue Marvel] starting ...
[2015-12-29 15:15:03,249][INFO ][transport ] [Blue Marvel] publish_address {127.0.0.1:9300}, bound_addresses {127.0.0.1:9300}, {[::1]:9300}
[2015-12-29 15:15:03,255][INFO ][discovery ] [Blue Marvel] elasticsearch/3DpYKTroSke4ruP21QefmA
[2015-12-29 15:15:07,287][INFO ][cluster.service ] [Blue Marvel] new_master {Blue Marvel}{3DpYKTroSke4ruP21QefmA}{127.0.0.1}{127.0.0.1:9300}, reason: zen-disco-join(elected_as_master, [0] joins received)
[2015-12-29 15:15:07,377][INFO ][http ] [Blue Marvel] publish_address {127.0.0.1:9200}, bound_addresses {127.0.0.1:9200}, {[::1]:9200}
[2015-12-29 15:15:07,382][INFO ][node ] [Blue Marvel] started
[2015-12-29 15:15:07,399][INFO ][gateway ] [Blue Marvel] recovered [1] indices into cluster_state
[2015-12-29 16:33:00,715][INFO ][rest.suppressed ] /logstash-$DATE/_search Params: {index=logstash-$DATE, q=response=200}
[logstash-$DATE] IndexNotFoundException[no such index]
at org.elasticsearch.cluster.metadata.IndexNameExpressionResolver$WildcardExpressionResolver.resolve(IndexNameExpressionResolver.java:566)

最佳答案

根据我的观察,您似乎没有在 logstash 输出配置文件中提供端口号。通常,elasticsearch 使用的端口是 9200(默认值)(正如那里的大多数教程所指示的那样)。尝试更改 logstash 配置 - 输出部分如下,让我知道它是否有效:

output {
elasticsearch { hosts => ["localhost:9200"] }
stdout { codec => rubydebug }
}

关于elasticsearch - Logstash 索引错误 : [logstash-*] IndexNotFoundException[no such index],我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/34513842/

26 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com