gpt4 book ai didi

logstash - Filebeat 无法连接到 logstash

转载 作者:行者123 更新时间:2023-12-05 07:26:48 31 4
gpt4 key购买 nike

我在云中使用两台服务器,在一台服务器上 (A) 我安装了 filebeat,在第二台服务器上 (B) 我安装了 logstash、elasticsearch 和 kibana。所以我在 logstash 上将日志从服务器 A 发送到服务器 B 时遇到问题。

我的filebeat配置是

filebeat.inputs:
- type: log
enabled: true
paths:
- /home/vinit/demo/*.log
fields:
log_type: apache
fields_under_root: true

#output.elasticsearch:
#hosts: ["localhost:9200"]
#protocol: "https"
#username: "elastic"
#password: "changeme"

output.logstash:
hosts: ["XXX.XX.X.XXX:5044"]
#ssl.certificate_authorities: ["/etc/pki/root/ca.pem"]
#ssl.certificate: "/etc/pki/client/cert.pem"
#ssl.key: "/etc/pki/client/cert.key"

在 logstash 中,我启用了模块 system、filebeat 和 logstash。

Logstash 配置为

input {
beats {
port => 5044
}
}
filter {
grok {
match => { "message" => "^%{IP:CLIENT_IP} (?:-|%{USER:IDEN}) (?:-|%{USER:AUTH}) \[%{HTTPDATE:CREATED_ON}\] \"(?:%{WORD:REQUEST_METHOD} (?:/|%{NOTSPACE:REQUEST})(?: HTT$
add_field => {
"LOG_TYPES" => "apache-log"
}
overwrite => [ "message" ]
}
}
output {
elasticsearch {
hosts => ["localhost:9200"]
manage_template => false
index => "apache-info-log"
}
stdout { codec => rubydebug }
}

在 Elasticsearch 中我做了

network.host: localhost

我得到的错误如下-

|2019-01-18T15:05:47.738Z|INFO|crawler/crawler.go:72|Loading Inputs: 1|
|---|---|---|---|
|2019-01-18T15:05:47.739Z|INFO|log/input.go:138|Configured paths: [/home/vinit/demo/*.log]|
|2019-01-18T15:05:47.739Z|INFO|input/input.go:114|Starting input of type: log; ID: 10340820847180584185 |
|2019-01-18T15:05:47.740Z|INFO|log/input.go:138|Configured paths: [/var/log/logstash/logstash-plain*.log]|
|2019-01-18T15:05:47.740Z|INFO|log/input.go:138|Configured paths: [/var/log/logstash/logstash-slowlog-plain*.log]|
|2019-01-18T15:05:47.742Z|INFO|log/harvester.go:254|Harvester started for file: /home/vinit/demo/info-log.log|
|2019-01-18T15:05:47.749Z|INFO|log/input.go:138|Configured paths: [/var/log/auth.log* /var/log/secure*]|
|2019-01-18T15:05:47.763Z|INFO|log/input.go:138|Configured paths: [/var/log/messages* /var/log/syslog*]|
|2019-01-18T15:05:47.763Z|INFO|crawler/crawler.go:106|Loading and starting Inputs completed. Enabled inputs: 1|
|2019-01-18T15:05:47.763Z|INFO|cfgfile/reload.go:150|Config reloader started|
|2019-01-18T15:05:47.777Z|INFO|log/input.go:138|Configured paths: [/var/log/auth.log* /var/log/secure*]|
|2019-01-18T15:05:47.790Z|INFO|log/input.go:138|Configured paths: [/var/log/messages* /var/log/syslog*]|
|2019-01-18T15:05:47.790Z|INFO|input/input.go:114|Starting input of type: log; ID: 15514736912311113705 |
|2019-01-18T15:05:47.790Z|INFO|input/input.go:114|Starting input of type: log; ID: 4004097261679848995 |
|2019-01-18T15:05:47.791Z|INFO|log/input.go:138|Configured paths: [/var/log/logstash/logstash-plain*.log]|
|2019-01-18T15:05:47.791Z|INFO|log/input.go:138|Configured paths: [/var/log/logstash/logstash-slowlog-plain*.log]|
|2019-01-18T15:05:47.791Z|INFO|input/input.go:114|Starting input of type: log; ID: 2251543969305657601 |
|2019-01-18T15:05:47.791Z|INFO|input/input.go:114|Starting input of type: log; ID: 9013300092125558684 |
|2019-01-18T15:05:47.791Z|INFO|cfgfile/reload.go:205|Loading of config files completed.|
|2019-01-18T15:05:47.792Z|INFO|log/harvester.go:254|Harvester started for file: /var/log/secure-20181223|
|2019-01-18T15:05:47.794Z|INFO|log/harvester.go:254|Harvester started for file: /var/log/messages-20181223|
|2019-01-18T15:05:47.797Z|INFO|log/harvester.go:254|Harvester started for file: /var/log/secure-20181230|
|2019-01-18T15:05:47.800Z|INFO|log/harvester.go:254|Harvester started for file: /var/log/messages-20181230|
|2019-01-18T15:05:47.804Z|INFO|log/harvester.go:254|Harvester started for file: /var/log/secure-20190106|
|2019-01-18T15:05:47.804Z|INFO|log/harvester.go:254|Harvester started for file: /var/log/secure|
|2019-01-18T15:05:47.804Z|INFO|log/harvester.go:254|Harvester started for file: /var/log/secure-20190113|
|2019-01-18T15:05:47.816Z|INFO|log/harvester.go:254|Harvester started for file: /var/log/messages-20190106|
|2019-01-18T15:05:47.817Z|INFO|log/harvester.go:254|Harvester started for file: /var/log/messages|
|2019-01-18T15:05:47.818Z|INFO|log/harvester.go:254|Harvester started for file: /var/log/messages-20190113|
|2019-01-18T15:05:47.855Z|INFO|pipeline/output.go:95|Connecting to backoff(async(tcp://XXX.XX.X.XXX:5044))|

|2019-01-18T15:06:18.855Z|ERROR|pipeline/output.go:100|Failed to connect to backoff(async(tcp://XXX.XX.X.XXX:5044)): dial tcp XXX.XX.X.XXX:5044: i/o timeout|
|---|---|---|---|
|2019-01-18T15:06:18.855Z|INFO|pipeline/output.go:93|Attempting to reconnect to backoff(async(tcp://XXX.XX.X.XXX:5044)) with 1 reconnect attempt(s)|

有人知道如何解决这个问题并使其正常工作吗?

最佳答案

相关问题是Failed to connect to backoff(async(tcp://ip:5044)): dial tcp ip:5044: i/o timeout .在答案中建议直接在您的云提供商设置页面中允许端口 5044 上的传出 TCP 连接,因为默认情况下它可能被阻止。

除了@Vinit Jordan 的评论外,他用这个 steps 将 EC2 上的端口 5044 列入白名单。 ,我为一般情况提出了可能的解决方案。

请检查您在 logstash 服务器上的默认防火墙。可能你有在初始 Nginx 设置期间预配置的 ufw 简单防火墙。我刚在机器B上安装了ELK,在机器A上安装了filebeat就遇到了这个问题。

我刚刚为 filebeat 服务器防火墙添加了一条新规则,错误消失了:

sudo ufw allow from <IP_address_of_machine_A> to any port 5044

然后机器A上的filbeat日志给我看:

"message":"Connection to backoff(async(tcp://<IP_address_of_machine_B>:5044)) established"

可能为您信任的服务器添加更多通用规则也是合理的:

sudo ufw allow from <IP_ADDRESS>

关于logstash - Filebeat 无法连接到 logstash,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/54257179/

31 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com