- html - 出于某种原因,IE8 对我的 Sass 文件中继承的 html5 CSS 不友好?
- JMeter 在响应断言中使用 span 标签的问题
- html - 在 :hover and :active? 上具有不同效果的 CSS 动画
- html - 相对于居中的 html 内容固定的 CSS 重复背景?
filebeat加载输入为0,且filebeat没有任何日志。
filebeat应该读取一些日志输入,并将其发送到logstash。我在logstash.conf中有一些过滤器,但我暂时将其删除。然后用logstash将它们发送到 flex 的,最后是kibana。
filebeat.config.modules:
path: "${path.config}/modules.d/*.yml"
reload.enabled: true
reload.period: 10s
filebeat.inputs:
enabled: true
paths:
- /var/log/TestLog/*.log
type: log
filebeat.registry.path: /var/lib/filebeat/registry/filebeat
logging.files:
name: filebeat.log
path: /var/log/filebeat
logging.level: info
logging.selectors:
- "*"
logging.to_files: true
monitoring.enabled: false
output.logstash:
enabled: true
hosts:
- "192.168.80.20:5044"
setup.kibana: ~
setup.template.settings:
index.number_of_shards: 1
最佳答案
journalctl -fu filebeat的响应是
INFO instance/beat.go:422 filebeat start running.
INFO registrar/migrate.go:104 No registry home found. Create: /var/lib/filebeat/registry/filebeat/filebeat
INFO registrar/migrate.go:112 Initialize registry meta file
INFO registrar/registrar.go:108 No registry file found under: /var/lib/filebeat/registry/filebeat/filebeat/data.json. Creating a new registry file.
INFO registrar/registrar.go:145 Loading registrar data from /var/lib/filebeat/registry/filebeat/filebeat/data.json
INFO registrar/registrar.go:152 States Loaded from registrar: 0
WARN beater/filebeat.go:368 Filebeat is unable to load the Ingest Node pipelines for the configured modules because the Elasticsearch output is not configured/enabled. If you have already loaded the Ingest Node pipelines or are using Logstash pipelines, you can ignore this warning.
INFO crawler/crawler.go:72 Loading Inputs: 0
INFO crawler/crawler.go:106 Loading and starting Inputs completed. Enabled inputs: 0
INFO cfgfile/reload.go:171 Config reloader started
INFO log/input.go:148 Configured paths: [/var/log/elasticsearch/*_access.log /var/log/elasticsearch/*_audit.log /var/log/elasticsearch/*_audit.json]
INFO log/input.go:148 Configured paths: [/var/log/elasticsearch/*_deprecation.log /var/log/elasticsearch/*_deprecation.json]
INFO log/input.go:148 Configured paths: [/var/log/elasticsearch/gc.log.[0-9]* /var/log/elasticsearch/gc.log]
INFO log/input.go:148 Configured paths: [/var/log/elasticsearch/*.log /var/log/elasticsearch/*_server.json]
INFO log/input.go:148 Configured paths: [/var/log/elasticsearch/*_index_search_slowlog.log /var/log/elasticsearch/*_index_indexing_slowlog.log /var/log/elasticsearch/*_index_search_slowlog.json /var/log/elasticsearch/*_index_indexing_slowlog.json]
INFO input/input.go:114 Starting input of type: log; ID: 10720371839583549447
INFO input/input.go:114 Starting input of type: log; ID: 8161597721645621668
INFO input/input.go:114 Starting input of type: log; ID: 15537576637552474368
INFO input/input.go:114 Starting input of type: log; ID: 14070679154152675563
INFO input/input.go:114 Starting input of type: log; ID: 7953850694515857477
INFO log/harvester.go:253 Harvester started for file: /var/log/elasticsearch/my-application_audit.json
INFO log/harvester.go:253 Harvester started for file: /var/log/elasticsearch/elasticsearch_audit.json
INFO log/input.go:148 Configured paths: [/var/log/logstash/logstash-plain*.log]
INFO log/input.go:148 Configured paths: [/var/log/logstash/logstash-slowlog-plain*.log]
INFO input/input.go:114 Starting input of type: log; ID: 17306378383715639109
INFO input/input.go:114 Starting input of type: log; ID: 14725834876846155099
INFO log/harvester.go:253 Harvester started for file: /var/log/logstash/logstash-plain.log
INFO log/harvester.go:253 Harvester started for file: /var/log/elasticsearch/my-application.log
INFO log/harvester.go:253 Harvester started for file: /var/log/elasticsearch/my-application_index_search_slowlog.json
INFO log/harvester.go:253 Harvester started for file: /var/log/logstash/logstash-slowlog-plain.log
INFO log/harvester.go:253 Harvester started for file: /var/log/elasticsearch/my-application_deprecation.log
INFO log/harvester.go:253 Harvester started for file: /var/log/elasticsearch/elasticsearch_deprecation.log
INFO log/harvester.go:253 Harvester started for file: /var/log/elasticsearch/gc.log.27
INFO log/harvester.go:253 Harvester started for file: /var/log/elasticsearch/elasticsearch_server.json
INFO log/harvester.go:253 Harvester started for file: /var/log/elasticsearch/elasticsearch_deprecation.json
INFO log/harvester.go:253 Harvester started for file: /var/log/elasticsearch/gc.log.31
INFO log/input.go:148 Configured paths: [/var/log/auth.log* /var/log/secure*]
INFO log/input.go:148 Configured paths: [/var/log/messages* /var/log/syslog*]
INFO input/input.go:114 Starting input of type: log; ID: 14797590234914819083
INFO input/input.go:114 Starting input of type: log; ID: 16974178264304869863
INFO log/harvester.go:253 Harvester started for file: /var/log/elasticsearch/my-application_deprecation.json
INFO log/harvester.go:253 Harvester started for file: /var/log/elasticsearch/my-application_server.json
INFO log/harvester.go:253 Harvester started for file: /var/log/elasticsearch/elasticsearch_index_indexing_slowlog.json
INFO log/harvester.go:253 Harvester started for file: /var/log/secure-20191201
INFO log/harvester.go:253 Harvester started for file: /var/log/elasticsearch/elasticsearch.log
INFO log/harvester.go:253 Harvester started for file: /var/log/messages-20191117
INFO log/harvester.go:253 Harvester started for file: /var/log/elasticsearch/my-application_index_indexing_slowlog.json
INFO log/harvester.go:253 Harvester started for file: /var/log/elasticsearch/gc.log.02
INFO log/harvester.go:253 Harvester started for file: /var/log/elasticsearch/gc.log
INFO log/harvester.go:253 Harvester started for file: /var/log/messages-20191124
INFO log/harvester.go:253 Harvester started for file: /var/log/secure
INFO log/harvester.go:253 Harvester started for file: /var/log/elasticsearch/elasticsearch_index_search_slowlog.log
INFO log/harvester.go:253 Harvester started for file: /var/log/elasticsearch/gc.log.03
INFO log/harvester.go:253 Harvester started for file: /var/log/elasticsearch/gc.log.08
INFO log/harvester.go:253 Harvester started for file: /var/log/elasticsearch/gc.log.18
INFO log/harvester.go:253 Harvester started for file: /var/log/elasticsearch/gc.log.11
INFO log/harvester.go:253 Harvester started for file: /var/log/elasticsearch/gc.log.26
INFO log/harvester.go:253 Harvester started for file: /var/log/elasticsearch/gc.log.06
INFO log/harvester.go:253 Harvester started for file: /var/log/elasticsearch/gc.log.12
INFO log/harvester.go:253 Harvester started for file: /var/log/elasticsearch/gc.log.20
INFO log/harvester.go:253 Harvester started for file: /var/log/elasticsearch/gc.log.29
INFO log/harvester.go:253 Harvester started for file: /var/log/elasticsearch/gc.log.21
INFO log/harvester.go:253 Harvester started for file: /var/log/elasticsearch/gc.log.07
INFO log/harvester.go:253 Harvester started for file: /var/log/elasticsearch/gc.log.13
INFO log/harvester.go:253 Harvester started for file: /var/log/elasticsearch/gc.log.19
INFO log/harvester.go:253 Harvester started for file: /var/log/elasticsearch/gc.log.28
INFO log/harvester.go:253 Harvester started for file: /var/log/elasticsearch/gc.log.22
INFO log/harvester.go:253 Harvester started for file: /var/log/elasticsearch/gc.log.24
INFO log/harvester.go:253 Harvester started for file: /var/log/elasticsearch/gc.log.23
INFO log/harvester.go:253 Harvester started for file: /var/log/elasticsearch/gc.log.05
INFO log/harvester.go:253 Harvester started for file: /var/log/secure-20191110
INFO log/harvester.go:253 Harvester started for file: /var/log/elasticsearch/gc.log.09
INFO log/harvester.go:253 Harvester started for file: /var/log/elasticsearch/gc.log.10
INFO log/harvester.go:253 Harvester started for file: /var/log/elasticsearch/my-application_index_search_slowlog.log
INFO log/harvester.go:253 Harvester started for file: /var/log/elasticsearch/gc.log.14
INFO log/harvester.go:253 Harvester started for file: /var/log/secure-20191117
INFO log/harvester.go:253 Harvester started for file: /var/log/elasticsearch/gc.log.16
INFO log/harvester.go:253 Harvester started for file: /var/log/elasticsearch/gc.log.30
INFO log/harvester.go:253 Harvester started for file: /var/log/secure-20191124
INFO log/harvester.go:253 Harvester started for file: /var/log/elasticsearch/gc.log
INFO log/harvester.go:253 Harvester started for file: /var/log/elasticsearch/gc.log.01
INFO log/harvester.go:253 Harvester started for file: /var/log/elasticsearch/gc.log.04
INFO log/harvester.go:253 Harvester started for file: /var/log/messages-20191201
INFO log/harvester.go:253 Harvester started for file: /var/log/elasticsearch/gc.log.15
INFO log/harvester.go:253 Harvester started for file: /var/log/elasticsearch/gc.log.17
INFO log/harvester.go:253 Harvester started for file: /var/log/elasticsearch/gc.log.00
INFO log/harvester.go:253 Harvester started for file: /var/log/elasticsearch/gc.log.25
INFO log/harvester.go:253 Harvester started for file: /var/log/messages
INFO log/harvester.go:253 Harvester started for file: /var/log/messages-20191110
INFO log/harvester.go:253 Harvester started for file: /var/log/elasticsearch/elasticsearch_index_indexing_slowlog.log
INFO log/harvester.go:253 Harvester started for file: /var/log/elasticsearch/elasticsearch_index_search_slowlog.json
INFO log/harvester.go:253 Harvester started for file: /var/log/elasticsearch/my-application_index_indexing_slowlog.log
INFO pipeline/output.go:95 Connecting to backoff(async(tcp://192.168.80.20:5044))
INFO pipeline/output.go:105 Connection to backoff(async(tcp://192.168.80.20:5044)) established
INFO [monitoring] log/log.go:145 Non-zero metrics in the last 30s {"monitoring": {"metrics": {"beat":{"cpu":{"system":{"ticks":550,"time":{"ms":560}},"total":{"ticks":4600,"time":{"ms":4612},"value":4600},"user":{"ticks":4050,"time":{"ms":4052}}},"handles":{"limit":{"hard":4096,"soft":1024},"open":70},"info":{"ephemeral_id":"e901ac2b-21fa-47b1-a84d-3ddc10b068fd","uptime":{"ms":30285}},"memstats":{"gc_next":57786240,"memory_alloc":50264424,"memory_total":511186464,"rss":92864512},"runtime":{"goroutines":387}},"filebeat":{"events":{"active":4139,"added":34923,"done":30784},"harvester":{"open_files":64,"running":64,"started":64}},"libbeat":{"config":{"module":{"running":0},"reloads":2},"output":{"events":{"acked":30720,"active":4096,"batches":17,"total":34816},"read":{"bytes":96},"type":"logstash","write":{"bytes":5233807}},"pipeline":{"clients":9,"events":{"active":4119,"filtered":64,"published":34836,"retry":2048,"total":34903},"queue":{"acked":30720}}},"registrar":{"states":{"current":63,"update":30784},"writes":{"success":48,"total":48}},"system":{"cpu":{"cores":2},"load":{"1":3.55,"15":3.97,"5":4.77,"norm":{"1":1.775,"15":1.985,"5":2.385}}}}}}
关于elasticsearch - Filebeat无法加载输入,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/59154660/
Filebeat 简介 Filebeat用于转发和集中日志数据的轻量级传送程序。作为服务器上的代理安装,Filebeat监视指定的位置文件或位置,收集日志事件,并将他们转发到Elasticsear
我的 filebeat(来自 docker.elastic.co/beats/filebeat:6.1.2 的容器)收割机正在被 close_inactive 关闭,我不希望它们被关闭。来自 here
我是ELK的新手。我先安装了没有 Logstash 的 Elasticsearch 和 Filebeat,我想将数据从 Filebeat 发送到 Elasticsearch。在我安装了 Filebea
我正面临 logstash 的延迟问题。 事实上,我有一个这样构建的 ELK 堆栈: 我在 AWS 自动缩放组中有多个 AWS EC2 网络前端 我在每个前端都安装了 filebeat filebea
我的目标 我正在尝试提交对 Filebeat documentation 的修复,写于asciidoc 。 来源 Currently it is not possible to recursively
我的目标 我正在尝试提交对 Filebeat documentation 的修复,写于asciidoc 。 来源 Currently it is not possible to recursively
我正在尝试使用 filebeat test ouput -e -c filebeat.yml 测试我的配置,我只看到带有命令列表的帮助消息。 我其实是想输出数据文件来验证。虽然我已经测试了 fileb
是否可以将文件节拍设置为从远程目录读取(因为我无法在那台机器上安装进程) 我在beats yml上是这样设置的: filebeat: # List of prospectors to fetch
我目前正在使用 ELK 5.5。现在看来 document_type 在 Filebeats 中已被弃用,但我现在在任何地方都找不到任何关于如何实现相同的示例。 这是我在日志中得到的: WARN DE
我在 filebeat 方面遇到了一些奇怪的问题 我正在使用云形成来运行我的堆栈,并且我正在安装和运行 filebeat 来进行日志聚合, 我将/etc/filebeat/filebeat.yml注入
因此,我正在使用Filebeat读取几种不同的文件类型。我为要收获的每种文件设置document_type。我的问题是我想将大多数这些文件类型发送到Logstash,但是我希望将某些类型的文件直接发送
我正在尝试从 filebeat 读取文件并将它们推送到 logstash。在推送它们之前,我正在尝试合并包含 java 堆栈跟踪的事件。我试过这个过滤器,但它不起作用。 filebeat.prospe
paths: - /var/log/*.log 我使用它作为filebeat中运输日志的路径。 输出为elasticsearch。 output: elasticsearch:
我有一个读取多种不同日志格式的文件节拍。 一种工作得很好的格式是单个衬里,它作为单个事件发送到 Logstash。 现在,我有另一种格式,即多线。我想将其作为单个事件读取并将其发送到 Logstash
我们正在通过filebeat将数据摄取到Elasticsearch并遇到配置问题。 我正在尝试为特定字段指定日期格式(标准@timestamp字段保留索引时间,我们需要实际的事件时间)。到目前为止,我
我正在使用 Filebeat > logstash > elasticsearch > kibana 运行一个基本的 elk 堆栈设置——全部在 5.2 版上 当我删除 Filebeat 并将 log
我想要一个filebeat实例可以将数据发送到不同的logstash管道的功能。 这可能吗? 我已经配置了一个logtash服务,它具有两个管道 管道给出了单独的端口。 假设管道1(端口5044),管
首先我为我的英语道歉。 我是一家公司的实习生,我用 Filebeat 提出了一个解决方案 ELK 来发送日志。 问题是一旦恢复 syslog_pri 总是显示 Notice 和 severity_co
问题: TCP 输入是否管理收割机(即,您是否将文件路径发送到 TCP 输入,然后收割机开始摄取该文件)? TCP 输入能否接受结构化数据(例如 log 输入上的 json 配置选项)? TCP 输入
我正在使用 Filebeat 将日志数据从我的本地 txt 文件发送到 Elasticsearch,并且我想将 message 行中的一些字段添加到事件中——比如时间戳和日志级别。例如,这是我的日志行
我是一名优秀的程序员,十分优秀!