gpt4 book ai didi

spring - FluentD 无法在 Elasticsearch 中写入日志

转载 作者:行者123 更新时间:2023-12-03 01:12:10 25 4
gpt4 key购买 nike

使用:

  • 流利的 1.11.2
  • 流利的插件 Elasticsearch 4.1.3
  • Elasticsearch 7.5.1
  • springboot 2.3.3

  • 在 Openshift (Kubernetes v1.17.1+20ba474) 中运行。
    Fluentd 和 Elasticsearch 都在不同的 pod 中运行。
    流利的配置文件:
    <source>
    @type forward
    port 24224
    bind 0.0.0.0
    </source>
    <filter *.**>
    @type parser
    key_name log
    reserve_data true
    <parse>
    @type none
    </parse>
    </filter>
    <match *.**>
    @type copy
    <store>
    @type elasticsearch
    host elasticdb
    port 9200
    logstash_format true
    logstash_prefix applogs
    logstash_dateformat %Y%m%d
    include_tag_key true
    type_name app_log
    tag_key @log_name
    flush_interval 1s
    user elastic
    password changeme
    </store>
    <store>
    @type stdout
    </store>
    </match>
    从本地 springboot 服务,我正在向 fluentd 发送一些虚拟数据:
    // Local port 24224 is being forwarded to remote 24224 via oc port-forward command
    private static FluentLogger LOG = FluentLogger.getLogger("app", "127.0.0.1", 24224);

    Map<String, Object> data = new HashMap<String, Object>();
    data.put("from", "userA");
    data.put("to", "userB");

    LOG.log("app", data);
    它发送这段 JSON 数据:
    {"from":"userA","to":"userB"}
    显然,它只工作十分之一。或者似乎工作了两三次,然后中断,直到我更改索引。实际上,并不清楚行为模式。
    当它不起作用时(大多数情况下),这些是 fluentd pod 中的日志:
    2020-09-18 17:33:08.000000000 +0000 app.appaa: {"from":"userA","to":"userB"}
    2020-09-18 17:33:37 +0000 [warn]: #0 dump an error event: error_class=ArgumentError error="log does not exist" location=nil tag="fluent.warn" time=2020-09-18 17:33:37.328180192 +0000 record={"error"=>"#<ArgumentError: log does not exist>", "location"=>nil, "tag"=>"app.appaa", "time"=>1600450388, "record"=>{"from"=>"userA", "to"=>"userB"}, "message"=>"dump an error event: error_class=ArgumentError error=\"log does not exist\" location=nil tag=\"app.appaa\" time=1600450388 record={\"from\"=>\"userAa\", \"to\"=>\"userBb\"}"}
    2020-09-18 17:33:37.328180192 +0000 fluent.warn: {"error":"#<ArgumentError: log does not exist>","location":null,"tag":"app.appaa","time":1600450388,"record":{"from":"userA","to":"userB"},"message":"dump an error event: error_class=ArgumentError error=\"log does not exist\" location=nil tag=\"app.appaa\" time=1600450388 record={\"from\"=>\"userA\", \"to\"=>\"userB\"}"}
    warning: 299 Elasticsearch-7.5.1-3ae9ac9a93c95bd0cdc054951cf95d88e1e18d96 "[types removal] Specifying types in bulk requests is deprecated."
    尽管 Elasticsearch pod 没有显示任何内容(我猜是日志记录级别的问题),但如果我使用 Elastic,我会看到:
    {
    "_index": "applogs-20200918",
    "_type": "_doc",
    "_id": "F0M2onQBB89nIri4Cb1Z",
    "_score": 1.0,
    "_source": {
    "error": "#<ArgumentError: log does not exist>",
    "location": null,
    "tag": "app.app",
    "time": 1600449251,
    "record": {
    "from": "userA",
    "to": "userB"
    },
    "message": "dump an error event: error_class=ArgumentError error=\"log does not exist\" location=nil tag=\"app.app\" time=1600449251 record={\"from\"=>\"userA\", \"to\"=>\"userB\"}",
    "@timestamp": "2020-09-18T17:14:39.775332214+00:00",
    "@log_name": "fluent.warn"
    }
    }
    所以看起来错误来自

    "Elastic: Argument Error: Log does not exist"


    以前有人遇到过这个错误吗?

    最佳答案

    过滤器中解析器的配置,即

    <filter *.**>
    @type parser
    key_name log # << Look for key `log` in event
    # ...
    </filter>
    正在寻找 key log在此事件中不存在:
    {"from":"userA","to":"userB"}
    你需要使用这样的东西:
    {"log":"... your log here..."}
    您可能需要转义 "如果您使用引号,则在其中。
    相关文档: https://docs.fluentd.org/filter/parser#key_name

    关于spring - FluentD 无法在 Elasticsearch 中写入日志,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/63960681/

    25 4 0
    Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
    广告合作:1813099741@qq.com 6ren.com