gpt4 book ai didi

logging - 在FluentD中解析内部JSON

转载 作者:行者123 更新时间:2023-12-05 00:52:51 27 4
gpt4 key购买 nike

我有一些通过FluentD驱动程序从Docker容器中发出的JSON:

'{"timeMillis":1485917543709,"thread":"main","level":"INFO","loggerName":"com.imageintelligence.ava.api.Boot","message":"{\"dom\":\"DOM\"}","loggerFqcn":"org.apache.logging.slf4j.Log4jLogger","threadId":1,"threadPriority":5}'

注意 message字段是字符串编码的JSON吗?当数据被fluentD捕获时,它最终看起来像预期的那样:
2017-02-01 06:29:15 +0000 docker.6faad650faa6: {"log":"{\"timeMillis\":1485917543709,\"thread\":\"main\",\"level\":\"INFO\",\"loggerName\":\"com.imageintelligence.ava.api.Boot\",\"message\":\"{\\\"dom\\\":\\\"DOM\\\"}\",\"loggerFqcn\":\"org.apache.logging.slf4j.Log4jLogger\",\"threadId\":1,\"threadPriority\":5}\r","com.amazonaws.ecs.cluster":"dombou","container_id":"6faad650faa6012af4f32df79901b42488543a5e6e53517fe3579b01ab2b6862","container_name":"/upbeat_booth","source":"stdout"}`

我使用类似的过滤器来解析JSON:
<filter docker.**>
@type parser
format json
key_name log
reserve_data true
hash_value_field log
</filter>

最后我得到了半 sanitizer 的JSON:
2017-02-01 06:32:10 +0000 docker.68c794f7f694: {"source":"stdout","log":{"timeMillis":1485917543709,"thread":"main","level":"INFO","loggerName":"com.imageintelligence.ava.api.Boot","message":"{\"dom\":\"DOM\"}","loggerFqcn":"org.apache.logging.slf4j.Log4jLogger","threadId":1,"threadPriority":5},"com.amazonaws.ecs.cluster":"dombou","container_id":"68c794f7f6948d4261b9497947834651abbf766e9aa51a76f39d6895b7a9ac18","container_name":"/sad_hamilton"}

问题是, message字段仍然是字符串转义的JSON字段。关于如何解析内部JSON字段的任何建议吗?如何堆叠过滤器?

最佳答案

您可以尝试顺序过滤器:

<filter docker.**>
@type parser
key_name log
format json
reserve_data true
</filter>

<filter docker.*.embeded_json.**>
@type parser
key_name message
format json
reserve_data true
</filter>

关于logging - 在FluentD中解析内部JSON,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/41991128/

27 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com