9994 } } filter{-6ren">
gpt4 book ai didi

来自 log4net 的 Logstash 和 Json 风格日志

转载 作者:行者123 更新时间:2023-12-03 06:29:12 25 4
gpt4 key购买 nike

给定以下配置:

input {
udp {
type=> "genericJson2"
port => 9994
}
}

filter{
if [type] == "genericJson2" {
json {
source => "message"
}
}
}

output {
elasticsearch{
}

}

以下输入:

{"date":"2018-02-27T13:21:41.3387552-05:00","level":"INFO","appname":"listenercore","logger":"Main","thread":"1","message":"test"}

我得到以下结果:

{
"_index": "logstash-2018.02.27",
"_type": "doc",
"_id": "AWHYfh_qDl_9h030IXjC",
"_score": 1,
"_source": {
"type": "genericJson2",
"@timestamp": "2018-02-27T18:19:59.747Z",
"host": "10.120.4.5",
"@version": "1",
"date": "{\"date\":\"2018-02-27T13:20:02.2113",
"message": "{\"da",
"logger": "{\"da",
"thread": "{",
"level": "{\"da",
"appname": "{\"date\":\"201"
},
"fields": {
"@timestamp": [
"2018-02-27T18:19:59.747Z"
]
}
}

我需要做什么才能正确解析我的 json 日志?

编辑

我挖得更深一些。从命令行运行它

sudo bin/logstash -e "input{stdin{type=>stdin}} filter{json {source=>message}} output{ stdout{ codec=>rubydebug } }"

产生了所需的输出

{
"@timestamp" => 2018-02-28T02:07:01.710Z,
"host" => "Elastisearch01",
"appname" => "listenercore",
"logger" => "Main",
"@version" => "1",
"type" => "stdin",
"date" => "2018-02-27T13:21:41.3387552-05:00",
"level" => "INFO",
"thread" => "1",
"message" => "test"
}

所以我写了一个快速的 python udp 服务器来查看网络上发生了什么,这是我捕获的内容:

{ " d a t e " : " 2 0 1 8 - 0 2 - 2 7 T 2 1 : 0 6 : 0 4 . 7 4 6 1 3 4 6 - 0 5 : 0 0 " , " l e v e l " : " I N F O " , " a p p n a m e " : " l i s t e n e r c o r e " , " l o g g e r " : " M a i n " , " t h r e a d " : " 1 " ,   m e s s a g e " : " t e s t " }

每个字符之间有额外的空格,我正在研究文本编码,但我还不确定。

编辑

几乎已经证实这是一个编码问题。如果我用这个 python 脚本捕获、解码并重新传输日志,它就可以解决问题:

import socket

UDP_IP_ADDRESS = "10.254.18.166"
UDP_PORT_NO = 9993
serverSock = socket.socket(socket.AF_INET, socket.SOCK_DGRAM)
serverSock.bind((UDP_IP_ADDRESS, UDP_PORT_NO))
clientSock = socket.socket(socket.AF_INET, socket.SOCK_DGRAM)
while True:
data, addr = serverSock.recvfrom(1024)
clientSock.sendto(data.decode('utf-16'), ("127.0.0.1", 9994))

我怎样才能让logstash到达我的UTF-16输入?我已经尝试过了,但它不起作用:

bin/logstash -e "input{udp{port=>9994 type=>stdin codec=>plain{charset=>'UTF-16'}}} filter{json {source=>message}} output{ stdout{ codec=>rubydebug } }"

最佳答案

可以用 UTF-16LE 而不是 UTF-16 进行检查吗?

关于来自 log4net 的 Logstash 和 Json 风格日志,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/48954892/

25 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com