gpt4 book ai didi

json - 在 Logstash 中,如何限制在 Elasticsearch 中转换为索引字段的日志中 JSON 属性的深度?

转载 作者:行者123 更新时间:2023-12-02 22:12:12 27 4
gpt4 key购买 nike

我是 Elastic Stack 的新手。我正在使用 Logstash 6.4.0 将 JSON 日志数据从 Filebeat 6.4.0 加载到 Elasticsearch 6.4.0.. 我发现一旦我开始使用 Kibana 6.4.0,我就会将太多 JSON 属性转换为字段.

我知道这一点是因为当我导航到 Kibana Discover 并输入我的索引 logstash-* 时,我收到一条错误消息,指出:

Discover: Trying to retrieve too many docvalue_fields. Must be less than or equal to: [100] but was [106]. This limit can be set by changing the [index.max_docvalue_fields_search] index level setting.



如果我导航到 Management > Kibana > Index Patterns我看到我有 94​​0 个字段。似乎我的根 JSON 对象的每个子属性(以及许多这些子属性都将 JSON 对象作为值,等等)被自动解析并用于在我的 Elasticsearch 中创建字段 logstash-*指数。

所以这是我的问题——我怎样才能限制这种自动创建?是否可以通过属性深度来做到这一点?有没有可能以其他方式做到这一点?

这是我的 Filebeat 配置(减去注释):
filebeat.inputs:
- type: log
enabled: true
paths:
- d:/clients/company-here/rpms/logs/rpmsdev/*.json
json.keys_under_root: true
json.add_error_key: true

filebeat.config.modules:
path: ${path.config}/modules.d/*.yml
reload.enabled: false

setup.template.settings:
index.number_of_shards: 3

setup.kibana:

output.logstash:
hosts: ["localhost:5044"]

这是我当前的 Logstash 管道配置:
input {
beats {
port => "5044"
}
}
filter {
date {
match => [ "@timestamp" , "ISO8601"]
}
}
output {
stdout {
#codec => rubydebug
}
elasticsearch {
hosts => [ "localhost:9200" ]
}
}

这是我要发送的单个日志消息的示例(我的日志文件的一行)——请注意,JSON 是完全动态的,并且可以根据记录的内容而变化:
{
"@timestamp": "2018-09-06T14:29:32.128",
"level": "ERROR",
"logger": "RPMS.WebAPI.Filters.LogExceptionAttribute",
"message": "Log Exception: RPMS.WebAPI.Entities.LogAction",
"eventProperties": {
"logAction": {
"logActionId": 26268916,
"performedByUserId": "b36778be-6181-4b69-a0fe-e3a975ddcdd7",
"performedByUserName": "test.sga.danny@domain.net",
"performedByFullName": "Mike Manley",
"controller": "RpmsToMainframeOperations",
"action": "UpdateStoreItemPricing",
"actionDescription": "Exception while updating store item pricing for store item with storeItemId: 146926. An error occurred while sending the request. InnerException: Unable to connect to the remote server InnerException: A connection attempt failed because the connected party did not properly respond after a period of time, or established connection failed because connected host has failed to respond 10.1.1.133:8800",
"url": "http://localhost:49399/api/RpmsToMainframeOperations/UpdateStoreItemPricing/146926",
"verb": "PUT",
"statusCode": 500,
"status": "Internal Server Error - Exception",
"request": {
"itemId": 648,
"storeId": 13,
"storeItemId": 146926,
"changeType": "price",
"book": "C",
"srpCode": "",
"multi": 0,
"price": "1.27",
"percent": 40,
"keepPercent": false,
"keepSrp": false
},
"response": {
"exception": {
"ClassName": "System.Net.Http.HttpRequestException",
"Message": "An error occurred while sending the request.",
"Data": null,
"InnerException": {
"ClassName": "System.Net.WebException",
"Message": "Unable to connect to the remote server",
"Data": null,
"InnerException": {
"NativeErrorCode": 10060,
"ClassName": "System.Net.Sockets.SocketException",
"Message": "A connection attempt failed because the connected party did not properly respond after a period of time, or established connection failed because connected host has failed to respond",
"Data": null,
"InnerException": null,
"HelpURL": null,
"StackTraceString": " at System.Net.Sockets.Socket.InternalEndConnect(IAsyncResult asyncResult)\r\n at System.Net.Sockets.Socket.EndConnect(IAsyncResult asyncResult)\r\n at System.Net.ServicePoint.ConnectSocketInternal(Boolean connectFailure, Socket s4, Socket s6, Socket& socket, IPAddress& address, ConnectSocketState state, IAsyncResult asyncResult, Exception& exception)",
"RemoteStackTraceString": null,
"RemoteStackIndex": 0,
"ExceptionMethod": "8\nInternalEndConnect\nSystem, Version=4.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089\nSystem.Net.Sockets.Socket\nVoid InternalEndConnect(System.IAsyncResult)",
"HResult": -2147467259,
"Source": "System",
"WatsonBuckets": null
},
"HelpURL": null,
"StackTraceString": " at System.Net.HttpWebRequest.EndGetRequestStream(IAsyncResult asyncResult, TransportContext& context)\r\n at System.Net.Http.HttpClientHandler.GetRequestStreamCallback(IAsyncResult ar)",
"RemoteStackTraceString": null,
"RemoteStackIndex": 0,
"ExceptionMethod": "8\nEndGetRequestStream\nSystem, Version=4.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089\nSystem.Net.HttpWebRequest\nSystem.IO.Stream EndGetRequestStream(System.IAsyncResult, System.Net.TransportContext ByRef)",
"HResult": -2146233079,
"Source": "System",
"WatsonBuckets": null
},
"HelpURL": null,
"StackTraceString": " at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)\r\n at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)\r\n at System.Runtime.CompilerServices.TaskAwaiter`1.GetResult()\r\n at RPMS.WebAPI.Infrastructure.RpmsToMainframe.RpmsToMainframeOperationsManager.<PerformOperationInternalAsync>d__14.MoveNext() in D:\\Century\\Clients\\PigglyWiggly\\RPMS\\PWADC.RPMS\\RPMSDEV\\RPMS.WebAPI\\Infrastructure\\RpmsToMainframe\\RpmsToMainframeOperationsManager.cs:line 114\r\n--- End of stack trace from previous location where exception was thrown ---\r\n at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)\r\n at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)\r\n at System.Runtime.CompilerServices.TaskAwaiter`1.GetResult()\r\n at RPMS.WebAPI.Infrastructure.RpmsToMainframe.RpmsToMainframeOperationsManager.<PerformOperationAsync>d__13.MoveNext() in D:\\Century\\Clients\\PigglyWiggly\\RPMS\\PWADC.RPMS\\RPMSDEV\\RPMS.WebAPI\\Infrastructure\\RpmsToMainframe\\RpmsToMainframeOperationsManager.cs:line 96\r\n--- End of stack trace from previous location where exception was thrown ---\r\n at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)\r\n at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)\r\n at System.Runtime.CompilerServices.TaskAwaiter`1.GetResult()\r\n at RPMS.WebAPI.Controllers.RpmsToMainframe.RpmsToMainframeOperationsController.<UpdateStoreItemPricing>d__43.MoveNext() in D:\\Century\\Clients\\PigglyWiggly\\RPMS\\PWADC.RPMS\\RPMSDEV\\RPMS.WebAPI\\Controllers\\RpmsToMainframe\\RpmsToMainframeOperationsController.cs:line 537\r\n--- End of stack trace from previous location where exception was thrown ---\r\n at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)\r\n at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)\r\n at System.Threading.Tasks.TaskHelpersExtensions.<CastToObject>d__1`1.MoveNext()\r\n--- End of stack trace from previous location where exception was thrown ---\r\n at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)\r\n at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)\r\n at System.Web.Http.Controllers.ApiControllerActionInvoker.<InvokeActionAsyncCore>d__1.MoveNext()\r\n--- End of stack trace from previous location where exception was thrown ---\r\n at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)\r\n at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)\r\n at System.Web.Http.Filters.ActionFilterAttribute.<CallOnActionExecutedAsync>d__6.MoveNext()\r\n--- End of stack trace from previous location where exception was thrown ---\r\n at System.Web.Http.Filters.ActionFilterAttribute.<CallOnActionExecutedAsync>d__6.MoveNext()\r\n--- End of stack trace from previous location where exception was thrown ---\r\n at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)\r\n at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)\r\n at System.Web.Http.Filters.ActionFilterAttribute.<ExecuteActionFilterAsyncCore>d__5.MoveNext()\r\n--- End of stack trace from previous location where exception was thrown ---\r\n at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)\r\n at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)\r\n at System.Web.Http.Filters.ActionFilterAttribute.<CallOnActionExecutedAsync>d__6.MoveNext()\r\n--- End of stack trace from previous location where exception was thrown ---\r\n at System.Web.Http.Filters.ActionFilterAttribute.<CallOnActionExecutedAsync>d__6.MoveNext()\r\n--- End of stack trace from previous location where exception was thrown ---\r\n at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)\r\n at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)\r\n at System.Web.Http.Filters.ActionFilterAttribute.<ExecuteActionFilterAsyncCore>d__5.MoveNext()\r\n--- End of stack trace from previous location where exception was thrown ---\r\n at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)\r\n at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)\r\n at System.Web.Http.Controllers.ActionFilterResult.<ExecuteAsync>d__5.MoveNext()\r\n--- End of stack trace from previous location where exception was thrown ---\r\n at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)\r\n at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)\r\n at System.Web.Http.Filters.AuthorizationFilterAttribute.<ExecuteAuthorizationFilterAsyncCore>d__3.MoveNext()\r\n--- End of stack trace from previous location where exception was thrown ---\r\n at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)\r\n at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)\r\n at System.Web.Http.Controllers.AuthenticationFilterResult.<ExecuteAsync>d__5.MoveNext()\r\n--- End of stack trace from previous location where exception was thrown ---\r\n at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)\r\n at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)\r\n at System.Web.Http.Controllers.ExceptionFilterResult.<ExecuteAsync>d__6.MoveNext()",
"RemoteStackTraceString": null,
"RemoteStackIndex": 0,
"ExceptionMethod": "8\nThrowForNonSuccess\nmscorlib, Version=4.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089\nSystem.Runtime.CompilerServices.TaskAwaiter\nVoid ThrowForNonSuccess(System.Threading.Tasks.Task)",
"HResult": -2146233088,
"Source": "mscorlib",
"WatsonBuckets": null,
"SafeSerializationManager": {
"m_serializedStates": [{

}]
},
"CLR_SafeSerializationManager_RealType": "System.Net.Http.HttpRequestException, System.Net.Http, Version=4.0.0.0, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a"
}
},
"performedAt": "2018-09-06T14:29:32.1195316-05:00"
}
},
"logAction": "RPMS.WebAPI.Entities.LogAction"
}

最佳答案

我最终没有找到一种方法来限制自动创建字段的深度。我也posted my question in the Elastic forums并没有得到答复。从我发帖到现在,我对 Logstash 有了更多的了解。

我的最终解决方案是提取我需要的 JSON 属性作为字段,然后我使用了 GREEDYDATA模式在 grok过滤以将其余属性放入 unextractedJson字段,以便我仍然可以在 Elasticsearch 中查询该字段中的值。

这是我的新 Filebeat 配置(减去注释):

filebeat.inputs:
- type: log
enabled: true
paths:
- d:/clients/company-here/rpms/logs/rpmsdev/*.json
#json.keys_under_root: true
json.add_error_key: true

filebeat.config.modules:
path: ${path.config}/modules.d/*.yml
reload.enabled: false

setup.template.settings:
index.number_of_shards: 3

setup.kibana:

output.logstash:
hosts: ["localhost:5044"]

请注意,我注释掉了 json.keys_under_root告诉 Filebeat 将 JSON 格式的日志条目放入 json 的设置发送到 Logstash 的字段。

这是我的新 Logstash 管道配置的片段:
#...

filter {

###########################################################################
# common date time extraction
date {
match => ["[json][time]", "ISO8601"]
remove_field => ["[json][time]"]
}

###########################################################################
# configuration for the actions log
if [source] =~ /actionsCurrent.json/ {

if ("" in [json][eventProperties][logAction][performedByUserName]) {
mutate {
add_field => {
"performedByUserName" => "%{[json][eventProperties][logAction][performedByUserName]}"
"performedByFullName" => "%{[json][eventProperties][logAction][performedByFullName]}"
}
remove_field => [
"[json][eventProperties][logAction][performedByUserName]",
"[json][eventProperties][logAction][performedByFullName]"]
}
}

mutate {
add_field => {
"logFile" => "actions"
"logger" => "%{[json][logger]}"
"level" => "%{[json][level]}"
"performedAt" => "%{[json][eventProperties][logAction][performedAt]}"
"verb" => "%{[json][eventProperties][logAction][verb]}"
"url" => "%{[json][eventProperties][logAction][url]}"
"controller" => "%{[json][eventProperties][logAction][controller]}"
"action" => "%{[json][eventProperties][logAction][action]}"
"actionDescription" => "%{[json][eventProperties][logAction][actionDescription]}"
"statusCode" => "%{[json][eventProperties][logAction][statusCode]}"
"status" => "%{[json][eventProperties][logAction][status]}"
}
remove_field => [
"[json][logger]",
"[json][level]",
"[json][eventProperties][logAction][performedAt]",
"[json][eventProperties][logAction][verb]",
"[json][eventProperties][logAction][url]",
"[json][eventProperties][logAction][controller]",
"[json][eventProperties][logAction][action]",
"[json][eventProperties][logAction][actionDescription]",
"[json][eventProperties][logAction][statusCode]",
"[json][eventProperties][logAction][status]",
"[json][logAction]",
"[json][message]"
]
}

mutate {
convert => {
"statusCode" => "integer"
}
}

grok {
match => { "json" => "%{GREEDYDATA:unextractedJson}" }
remove_field => ["json"]
}

}

# ...

请注意 add_field mutate 中的配置选项将属性提取到命名字段中的命令,后跟 remove_field从 JSON 中删除这些属性的配置选项。在过滤器片段的末尾,请注意 grok命令吞噬了 JSON 的其余部分并将其放入 unextractedJson field 。最后,最重要的是,我删除了 json Filebeat 提供的字段。最后一点让我免于将所有 JSON 数据暴露给 Elasticsearch/Kibana。

此解决方案采用如下所示的日志条目:
{ "time": "2018-09-13T13:36:45.376", "level": "DEBUG", "logger": "RPMS.WebAPI.Filters.LogActionAttribute", "message": "Log Action: RPMS.WebAPI.Entities.LogAction", "eventProperties": {"logAction": {"logActionId":26270372,"performedByUserId":"83fa1d72-fac2-4184-867e-8c2935a262e6","performedByUserName":"rpmsadmin@domain.net","performedByFullName":"Super Admin","clientIpAddress":"::1","controller":"Account","action":"Logout","actionDescription":"Logout.","url":"http://localhost:49399/api/Account/Logout","verb":"POST","statusCode":200,"status":"OK","request":null,"response":null,"performedAt":"2018-09-13T13:36:45.3707739-05:00"}}, "logAction": "RPMS.WebAPI.Entities.LogAction" }

并将它们转换为如下所示的 Elasticsearch 索引:
{
"_index": "actions-2018.09.13",
"_type": "doc",
"_id": "xvA41GUBIzzhuC5epTZG",
"_version": 1,
"_score": null,
"_source": {
"level": "DEBUG",
"tags": [
"beats_input_raw_event"
],
"@timestamp": "2018-09-13T18:36:45.376Z",
"status": "OK",
"unextractedJson": "{\"eventProperties\"=>{\"logAction\"=>{\"performedByUserId\"=>\"83fa1d72-fac2-4184-867e-8c2935a262e6\", \"logActionId\"=>26270372, \"clientIpAddress\"=>\"::1\"}}}",
"action": "Logout",
"source": "d:\\path\\actionsCurrent.json",
"actionDescription": "Logout.",
"offset": 136120,
"@version": "1",
"verb": "POST",
"statusCode": 200,
"controller": "Account",
"performedByFullName": "Super Admin",
"logger": "RPMS.WebAPI.Filters.LogActionAttribute",
"input": {
"type": "log"
},
"url": "http://localhost:49399/api/Account/Logout",
"logFile": "actions",
"host": {
"name": "Development5"
},
"prospector": {
"type": "log"
},
"performedAt": "2018-09-13T13:36:45.3707739-05:00",
"beat": {
"name": "Development5",
"hostname": "Development5",
"version": "6.4.0"
},
"performedByUserName": "rpmsadmin@domain.net"
},
"fields": {
"@timestamp": [
"2018-09-13T18:36:45.376Z"
],
"performedAt": [
"2018-09-13T18:36:45.370Z"
]
},
"sort": [
1536863805376
]
}

关于json - 在 Logstash 中,如何限制在 Elasticsearch 中转换为索引字段的日志中 JSON 属性的深度?,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/52212228/

27 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com