gpt4 book ai didi

json - 使用 Log4j 在日志中输出 Spark 应用程序 ID

转载 作者:行者123 更新时间:2023-12-01 12:54:27 28 4
gpt4 key购买 nike

我有一个用于 Spark 应用程序的自定义 Log4j 文件。我想输出 Spark 应用程序 ID 以及消息和日期等其他属性,因此 JSON 字符串结构如下所示:

{"name":,"time":,"date":,"level":,"thread":,"message":,"app_id":}

现在,这个结构看起来像这样:

{"name":,"time":,"date":,"level":,"thread":,"message":}

如何为 Spark 驱动日志定义这样的布局?

我的 log4j 文件如下所示:

<?xml version="1.0" encoding="UTF-8" ?>
<!DOCTYPE log4j:configuration SYSTEM "log4j.dtd">
<log4j:configuration xmlns:log4j='http://jakarta.apache.org/log4j/'>

<appender name="Json" class="org.apache.log4j.ConsoleAppender">
<layout class="org.apache.hadoop.log.Log4Json">
<param name="ConversionLayout" value=""/>
</layout>
</appender>

<root>
<level value="INFO"/>
<appender-ref ref="Json"/>
</root>
</log4j:configuration>

最佳答案

我怀疑是否可以为此调整 org.apache.hadoop.log.Log4Json。根据它的 javadoc 和源代码,它可能相当麻烦。

虽然看起来您使用的是 Log4j 1x,但它的 API 非常灵活,我们可以通过扩展 org.apache.log4j.Layout 轻松定义我们自己的布局。

我们需要一个根据目标结构转换为 JSON 的案例类:

case class LoggedMessage(name: String,
appId: String,
thread: String,
time: Long,
level: String,
message: String)

Layout 可以扩展如下。要访问“app_id”的值,我们将使用 Log4j 的映射诊断上下文

import org.apache.log4j.Layout
import org.apache.log4j.spi.LoggingEvent
import org.json4s.DefaultFormats
import org.json4s.native.Serialization.write

class JsonLoggingLayout extends Layout {
// required by the API
override def ignoresThrowable(): Boolean = false
// required by the API
override def activateOptions(): Unit = { /* nothing */ }

override def format(event: LoggingEvent): String = {
// we are using json4s for JSON serialization
implicit val formats = DefaultFormats

// retrieve app_id from Mapped Diagnostic Context
val appId = event.getMDC("app_id") match {
case null => "[no_app]" // logged messages outside our app
case defined: AnyRef => defined.toString
}
val message = LoggedMessage("TODO",
appId,
Thread.currentThread().getName,
event.getTimeStamp,
event.getLevel.toString,
event.getMessage.toString)
write(message) + "\n"
}

}

最后,当创建Spark session 时,我们将app_id值放入MDC:

import org.apache.log4j.{Logger, MDC}

// create Spark session

MDC.put("app_id", session.sparkContext.applicationId)

logger.info("-------- this is info --------")
logger.warn("-------- THIS IS A WARNING --------")
logger.error("-------- !!! ERROR !!! --------")

这会产生以下日志:

{"name":"TODO","appId":"local-1550247707920","thread":"main","time":1550247708149,"level":"INFO","message":"-------- this is info --------"}
{"name":"TODO","appId":"local-1550247707920","thread":"main","time":1550247708150,"level":"WARN","message":"-------- THIS IS A WARNING --------"}
{"name":"TODO","appId":"local-1550247707920","thread":"main","time":1550247708150,"level":"ERROR","message":"-------- !!! ERROR !!! --------"}

当然,不要忘记在 log4j 配置 xml 中引用实现:

<appender name="Json" class="org.apache.log4j.ConsoleAppender">
<layout class="stackoverflow.q54706582.JsonLoggingLayout" />
</appender>

关于json - 使用 Log4j 在日志中输出 Spark 应用程序 ID,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/54706582/

28 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com