gpt4 book ai didi

scala - 如何在单元测试中抑制 Spark 日志记录?

转载 作者:行者123 更新时间:2023-12-03 09:13:23 24 4
gpt4 key购买 nike

因此,多亏了我尝试过的易于搜索的博客:

import org.specs2.mutable.Specification

class SparkEngineSpecs extends Specification {
sequential

def setLogLevels(level: Level, loggers: Seq[String]): Map[String, Level] = loggers.map(loggerName => {
val logger = Logger.getLogger(loggerName)
val prevLevel = logger.getLevel
logger.setLevel(level)
loggerName -> prevLevel
}).toMap

setLogLevels(Level.WARN, Seq("spark", "org.eclipse.jetty", "akka"))

val sc = new SparkContext(new SparkConf().setMaster("local").setAppName("Test Spark Engine"))

// ... my unit tests

但不幸的是它不起作用,我仍然得到很多 Spark 输出,例如:
14/12/02 12:01:56 INFO MemoryStore: Block broadcast_4 of size 4184 dropped from memory (free 583461216)
14/12/02 12:01:56 INFO ContextCleaner: Cleaned broadcast 4
14/12/02 12:01:56 INFO ContextCleaner: Cleaned shuffle 4
14/12/02 12:01:56 INFO ShuffleBlockManager: Deleted all files for shuffle 4

最佳答案

将以下代码添加到 log4j.properties src/test/resources 内的文件dir,如果不存在则创建文件/目录

# Change this to set Spark log level
log4j.logger.org.apache.spark=WARN

# Silence akka remoting
log4j.logger.Remoting=WARN

# Ignore messages below warning level from Jetty, because it's a bit verbose
log4j.logger.org.eclipse.jetty=WARN

当我运行我的单元测试(我使用 JUnit 和 Maven)时,我只收到 WARN 级别的日志,换句话说,不再有 INFO 级别的日志困惑(尽管它们有时对调试很有用)。

我希望这有帮助。

关于scala - 如何在单元测试中抑制 Spark 日志记录?,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/27248997/

24 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com