gpt4 book ai didi

java - Spark 强制 log4j

转载 作者:可可西里 更新时间:2023-11-01 14:12:51 26 4
gpt4 key购买 nike

我在 Scala 中有一个简单的 spark 项目,并且想使用 logback,但 spark/hadoop 似乎强制我使用 log4j。

  1. 这似乎与我对 slf4j 目的的理解不一致;是这不是对 spark/hadoop 的疏忽?

  2. 我必须放弃 logback 并使用 log4j 吗?解决方法?

在 build.sbt 中我尝试了排除 ...

"org.apache.spark" %% "spark-core" % "1.4.1" excludeAll(
ExclusionRule(name = "log4j"),
ExclusionRule(name = "slf4j-log4j12")
),
"org.slf4j" % "slf4j-api" % "1.7.12",
"ch.qos.logback" % "logback-core" % "1.1.3",
"ch.qos.logback" % "logback-classic" % "1.1.3"

...但这会导致异常...

Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/log4j/Level
at org.apache.hadoop.mapred.JobConf.<clinit>(JobConf.java:354)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:344)
at org.apache.hadoop.conf.Configuration.getClassByNameOrNull(Configuration.java:1659)
at org.apache.hadoop.util.ReflectionUtils.setJobConf(ReflectionUtils.java:91)
at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:75)
at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:133)
at org.apache.hadoop.security.Groups.<init>(Groups.java:55)
at org.apache.hadoop.security.Groups.getUserToGroupsMappingService(Groups.java:182)
at org.apache.hadoop.security.UserGroupInformation.initialize(UserGroupInformation.java:235)
at org.apache.hadoop.security.UserGroupInformation.ensureInitialized(UserGroupInformation.java:214)
at org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:669)
at org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:571)
at org.apache.spark.util.Utils$$anonfun$getCurrentUserName$1.apply(Utils.scala:2162)
at org.apache.spark.util.Utils$$anonfun$getCurrentUserName$1.apply(Utils.scala:2162)
at scala.Option.getOrElse(Option.scala:120)
at org.apache.spark.util.Utils$.getCurrentUserName(Utils.scala:2162)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:301)
at spike.HelloSpark$.main(HelloSpark.scala:19)
at spike.HelloSpark.main(HelloSpark.scala)
Caused by: java.lang.ClassNotFoundException: org.apache.log4j.Level
at java.net.URLClassLoader$1.run(URLClassLoader.java:372)
at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:360)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 20 more

最佳答案

我遇到了和你一样的异常。我认为除了排除 log4j 和 slf4j-log4j12 之外,您还应该添加 log4j-over-slf4j 作为依赖项。它对我有用。

log4j-over-slf4j 是 log4j 的替代品,因为它提供与 log4j 完全相同的 api,并且实际上将所有对 log4j 的调用路由到 slf4j,后者又将所有内容路由到底层日志框架。 https://www.slf4j.org/legacy.html给出了详细的解释。

关于java - Spark 强制 log4j,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/31766174/

26 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com