gpt4 book ai didi

scala - sbt程序集由于文件冲突而失败

转载 作者:行者123 更新时间:2023-12-02 21:41:30 24 4
gpt4 key购买 nike

我正在尝试通过为项目运行sb​​t程序集来制作一个胖子。
我收到以下错误:

[error] (root/*:assembly) deduplicate: different file contents found in the following:
[error] /Users/xyz/.ivy2/cache/org.apache.hadoop/hadoop-mapreduce-client-core/jars/hadoop-mapreduce-client-core-2.2.0.jar:org/apache/hadoop/filecache/DistributedCache.class
[error] /Users/xyz/.ivy2/cache/org.apache.hadoop/hadoop-core/jars/hadoop-core-2.0.0-mr1-cdh4.7.1.jar:org/apache/hadoop/filecache/DistributedCache.class

hadoop-mapreduce-client-core的DistributedCache现在已弃用。
在我的build.sbt中,我包括了:
"org.apache.hadoop" % "hadoop-client" % "2.0.0-mr1-cdh4.7.1" excludeAll(
ExclusionRule(organization = "javax.servlet"))

依赖关系是这样的:
org.apache.hadoop:hadoop-client:2.2.0 
org.apache.hadoop:hadoop-mapreduce-client-app:2.2.0
org.apache.hadoop:hadoop-mapreduce-client-core:2.2.0

我该如何处理?

提前致谢!

最佳答案

如果您打算在依赖hadoop-client:2.2.0加载时删除mapreduce-client-app的依赖 jar ,则只需添加intransitive:

"org.apache.hadoop" % "hadoop-client" % "2.2.0" intransitive()

这将仅包括hadoop-client:2.2.0 jar,并排除其所有依赖项。

关于scala - sbt程序集由于文件冲突而失败,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/28428068/

24 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com