gpt4 book ai didi

unit-testing - hbase-testing-utility 的 sbt 依赖管理问题

转载 作者:行者123 更新时间:2023-11-28 19:57:55 24 4
gpt4 key购买 nike

我正在尝试使用 scalatest 执行单元测试,利用 hbase 测试实用程序在本地测试开发代码。 sbt 中 hbase 测试实用程序的设置现在很困难。当我编译时,出现以下错误:

[warn]  module not found: org.apache.hbase#${compat.module};1.2.1
[warn] ==== local: tried
[warn] /root/.ivy2/local/org.apache.hbase/${compat.module}/1.2.1/ivys/ivy.xml
[warn] ==== public: tried
[warn] https://repo1.maven.org/maven2/org/apache/hbase/${compat.module}/1.2.1/${compat.module}-1.2.1.pom
[warn] ==== Akka Repository: tried
[warn] http://repo.akka.io/releases/org/apache/hbase/${compat.module}/1.2.1/${compat.module}-1.2.1.pom
[warn] ==== scala-tools: tried
[warn] https://oss.sonatype.org/content/groups/scala-tools/org/apache/hbase/${compat.module}/1.2.1/${compat.module}-1.2.1.pom
[warn] ==== cloudera-repos: tried
[warn] https://repository.cloudera.com/artifactory/cloudera-repos/org/apache/hbase/${compat.module}/1.2.1/${compat.module}-1.2.1.pom
[warn] ==== Sonatype OSS Snapshots: tried
[warn] https://oss.sonatype.org/content/repositories/snapshots/org/apache/hbase/${compat.module}/1.2.1/${compat.module}-1.2.1.pom
[warn] ::::::::::::::::::::::::::::::::::::::::::::::
[warn] :: UNRESOLVED DEPENDENCIES ::
[warn] ::::::::::::::::::::::::::::::::::::::::::::::
[warn] :: org.apache.hbase#${compat.module};1.2.1: not found
[warn] ::::::::::::::::::::::::::::::::::::::::::::::
[warn]
[warn] Note: Unresolved dependencies path:
[warn] org.apache.hbase:${compat.module}:1.2.1
[warn] +- org.apache.hbase:hbase-testing-util:1.2.1 (/workspace/spark/etl/built.sbt#L30-62)

[trace] Stack trace suppressed: run last *:update for the full output.
[error] (*:update) sbt.ResolveException: unresolved dependency: org.apache.hbase#${compat.module};1.2.1: not found
[error] Total time: 32 s, completed Apr 29, 2016 9:25:27 AM

我的build.sbt文件如下:

val hbaseVersion = "1.2.1"
val sparkVersion = "1.6.1"
val hadoopVersion = "2.7.1"

libraryDependencies ++= Seq(
"org.apache.spark" %% "spark-core" % sparkVersion % "provided",
"org.apache.spark" %% "spark-streaming" % sparkVersion % "provided",
"org.apache.spark" %% "spark-streaming-kafka" % sparkVersion,
"org.apache.spark" %% "spark-sql" % sparkVersion % "provided",
"org.apache.spark" %% "spark-mllib" % sparkVersion ,
"org.apache.hbase" % "hbase" % hbaseVersion,
"org.apache.hbase" % "hbase-server" % hbaseVersion,
"org.apache.hbase" % "hbase-server" % hbaseVersion classifier "tests",
"org.apache.hbase" % "hbase-client" % hbaseVersion,
"org.apache.hbase" % "hbase-common" % hbaseVersion,
"org.apache.hbase" % "hbase-common" % hbaseVersion classifier "tests",
"org.apache.hbase" % "hbase-annotations" % hbaseVersion,
"org.apache.hbase" % "hbase-testing-util" % hbaseVersion % "test",
"org.apache.hadoop" % "hadoop-minicluster" % hadoopVersion,
"org.apache.hadoop" % "hadoop-mapreduce-client-jobclient" % hadoopVersion classifier "tests",
"org.apache.hadoop" % "hadoop-hdfs" % hadoopVersion,
"org.apache.hadoop" % "hadoop-hdfs" % hadoopVersion classifier "tests",
"org.apache.hbase" % "hbase-hadoop-compat" % hbaseVersion,
"org.apache.hbase" % "hbase-hadoop-compat" % hbaseVersion classifier "tests",
"org.apache.hbase" % "hbase-hadoop2-compat" % hbaseVersion,
"org.apache.hbase" % "hbase-hadoop2-compat" % hbaseVersion classifier "tests",
"org.apache.hadoop" % "hadoop-common" % hadoopVersion,
"org.apache.hadoop" % "hadoop-common" % hadoopVersion classifier "tests",
"org.apache.hadoop" % "hadoop-annotations" % hadoopVersion,
"org.scalatest" %% "scalatest" % "2.2.6" % "test" ,
//"org.scalacheck" %% "scalacheck" % "1.12.5" % "test",
"com.cloudera.sparkts" % "sparkts" % "0.3.0",
"com.ecwid.consul" % "consul-api" % "1.1.9",
"joda-time" % "joda-time" % "2.7"
)

resolvers ++= Seq(
"Akka Repository" at "http://repo.akka.io/releases/",
"scala-tools" at "https://oss.sonatype.org/content/groups/scala-tools",
"cloudera-repos" at "https://repository.cloudera.com/artifactory/cloudera-repos/",
"Sonatype OSS Snapshots" at "https://oss.sonatype.org/content/repositories/snapshots"
)

有人知道为什么会发生这种故障吗?

最佳答案

抱歉延迟回复。我无法让它按原样工作,所以我像这样更改了版本:

val sparkVersion = "1.6.1"
val hbaseVersion = "1.2.0-cdh5.7.0"
val hadoopVersion = "2.6.0-cdh5.7.0"

这导致了更多的头痛。由于引用了旧的库方法,我不得不将 guava 的版本更改为较早的版本,因此这是必需的:

"com.google.guava" % "guava" % "14.0" force()

(我认为最高版本 16.0 没问题)

必须注释掉以下内容:

// "com.cloudera" % "spark-hbase" % "0.0.2-clabs",

(原来的q中没有)

最后,看起来原来的问题是一个需要解决的bug,请看这里,感谢David Portabella的引用:

https://issues.apache.org/jira/browse/HBASE-15925

并修复了 1.2.2 版本

关于unit-testing - hbase-testing-utility 的 sbt 依赖管理问题,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/36943563/

24 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com