gpt4 book ai didi

gradle - 为什么运行时依赖项在gradle中不起作用?

转载 作者:行者123 更新时间:2023-12-03 04:04:27 25 4
gpt4 key购买 nike

我有一个简单的spark应用程序和gradle 2.3,spark guide说不需要 bundle spark库,因此我在build.gradle中使用“运行时”依赖项,如下所示:

apply plugin: 'java'
apply plugin: 'idea'
apply plugin: 'scala'
apply plugin: 'maven'


repositories {

mavenCentral()
}

dependencies {
compile 'org.scala-lang:scala-library:2.10.5'
runtime 'org.apache.spark:spark-core_2.10:1.3.1'
runtime 'org.apache.spark:spark-streaming_2.10:1.3.1'
compile 'com.datastax.spark:spark-cassandra-connector_2.10:1.2.0-rc3'

testCompile group: 'junit', name: 'junit', version: '4.11'
}

但是,当我运行“类”任务时,出现了错误。这意味着编译器找不到 jar 。我也尝试了“provided”和“providedCompile”,结果是“未找到方法provider()/ providedCompile()”。
[ant:scalac] /Users/grant/programming/ideaprojects/scalaTest/src/main/scala/com/grant/cassandra/DemoApp.scala:3: error: object Logging is not a member of package org.apache.spark
[ant:scalac] import org.apache.spark.{Logging, SparkContext, SparkConf}
[ant:scalac] ^
[ant:scalac] /Users/grant/programming/ideaprojects/scalaTest/src/main/scala/com/grant/cassandra/DemoApp.scala:5: error: not found: type Logging
[ant:scalac] trait DemoApp extends App with Logging {
[ant:scalac] ^
[ant:scalac] /Users/grant/programming/ideaprojects/scalaTest/src/main/scala/com/grant/cassandra/DemoApp.scala:14: error: not found: type SparkConf
[ant:scalac] val conf = new SparkConf(true)
[ant:scalac] ^
[ant:scalac] /Users/grant/programming/ideaprojects/scalaTest/src/main/scala/com/grant/cassandra/DemoApp.scala:21: error: not found: type SparkContext
[ant:scalac] lazy val sc = new SparkContext(conf)
[ant:scalac] ^
[ant:scalac] /Users/grant/programming/ideaprojects/scalaTest/src/main/scala/com/grant/cassandra/WordCountDemo.scala:3: error: object SparkContext is not a member of package org.apache.spark
[ant:scalac] import org.apache.spark.SparkContext._
[ant:scalac] ^
[ant:scalac] error: bad symbolic reference. A signature in CassandraConnector.class refers to type Logging
[ant:scalac] in package org.apache.spark which is not available.
[ant:scalac] It may be completely missing from the current classpath, or the version on
[ant:scalac] the classpath might be incompatible with the version used when compiling CassandraConnector.class.
[ant:scalac] error: bad symbolic reference. A signature in CassandraConnector.class refers to type SparkConf
[ant:scalac] in package org.apache.spark which is not available.
[ant:scalac] It may be completely missing from the current classpath, or the version on
[ant:scalac] the classpath might be incompatible with the version used when compiling CassandraConnector.class.
[ant:scalac] /Users/grant/programming/ideaprojects/scalaTest/src/main/scala/com/grant/spark/PairRDDTest.scala:3: error: object SparkConf is not a member of package org.apache.spark
[ant:scalac] import org.apache.spark.SparkConf
[ant:scalac] ^
[ant:scalac] /Users/grant/programming/ideaprojects/scalaTest/src/main/scala/com/grant/spark/PairRDDTest.scala:4: error: object SparkContext is not a member of package org.apache.spark
[ant:scalac] import org.apache.spark.SparkContext
[ant:scalac] ^
[ant:scalac] /Users/grant/programming/ideaprojects/scalaTest/src/main/scala/com/grant/spark/PairRDDTest.scala:5: error: object rdd is not a member of package org.apache.spark
[ant:scalac] import org.apache.spark.rdd.PairRDDFunctions
[ant:scalac] ^
[ant:scalac] /Users/grant/programming/ideaprojects/scalaTest/src/main/scala/com/grant/spark/PairRDDTest.scala:10: error: not found: type SparkConf
[ant:scalac] val conf = new SparkConf
[ant:scalac] ^
[ant:scalac] /Users/grant/programming/ideaprojects/scalaTest/src/main/scala/com/grant/spark/PairRDDTest.scala:14: error: not found: type SparkContext
[ant:scalac] val sc = new SparkContext(conf)
[ant:scalac] ^
[ant:scalac] /Users/grant/programming/ideaprojects/scalaTest/src/main/scala/com/grant/spark/PairRDDTest.scala:18: error: not found: type PairRDDFunctions
[ant:scalac] val func = new PairRDDFunctions(rdd)
[ant:scalac] ^
[ant:scalac] /Users/grant/programming/ideaprojects/scalaTest/src/main/scala/com/grant/spark/SparkMain.scala:3: error: object SparkConf is not a member of package org.apache.spark
[ant:scalac] import org.apache.spark.{SparkConf, SparkContext}
[ant:scalac] ^
[ant:scalac] /Users/grant/programming/ideaprojects/scalaTest/src/main/scala/com/grant/spark/SparkMain.scala:7: error: not found: type SparkConf
[ant:scalac] val conf = new SparkConf
[ant:scalac] ^
[ant:scalac] /Users/grant/programming/ideaprojects/scalaTest/src/main/scala/com/grant/spark/SparkMain.scala:11: error: not found: type SparkContext
[ant:scalac] val sc = new SparkContext(conf)
[ant:scalac] ^
[ant:scalac] 16 errors found

最佳答案

provided / providedCompile配置在某些插件或构建脚本创建它们之前不存在。您可以使用nebula-plugins中的插件,也可以像这样使用

configurations {
provided
}
sourceSets {
main {
compileClasspath += [configurations.provided]
}
}
dependencies {
provided 'org.apache.hadoop:hadoop-core:2.5.0-mr1-cdh5.3.0'
compile ...
testCompile 'org.apache.mrunit:mrunit:1.0.0'
}
jar {
doFirst {
into('lib') { from configurations.runtime }
}
}
idea {
module {
scopes.PROVIDED.plus += [configurations.provided]
}
}

该示例还将编译后的库添加到JAR的 lib文件夹中,以使其易于作为Hadoop作业运行。对于Spark,您可能需要创建阴影JAR或“ super ” JAR。这将添加编译依赖项(未提供)。

关于gradle - 为什么运行时依赖项在gradle中不起作用?,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/30364054/

25 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com