- html - 出于某种原因,IE8 对我的 Sass 文件中继承的 html5 CSS 不友好?
- JMeter 在响应断言中使用 span 标签的问题
- html - 在 :hover and :active? 上具有不同效果的 CSS 动画
- html - 相对于居中的 html 内容固定的 CSS 重复背景?
我有一个简单的spark应用程序和gradle 2.3,spark guide说不需要 bundle spark库,因此我在build.gradle中使用“运行时”依赖项,如下所示:
apply plugin: 'java'
apply plugin: 'idea'
apply plugin: 'scala'
apply plugin: 'maven'
repositories {
mavenCentral()
}
dependencies {
compile 'org.scala-lang:scala-library:2.10.5'
runtime 'org.apache.spark:spark-core_2.10:1.3.1'
runtime 'org.apache.spark:spark-streaming_2.10:1.3.1'
compile 'com.datastax.spark:spark-cassandra-connector_2.10:1.2.0-rc3'
testCompile group: 'junit', name: 'junit', version: '4.11'
}
[ant:scalac] /Users/grant/programming/ideaprojects/scalaTest/src/main/scala/com/grant/cassandra/DemoApp.scala:3: error: object Logging is not a member of package org.apache.spark
[ant:scalac] import org.apache.spark.{Logging, SparkContext, SparkConf}
[ant:scalac] ^
[ant:scalac] /Users/grant/programming/ideaprojects/scalaTest/src/main/scala/com/grant/cassandra/DemoApp.scala:5: error: not found: type Logging
[ant:scalac] trait DemoApp extends App with Logging {
[ant:scalac] ^
[ant:scalac] /Users/grant/programming/ideaprojects/scalaTest/src/main/scala/com/grant/cassandra/DemoApp.scala:14: error: not found: type SparkConf
[ant:scalac] val conf = new SparkConf(true)
[ant:scalac] ^
[ant:scalac] /Users/grant/programming/ideaprojects/scalaTest/src/main/scala/com/grant/cassandra/DemoApp.scala:21: error: not found: type SparkContext
[ant:scalac] lazy val sc = new SparkContext(conf)
[ant:scalac] ^
[ant:scalac] /Users/grant/programming/ideaprojects/scalaTest/src/main/scala/com/grant/cassandra/WordCountDemo.scala:3: error: object SparkContext is not a member of package org.apache.spark
[ant:scalac] import org.apache.spark.SparkContext._
[ant:scalac] ^
[ant:scalac] error: bad symbolic reference. A signature in CassandraConnector.class refers to type Logging
[ant:scalac] in package org.apache.spark which is not available.
[ant:scalac] It may be completely missing from the current classpath, or the version on
[ant:scalac] the classpath might be incompatible with the version used when compiling CassandraConnector.class.
[ant:scalac] error: bad symbolic reference. A signature in CassandraConnector.class refers to type SparkConf
[ant:scalac] in package org.apache.spark which is not available.
[ant:scalac] It may be completely missing from the current classpath, or the version on
[ant:scalac] the classpath might be incompatible with the version used when compiling CassandraConnector.class.
[ant:scalac] /Users/grant/programming/ideaprojects/scalaTest/src/main/scala/com/grant/spark/PairRDDTest.scala:3: error: object SparkConf is not a member of package org.apache.spark
[ant:scalac] import org.apache.spark.SparkConf
[ant:scalac] ^
[ant:scalac] /Users/grant/programming/ideaprojects/scalaTest/src/main/scala/com/grant/spark/PairRDDTest.scala:4: error: object SparkContext is not a member of package org.apache.spark
[ant:scalac] import org.apache.spark.SparkContext
[ant:scalac] ^
[ant:scalac] /Users/grant/programming/ideaprojects/scalaTest/src/main/scala/com/grant/spark/PairRDDTest.scala:5: error: object rdd is not a member of package org.apache.spark
[ant:scalac] import org.apache.spark.rdd.PairRDDFunctions
[ant:scalac] ^
[ant:scalac] /Users/grant/programming/ideaprojects/scalaTest/src/main/scala/com/grant/spark/PairRDDTest.scala:10: error: not found: type SparkConf
[ant:scalac] val conf = new SparkConf
[ant:scalac] ^
[ant:scalac] /Users/grant/programming/ideaprojects/scalaTest/src/main/scala/com/grant/spark/PairRDDTest.scala:14: error: not found: type SparkContext
[ant:scalac] val sc = new SparkContext(conf)
[ant:scalac] ^
[ant:scalac] /Users/grant/programming/ideaprojects/scalaTest/src/main/scala/com/grant/spark/PairRDDTest.scala:18: error: not found: type PairRDDFunctions
[ant:scalac] val func = new PairRDDFunctions(rdd)
[ant:scalac] ^
[ant:scalac] /Users/grant/programming/ideaprojects/scalaTest/src/main/scala/com/grant/spark/SparkMain.scala:3: error: object SparkConf is not a member of package org.apache.spark
[ant:scalac] import org.apache.spark.{SparkConf, SparkContext}
[ant:scalac] ^
[ant:scalac] /Users/grant/programming/ideaprojects/scalaTest/src/main/scala/com/grant/spark/SparkMain.scala:7: error: not found: type SparkConf
[ant:scalac] val conf = new SparkConf
[ant:scalac] ^
[ant:scalac] /Users/grant/programming/ideaprojects/scalaTest/src/main/scala/com/grant/spark/SparkMain.scala:11: error: not found: type SparkContext
[ant:scalac] val sc = new SparkContext(conf)
[ant:scalac] ^
[ant:scalac] 16 errors found
最佳答案
provided
/ providedCompile
配置在某些插件或构建脚本创建它们之前不存在。您可以使用nebula-plugins中的插件,也可以像这样使用
configurations {
provided
}
sourceSets {
main {
compileClasspath += [configurations.provided]
}
}
dependencies {
provided 'org.apache.hadoop:hadoop-core:2.5.0-mr1-cdh5.3.0'
compile ...
testCompile 'org.apache.mrunit:mrunit:1.0.0'
}
jar {
doFirst {
into('lib') { from configurations.runtime }
}
}
idea {
module {
scopes.PROVIDED.plus += [configurations.provided]
}
}
lib
文件夹中,以使其易于作为Hadoop作业运行。对于Spark,您可能需要创建阴影JAR或“ super ” JAR。这将添加编译依赖项(未提供)。
关于gradle - 为什么运行时依赖项在gradle中不起作用?,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/30364054/
今天有小伙伴给我留言问到,try{...}catch(){...}是什么意思?它用来干什么? 简单的说 他们是用来捕获异常的 下面我们通过一个例子来详细讲解下
我正在努力提高网站的可访问性,但我不知道如何在页脚中标记社交媒体链接列表。这些链接指向我在 facecook、twitter 等上的帐户。我不想用 role="navigation" 标记这些链接,因
说现在是 6 点,我有一个 Timer 并在 10 点安排了一个 TimerTask。之后,System DateTime 被其他服务(例如 ntp)调整为 9 点钟。我仍然希望我的 TimerTas
就目前而言,这个问题不适合我们的问答形式。我们希望答案得到事实、引用资料或专业知识的支持,但这个问题可能会引发辩论、争论、投票或扩展讨论。如果您觉得这个问题可以改进并可能重新打开,visit the
我就废话不多说了,大家还是直接看代码吧~ ? 1
Maven系列1 1.什么是Maven? Maven是一个项目管理工具,它包含了一个对象模型。一组标准集合,一个依赖管理系统。和用来运行定义在生命周期阶段中插件目标和逻辑。 核心功能 Mav
我是一名优秀的程序员,十分优秀!