gpt4 book ai didi

java - 编译 Spark 类时出现 "Eclipse Plugin for Scala"错误

转载 作者:行者123 更新时间:2023-11-30 03:47:13 24 4
gpt4 key购买 nike

我正在使用CDH5.1.0进行一些简单的Spark编程。另外,我有 Eclipse Juno(VM 附带)并安装了 Scala IDE 插件 2.10.0。我在 IDE 中收到以下错误:

Bad symbolic reference. A signature in SparkContext.class refers to term io in package org.apache.hadoop which is not available. It may be completely missing from the current classpath, or the version on the classpath might be incompatible with the version used when compiling SparkContext.class. SimpleApp.scala /MyScalaProject/src/com/test/spark1 line 10 Scala Problem

代码:

package com.test.spark1
import org.apache.spark.SparkConf
import org.apache.spark.SparkContext
import org.apache.spark.SparkContext._
object SimpleApp {
def main(args: Array[String]) {
val logFile = "/home/Desktop/scala/sparktest.txt" // Should be some file on your system
val conf = new org.apache.spark.SparkConf().setAppName("Simple Application")
val sc = new SparkContext(conf)
val logData = sc.textFile(logFile, 2).cache()
val numAs = logData.filter(line => line.contains("a")).count()
val numBs = logData.filter(line => line.contains("b")).count()
println("Lines with a: %s, Lines with b: %s").format(numAs, numBs)
}
}

我在第 10 行遇到同样的错误(var conf - new org.apache.spark.SparkCon...)还有第 15 行(println...)。

我的项目构建路径有 /usr/lib/spark/assembly/lib/spark-assemble-1.0.0-cdh5.1.0-hadoop2.3.0-cdh5.1.0.jar 并且我检查了这个简单的 scala 程序的所有必需的类都在那里。

最佳答案

在构建路径中添加以下 jar 后,编译错误就消失了:

hadoop-common-2.3.0-cdh5.1.0.jar

因此缺少一些内部依赖项导致了此错误。

关于java - 编译 Spark 类时出现 "Eclipse Plugin for Scala"错误,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/25355592/

24 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com