gpt4 book ai didi

scala - 如何使用Scala在Spark中创建SQLContext?

转载 作者:行者123 更新时间:2023-12-04 13:59:01 24 4
gpt4 key购买 nike

我正在使用sbt将Scala程序创建为SQLContext。这是我的build.sbt:

name := "sampleScalaProject"

version := "1.0"

scalaVersion := "2.11.7"
//libraryDependencies += "org.apache.spark" %% "spark-core" % "2.5.2"
libraryDependencies += "org.apache.spark" % "spark-core_2.11" % "1.5.2"
libraryDependencies += "org.apache.kafka" % "kafka_2.11" % "0.8.2.2"
libraryDependencies += "org.apache.spark" % "spark-streaming_2.11" % "1.5.2"
libraryDependencies += "org.apache.spark" % "spark-sql_2.11" % "1.5.2"
libraryDependencies += "org.apache.hadoop" % "hadoop-common" % "2.6.0"

这是测试程序:
import org.apache.spark.SparkContext
import org.apache.spark.sql.SQLContext

object SqlContextSparkScala {

def main (args: Array[String]) {
val sc = SparkContext
val sqlcontext = new SQLContext(sc)
}
}

我得到以下错误:
Error:(8, 26) overloaded method constructor SQLContext with alternatives:
(sparkContext: org.apache.spark.api.java.JavaSparkContext)org.apache.spark.sql.SQLContext <and>
(sparkContext: org.apache.spark.SparkContext)org.apache.spark.sql.SQLContext
cannot be applied to (org.apache.spark.SparkContext.type)
val sqlcontexttest = new SQLContext(sc)

谁可以让我知道这个问题,因为我是scala和spark编程的新手?

最佳答案

您需要new您的SparkContext,这应该可以解决它

关于scala - 如何使用Scala在Spark中创建SQLContext?,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/34387889/

24 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com