gpt4 book ai didi

scala - saveTocassandra 找不到参数 rwf 的隐式值

转载 作者:行者123 更新时间:2023-12-04 10:03:47 25 4
gpt4 key购买 nike

我正在尝试使用 spark scala 在 Cassandra 数据库中保存数据集,但是在运行代码时出现异常:
使用的链接:http://rustyrazorblade.com/2015/01/introduction-to-spark-cassandra/

error:

could not find implicit value for parameter rwf: com.datastax.spark.connector.writer.RowWriterFctory[FoodToUserIndex]
food_index.saveToCassandra("tutorial", "food_to_user_index")
^

.scala
def main(args: Array[String]): Unit = {

val conf = new SparkConf(true)
.set("spark.cassandra.connection.host", "localhost")
.set("spark.executor.memory", "1g")
.set("spark.cassandra.connection.native.port", "9042")
val sc = new SparkContext(conf)


case class FoodToUserIndex(food: String, user: String)

val user_table = sc.cassandraTable[CassandraRow]("tutorial", "user").select("favorite_food","name")

val food_index = user_table.map(r => new FoodToUserIndex(r.getString("favorite_food"), r.getString("name")))
food_index.saveToCassandra("tutorial", "food_to_user_index")}

生成.sbt
name := "intro_to_spark"

version := "1.0"

scalaVersion := "2.11.2"

libraryDependencies += "org.apache.spark" %% "spark-core" % "1.2.0"

libraryDependencies += "com.datastax.spark" %% "spark-cassandra-connector" % "1.2.0-rc3"

如果将 scala 和 cassandra 连接器的版本更改为 2.10、1.1.0,则它可以工作。但我需要使用 Scala 2.11:
scalaVersion := "2.10.4"

libraryDependencies += "org.apache.spark" %% "spark-core" % "1.2.0"

libraryDependencies += "com.datastax.spark" %% "spark-cassandra-connector" % "1.1.0" withSources() withJavadoc()

最佳答案

搬家 case class FoodToUserIndex(food: String, user: String)外部主函数应该可以解决问题。

关于scala - saveTocassandra 找不到参数 rwf 的隐式值,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/29767558/

25 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com