gpt4 book ai didi

r - 将 Spark 数据帧转换为 SparklyR 表 "tbl_spark"

转载 作者:行者123 更新时间:2023-12-01 22:41:30 25 4
gpt4 key购买 nike

我正在尝试将 Spark 数据帧 org.apache.spark.sql.DataFrame 转换为 Sparklyr 表 tbl_spark。我尝试使用 sdf_register,但失败并出现以下错误。

这里,df 是 Spark 数据帧。

sdf_register(df, name = "my_tbl")

错误是,

Error: org.apache.spark.sql.AnalysisException: Table not found: my_tbl; line 2 pos 17
at org.apache.spark.sql.catalyst.analysis.package$AnalysisErrorAt.failAnalysis(package.scala:42)
at org.apache.spark.sql.catalyst.analysis.Analyzer$ResolveRelations$.getTable(Analyzer.scala:306)
at org.apache.spark.sql.catalyst.analysis.Analyzer$ResolveRelations$$anonfun$apply$9.applyOrElse(Analyzer.scala:315)
at org.apache.spark.sql.catalyst.analysis.Analyzer$ResolveRelations$$anonfun$apply$9.applyOrElse(Analyzer.scala:310)
at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan$$anonfun$resolveOperators$1.apply(LogicalPlan.scala:57)
at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan$$anonfun$resolveOperators$1.apply(LogicalPlan.scala:57)
at org.apache.spark.sql.catalyst.trees.CurrentOrigin$.withOrigin(TreeNode.scala:69)
at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.resolveOperators(LogicalPlan.scala:56)
at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan$$anonfun$1.apply(LogicalPlan.scala:54)
at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan$$anonfun$1.apply(LogicalPlan.scala:54)
at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$4.apply(TreeNode.scala:281)
at scala.collection.Iterator$$anon$11.next(Iterator.scala:328)
at scala.collection.Iterator$class.foreach(Iterator.scala:727)

我错过了什么吗?或者有没有更好的方法将其转换为tbl_spark

谢谢!

最佳答案

使用sdf_copy_to()dplyr::copy_to() ,例如my_tbl <- sdf_copy_to(sc, df, "my_tbl")

关于r - 将 Spark 数据帧转换为 SparklyR 表 "tbl_spark",我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/48288775/

25 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com