gpt4 book ai didi

apache-spark - 如何将行rdd转换为类型化rdd

转载 作者:行者123 更新时间:2023-12-04 04:08:44 25 4
gpt4 key购买 nike

是否可以将 Row RDD 转换为 Typed RDD。在下面的代码中,我可以将行 JavaRDD 转换为计数器类型 JavaRDD

代码:

JavaRDD<Counter> rdd = sc.parallelize(counters);
Dataset<Counter> ds = sqlContext.createDataset(rdd.rdd(), encoder);

DataFrame df = ds.toDF();
df.show()

df.write().parquet(path);
DataFrame newDataDF = sqlContext.read().parquet(path);

newDataDF.toJavaRDD(); // This gives a row type rdd

在斯卡拉:

case class A(countId: Long, bytes: Array[Byte], blist: List[B])
case class B(id: String, count: Long)

val b1 = B("a", 1L)
val b2 = B("b", 2L)

val a1 = A(1L, Array(1.toByte,2.toByte), List(a1, a2))
val rdd = sc.parallelize(List(a1))

val dataSet: Dataset[A] = sqlContext.createDataset(rdd)
val df = dataSet.toDF()

// this shows, so this last entry is for List[B] in which it is storing string as null
|1|[01 02]| [[null,3984726108...|]
df.show

df.write.parquet(path)
val roundTripRDD = sqlContext.read.parquet(path).as[A].rdd

//throws error here when run show on df
Caused by: org.codehaus.commons.compiler.CompileException: File 'generated.java',
Line 300, Column 68:
No applicable constructor/method found for actual parameters
"long, byte[], scala.collection.Seq"; candidates are:
"test.data.A(long, byte[], scala.collection.immutable.List)"


roundTripRDD.toDF.show

assertEquals(roundTripRDD, rdd)

我需要为案例类提供某种构造函数吗?

最佳答案

尝试:

sqlContext.read().parquet(path).as(encoder).rdd().toJavaRDD();

关于apache-spark - 如何将行rdd转换为类型化rdd,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/40062778/

25 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com