gpt4 book ai didi

apache-spark - rdd后面的数字是什么意思

转载 作者:行者123 更新时间:2023-12-01 23:20:28 25 4
gpt4 key购买 nike

enter image description here

rdd后面括号里的数字是什么意思?

最佳答案

RDD后面的数字是它的标识符:

Welcome to
____ __
/ __/__ ___ _____/ /__
_\ \/ _ \/ _ `/ __/ '_/
/___/ .__/\_,_/_/ /_/\_\ version 2.3.0
/_/

Using Scala version 2.11.8 (OpenJDK 64-Bit Server VM, Java 1.8.0_151)
Type in expressions to have them evaluated.
Type :help for more information.

scala> val rdd = sc.range(0, 42)
rdd: org.apache.spark.rdd.RDD[Long] = MapPartitionsRDD[1] at range at <console>:24

scala> rdd.id
res0: Int = 1

它用于在整个 session 中跟踪 RDD,例如用于缓存之类的目的:

scala> rdd.cache
res1: rdd.type = MapPartitionsRDD[1] at range at <console>:24

scala> rdd.count
res2: Long = 42

scala> sc.getPersistentRDDs
res3: scala.collection.Map[Int,org.apache.spark.rdd.RDD[_]] = Map(1 -> MapPartitionsRDD[1] at range at <console>:24)

这个数字很简单 incremental integer (nextRddId 只是一个 AtomicInteger):

private[spark] def newRddId(): Int = nextRddId.getAndIncrement()

生成when RDD is constructed :

/** A unique ID for this RDD (within its SparkContext). */
val id: Int = sc.newRddId()

所以如果我们遵循:

scala> val pairs1 = sc.parallelize(Seq((1, "foo")))
pairs1: org.apache.spark.rdd.RDD[(Int, String)] = ParallelCollectionRDD[2] at parallelize at <console>:24

scala> val pairs2 = sc.parallelize(Seq((1, "bar")))
pairs2: org.apache.spark.rdd.RDD[(Int, String)] = ParallelCollectionRDD[3] at parallelize at <console>:24


scala> pairs1.id
res5: Int = 2

scala> pairs2.id
res6: Int = 3

你会看到2和3,如果你执行

scala> pairs1.join(pairs2).foreach(_ => ())

您期望的是 4,这可以通过检查 UI 来确认:

enter image description here

我们还可以看到 join 在幕后创建了一些新的 RDD(56) .

关于apache-spark - rdd后面的数字是什么意思,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/48277531/

25 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com