gpt4 book ai didi

scala - Spark 源代码 : How to understand withScope method

转载 作者:行者123 更新时间:2023-12-04 14:56:26 24 4
gpt4 key购买 nike

我无法理解 withScope 方法的功能(实际上,我并不真正了解 RDDOperationScope 类的含义)

特别是withScope方法的参数列表中(body: => T)是什么意思:

private[spark] def withScope[T](
sc: SparkContext,
name: String,
allowNesting: Boolean,
ignoreParent: Boolean)(body: => T): T = {
// Save the old scope to restore it later
val scopeKey = SparkContext.RDD_SCOPE_KEY
val noOverrideKey = SparkContext.RDD_SCOPE_NO_OVERRIDE_KEY
val oldScopeJson = sc.getLocalProperty(scopeKey)
val oldScope = Option(oldScopeJson).map(RDDOperationScope.fromJson)
val oldNoOverride = sc.getLocalProperty(noOverrideKey)
try {
if (ignoreParent) {
// Ignore all parent settings and scopes and start afresh with our own root scope
sc.setLocalProperty(scopeKey, new RDDOperationScope(name).toJson)
} else if (sc.getLocalProperty(noOverrideKey) == null) {
// Otherwise, set the scope only if the higher level caller allows us to do so
sc.setLocalProperty(scopeKey, new RDDOperationScope(name, oldScope).toJson)
}
// Optionally disallow the child body to override our scope
if (!allowNesting) {
sc.setLocalProperty(noOverrideKey, "true")
}
body
} finally {
// Remember to restore any state that was modified before exiting
sc.setLocalProperty(scopeKey, oldScopeJson)
sc.setLocalProperty(noOverrideKey, oldNoOverride)
}
}

您可以通过以下链接找到源代码:
https://github.com/apache/spark/blob/master/core/src/main/scala/org/apache/spark/rdd/RDDOperationScope.scala

谁能帮我?谢谢,我纠结了很久。

最佳答案

以下代码可能对您有所帮助

object TestWithScope {
def withScope(func: => String) = {
println("withscope")
func
}

def bar(foo: String) = withScope {
println("Bar: " + foo)
"BBBB"
}

def main(args: Array[String]): Unit = {
println(bar("AAAA"));
}
}

可能的输出

withscope
Bar: AAAA
BBBB

关于scala - Spark 源代码 : How to understand withScope method,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/37691391/

24 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com