gpt4 book ai didi

apache-spark-sql - spark 2.4.0 为左连接提供 "Detected implicit cartesian product"异常,右 DF 为空

转载 作者:行者123 更新时间:2023-12-04 08:30:46 62 4
gpt4 key购买 nike

似乎在 spark 2.2.1 和 spark 2.4.0 之间,具有空右数据帧的左连接的行为从成功更改为返回“AnalysisException:检测到逻辑计划之间的 LEFT OUTER 连接的隐式笛卡尔积”。

例如:

val emptyDf = spark.emptyDataFrame
.withColumn("id", lit(0L))
.withColumn("brand", lit(""))
val nonemptyDf = ((1L, "a") :: Nil).toDF("id", "size")
val neje = nonemptyDf.join(emptyDf, Seq("id"), "left")
neje.show()

在 2.2.1 中,结果是
+---+----+-----+
| id|size|brand|
+---+----+-----+
| 1| a| null|
+---+----+-----+

但是,在 2.4.0 中,我收到以下异常:
org.apache.spark.sql.AnalysisException: Detected implicit cartesian product for LEFT OUTER join between logical plans
LocalRelation [id#278L, size#279]
and
Project [ AS brand#55]
+- LogicalRDD false
Join condition is missing or trivial.
Either: use the CROSS JOIN syntax to allow cartesian products between these
relations, or: enable implicit cartesian products by setting the configuration
variable spark.sql.crossJoin.enabled=true;

这是后者的完整计划解释:

> neje.explain(true)

== Parsed Logical Plan ==
'Join UsingJoin(LeftOuter,List(id))
:- Project [_1#275L AS id#278L, _2#276 AS size#279]
: +- LocalRelation [_1#275L, _2#276]
+- Project [id#53L, AS brand#55]
+- Project [0 AS id#53L]
+- LogicalRDD false

== Analyzed Logical Plan ==
id: bigint, size: string, brand: string
Project [id#278L, size#279, brand#55]
+- Join LeftOuter, (id#278L = id#53L)
:- Project [_1#275L AS id#278L, _2#276 AS size#279]
: +- LocalRelation [_1#275L, _2#276]
+- Project [id#53L, AS brand#55]
+- Project [0 AS id#53L]
+- LogicalRDD false

== Optimized Logical Plan ==
org.apache.spark.sql.AnalysisException: Detected implicit cartesian product for LEFT OUTER join between logical plans
LocalRelation [id#278L, size#279]
and
Project [ AS brand#55]
+- LogicalRDD false
Join condition is missing or trivial.
Either: use the CROSS JOIN syntax to allow cartesian products between these
relations, or: enable implicit cartesian products by setting the configuration
variable spark.sql.crossJoin.enabled=true;
== Physical Plan ==
org.apache.spark.sql.AnalysisException: Detected implicit cartesian product for LEFT OUTER join between logical plans
LocalRelation [id#278L, size#279]
and
Project [ AS brand#55]
+- LogicalRDD false
Join condition is missing or trivial.
Either: use the CROSS JOIN syntax to allow cartesian products between these
relations, or: enable implicit cartesian products by setting the configuration
variable spark.sql.crossJoin.enabled=true;

补充意见:
  • 如果只有左数据框为空,则连接成功。
  • 类似的行为变化适用于具有空左的右连接
    数据框。
  • 然而,有趣的是,请注意两个版本都失败了
    如果两个数据框都为空,则内部联接的 AnalysisException。

  • 这是回归还是设计?早期的行为对我来说似乎更正确。我在 spark 发行说明、spark jira 问题或 stackoverflow 问题中找不到任何相关信息。

    最佳答案

    我没有遇到你的问题,但至少有同样的错误,我通过明确允许交叉连接来修复它:

    spark.conf.set( "spark.sql.crossJoin.enabled" , "true" )

    关于apache-spark-sql - spark 2.4.0 为左连接提供 "Detected implicit cartesian product"异常,右 DF 为空,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/56330095/

    62 4 0
    Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
    广告合作:1813099741@qq.com 6ren.com