gpt4 book ai didi

apache-spark - 为什么查询失败并显示 "AnalysisException: Expected only partition pruning predicates"?

转载 作者:行者123 更新时间:2023-12-04 00:20:30 27 4
gpt4 key购买 nike

在 Spark shell 上执行以下查询时,我面临分区错误:

Expected only partition pruning predicates: ((((isnotnull(tenant_suite#478) && isnotnull(DS#477)) && (DS#477 >= 2017-06-01)) && (DS#477 <= 2017-06-25)) && (tenant_suite#478 = SAMS_CORESITE))



不确定 spark 抛出什么错误。任何人都可以帮我解决这个问题吗?
SELECT 
A.*
FROM
(-----------SUBQUERY 1
SELECT *
FROM
T2 -- PARTITION COLUMNS ARE DS AND TENANT_SUITE
WHERE
DS BETWEEN '2017-06-01'AND '2017-06-25'--date_sub(TO_DATE(FROM_UNIXTIME(UNIX_TIMESTAMP())),1)
AND tenant_suite = 'CORESITE'
) a

JOIN

( -----------SUBQUERY 1
SELECT
concat(concat(visid_high,'-',visid_low),'-',visit_num) AS VISIT_ID
,concat(visid_high,'-',visid_low) AS VISITOR_ID
,MAX(DS) AS EVENT_DT
FROM
T2 -- PARTITION COLUMNS ARE DS AND TENANT_SUITE
WHERE
tenant_suite = 'CORESITE'
AND DS BETWEEN '2017-06-01'AND '2017-06-25' --date_sub(TO_DATE(FROM_UNIXTIME(UNIX_TIMESTAMP())),1)
GROUP BY concat(concat(visid_high,'-',visid_low),'-',visit_num),concat(visid_high,'-',visid_low)
) B
ON A.VISIT_ID = B.VISIT_ID
AND A.VISITOR_ID = B.VISITOR_ID
AND A.VISIT_DT = B.EVENT_DT
group by a.VISIT_DT;

最佳答案

在 Spark 分区中,列区分大小写-
Spark Jira Issue

关于apache-spark - 为什么查询失败并显示 "AnalysisException: Expected only partition pruning predicates"?,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/45024039/

27 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com