gpt4 book ai didi

apache-spark - 如何在 Spark SQL 查询的 Interval 中使用动态值

转载 作者:行者123 更新时间:2023-12-04 13:57:54 25 4
gpt4 key购买 nike

一个有效的 Spark SQL:

SELECT current_timestamp() - INTERVAL 10 DAYS as diff from sample_table
我试过的 Spark SQL(不工作):
SELECT current_timestamp() - INTERVAL col1 DAYS as diff from sample_table
从上面的查询得到的错误:
mismatched input 'DAYS' expecting 

== SQL ==
SELECT current_timestamp() - INTERVAL col1 DAYS as diff from sample_table
------------------------------------------^^^
" Traceback (most recent call last):
File "/usr/lib/spark/python/lib/pyspark.zip/pyspark/sql/session.py", line 767, in sql return DataFrame(self._jsparkSession.sql(sqlQuery), self._wrapped)
File "/usr/lib/spark/python/lib/py4j-0.10.7-src.zip/py4j/java_gateway.py", line 1257, in call answer, self.gateway_client, self.target_id, self.name)
File "/usr/lib/spark/python/lib/pyspark.zip/pyspark/sql/utils.py", line 73, in deco raise ParseException(s.split(': ', 1)[1], stackTrace) pyspark.sql.utils.ParseException: "
mismatched input 'DAYS' expecting

== SQL ==
SELECT current_timestamp() - INTERVAL col1 DAYS as diff from sample_table
------------------------------------------^^^
我想用 col1作为动态间隔值。我怎样才能做到这一点?

最佳答案

SparkSQL 函数 make_interval实现这一点:

SELECT current_timestamp() - make_interval(0, 0, 0, col1, 0, 0, 0) as diff from sample_table

关于apache-spark - 如何在 Spark SQL 查询的 Interval 中使用动态值,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/58074912/

25 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com