gpt4 book ai didi

scala - SparkSQL时间戳查询失败

转载 作者:行者123 更新时间:2023-12-02 06:00:27 26 4
gpt4 key购买 nike

我通过Spark将一些日志文件放入sql表中,而我的架构如下所示:

|-- timestamp: timestamp (nullable = true) 
|-- c_ip: string (nullable = true)
|-- cs_username: string (nullable = true)
|-- s_ip: string (nullable = true)
|-- s_port: string (nullable = true)
|-- cs_method: string (nullable = true)
|-- cs_uri_stem: string (nullable = true)
|-- cs_query: string (nullable = true)
|-- sc_status: integer (nullable = false)
|-- sc_bytes: integer (nullable = false)
|-- cs_bytes: integer (nullable = false)
|-- time_taken: integer (nullable = false)
|-- User_Agent: string (nullable = true)
|-- Referrer: string (nullable = true)

如您所见,我创建了一个时间戳字段,Spark支持该字段的读取(据我所知,日期无法正常工作)。我很乐意用于“where timestamp>(2012-10-08 16:10:3​​6.0)”之类的查询,但是当我运行它时,我会不断收到错误消息。
我尝试了以下两种sintax形式:
对于第二个,我解析了一个字符串,所以我确定我实际上以时间戳格式传递了它。
我使用2个函数:parse和date2timestamp。

关于如何处理时间戳值的任何提示?

谢谢!

1)
scala> sqlContext.sql(“SELECT * FROM Logs as l where l.timestamp =(2012-10-08 16:10:3​​6.0)”)。collect
java.lang.RuntimeException: [1.55] failure: ``)'' expected but 16 found 

SELECT * FROM Logs as l where l.timestamp=(2012-10-08 16:10:36.0)
^

2)
sqlContext.sql(“SELECT * FROM作为l的日志,其中l.timestamp =” + date2timestamp(formatTime3.parse(“2012-10-08 16:10:3​​6.0”)))。collect
java.lang.RuntimeException: [1.54] failure: ``UNION'' expected but 16 found 

SELECT * FROM Logs as l where l.timestamp=2012-10-08 16:10:36.0
^

最佳答案

我认为问题首先是时间戳的精度,而且我传递的代表时间戳的字符串也必须强制转换为字符串

因此,此查询现在可以正常工作:

sqlContext.sql("SELECT * FROM Logs as l where cast(l.timestampLog as String) <= '2012-10-08 16:10:36'")

关于scala - SparkSQL时间戳查询失败,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/27069537/

26 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com