gpt4 book ai didi

apache-spark - unix_timestamp()是否可以在Apache Spark中返回以毫秒为单位的unix时间?

转载 作者:行者123 更新时间:2023-12-03 20:54:53 70 4
gpt4 key购买 nike

我正在尝试从时间戳字段(以毫秒为单位)(13位数字)获取unix时间,但目前它以秒为单位(10位数字)返回。

scala> var df = Seq("2017-01-18 11:00:00.000", "2017-01-18 11:00:00.123", "2017-01-18 11:00:00.882", "2017-01-18 11:00:02.432").toDF()
df: org.apache.spark.sql.DataFrame = [value: string]

scala> df = df.selectExpr("value timeString", "cast(value as timestamp) time")
df: org.apache.spark.sql.DataFrame = [timeString: string, time: timestamp]


scala> df = df.withColumn("unix_time", unix_timestamp(df("time")))
df: org.apache.spark.sql.DataFrame = [timeString: string, time: timestamp ... 1 more field]

scala> df.take(4)
res63: Array[org.apache.spark.sql.Row] = Array(
[2017-01-18 11:00:00.000,2017-01-18 11:00:00.0,1484758800],
[2017-01-18 11:00:00.123,2017-01-18 11:00:00.123,1484758800],
[2017-01-18 11:00:00.882,2017-01-18 11:00:00.882,1484758800],
[2017-01-18 11:00:02.432,2017-01-18 11:00:02.432,1484758802])


即使 2017-01-18 11:00:00.1232017-01-18 11:00:00.000不同,我回到 1484758800的时间也一样

我想念什么?

最佳答案

unix_timestamp()以秒为单位返回unix时间戳。

时间戳的最后3位数字与毫秒字符串(1.999sec = 1999 milliseconds)的最后3位数字相同,因此只需取时间戳字符串的最后3位数字并追加到毫秒字符串的末尾即可。

关于apache-spark - unix_timestamp()是否可以在Apache Spark中返回以毫秒为单位的unix时间?,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/42237938/

70 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com