gpt4 book ai didi

apache-spark - spark sql string to timestamp missing milliseconds

转载 作者:行者123 更新时间:2023-12-02 14:47:51 27 4
gpt4 key购买 nike

为什么是:

import spark.implicits._
val content = Seq(("2019", "09", "11","17","16","54","762000000")).toDF("year", "month", "day", "hour", "minute", "second", "nano")
content.printSchema
content.show
content.withColumn("event_time_utc", to_timestamp(concat('year, 'month, 'day, 'hour, 'minute, 'second), "yyyyMMddHHmmss"))
.withColumn("event_time_utc_millis", to_timestamp(concat('year, 'month, 'day, 'hour, 'minute, 'second, substring('nano, 0, 3)), "yyyyMMddHHmmssSSS"))
.select('year, 'month, 'day, 'hour, 'minute, 'second, 'nano,substring('nano, 0, 3), 'event_time_utc, 'event_time_utc_millis)
.show

缺少毫秒?

+----+-----+---+----+------+------+---------+---------------------+-------------------+---------------------+
|year|month|day|hour|minute|second| nano|substring(nano, 0, 3)| event_time_utc|event_time_utc_millis|
+----+-----+---+----+------+------+---------+---------------------+-------------------+---------------------+
|2019| 09| 11| 17| 16| 54|762000000| 762|2019-09-11 17:16:54| 2019-09-11 17:16:54|
+----+-----+---+----+------+------+---------+---------------------+-------------------+---------------------+

对于格式字符串:yyyyMMddHHmmssSSS 如果我没记错的话,它应该包括 SSS 中的毫秒。

最佳答案

我遇到过类似的问题,官方Documentspark <2.4 之前说下面的行:

Convert time string to a Unix timestamp (in seconds) with a specified format (see [http://docs.oracle.com/javase/tutorial/i18n/format/simpleDateFormat.html]) to Unix timestamp (in seconds), return null if fail.

这意味着它只处理秒数。

Spark>= 2.4也可以处理 SSS

解决方案: UDF 将有助于处理这种情况:

import java.text.SimpleDateFormat
import java.sql.Timestamp
import org.apache.spark.sql.functions._
import scala.util.{Try, Success, Failure}

val getTimestampWithMilis: ((String , String) => Option[Timestamp]) = (input, frmt) => input match {
case "" => None
case _ => {
val format = new SimpleDateFormat(frmt)
Try(new Timestamp(format.parse(input).getTime)) match {
case Success(t) => Some(t)
case Failure(_) => None
}
}
}

val getTimestampWithMilisUDF = udf(getTimestampWithMilis)

对于你的例子:

val content = Seq(("2019", "09", "11","17","16","54","762000000")).toDF("year", "month", "day", "hour", "minute", "second", "nano")
val df = content.withColumn("event_time_utc", concat('year, 'month, 'day, 'hour, 'minute, 'second, substring('nano, 0, 3)))
df.show
+----+-----+---+----+------+------+---------+-----------------+
|year|month|day|hour|minute|second| nano| event_time_utc|
+----+-----+---+----+------+------+---------+-----------------+
|2019| 09| 11| 17| 16| 54|762000000|20190911171654762|
+----+-----+---+----+------+------+---------+-----------------+

df.withColumn("event_time_utc_millis", getTimestampWithMilisUDF($"event_time_utc", lit("yyyyMMddHHmmssSSS"))).show(1, false)
+----+-----+---+----+------+------+---------+-----------------+-----------------------+
|year|month|day|hour|minute|second|nano |event_time_utc |event_time_utc_millis |
+----+-----+---+----+------+------+---------+-----------------+-----------------------+
|2019|09 |11 |17 |16 |54 |762000000|20190911171654762|2019-09-11 17:16:54.762|
+----+-----+---+----+------+------+---------+-----------------+-----------------------+

root
|-- year: string (nullable = true)
|-- month: string (nullable = true)
|-- day: string (nullable = true)
|-- hour: string (nullable = true)
|-- minute: string (nullable = true)
|-- second: string (nullable = true)
|-- nano: string (nullable = true)
|-- event_time_utc: string (nullable = true)
|-- event_time_utc_millis: timestamp (nullable = true)

关于apache-spark - spark sql string to timestamp missing milliseconds,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/57927341/

27 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com