gpt4 book ai didi

scala - 如何使用Scala在Spark 2.1中将以毫秒为单位的字符串列转换为以毫秒为单位的时间戳?

转载 作者:行者123 更新时间:2023-12-04 10:23:07 24 4
gpt4 key购买 nike

我在Scala中使用Spark 2.1。

如何将以毫秒为单位的字符串列转换为以毫秒为单位的时间戳?

我从Better way to convert a string field into timestamp in Spark问题尝试了以下代码

import org.apache.spark.sql.functions.unix_timestamp
val tdf = Seq((1L, "05/26/2016 01:01:01.601"), (2L, "#$@#@#")).toDF("id", "dts")
val tts = unix_timestamp($"dts", "MM/dd/yyyy HH:mm:ss.SSS").cast("timestamp")
tdf.withColumn("ts", tts).show(2, false)

但是我得到的结果没有毫秒:
+---+-----------------------+---------------------+
|id |dts |ts |
+---+-----------------------+---------------------+
|1 |05/26/2016 01:01:01.601|2016-05-26 01:01:01.0|
|2 |#$@#@# |null |
+---+-----------------------+---------------------+

最佳答案

具有SimpleDateFormat的UDF可以工作。这个想法来自Ram Ghadiyaram到UDF logic的链接。

import java.text.SimpleDateFormat
import java.sql.Timestamp
import org.apache.spark.sql.functions.udf
import scala.util.{Try, Success, Failure}

val getTimestamp: (String => Option[Timestamp]) = s => s match {
case "" => None
case _ => {
val format = new SimpleDateFormat("MM/dd/yyyy' 'HH:mm:ss.SSS")
Try(new Timestamp(format.parse(s).getTime)) match {
case Success(t) => Some(t)
case Failure(_) => None
}
}
}

val getTimestampUDF = udf(getTimestamp)
val tdf = Seq((1L, "05/26/2016 01:01:01.601"), (2L, "#$@#@#")).toDF("id", "dts")
val tts = getTimestampUDF($"dts")
tdf.withColumn("ts", tts).show(2, false)

输出:
+---+-----------------------+-----------------------+
|id |dts |ts |
+---+-----------------------+-----------------------+
|1 |05/26/2016 01:01:01.601|2016-05-26 01:01:01.601|
|2 |#$@#@# |null |
+---+-----------------------+-----------------------+

关于scala - 如何使用Scala在Spark 2.1中将以毫秒为单位的字符串列转换为以毫秒为单位的时间戳?,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/44886772/

24 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com