gpt4 book ai didi

python - 将 python 移植到 scala

转载 作者:太空狗 更新时间:2023-10-30 00:13:17 27 4
gpt4 key购买 nike

我正在尝试移植 python 代码 ( spark sql distance to nearest holiday )

last_holiday = index.value[0]
for next_holiday in index.value:
if next_holiday >= date:
break
last_holiday = next_holiday
if last_holiday > date:
last_holiday = None
if next_holiday < date:
next_holiday = None

到标度。我(还)没有太多的 scala 经验,但是 break 似乎并不干净/scala 的方式来做到这一点。请告诉我如何“干净地”将其移植到 Scala。

breakable {
for (next_holiday <- indexAT.value) {
val next = next_holiday.toLocalDate
println("next ", next)
println("last ", last_holiday)

if (next.isAfter(current) || next.equals(current)) break
// check do I actually get here?
last_holiday = Option(next)
} // TODO this is so not scala and ugly ...
if (last_holiday.isDefined) {
if (last_holiday.get.isAfter(current)) {
last_holiday = None
}
}
if (last_holiday.isDefined) {
if (last_holiday.get.isBefore(current)) {
// TODO use one more var because out of scope
next = None
}
}
}

此处相同的代码在更多上下文中 https://gist.github.com/geoHeil/ff513b97a2b3e16241fdd9c8b0f3bdfb此外,我不确定我应该将中断设置为多“大” - 但我希望在代码的 scala native 端口中摆脱它。

最佳答案

所以这不是直接移植,但我认为它更接近惯用的 Scala。我会将假期列表视为顺序对列表,然后找出输入日期位于哪对之间。

这是一个完整的例子:

scala> import java.sql.Date
import java.sql.Date

scala> import java.text.SimpleDateFormat
import java.text.SimpleDateFormat

scala> :pa
// Entering paste mode (ctrl-D to finish)
def parseDate(in: String): java.sql.Date =
{
val formatter = new SimpleDateFormat("MM/dd/yyyy")
val d = formatter.parse(in)
new java.sql.Date(d.getTime());
}
// Exiting paste mode, now interpreting.
parseDate: (in: String)java.sql.Date

scala> val holidays = Seq("11/24/2016", "12/25/2016", "12/31/2016").map(parseDate)
holidays: Seq[java.sql.Date] = List(2016-11-24, 2016-12-25, 2016-12-31)

scala> val hP = sc.broadcast(holidays.zip(holidays.tail))
hP: org.apache.spark.broadcast.Broadcast[Seq[(java.sql.Date, java.sql.Date)]] = Broadcast(4)

scala> def geq(d1: Date, d2: Date) = d1.after(d2) || d1.equals(d2)
geq: (d1: java.sql.Date, d2: java.sql.Date)Boolean

scala> def leq(d1: Date, d2: Date) = d1.before(d2) || d1.equals(d2)
leq: (d1: java.sql.Date, d2: java.sql.Date)Boolean

scala> :pa
// Entering paste mode (ctrl-D to finish)
val findNearestHolliday = udf((inDate: Date) => {
val hP_l = hP.value
val dates = hP_l.collectFirst{case (d1, d2) if (geq(inDate, d1) && leq(inDate, d2)) => (Some(d1), Some(d2))}
dates.getOrElse(if (leq(inDate, hP_l.head._1)) (None, Some(hP_l.head._1)) else (Some(hP_l.last._2), None))
})
// Exiting paste mode, now interpreting.
findNearestHolliday: org.apache.spark.sql.UserDefinedFunction = UserDefinedFunction(<function1>,StructType(StructField(_1,DateType,true), StructField(_2,DateType,true)),List(DateType))

scala> val df = Seq((1, parseDate("11/01/2016")), (2, parseDate("12/01/2016")), (3, parseDate("01/01/2017"))).toDF("id", "date")
df: org.apache.spark.sql.DataFrame = [id: int, date: date]

scala> val df2 = df.withColumn("nearestHollidays", findNearestHolliday($"date"))
df2: org.apache.spark.sql.DataFrame = [id: int, date: date, nearestHollidays: struct<_1:date,_2:date>]

scala> df2.show
+---+----------+--------------------+
| id| date| nearestHollidays|
+---+----------+--------------------+
| 1|2016-11-01| [null,2016-11-24]|
| 2|2016-12-01|[2016-11-24,2016-...|
| 3|2017-01-01| [2016-12-31,null]|
+---+----------+--------------------+

scala> df2.foreach{println}
[3,2017-01-01,[2016-12-31,null]]
[1,2016-11-01,[null,2016-11-24]]
[2,2016-12-01,[2016-11-24,2016-12-25]]

关于python - 将 python 移植到 scala,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/40767743/

27 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com