gpt4 book ai didi

scala - 如何使这些函数定义 D.R.Y.?

转载 作者:行者123 更新时间:2023-12-02 18:43:20 25 4
gpt4 key购买 nike

我有多个函数定义来编写 Parquet 文件,但我宁愿有一个通用函数。如何使一项通用功能起作用?问题是一个通用函数返回错误could not find implicit value for parameter writerFactory: com.github.mjakubowski84.parquet4s.ParquetWriter.ParquetWriterFactory[Product]多种功能:

import com.github.mjakubowski84.parquet4s.ParquetWriter
import org.apache.parquet.hadoop.metadata.CompressionCodecName
import org.apache.parquet.hadoop.ParquetFileWriter
import java.time.format.DateTimeFormatter
import java.time.LocalDate

def writeParquetUnsegmented(rows: List[ACaseClass], date: String) = {
val outPath = s"s3a://redacted/path-$date.parquet"
ParquetWriter.writeAndClose(outPath,
rows,
ParquetWriter.Options(
writeMode = ParquetFileWriter.Mode.OVERWRITE,
compressionCodecName = CompressionCodecName.SNAPPY
))
}

def writeParquetSegmented(rows: List[BCaseClass], date: String) = {
val outPath = s"s3a://redacted/path-$date.parquet"
ParquetWriter.writeAndClose(outPath,
rows,
ParquetWriter.Options(
writeMode = ParquetFileWriter.Mode.OVERWRITE,
compressionCodecName = CompressionCodecName.SNAPPY
))
}
一功能:
def writeParquetSegmented(rows: List[Product], date: String) = {
val outPath = s"s3a://redacted/path-$date.parquet"
ParquetWriter.writeAndClose(outPath,
rows,
ParquetWriter.Options(
writeMode = ParquetFileWriter.Mode.OVERWRITE,
compressionCodecName = CompressionCodecName.SNAPPY
))
}
我也尝试了不同的函数签名并得到了同样的错误。
def writeParquetSegmented[A](rows: List[A], date: String)
def writeParquetSegmented[A :< RowParent](rows: List[A], date: String)

最佳答案

我会试试这个签名:

def writeParquetSegmented[A : ParquetWriterFactory](rows: List[A], date: String)
编译器会将 context bound [A : ParquetWriterFactory]进入错误消息提示的隐式参数。
def writeParquetSegmented[A](rows: List[A], date: String)(implicit pwf: ParquetWriterFactory)

关于scala - 如何使这些函数定义 D.R.Y.?,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/62544898/

25 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com