gpt4 book ai didi

scala - 在 Apache-spark 中,如何添加稀疏向量?

转载 作者:行者123 更新时间:2023-12-04 18:37:21 25 4
gpt4 key购买 nike

我正在尝试使用 spark 开发自己的前馈神经网络。但是我在 spark 的稀疏向量中找不到乘法、加法或除法等操作。该文件称它是使用微风矢量实现的。但我可以在微风中找到添加操作,但在 Spark 矢量中找不到。如何解决这个问题呢?

最佳答案

星火Vector实现不支持代数运算。不幸的是,Spark API 不再支持转换 SparkVectors进入 BreezeVectors通过方法asBreezefromBreeze ,因为这些方法已被设置为相对于 spark 的包私有(private)。包裹。

但是,您可以编写自己的 Spark 到 Breeze 转换器。下面的代码使用类型类定义了这样一个转换器,它允许您始终获得最具体的类型。

import breeze.linalg.{Vector => BreezeVector, DenseVector => DenseBreezeVector, SparseVector => SparseBreezeVector}
import org.apache.spark.mllib.linalg.{Vector => SparkVector, DenseVector => DenseSparkVector, SparseVector => SparseSparkVector}

package object myPackage {

implicit class RichSparkVector[I <: SparkVector](vector: I) {
def asBreeze[O <: BreezeVector[Double]](implicit converter: Spark2BreezeConverter[I, O]): O = {
converter.convert(vector)
}
}

implicit class RichBreezeVector[I <: BreezeVector[Double]](breezeVector: I) {
def fromBreeze[O <: SparkVector](implicit converter: Breeze2SparkConverter[I, O]): O = {
converter.convert(breezeVector)
}
}
}

trait Spark2BreezeConverter[I <: SparkVector, O <: BreezeVector[Double]] {
def convert(sparkVector: I): O
}

object Spark2BreezeConverter {
implicit val denseSpark2DenseBreezeConverter = new Spark2BreezeConverter[DenseSparkVector, DenseBreezeVector[Double]] {
override def convert(sparkVector: DenseSparkVector): DenseBreezeVector[Double] = {
new DenseBreezeVector[Double](sparkVector.values)
}
}

implicit val sparkSpark2SparseBreezeConverter = new Spark2BreezeConverter[SparseSparkVector, SparseBreezeVector[Double]] {
override def convert(sparkVector: SparseSparkVector): SparseBreezeVector[Double] = {
new SparseBreezeVector[Double](sparkVector.indices, sparkVector.values, sparkVector.size)
}
}

implicit val defaultSpark2BreezeConverter = new Spark2BreezeConverter[SparkVector, BreezeVector[Double]] {
override def convert(sparkVector: SparkVector): BreezeVector[Double] = {
sparkVector match {
case dv: DenseSparkVector => denseSpark2DenseBreezeConverter.convert(dv)
case sv: SparseSparkVector => sparkSpark2SparseBreezeConverter.convert(sv)
}
}
}
}

trait Breeze2SparkConverter[I <: BreezeVector[Double], O <: SparkVector] {
def convert(breezeVector: I): O
}

object Breeze2SparkConverter {
implicit val denseBreeze2DenseSparkVector = new Breeze2SparkConverter[DenseBreezeVector[Double], DenseSparkVector] {
override def convert(breezeVector: DenseBreezeVector[Double]): DenseSparkVector = {
new DenseSparkVector(breezeVector.data)
}
}

implicit val sparseBreeze2SparseSparkVector = new Breeze2SparkConverter[SparseBreezeVector[Double], SparseSparkVector] {
override def convert(breezeVector: SparseBreezeVector[Double]): SparseSparkVector = {
val size = breezeVector.activeSize
val indices = breezeVector.array.index.take(size)
val data = breezeVector.data.take(size)
new SparseSparkVector(size, indices, data)
}
}

implicit val defaultBreeze2SparkVector = new Breeze2SparkConverter[BreezeVector[Double], SparkVector] {
override def convert(breezeVector: BreezeVector[Double]): SparkVector = {
breezeVector match {
case dv: DenseBreezeVector[Double] => denseBreeze2DenseSparkVector.convert(dv)
case sv: SparseBreezeVector[Double] => sparseBreeze2SparseSparkVector.convert(sv)
}
}
}
}

关于scala - 在 Apache-spark 中,如何添加稀疏向量?,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/32456405/

25 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com