gpt4 book ai didi

python - 如何将 Vector 拆分成列 - 使用 PySpark

转载 作者:行者123 更新时间:2023-12-04 16:05:34 25 4
gpt4 key购买 nike

上下文:我有一个 DataFrame有 2 列:单词和向量。其中“向量”的列类型是 VectorUDT .

一个例子:

word    |  vector
assert | [435,323,324,212...]

我想得到这个:
word   |  v1 | v2  | v3 | v4 | v5 | v6 ......
assert | 435 | 5435| 698| 356|....

问题:

如何使用 PySpark 为每个维度拆分带有向量的列?

提前致谢

最佳答案

Spark >= 3.0.0

从 Spark 3.0.0 开始,这可以在不使用 UDF 的情况下完成。

from pyspark.ml.functions import vector_to_array

(df
.withColumn("xs", vector_to_array("vector")))
.select(["word"] + [col("xs")[i] for i in range(3)]))

## +-------+-----+-----+-----+
## | word|xs[0]|xs[1]|xs[2]|
## +-------+-----+-----+-----+
## | assert| 1.0| 2.0| 3.0|
## |require| 0.0| 2.0| 0.0|
## +-------+-----+-----+-----+

Spark < 3.0.0

一种可能的方法是与 RDD 相互转换:
from pyspark.ml.linalg import Vectors

df = sc.parallelize([
("assert", Vectors.dense([1, 2, 3])),
("require", Vectors.sparse(3, {1: 2}))
]).toDF(["word", "vector"])

def extract(row):
return (row.word, ) + tuple(row.vector.toArray().tolist())

df.rdd.map(extract).toDF(["word"]) # Vector values will be named _2, _3, ...

## +-------+---+---+---+
## | word| _2| _3| _4|
## +-------+---+---+---+
## | assert|1.0|2.0|3.0|
## |require|0.0|2.0|0.0|
## +-------+---+---+---+

另一种解决方案是创建一个 UDF:
from pyspark.sql.functions import udf, col
from pyspark.sql.types import ArrayType, DoubleType

def to_array(col):
def to_array_(v):
return v.toArray().tolist()
# Important: asNondeterministic requires Spark 2.3 or later
# It can be safely removed i.e.
# return udf(to_array_, ArrayType(DoubleType()))(col)
# but at the cost of decreased performance
return udf(to_array_, ArrayType(DoubleType())).asNondeterministic()(col)

(df
.withColumn("xs", to_array(col("vector")))
.select(["word"] + [col("xs")[i] for i in range(3)]))

## +-------+-----+-----+-----+
## | word|xs[0]|xs[1]|xs[2]|
## +-------+-----+-----+-----+
## | assert| 1.0| 2.0| 3.0|
## |require| 0.0| 2.0| 0.0|
## +-------+-----+-----+-----+

对于 Scala 等效项,请参阅 Spark Scala: How to convert Dataframe[vector] to DataFrame[f1:Double, ..., fn: Double)] .

关于python - 如何将 Vector 拆分成列 - 使用 PySpark,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/44904232/

25 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com