gpt4 book ai didi

json - 将pyspark数据帧转换为嵌套的json结构

转载 作者:行者123 更新时间:2023-12-04 08:18:02 24 4
gpt4 key购买 nike

我正在尝试将下面的数据框转换为嵌套的 json(字符串)
输入:

+---+---+-------+------+
| id|age| name |number|
+---+---+-------+------+
| 1| 12| smith| uber|
| 2| 13| jon| lunch|
| 3| 15|jocelyn|rental|
| 3| 15| megan| sds|
+---+---+-------+------+
输出:-
+---+---+--------------------------------------------------------------------+
|id |age|values
|
+---+---+--------------------------------------------------------------------+
|1 |12 |[{"number": "uber", "name": "smith"}]
|
|2 |13 |[{"number": "lunch", "name": "jon"}]
|
|3 |15 |[{"number": "rental", "name": "megan"}, {"number": "sds", "name": "jocelyn"}]|
+---+---+--------------------------------------------------------------------+
我的代码
from pyspark.sql import SparkSession
from pyspark.sql.types import ArrayType, StructField, StructType, StringType, IntegerType
# List
data = [(1,12,"smith", "uber"),
(2,13,"jon","lunch"),(3,15,"jocelyn","rental")
,(3,15,"megan","sds")
]

# Create a schema for the dataframe
schema = StructType([
StructField('id', IntegerType(), True),
StructField('age', IntegerType(), True),
StructField('number', StringType(), True),
StructField('name', StringType(), True)])

# Convert list to RDD
rdd = spark.sparkContext.parallelize(data)

# Create data frame
df = spark.createDataFrame(rdd,schema)
我尝试使用 collect_list 和 collect_set,但无法获得所需的输出。

最佳答案

您可以使用 collect_listto_json为每个组收集一组 jsons:

import pyspark.sql.functions as F

df2 = df.groupBy(
'id', 'age'
).agg(
F.collect_list(
F.to_json(
F.struct('number', 'name')
)
).alias('values')
).orderBy(
'id', 'age'
)

df2.show(truncate=False)
+---+---+-----------------------------------------------------------------------+
|id |age|values |
+---+---+-----------------------------------------------------------------------+
|1 |12 |[{"number":"smith","name":"uber"}] |
|2 |13 |[{"number":"jon","name":"lunch"}] |
|3 |15 |[{"number":"jocelyn","name":"rental"}, {"number":"megan","name":"sds"}]|
+---+---+-----------------------------------------------------------------------+

关于json - 将pyspark数据帧转换为嵌套的json结构,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/65620268/

24 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com