gpt4 book ai didi

pyspark - Chain withColumn 用于在 PySpark 上多次更改一列

转载 作者:行者123 更新时间:2023-12-02 20:17:52 26 4
gpt4 key购买 nike

我使用的是 UCI 的成人年收入。

我有一个数据框,其中一列中有一个类别变量,我想将其分组为不同的类别(一些常见的特征工程)。

df.groupBy('education').count().show()

给出:

+------------+-----+
| education|count|
+------------+-----+
| 10th| 1223|
| Masters| 2514|
| 5th-6th| 449|
| Assoc-acdm| 1507|
| Assoc-voc| 1959|
| 7th-8th| 823|
| 9th| 676|
| HS-grad|14783|
| Bachelors| 7570|
| 11th| 1619|
| 1st-4th| 222|
| Preschool| 72|
| 12th| 577|
| Doctorate| 544|
|Some-college| 9899|
| Prof-school| 785|
+------------+-----+

我想将以下类别放入特定组中,这样:

dropout = ['Preschool', '1st-4th', '5th-6th', '7th-8th', '9th', '10th', '11th', '12th']
community_college = ['Assoc-acdm', 'Assoc-voc', 'Some-college']
masters = ['Prof-school']

为此我可以执行以下操作:

from pyspark.sql.functions import when, col
df = df.withColumn('education', when(col('education').isin(dropout), 'Dropout').otherwise(df['education']))
df = df.withColumn('education', when(col('education').isin(community_college), 'Community_college').otherwise(df['education']))
df = df.withColumn('education', when(col('education') == 'Prof-school', 'Masters').otherwise(df['education']))

获取:

+-----------------+-----+
| education|count|
+-----------------+-----+
| Masters| 3299|
| HS-grad|14783|
| Bachelors| 7570|
| Dropout| 5661|
| Doctorate| 544|
|Community_college|13365|
+-----------------+-----+

是否有可能将这些withColumn链接起来?我尝试了以下方法但没有成功:

df = df.withColumn('education', when(col('education').isin(dropout), 'Dropout').otherwise(df['education']))\
.withColumn('education', when(col('education').isin(community_college), 'Community_college').otherwise(df['education']))\
.withColumn('education', when(col('education') == 'Prof-school', 'Masters').otherwise(df['education']))

最佳答案

是的,通过链接when()可以实现。

df = df.withColumn('education', when(col('education').isin(dropout), 'Dropout')\
.when(col('education').isin(community_college), 'Community_college')\
.when(col('education') == 'Prof-school', 'Masters') \
.otherwise(df['education']))

关于pyspark - Chain withColumn 用于在 PySpark 上多次更改一列,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/52016771/

26 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com