gpt4 book ai didi

mysql - 如何在pyspark sql或Mysql中按键对值求和

转载 作者:行者123 更新时间:2023-12-04 10:55:14 25 4
gpt4 key购买 nike

我不知道如何按索引添加值帮助我解决这个问题:根据 Key 的值的索引添加值

输入CSV:

Country,Values
Canada,47;97;33;94;6
Canada,59;98;24;83;3
Canada,77;63;93;86;62
China,86;71;72;23;27
China,74;69;72;93;7
China,58;99;90;93;41
England,40;13;85;75;90
England,39;13;33;29;14
England,99;88;57;69;49
Germany,67;93;90;57;3
Germany,0;9;15;20;19
Germany,77;64;46;95;48
India,90;49;91;14;70
India,70;83;38;27;16
India,86;21;19;59;4

输出 csv 应该是:
Country,Values
Canada,183;258;150;263;71
China,218;239;234;209;75
England,178;114;175;173;153
Germany,144;166;151;172;70
India,246;153;148;100;90

最佳答案

Import required modules & create session


from pyspark.sql import SparkSession 

spark_session = SparkSession.builder.getOrCreate()

Read csv file & create view


file_df = spark_session.read.csv("/data/user/country_values.csv",header=True)

temp_view = file_df.createOrReplaceTempView("temp_view")

Write sql query


output_df = spark_session.sql("select country, concat( cast (sum (split(values,';') [0]) as int),';', cast (sum (split(Values,';')[1]) as int),';',cast(sum(split(values,';')[2]) as int),';', cast (sum (split(Values,';')[3]) as int),';', cast(sum (split(values,';')[4]) as int)) as values from temp_view group by Country order by country")

print result


output_df.show()

限制 -> SQL 查询的编写方式是 Values 列在用“;”分割后应该有 5 个值分隔符。值列在查询中被拆分 5 次,可以优化。

或者在主聚合之前使用 CTE 将值拆分为整数数组:
spark_session.sql(""" 

WITH t1 AS (
SELECT Country
, CAST(split(Values, ";") AS array<int>) AS V
FROM temp_view
)
SELECT Country
, CONCAT_WS(";", sum(V[0]), sum(V[1]), sum(V[2]), sum(V[3]), sum(V[4])) AS Values
FROM t1
GROUP BY Country
ORDER BY Country

""").show(truncate=False)

关于mysql - 如何在pyspark sql或Mysql中按键对值求和,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/59242215/

25 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com