gpt4 book ai didi

pyspark - 如何将sql输出转换为Dataframe?

转载 作者:行者123 更新时间:2023-12-05 05:37:47 27 4
gpt4 key购买 nike

我有一个数据框,从中创建一个临时 View 以运行 sql 查询。经过几次 sql 查询后,我想将 sql 查询的输出转换为新的 Dataframe。我希望将数据返回到 Dataframe 中的原因是我可以将其保存到 blob 存储中。

那么,问题是:将 sql 查询输出转换为 Dataframe 的正确方法是什么?

这是我目前的代码:

%scala
//read data from Azure blob
...
var df = spark.read.parquet(some_path)

// create temp view
df.createOrReplaceTempView("data_sample")

%sql
//have some sqlqueries, the one below is just an example
SELECT
date,
count(*) as cnt
FROM
data_sample
GROUP BY
date

//Now I want to have a dataframe that has the above sql output. How to do that?
Preferably the code would be in python or scala.


最佳答案

斯卡拉:

var df = spark.sql(s"""
SELECT
date,
count(*) as cnt
FROM
data_sample
GROUP BY
date
""")

PySpark:

df = spark.sql(f'''
SELECT
date,
count(*) as cnt
FROM
data_sample
GROUP BY
date
''')

关于pyspark - 如何将sql输出转换为Dataframe?,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/73052537/

27 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com