gpt4 book ai didi

python - Pandas 数据框 CSV 减少磁盘大小

转载 作者:行者123 更新时间:2023-11-28 20:56:02 24 4
gpt4 key购买 nike

对于我的大学作业,我必须生成一个包含世界机场所有距离的 csv 文件...问题是我的 csv 文件重 151Mb。我想尽可能地减少它:这是我的 csv:

enter image description here

这是我的代码:

# drop all features we don't need
for attribute in df:
if attribute not in ('NAME', 'COUNTRY', 'IATA', 'LAT', 'LNG'):
df = df.drop(attribute, axis=1)

# create a dictionary of airports, each airport has the following structure:
# IATA : (NAME, COUNTRY, LAT, LNG)
airport_dict = {}
for airport in df.itertuples():
airport_dict[airport[3]] = (airport[1], airport[2], airport[4], airport[5])

# From tutorial 4 soulution:
airportcodes=list(airport_dict)
airportdists=pd.DataFrame()
for i, airport_code1 in enumerate(airportcodes):
airport1 = airport_dict[airport_code1]
dists=[]
for j, airport_code2 in enumerate(airportcodes):
if j > i:
airport2 = airport_dict[airport_code2]
dists.append(distanceBetweenAirports(airport1[2],airport1[3],airport2[2],airport2[3]))
else:
# little edit: no need to calculate the distance twice, all duplicates are set to 0 distance
dists.append(0)
airportdists[i]=dists
airportdists.columns=airportcodes
airportdists.index=airportcodes

# set all 0 distance values to NaN
airportdists = airportdists.replace(0, np.nan)
airportdists.to_csv(r'../Project Data Files-20190322/distances.csv')

我也试过在保存之前重新索引它:

# remove all NaN values
airportdists = airportdists.stack().reset_index()
airportdists.columns = ['airport1','airport2','distance']

但结果是一个包含 3 列和 1700 万列以及磁盘大小为 419Mb 的数据框...完全不是改进...

你能帮我缩小 csv 的大小吗?谢谢!

最佳答案

我以前做过类似的申请;这是我要做的:

很难缩小你的文件,但如果你的应用程序需要有机场与其他人之间的距离,我建议你创建 9541 个文件,每个文件将是机场与其他人的距离及其名称将是机场的名称。

在这种情况下,文件的加载非常快。

关于python - Pandas 数据框 CSV 减少磁盘大小,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/55299536/

24 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com