gpt4 book ai didi

python - 使用多处理从数据帧写入 csv 而不会弄乱输出

转载 作者:行者123 更新时间:2023-12-05 05:13:07 42 4
gpt4 key购买 nike

import numpy as np
import pandas as pd
from multiprocessing import Pool
import threading

#Load the data
df = pd.read_csv('crsp_short.csv', low_memory=False)

def funk(date):
...
# for each date in df.date.unique() do stuff which gives sample dataframe
# as an output
#then write it to file

sample.to_csv('crsp_full.csv', mode='a')

def evaluation(f_list):
with futures.ProcessPoolExecutor() as pool:
return pool.map(funk, f_list)

# list_s is a list of dates I want to calculate function funk for

evaluation(list_s)

我得到一个 csv 文件作为输出,其中一些行乱七八糟,因为 python 同时从不同的线程写入一些片段。我想我需要使用队列,但我无法修改代码以使其工作。想法如何去做?否则需要很长时间才能得到结果。

最佳答案

这解决了问题(Pool 为您排好队)

Python: Writing to a single file with queue while using multiprocessing Pool

我的代码版本没有弄乱输出的 csv 文件:

import numpy as np
import pandas as pd
from multiprocessing import Pool
import threading

#Load the data
df = pd.read_csv('crsp_short.csv', low_memory=False)

def funk(date):
...
# for each date in df.date.unique() do stuff which gives sample dataframe
# as an output

return sample

# list_s is a list of dates I want to calculate function funk for

def mp_handler():
# 28 is a number of processes I want to run
p = multiprocessing.Pool(28)
for result in p.imap(funk, list_s):
result.to_csv('crsp_full.csv', mode='a')


if __name__=='__main__':
mp_handler()

关于python - 使用多处理从数据帧写入 csv 而不会弄乱输出,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/53950320/

42 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com