gpt4 book ai didi

python - Python多重处理,无法腌制thread.lock(pymongo)

转载 作者:行者123 更新时间:2023-12-03 12:54:45 24 4
gpt4 key购买 nike

我有一个使用以下方法的类:

def get_add_new_links(self, max_num_links):
self.get_links_m2(max_num_links)
processes = mp.cpu_count()
pool = mp.Pool(processes=processes)
func = partial(worker, self)
with open(os.path.join(self.report_path, "links.txt"), "r") as f:
reports = pool.map(func, f.readlines())
pool.close()
pool.join()
其中 get_links_m2是创建文件“links.txt”的另一种方法。 worker 是:
def worker(obje, link):
doc, rep = obje.get_info_m2(link)
obje.add_new_active(doc, sure_not_exists=True)
return rep
方法 get_info_m2访问该链接并提取一些信息。方法 add_new_active将信息添加到MongoDB。
我的代码可能出什么问题了?当我运行它时,我得到此错误(和回溯):

File "controller.py", line 234, in get_add_new_links

reports = pool.map(func, f.readlines())   File "/home/vladimir/anaconda3/lib/python3.5/multiprocessing/pool.py", line

260, in map

return self._map_async(func, iterable, mapstar, chunksize).get()   File "/home/vladimir/anaconda3/lib/python3.5/multiprocessing/pool.py",

line 608, in get

raise self._value   File "/home/vladimir/anaconda3/lib/python3.5/multiprocessing/pool.py", line

385, in _handle_tasks

put(task)   File "/home/vladimir/anaconda3/lib/python3.5/multiprocessing/connection.py",

line 206, in send

self._send_bytes(ForkingPickler.dumps(obj))   File "/home/vladimir/anaconda3/lib/python3.5/multiprocessing/reduction.py",

line 50, in dumps

cls(buf, protocol).dump(obj) TypeError: can't pickle _thread.lock objects

最佳答案

the docs中所述:

永远不要这样做:

client = pymongo.MongoClient()

# Each child process attempts to copy a global MongoClient
# created in the parent process. Never do this.
def func():
db = client.mydb
# Do something with db.

proc = multiprocessing.Process(target=func)
proc.start()

相反,必须在worker函数内部初始化客户端。

关于python - Python多重处理,无法腌制thread.lock(pymongo),我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/41071563/

24 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com