gpt4 book ai didi

python - 多进程同步

转载 作者:行者123 更新时间:2023-12-04 11:02:08 26 4
gpt4 key购买 nike

假设我需要并行运行 5 个进程,但进程 2 到 5 依赖于进程 1。如何确保进程 1 会在其他进程之前运行?我应该使用 Python 的 Multiprocessing Event() 或 Lock() 还是两者都使用?

示例 1:

process 1
process 2 or 3 or 4 or 5
process 2 or 3 or 4 or 5
process 2 or 3 or 4 or 5
process 2 or 3 or 4 or 5

示例 2:
process 3
process 1 or 2 or 4 or 5
process 1 or 2 or 4 or 5
process 1 or 2 or 4 or 5
process 1 or 2 or 4 or 5

Example3 有 2 个依赖项:
process 1
process 2 or 3 (run in parallel after 1)
process 4
process 5 or 6 (run in parallel after 1 and after 4)

所有进程都调用相同的函数(msg),但都返回不同的值。

我需要一些指导,不一定是代码,如果你能提供,谢谢。

伪代码:
import Multiprocessing as mp

function(msg):
return 1 if msg == "one"
return 2 if msg == "two"
return 3 if msg == "three"
return 4 if msg == "four"
return 5 if msg == "five"

msgs = ['one', 'two', 'three', 'four', 'five']

jobs = []
for msg in msgs:
p = Process(target=function, args=(msg,))
p.start()
jobs.append(p)

for job in jobs:
job.join()

在这种情况下,所有进程都将无序运行。

如果我想要过程 1 在我可以做之前:

可能的解决方案:
import Multiprocessing as mp

function(msg):
return 1 if msg == "one"
return 2 if msg == "two"
return 3 if msg == "three"
return 4 if msg == "four"
return 5 if msg == "five"

msg = ['one']
p1 = Process(target=function, args=(msg,))
p1.start()
p1.join()


msgs = ['two', 'three', 'four', 'five']

jobs = []
for msg in msgs:
p = Process(target=function, args=(msg,))
p.start()
jobs.append(p)

for job in jobs:
job.join()

有没有更好的解决方案,或者这样很好?它有效,但这并不意味着它不能以更好的方式完成(例如,更少的代码重复)。

最佳答案

不知道最后做了什么,但你可以使用 Event毕竟为了这个目的:

import multiprocessing as mp

def function(msg,events):
if msg == "one":
print(1)
events[0].set()
if msg == "two":
print("2 waiting")
events[0].wait()
events[1].wait()
print("2 done")
if msg == "three":
print(3)
events[1].set()
if msg == "four":
print(4)
if msg == "five":
print("5 waiting")
events[0].wait()
print("5 done")

if __name__ == '__main__':
events = [mp.Event(),mp.Event()]
jobs = []
for item in ['one','two','three','four','five']:
job = mp.Process(target=function, args=(item,events))
job.start()
jobs.append(job)
for job in jobs:
job.join()

这里我特意引入了第二个依赖:p2 依赖于 p1 和 p3(而 p5 仍然依赖于 p1)。这样,如果您多次运行它,它会显示出更多的变化(比单个依赖项):

python procy.py
2 waiting
4
1
5 waiting
5 done
3
2 done

python procy.py
1
5 waiting
2 waiting
4
5 done
3
2 done

python procy.py
1
4
3
5 waiting
5 done
2 waiting
2 done

关于python - 多进程同步,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/58737536/

26 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com