gpt4 book ai didi

Python 多处理 : Killing a process gracefully

转载 作者:IT王子 更新时间:2023-10-29 01:22:14 25 4
gpt4 key购买 nike

import multiprocessing
import schedule


def worker():
#do some stuff


def sched(argv):
schedule.every(0.01).minutes.do(worker)
while True:
schedule.run_pending()


processs = []
..
..
p = multiprocessing.Process(target=sched,args)
..
..
processs.append(p)

for p in processs:
p.terminate()

优雅地杀死一系列进程?

如果不是,最简单的方法是什么?

目标是将配置文件重新加载到内存中,所以我想杀死所有子进程并创建其他进程,后者将读取新的配置文件。

编辑:添加了更多代码来解释我正在运行一个 while True 循环

编辑:这是@dano suggestion 之后的新代码

def get_config(self):
from ConfigParser import SafeConfigParser
..
return argv

def sched(self, args, event):
#schedule instruction:
schedule.every(0.01).minutes.do(self.worker,args)
while not event.is_set():
schedule.run_pending()

def dispatch_processs(self, conf):
processs = []
event = multiprocessing.Event()

for conf in self.get_config():
process = multiprocessing.Process(target=self.sched,args=( i for i in conf), kwargs={'event' : event})
processs.append((process, event)
return processs

def start_process(self, process):
process.start()

def gracefull_process(self, process):
process.join()

def main(self):
while True:
processs = self.dispatch_processs(self.get_config())
print ("%s processes running " % len(processs) )

for process, event in processs:

self.start_process(process)
time.sleep(1)
event.set()
self.gracefull_process(process)

代码的优点是我可以编辑配置文件,并且进程也会重新加载其配置。

问题是只有第一个进程运行而其他进程被忽略。

编辑:这救了我的命,在 schedule() 中使用 while True 不是一个好主意,所以我改为设置 refresh_time

def sched(self, args, event):

schedule.every(0.01).minutes.do(self.worker,args)
for i in range(refresh_time):
schedule.run_pending()
time.sleep(1)

def start_processs(self, processs):
for p,event in processs:
if not p.is_alive():
p.start()
time.sleep(1)
event.set()

self.gracefull_processs(processs)

def gracefull_processs(self, processs):
for p,event in processs:
p.join()
processs = self.dispatch_processs(self.get_config())
self.start_processs(processs)

def main(self):

while True:
processs = self.dispatch_processs(self.get_config())

self.start_processs(processs)
break
print ("Reloading function main")
self.main()

最佳答案

如果您不介意在worker 完成其所有工作后才中止,添加一个multiprocessing.Event 非常简单。优雅地处理退出:

import multiprocessing
import schedule


def worker():
#do some stuff

def sched(argv, event=None):
schedule.every(0.01).minutes.do(worker)
while not event.is_set(): # Run until we're told to shut down.
schedule.run_pending()

processes = []
..
..
event = multiprocessing.Event()
p = multiprocessing.Process(target=sched,args, kwargs={'event' : event})
..
..
processes.append((p, event))

# Tell all processes to shut down
for _, event in processes:
event.set()

# Now actually wait for them to shut down
for p, _ in processes:
p.join()

关于Python 多处理 : Killing a process gracefully,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/26627382/

25 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com