gpt4 book ai didi

python - 如何使用 SimPy 在 Python 中运行并发 "tasks",其中每个任务都在等待多个资源?

转载 作者:行者123 更新时间:2023-12-04 04:22:39 25 4
gpt4 key购买 nike

我正在建模的系统具有需要以一系列任务的形式进行维护的对象。目前在这个模型中,他们请求一个“工作位置”,然后一旦他们捕获了它,他们就会请求完成第一个任务所需的“ worker ”资源。任务是列表中的对象,它是已对其进行维护的对象的属性,并且一些任务网络允许并行完成多个任务。

就目前而言,我在 for 循环中迭代任务列表,其中嵌套的 for 循环然后请求并捕获必要的“ worker ”,然后在所有“ worker ”都被捕获后在任务持续时间内超时.

        with self.location.request() as loc_req: # request work location
yield loc_req # wait for work location

for task in self.tasks[:]:
t_duration = task.calc_duration(self)
if t_duration == 0: # skip tasks where probability sets duration to 0
continue
needs = task.resources[:]
## check if available workers are useful; if not, release them
task_cur_resources = []
for res,req in self.resources['available'].copy():
if res.worker_id in needs:
self.resources['busy'].add((res,req))
task_cur_resources.append((res,req))
needs.remove(res.worker_id)
else:
res.release(req)
self.resources['available'].remove((res,req))
## acquire all resources needed for task
for need in needs[:]:
priority = len(needs) # prioritize tasks closer to meeting needs
res = self.location.workers[need] # set resource to worker of type need
req = res.request(priority) # save the request object
yield req # wait for resource
## stash resource and request in order to release later
task_cur_resources.append((res, req))
self.resources['busy'].add((res,req))
needs.remove(res.worker_id)

## perform task with task duration timeout
yield self.env.process(task.perform(self, t_duration))

## make resources available
for worker in task_cur_resources:
self.resources['busy'].remove(worker)
self.resources['available'].add(worker)

for res,req in self.resources['available']:
res.release(req)
self.resources['available'] = set()

问题是这不允许并发任务完成。这些任务是根据输入参数以正态分布的持续时间顺序完成的。我如何更改此设置以允许在其前任完成且工作人员可用时完成任务?我尝试了一个 while 循环,它遍历任务列表和他们的前辈完成的计划任务,但由于我明显滥用 SimPy 和 yield ,我一直以无限循环结束。有任何想法吗?

最佳答案

env.process() 创建一个事件
诀窍是将这些事件收集在一个列表中,然后使用 env.all_of 在该事件列表上产生结果。还有 env.any_of
这是一个例子

"""
Demonstrates how to grab some resources concurrently, and then do some tasks concurently

programer Michael R. Gibbs
"""

import simpy

class Asset():
"""
asset that needs some maintenace
creating the asset starts the assets processes

to keep thins simple, I modeled just the maintenace task and skipped a prime task that would
get interuppted when it was time to do maintence
"""
def __init__(self, env, id, maintTasks):
"""
store attributes, kick opp mantence process
"""
self.id = id
self.env = env
self.maintTasks = maintTasks

self.env.process(self.do_maintence())

def do_maintence(self):
"""
waits till time to do maintence
grabs a work location resource
do each maintence task
each maintaince task has a list of resouces that are grabbed at the same time
and once they have all been seized
then the maintence task's list of processes are all kicked off at the same time
affter all the process are finish, then all the resources are released
"""

yield self.env.timeout(3)

print(f'{self.env.now} object {self.id} is starting maintense')

# grab work location
with workLocRes.request() as workReq:
yield workReq
print(f'{self.env.now} object {self.id} got work loc, starting tasks')

# for each maintTask, get the list of resources, and tasks
for res, tasks in self.maintTasks:
print(f'{self.env.now} -- object {self.id} start loop tasks')

# get the requests for the needed resources
print(f'{self.env.now} object {self.id} res 1: {res1.count}, res 2 {res2.count}')
resList = []
for r in res:
req = r.request()
req.r = r # save the resource in the request
resList.append(req)
# one yield that waits for all the requests
yield self.env.all_of(resList)
print(f'{self.env.now} object {self.id} res 1: {res1.count}, res 2 {res2.count}')

# start all the tasks and save the events
taskList = []
for t in tasks:
taskList.append(self.env.process(t(self.env, self.id)))
#one yield that waits for all the processes to finish
yield self.env.all_of(taskList)

for r in resList:
r.r.release(r)
print(f'{self.env.now} object {self.id} res 1: {res1.count}, res 2 {res2.count}')
print(f'{self.env.now} -- object {self.id} finish loop tasks')

print(f'{self.env.now} object {self.id} finish all tasks')


# some processes for a maintence task
def task1(env, obj_id):
print(f'{env.now} starting task 1 for object {obj_id}')
yield env.timeout(3)
print(f'{env.now} finish task 1 for object {obj_id}')

def task2(env, obj_id):
print(f'{env.now} starting task 2 for object {obj_id}')
yield env.timeout(3)
print(f'{env.now} finish task 2 for object {obj_id}')

def task3(env, obj_id):
print(f'{env.now} starting task 3 for object {obj_id}')
yield env.timeout(3)
print(f'{env.now} finish task 3 for object {obj_id}')


env = simpy.Environment()

workLocRes = simpy.Resource(env, capacity=3)
res1 = simpy.Resource(env, capacity=4)
res2 = simpy.Resource(env, capacity=5)

# build the maintence task with nested list of resources, and nested processes
maintTask = []
maintTask.append(([res1],[task1]))
maintTask.append(([res1,res2],[task2,task3]))

# creating asset also starts it
a = Asset(env,1,maintTask)

env.run(20)

关于python - 如何使用 SimPy 在 Python 中运行并发 "tasks",其中每个任务都在等待多个资源?,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/58714823/

25 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com