gpt4 book ai didi

python rq - 如何在完成多个其他作业时触发作业?多工作依赖工作?

转载 作者:行者123 更新时间:2023-12-03 06:35:56 25 4
gpt4 key购买 nike

我的 python redis 队列中有一个嵌套的作业结构。首先执行 rncopy 作业。完成后,接下来是 3 个相关的注册作业。当所有这 3 个作业的计算完成后,我想触发一个作业向我的前端发送 websocket 通知。

我目前的尝试:

    rncopy = redisqueue.enqueue(raw_nifti_copymachine, patientid, imagepath, timeout=6000)
t1c_reg = redisqueue.enqueue(modality_registrator, patientid, "t1c", timeout=6000, depends_on=rncopy)
t2_reg = redisqueue.enqueue(modality_registrator, patientid, "t2", timeout=6000, depends_on=rncopy)
fla_reg = redisqueue.enqueue(modality_registrator, patientid, "fla", timeout=6000, depends_on=rncopy)
notify = redisqueue.enqueue(print, patient_finished, patientid, timeout=6000, depends_on=(t1c_reg, t2_reg, fla_reg))

不幸的是,似乎多作业依赖功能从未合并到主版本中。我看到目前 git 上有两个拉取请求。有我可以使用的解决方法吗?

很抱歉未能提供可重现的示例。

最佳答案

我创建了一个“rq-manager”来解决具有多个和树状依赖项的类似问题:
https://github.com/crispyDyne/rq-manager
具有多个依赖项的项目结构如下所示。

def simpleTask(x):
return 2*x
project = {'jobs':[
{
'blocking':True, # this job must finished before moving on.
'func':simpleTask,'args': 0
},
{
'blocking':True, # this job, and its child jobs, must finished before moving on.
'jobs':[ # these child jobs will run in parallel
{'func':simpleTask,'args': 1},
{'func':simpleTask,'args': 2},
{'func':simpleTask,'args': 3}],
},
{ # this job will only run when the blocking jobs above finish.
'func':simpleTask,'args': 4
}
]}
然后交给管理员完成。
from rq_manager import manager, getProjectResults

managerJob = q.enqueue(manager,project)
projectResults = getProjectResults(managerJob)

返回
projectResults = [0, [2, 4, 6], 8]

当依赖作业需要父级的结果时。我创建了一个执行第一个作业的函数,然后向项目添加其他作业。所以对于你的例子:
def firstTask(patientid,imagepath):

raw_nifti_result = raw_nifti_copymachine(patientid,imagepath)

moreTasks = {'jobs':[
{'func':modality_registrator,'args':(patientid, "t1c", raw_nifti_result)},
{'func':modality_registrator,'args':(patientid, "t2", raw_nifti_result)},
{'func':modality_registrator,'args':(patientid, "fla", raw_nifti_result)},
]}

# returning a dictionary with an "addJobs" will add those tasks to the project.
return {'result':raw_nifti_result, 'addJobs':moreTasks}
该项目将如下所示:
project = {'jobs':[
{'blocking':True, # this job, and its child jobs, must finished before moving on.
'jobs':[
{
'func':firstTask, 'args':(patientid, imagepath)
'blocking':True, # this job must finished before moving on.
},
# "moreTasks" will be added here
]
}
{ # this job will only run when the blocking jobs above finish.
'func':print,'args': (patient_finished, patientid)
}
]}

如果最终作业需要先前作业的结果,则设置“previousJobArgs”标志。 “finalJob” 将接收一个包含先前结果的数组及其子作业结果的嵌套数组。
def finalJob(previousResults):
# previousResults = [
# raw_nifti_copymachine(patientid,imagepath),
# [
# modality_registrator(patientid, "t1c", raw_nifti_result),
# modality_registrator(patientid, "t2", raw_nifti_result),
# modality_registrator(patientid, "fla", raw_nifti_result),
# ]
# ]
return doSomethingWith(previousResults)

然后项目看起来像这样
project = {'jobs':[
{
#'blocking':True, # Blocking not needed.
'jobs':[
{
'func':firstTask, 'args':(patientid, imagepath)
'blocking':True, # this job must finished before moving on.
},
# "moreTasks" will be added here
]
}
{ # This job will wait, since it needs the previous job's results.
'func':finalJob, 'previousJobArgs': True # it gets all the previous jobs results
}
]}

希望 https://github.com/rq/rq/issues/260已实现,我的解决方案将过时!

关于python rq - 如何在完成多个其他作业时触发作业?多工作依赖工作?,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/49469546/

25 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com