gpt4 book ai didi

python - OpenCV + Python + 多处理行不通

转载 作者:太空宇宙 更新时间:2023-11-03 21:11:44 25 4
gpt4 key购买 nike

我想我在 OpenCV 的 Python 绑定(bind)中发现了一个错误,但由于椅子和键盘之间总是存在问题,而不是代码中存在问题,我想在这里确认而不是立即提交票证。

这是一个用于并行处理一堆图像的简单脚本:

import cv2
import multiprocessing
import glob
import numpy

def job(path, output):

image = cv2.imread(path)
image = cv2.cvtColor(image, cv2.COLOR_BGR2GRAY)

output.put(path)

if __name__ == "__main__":

main_image = cv2.imread("./image.png")
main_image = cv2.cvtColor(image, cv2.COLOR_BGR2GRAY)

output = multiprocessing.Queue()

processes = []

for path in glob.glob("./data/*"):

process = multiprocessing.Process(
target=job,
args=(path, output))

process.start()
processes.append(process)

for process in processes:
process.join()

# Collect all results
results = [output.get() for process in processes]

print 'Finished'

在此代码中,results = [output.get() for process in processes] 永远不会完成。现在真正奇怪的部分是,如果我注释掉 main_image = cv2.cvtColor(image, cv2.COLOR_BGR2GRAY) 行,它应该不会对并行计算有任何影响,脚本会完成。

./image.png./data/ 中的路径都指向普通图像,总共大约有 20 张。我尝试在内存中创建图像 (numpy.ones([100, 100, 3]).astype(numpy.float32)) 并且没有产生错误。

我有用 C++ 编写的类似代码,它运行得很好。我的环境:OS X 10.10、OpenCV 3.0.0、Python 2.7

那么,我是在做一些愚蠢的事情,还是这确实是 OpenCV 中出现在并行计算中的错误?


编辑:我还尝试了使用 multiprocessing.Pool.map() 的实现,结果是一样的。这是代码

import cv2
import multiprocessing
import glob
import numpy

def job(path):

image = cv2.imread(path)
image = cv2.cvtColor(image, cv2.COLOR_BGR2GRAY)

return path

if __name__ == "__main__":

image = cv2.imread("./image.png")
image = cv2.cvtColor(image, cv2.COLOR_BGR2GRAY)

paths = glob.glob("./data/*")
pool = multiprocessing.Pool()
result = pool.map(job, paths)

print 'Finished'

for value in result:
print value

对于使用非 opencv 任务的此类设计,我能够获得正确的结果,因此我坚信问题出在 opencv 方面。但请随时证明我错了 - 我很乐意这样做,因为这意味着我不必求助于 C++。

最佳答案

你不应该在加入之前得到吗?

根据 the python docs :

Joining processes that use queues

Bear in mind that a process that has put items in a queue will wait before terminating until all the buffered items are fed by the “feeder” thread to the underlying pipe. (The child process can call the cancel_join_thread() method of the queue to avoid this behaviour.)

This means that whenever you use a queue you need to make sure that all items which have been put on the queue will eventually be removed before the process is joined. Otherwise you cannot be sure that processes which have put items on the queue will terminate. Remember also that non-daemonic processes will be joined automatically.

An example which will deadlock is the following:

 from multiprocessing import Process, Queue


def f(q):
q.put('X' * 1000000)


if __name__ == '__main__':
queue = Queue()
p = Process(target=f, args=(queue,))
p.start()
p.join() # this deadlocks
obj = queue.get()

A fix here would be to swap the last two lines (or simply remove the p.join() line).

关于python - OpenCV + Python + 多处理行不通,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/31870541/

25 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com