gpt4 book ai didi

tensorflow - 从子流程创建引擎时 TensorRT 创建异常

转载 作者:行者123 更新时间:2023-12-05 07:35:20 24 4
gpt4 key购买 nike

从子进程运行 tensorrt 时遇到问题。我不确定这是一个 tensorrt 错误还是我做错了什么。如果这是一个集成错误,我想知道这是否已经在 tensorflow 1.7 的新版本中得到解决。
以下是错误摘要以及重现方法。

工作 单个进程的 TensorRT 示例 Python 代码:

import pycuda.driver as cuda
import pycuda.autoinit
import argparse
import numpy as np
import time
import tensorrt as trt
from tensorrt.parsers import uffparser

uff_model = open('resnet_v2_50_dc.uff', 'rb').read()

parser = uffparser.create_uff_parser()
parser.register_input("input", (3, 224, 224), 0)
parser.register_output("resnet_v2_50/predictions/Reshape_1")


trt_logger = trt.infer.ConsoleLogger(trt.infer.LogSeverity.INFO)

engine = trt.utils.uff_to_trt_engine(logger=trt_logger,
stream=uff_model,
parser=parser,
max_batch_size=4,
max_workspace_size= 1 << 30,
datatype=trt.infer.DataType.FLOAT)

非工作 TensorRT 示例 Python 代码,其中
trt.utils.uff_to_trt_engine() 从子进程调用:

import pycuda.driver as cuda
import pycuda.autoinit
import argparse
import numpy as np
import time
import tensorrt as trt
from tensorrt.parsers import uffparser
import multiprocessing
from multiprocessing import sharedctypes, Queue

def inference_process():
uff_model = open('resnet_v2_50_dc.uff', 'rb').read()

parser = uffparser.create_uff_parser()
parser.register_input("input", (3, 224, 224), 0)
parser.register_output("resnet_v2_50/predictions/Reshape_1")

trt_logger = trt.infer.ConsoleLogger(trt.infer.LogSeverity.INFO)
engine = trt.utils.uff_to_trt_engine(logger=trt_logger,
stream=uff_model,
parser=parser,
max_batch_size=4,
max_workspace_size= 1 << 30,
datatype=trt.infer.DataType.FLOAT)

inference_p = multiprocessing.Process(target=inference_process, args=( ))
inference_p.start()

控制台错误消息:

[TensorRT] ERROR: cudnnLayerUtils.cpp (288) - Cuda Error in smVersion: 3
terminate called after throwing an instance of 'nvinfer1::CudaError'
what(): std::exception

最佳答案

你应该在子进程中导入tensorRT!

可能是:

import pycuda.driver as cuda
import pycuda.autoinit
import argparse
import numpy as np
import time
import multiprocessing
from multiprocessing import sharedctypes, Queue

def inference_process():
import tensorrt as trt
from tensorrt.parsers import uffparser

uff_model = open('resnet_v2_50_dc.uff', 'rb').read()

parser = uffparser.create_uff_parser()
parser.register_input("input", (3, 224, 224), 0)
parser.register_output("resnet_v2_50/predictions/Reshape_1")

trt_logger = trt.infer.ConsoleLogger(trt.infer.LogSeverity.INFO)
engine = trt.utils.uff_to_trt_engine(logger=trt_logger,
stream=uff_model,
parser=parser,
max_batch_size=4,
max_workspace_size= 1 << 30,
datatype=trt.infer.DataType.FLOAT)

inference_p = multiprocessing.Process(target=inference_process, args=( ))
inference_p.start()

关于tensorflow - 从子流程创建引擎时 TensorRT 创建异常,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/49640642/

24 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com