gpt4 book ai didi

docker - 有自定义操作时如何使用docker image tensorflow/serving服务tensorflow模型?

转载 作者:行者123 更新时间:2023-12-02 19:29:32 25 4
gpt4 key购买 nike

我正在尝试在这里找到的模型中使用tf-sentencepiece操作https://github.com/google/sentencepiece/tree/master/tensorflow

建立模型并获取包含变量和 Assets 的saved_model.pb文件没有问题。但是,如果我尝试将docker镜像用于tensorflow / serving,它会说

Loading servable: {name: model version: 1} failed: 
Not found: Op type not registered 'SentencepieceEncodeSparse' in binary running on 0ccbcd3998d1.
Make sure the Op and Kernel are registered in the binary running in this process.
Note that if you are loading a saved graph which used ops from tf.contrib, accessing
(e.g.) `tf.contrib.resampler` should be done before importing the graph,
as contrib ops are lazily registered when the module is first accessed.

我不熟悉如何手动构建任何东西,并希望我可以进行很多更改。

最佳答案

一种方法是:

  • 拉docker开发镜像

    $ docker pull tensorflow / serving:最新开发的
  • 在容器中,更改代码

    $ docker run -it tensorflow / serving:latest-devel

  • 修改代码以添加op依赖项 here
  • 在容器中,构建TensorFlow Serving

    容器:$ tensorflow_serving / model_servers:tensorflow_model_server && cp bazel-bin / tensorflow_serving / model_servers / tensorflow_model_server / usr / local / bin /
  • 使用exit命令退出容器
  • 查找容器ID:

    $ docker ps
  • 使用该容器ID提交开发镜像:

    $ docker commit $ USER / tf-serving-devel-custom-op
  • 现在使用开发容器作为源来构建服务容器

    $ mkdir / tmp / tfserving

    $ cd / tmp / tfserving

    $ git clone https://github.com/tensorflow/serving

    $ docker build -t $ USER / tensorflow-serving --build-arg TF_SERVING_BUILD_IMAGE = $ USER / tf-serving-devel-custom-op -f tensorflow_serving / tools / docker / Dockerfile。
  • 现在,您可以使用$ USER / tensorflow-serving在Docker instructions后面提供图像
  • 关于docker - 有自定义操作时如何使用docker image tensorflow/serving服务tensorflow模型?,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/52772605/

    25 4 0
    Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
    广告合作:1813099741@qq.com 6ren.com