gpt4 book ai didi

python - 如何对已保存的估算器模型执行简单的 CLI 查询?

转载 作者:太空狗 更新时间:2023-10-29 20:57:37 25 4
gpt4 key购买 nike

我已经成功地训练了一个 DNNClassifier 来对文本(来自在线讨论区的帖子)进行分类。我已保存模型,现在我想使用 TensorFlow CLI 对文本进行分类。

当我为保存的模型运行 saved_model_cli show 时,我得到以下输出:

saved_model_cli show --dir /my/model --tag_set serve --signature_def predict
The given SavedModel SignatureDef contains the following input(s):
inputs['examples'] tensor_info:
dtype: DT_STRING
shape: (-1)
name: input_example_tensor:0
The given SavedModel SignatureDef contains the following output(s):
outputs['class_ids'] tensor_info:
dtype: DT_INT64
shape: (-1, 1)
name: dnn/head/predictions/ExpandDims:0
outputs['classes'] tensor_info:
dtype: DT_STRING
shape: (-1, 1)
name: dnn/head/predictions/str_classes:0
outputs['logistic'] tensor_info:
dtype: DT_FLOAT
shape: (-1, 1)
name: dnn/head/predictions/logistic:0
outputs['logits'] tensor_info:
dtype: DT_FLOAT
shape: (-1, 1)
name: dnn/logits/BiasAdd:0
outputs['probabilities'] tensor_info:
dtype: DT_FLOAT
shape: (-1, 2)
name: dnn/head/predictions/probabilities:0
Method name is: tensorflow/serving/predict

我无法找出用于 saved_model_cli run 的正确参数来获得预测。

我尝试了几种方法,例如:

saved_model_cli run --dir /my/model --tag_set serve --signature_def predict --input_exprs='examples=["klassifiziere mich bitte"]'

这给了我这个错误信息:

InvalidArgumentError (see above for traceback): Could not parse example input, value: 'klassifiziere mich bitte'
[[Node: ParseExample/ParseExample = ParseExample[Ndense=1, Nsparse=0, Tdense=[DT_STRING], dense_shapes=[[1]], sparse_types=[], _device="/job:localhost/replica:0/task:0/device:CPU:0"](_arg_input_example_tensor_0_0, ParseExample/ParseExample/names, ParseExample/ParseExample/dense_keys_0, ParseExample/ParseExample/names)]]

将我的输入字符串传递到 CLI 以获得分类的正确方法是什么?

您可以在 GitHub 上找到我的项目代码,包括训练数据:https://github.com/pahund/beitragstuev

我正在构建并保存我的模型(简化,see GitHub for original code):

embedded_text_feature_column = hub.text_embedding_column(
key="sentence",
module_spec="https://tfhub.dev/google/nnlm-de-dim128/1")
feature_columns = [embedded_text_feature_column]
estimator = tf.estimator.DNNClassifier(
hidden_units=[500, 100],
feature_columns=feature_columns,
n_classes=2,
optimizer=tf.train.AdagradOptimizer(learning_rate=0.003))
feature_spec = tf.feature_column.make_parse_example_spec(feature_columns)
serving_input_receiver_fn = tf.estimator.export.build_parsing_serving_input_receiver_fn(feature_spec)
estimator.export_savedmodel(export_dir_base="/my/dir/base", serving_input_receiver_fn=serving_input_receiver_fn)

最佳答案

您为模型导出创建的 ServingInputReceiver 告诉保存的模型期望序列化的 tf.Example 原型(prototype)而不是您希望分类的原始字符串。

来自 the Save and Restore documentation :

A typical pattern is that inference requests arrive in the form of serialized tf.Examples, so the serving_input_receiver_fn() creates a single string placeholder to receive them. The serving_input_receiver_fn() is then also responsible for parsing the tf.Examples by adding a tf.parse_example op to the graph.

....

The tf.estimator.export.build_parsing_serving_input_receiver_fn utility function provides that input receiver for the common case.

因此您导出的模型包含一个 tf.parse_example希望接收满足您传递给 build_parsing_serving_input_receiver_fn 的功能规范的序列化 tf.Example 原型(prototype)的操作,即在您的情况下,它需要具有 sentence 功能的序列化示例。要使用模型进行预测,您必须提供那些序列化的原型(prototype)。

幸运的是,Tensorflow 使构建这些变得相当容易。这是一个可能的函数,用于返回将 examples 输入键映射到一批字符串的表达式,然后您可以将其传递给 CLI:

import tensorflow as tf

def serialize_example_string(strings):

serialized_examples = []
for s in strings:
try:
value = [bytes(s, "utf-8")]
except TypeError: # python 2
value = [bytes(s)]

example = tf.train.Example(
features=tf.train.Features(
feature={
"sentence": tf.train.Feature(bytes_list=tf.train.BytesList(value=value))
}
)
)
serialized_examples.append(example.SerializeToString())

return "examples=" + repr(serialized_examples).replace("'", "\"")

因此使用从您的示例中提取的一些字符串:

strings = ["klassifiziere mich bitte",
"Das Paket „S Line Competition“ umfasst unter anderem optische Details, eine neue Farbe (Turboblau), 19-Zöller und LED-Lampen.",
"(pro Stimme geht 1 Euro Spende von Pfuscher ans Forum) ah du sack, also so gehts ja net :D:D:D"]

print (serialize_example_string(strings))

CLI 命令是:

saved_model_cli run --dir /path/to/model --tag_set serve --signature_def predict --input_exprs='examples=[b"\n*\n(\n\x08sentence\x12\x1c\n\x1a\n\x18klassifiziere mich bitte", b"\n\x98\x01\n\x95\x01\n\x08sentence\x12\x88\x01\n\x85\x01\n\x82\x01Das Paket \xe2\x80\x9eS Line Competition\xe2\x80\x9c umfasst unter anderem optische Details, eine neue Farbe (Turboblau), 19-Z\xc3\xb6ller und LED-Lampen.", b"\np\nn\n\x08sentence\x12b\n`\n^(pro Stimme geht 1 Euro Spende von Pfuscher ans Forum) ah du sack, also so gehts ja net :D:D:D"]'

这应该会给你想要的结果:

Result for output key class_ids:
[[0]
[1]
[0]]
Result for output key classes:
[[b'0']
[b'1']
[b'0']]
Result for output key logistic:
[[0.05852016]
[0.88453305]
[0.04373989]]
Result for output key logits:
[[-2.7780817]
[ 2.0360758]
[-3.0847695]]
Result for output key probabilities:
[[0.94147986 0.05852016]
[0.11546692 0.88453305]
[0.9562601 0.04373989]]

关于python - 如何对已保存的估算器模型执行简单的 CLI 查询?,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/51212160/

25 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com