gpt4 book ai didi

python - TFX - 无需序列化数据输入即可获得预测的 REST API

转载 作者:行者123 更新时间:2023-12-05 07:06:57 25 4
gpt4 key购买 nike

我是 TFX 的新手,我一直在学习 Keras 教程,并且已经使用我的数据成功创建了 TFX 管道。当我学习通过带有 TF 服务的 Docker 为我的模型提供服务时,我的数据输入必须按如下方式序列化以返回预测结果。

如何在不序列化数据输入的情况下将数据输入到 REST API。为此,我创建了第二个函数 - def_get_serve_raw。 trainer已经创建成功,但是我好像不能用REST API输入的原始数据来调用它。我测试了多种格式,但每次都有不同的错误。

我应该在 def_get_served_raw 函数中做什么,以便模型接受没有 base64 数据的数据输入?

仅供引用 - 该模型有 2 个数据输入作为字符串。

下面是我在 Trainer TFX 中的 def run_fn

def _get_serve_tf_examples_fn(model, tf_transform_output):

model.tft_layer = tf_transform_output.transform_features_layer()

@tf.function
def serve_tf_examples_fn(serialized_tf_examples):
"""Returns the output to be used in the serving signature."""
feature_spec = tf_transform_output.raw_feature_spec()
feature_spec.pop(features.LABEL_KEY)
parsed_features = tf.io.parse_example(serialized_tf_examples, feature_spec)

transformed_features = model.tft_layer(parsed_features)
transformed_features.pop(features.transformed_name(features.LABEL_KEY))

outputs = model(transformed_features)
return {'outputs': outputs}

return serve_tf_examples_fn

def _get_serve_raw(model, transform_output):

model.tft_layer = tf_transform_output.transform_features_layer()

@tf.function
def serve_raw_fn(country_code, project_type):

country_code_sp_tensor = tf.sparse.SparseTensor(
indices= [[0,0]],
values= country_code,
dense_shape= (1,1)
)

project_type_sp_tensor = tf.sparse.SparseTensor(
indices= [[0,0]],
values= project_type,
dense_shape= (1,1)
)

parsed_features = {'Country_Code' : country_code_sp_tensor,
'Project_Type' : project_type_sp_tensor}

transformed_features = model.tft_layer(parsed_features)
transformed_features.pop(_transformed_name(_LABEL_KEY_EA))

outputs = model(transformed_features)
return {'outputs': outputs}

return serve_raw_fn



signatures = { "serving_default": _get_serve_tf_examples_fn(model, tf_transform_output).get_concrete_function(
tf.TensorSpec(shape=[None], dtype=tf.string, name='examples')),
"serving_raw": _get_serve_raw(model, tf_transform_output).get_concrete_function(
tf.TensorSpec(shape=[None], dtype=tf.string, name='country_code'),
tf.TensorSpec(shape=(None), dtype=tf.string, name='project_type'))}


model.save(fn_args.serving_model_dir, save_format='tf', signatures=signatures)

服务签名

MetaGraphDef with tag-set: 'serve' contains the following SignatureDefs:

signature_def['__saved_model_init_op']:
The given SavedModel SignatureDef contains the following input(s):
The given SavedModel SignatureDef contains the following output(s):
outputs['__saved_model_init_op'] tensor_info:
dtype: DT_INVALID
shape: unknown_rank
name: NoOp
Method name is:

signature_def['serving_default']:
The given SavedModel SignatureDef contains the following input(s):
inputs['examples'] tensor_info:
dtype: DT_STRING
shape: unknown_rank
name: serving_default_examples:0
The given SavedModel SignatureDef contains the following output(s):
outputs['outputs'] tensor_info:
dtype: DT_FLOAT
shape: (-1, 1)
name: StatefulPartitionedCall:0
Method name is: tensorflow/serving/predict

signature_def['serving_raw']:
The given SavedModel SignatureDef contains the following input(s):
inputs['raw'] tensor_info:
dtype: DT_STRING
shape: unknown_rank
name: serving_raw_raw:0
The given SavedModel SignatureDef contains the following output(s):
outputs['outputs'] tensor_info:
dtype: DT_FLOAT
shape: (-1, 1)
name: StatefulPartitionedCall_1:0
Method name is: tensorflow/serving/predict

测试和错误 1:

url = f'http://localhost:8501/v1/models/ea:predict'
headers = {"content-type": "application/json"}
data = {
"signature_name":"serving_raw",
"instances":[
{
"raw":{"countty_code": "US",
"project_type": "Delivery"}
}
]
}

data = json.dumps(data)
print(data)
json_response = requests.post(url, data=data, headers=headers)
print(json_response.content)
print(json_response.json)

b'{\n "error": "Failed to process element: 0 key: raw of \'instances\' list. Error: Invalid argument: JSON Value: {\\n \\"country_code\\": \\"US\\",\\n \\"project_type\\": \\"Delivery\\"\\n} not formatted correctly for base64 data"\n}'

测试和错误 2

url = f'http://localhost:8501/v1/models/ea:predict'
headers = {"content-type": "application/json"}
data = {
"signature_name":"serving_raw",
"instances":[
{
"raw":{"b64": "US",
"b64": "Delivery"}
}
]
}

data = json.dumps(data)
print(data)
json_response = requests.post(url, data=data, headers=headers)
print(json_response.content)
print(json_response.json)

b'{\n "error": "You must feed a value for placeholder tensor \'StatefulPartitionedCall_1/StatefulPartitionedCall/transform_features_layer_1/transform/transform/inputs/F_Project_Type/shape\' with dtype int64 and shape [2]\\n\\t [[{{node transform_features_layer_1/transform/transform/inputs/F_Project_Type/shape}}]]"\n}'

最佳答案

再次重新测试后,测试1已经成功执行。

关于python - TFX - 无需序列化数据输入即可获得预测的 REST API,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/62314939/

25 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com