gpt4 book ai didi

python - 如何将 tf.constant 添加到 Keras 的输出中

转载 作者:行者123 更新时间:2023-11-30 09:03:07 24 4
gpt4 key购买 nike

我有一个正在运行的模型,使用以下方式构建:

模型= tf.keras.Model(输入=输入层,输出=输出)

如果我尝试向输出添加一个简单的常量,我会收到一条错误消息。例如:

output = output + [tf.constant(['label1', 'label2'], dtype = tf.string)]
model = tf.keras.Model(inputs=input_layers, outputs=outputs)

错误消息:AttributeError:启用急切执行时,Tensor.op 无意义。

有没有办法将其添加到模型中,即使是在训练之后或在 save() 时也是如此。

这个想法是在服务时间内将常数作为输出。

出现错误的完整网络示例:

import tensorflow as tf
import tensorflow.keras as keras

input = keras.layers.Input(shape=(2,))
hidden = keras.layers.Dense(10)(input)
output = keras.layers.Dense(3, activation='sigmoid')(hidden)
model = keras.models.Model(inputs=input, outputs=[output, tf.constant(['out1','out2','out3'], dtype=tf.string)])

错误

in <module>
5 hidden = keras.layers.Dense(10)(input)
6 output = keras.layers.Dense(3, activation='sigmoid')(input)
----> 7 model = keras.models.Model(inputs=input, outputs=[output, tf.constant(['out1','out2','out3'], dtype=tf.string)])

/lib/python3.6/site-packages/tensorflow_core/python/keras/engine/training.py in __init__(self, *args, **kwargs)
144
145 def __init__(self, *args, **kwargs):
--> 146 super(Model, self).__init__(*args, **kwargs)
147 _keras_api_gauge.get_cell('model').set(True)
148 # initializing _distribution_strategy here since it is possible to call

/lib/python3.6/site-packages/tensorflow_core/python/keras/engine/network.py in __init__(self, *args, **kwargs)
165 'inputs' in kwargs and 'outputs' in kwargs):
166 # Graph network
--> 167 self._init_graph_network(*args, **kwargs)
168 else:
169 # Subclassed network

/lib/python3.6/site-packages/tensorflow_core/python/training/tracking/base.py in _method_wrapper(self, *args, **kwargs)
455 self._self_setattr_tracking = False # pylint: disable=protected-access
456 try:
--> 457 result = method(self, *args, **kwargs)
458 finally:
459 self._self_setattr_tracking = previous_value # pylint: disable=protected-access

/lib/python3.6/site-packages/tensorflow_core/python/keras/engine/network.py in _init_graph_network(self, inputs, outputs, name, **kwargs)
268
269 if any(not hasattr(tensor, '_keras_history') for tensor in self.outputs):
--> 270 base_layer_utils.create_keras_history(self._nested_outputs)
271
272 self._base_init(name=name, **kwargs)

/lib/python3.6/site-packages/tensorflow_core/python/keras/engine/base_layer_utils.py in create_keras_history(tensors)
182 keras_tensors: The Tensors found that came from a Keras Layer.
183 """
--> 184 _, created_layers = _create_keras_history_helper(tensors, set(), [])
185 return created_layers
186

/lib/python3.6/site-packages/tensorflow_core/python/keras/engine/base_layer_utils.py in _create_keras_history_helper(tensors, processed_ops, created_layers)
208 if getattr(tensor, '_keras_history', None) is not None:
209 continue
--> 210 op = tensor.op # The Op that created this Tensor.
211 if op not in processed_ops:
212 # Recursively set `_keras_history`.

/lib/python3.6/site-packages/tensorflow_core/python/framework/ops.py in op(self)
1078 def op(self):
1079 raise AttributeError(
-> 1080 "Tensor.op is meaningless when eager execution is enabled.")
1081
1082 @property

AttributeError: Tensor.op is meaningless when eager execution is enabled.

使用Python 3.6和Tensorflow 2.0

最佳答案

将常量放入 Lambda 层内。 Keras 做了一些额外的记录,因此您需要的不仅仅是 tf 操作才能正常工作。使用 Lambda 层将为您完成此操作。

编辑以给出其工作原理的示例:您的最后一个示例将转换为以下代码

import tensorflow as tf
import tensorflow.keras as keras

inputs = keras.layers.Input(shape=(2,))
hidden = keras.layers.Dense(10)(inputs)
output1 = keras.layers.Dense(3, activation='sigmoid')(hidden)

@tf.function
def const(tensor):
batch_size = tf.shape(tensor)[0]
constant = tf.constant(['out1','out2','out3'], dtype=tf.string)
constant = tf.expand_dims(constant, axis=0)
return tf.broadcast_to(constant, shape=(batch_size, 3))

output2 = keras.layers.Lambda(const)(inputs)
model = keras.models.Model(inputs=inputs, outputs=[output1, output2])

编辑:这让我想起了不久前的一个项目,我必须在 Keras 模型中使用大量常量。当时我给它写了一层

class ConstantOnBatch(keras.layers.Layer):
def __init__(self, constant, *args, **kwargs):
self._initial_constant = copy.deepcopy(constant)
self.constant = K.constant(constant)
self.out_shape = self.constant.shape.as_list()
self.constant = tf.reshape(self.constant, [1]+self.out_shape)
super().__init__(*args, **kwargs)

def build(self, input_shape):
super().build(input_shape)

def call(self, inputs):
batch_size = tf.shape(inputs)[0]
output_shape = [batch_size]+self.out_shape
return tf.broadcast_to(self.constant, output_shape)

def compute_output_shape(self, input_shape):
input_shape = input_shape.as_list()
return [input_shape[0]]+self.out_shape

def get_config(self):
base_config = super().get_config()
base_config['constant'] = self._initial_constant

@classmethod
def from_config(cls, config):
return cls(**config)

它可能需要对 tf2 进行一些更新,并且代码肯定可以用更好的方式编写,但如果您需要大量常量,这可能为比使用大量 Lambda 层稍微更优雅的解决方案提供基础。

关于python - 如何将 tf.constant 添加到 Keras 的输出中,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/59458332/

24 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com