gpt4 book ai didi

python - 如何在TensorFlow 2.0中实现clip_gradients_by_norm?

转载 作者:行者123 更新时间:2023-12-01 00:53:07 28 4
gpt4 key购买 nike

我想在 TF 2.0 中使用 tf.contrib.estimator.clip_gradients_by_norm,就像在 TF 1.3 下一样,但是随着 contrib 现在消失,我需要一个解决方法,甚至只是一些关于它如何工作的基本直觉。

我知道这个问题已在 Github ( https://github.com/tensorflow/tensorflow/issues/28707 ) 上作为问题提出,但如果可能的话,希望尽快找到解决方案。

# Use gradient descent as the optimizer for training the model.
my_optimizer=tf.train.GradientDescentOptimizer(learning_rate=0.0000001)
my_optimizer = tf.contrib.estimator.clip_gradients_by_norm(my_optimizer, 5.0)

# Configure the linear regression model with our feature columns and optimizer.
# Set a learning rate of 0.0000001 for Gradient Descent.
linear_regressor = tf.estimator.LinearRegressor(
feature_columns=feature_columns,
optimizer=my_optimizer
)

更多信息请参见:

https://colab.research.google.com/notebooks/mlcc/first_steps_with_tensor_flow.ipynb?utm_source=mlcc&utm_campaign=colab-external&utm_medium=referral&utm_content=firststeps-colab&hl=en#scrollTo=ubhtW-NGU802

我已经尝试使用自定义渐变,如下所述: https://www.tensorflow.org/guide/eager

@tf.custom_gradient
def clip_gradient_by_norm(x, norm):
y = tf.identity(x)
def grad_fn(dresult):
return [tf.clip_by_norm(dresult, norm), None]
return y, grad_fn

没有成功。

最佳答案

查看此问题的评论 https://github.com/tensorflow/tensorflow/issues/28707#issuecomment-502336827 ,

我发现您可以将代码修改为如下所示:

# Use gradient descent as the optimizer for training the model.
from tensorflow.keras import optimizers
my_optimizer = optimizers.SGD(lr=0.0000001, clipnorm=5.0)

# Configure the linear regression model with our feature columns and optimizer.
# Set a learning rate of 0.0000001 for Gradient Descent.
linear_regressor = tf.estimator.LinearRegressor(
feature_columns=feature_columns,
optimizer=my_optimizer
)

而不是:

# Use gradient descent as the optimizer for training the model.
my_optimizer=tf.train.GradientDescentOptimizer(learning_rate=0.0000001)
my_optimizer = tf.contrib.estimator.clip_gradients_by_norm(my_optimizer, 5.0)

# Configure the linear regression model with our feature columns and optimizer.
# Set a learning rate of 0.0000001 for Gradient Descent.
linear_regressor = tf.estimator.LinearRegressor(
feature_columns=feature_columns,
optimizer=my_optimizer
)

关于python - 如何在TensorFlow 2.0中实现clip_gradients_by_norm?,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/56428659/

28 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com