gpt4 book ai didi

python - 如何降低 TensorFlow 2.0 中 SGD 优化器的学习率?

转载 作者:行者123 更新时间:2023-12-02 07:09:00 25 4
gpt4 key购买 nike

我想降低tensorflow2.0的SGD优化器中的学习率,我使用了这行代码:tf.keras.optimizers.SGD(learning_rate, Decay=lr_decay, Momentum=0.9)但我不知道我的学习率是否下降了,我怎样才能得到我当前的学习率?

最佳答案

print(model.optimizer._decayed_lr('float32').numpy())

就可以了。 _decayed_lr()计算衰减学习率作为迭代衰减的函数。完整示例如下。

<小时/>
from tensorflow.keras.layers import Input, Dense
from tensorflow.keras.models import Model
from tensorflow.keras.optimizers import SGD
import numpy as np

ipt = Input((12,))
out = Dense(12)(ipt)
model = Model(ipt, out)
model.compile(SGD(1e-4, decay=1e-2), loss='mse')

x = y = np.random.randn(32, 12) # dummy data
for iteration in range(10):
model.train_on_batch(x, y)
print("lr at iteration {}: {}".format(
iteration + 1, model.optimizer._decayed_lr('float32').numpy()))
# OUTPUTS
lr at iteration 1: 9.900989971356466e-05
lr at iteration 2: 9.803921420825645e-05
lr at iteration 3: 9.708738070912659e-05
lr at iteration 4: 9.61538462433964e-05
lr at iteration 5: 9.523809421807528e-05
lr at iteration 6: 9.433962259208784e-05
lr at iteration 7: 9.345793660031632e-05
lr at iteration 8: 9.259258513338864e-05
lr at iteration 9: 9.174311708193272e-05
lr at iteration 10: 9.09090886125341e-05

关于python - 如何降低 TensorFlow 2.0 中 SGD 优化器的学习率?,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/58740496/

25 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com