gpt4 book ai didi

tensorflow2.0 - 张量板直方图上未显示 tensorflow v2 梯度

转载 作者:行者123 更新时间:2023-12-04 13:55:22 26 4
gpt4 key购买 nike

我有一个简单的神经网络,我试图通过使用如下回调使用张量板绘制梯度:

class GradientCallback(tf.keras.callbacks.Callback):
console = False
count = 0
run_count = 0

def on_epoch_end(self, epoch, logs=None):
weights = [w for w in self.model.trainable_weights if 'dense' in w.name and 'bias' in w.name]
self.run_count += 1
run_dir = logdir+"/gradients/run-" + str(self.run_count)
with tf.summary.create_file_writer(run_dir).as_default(),tf.GradientTape() as g:
# use test data to calculate the gradients
_x_batch = test_images_scaled_reshaped[:100]
_y_batch = test_labels_enc[:100]
g.watch(_x_batch)
_y_pred = self.model(_x_batch) # forward-propagation
per_sample_losses = tf.keras.losses.categorical_crossentropy(_y_batch, _y_pred)
average_loss = tf.reduce_mean(per_sample_losses) # Compute the loss value
gradients = g.gradient(average_loss, self.model.weights) # Compute the gradient

for t in gradients:
tf.summary.histogram(str(self.count), data=t)
self.count+=1
if self.console:
print('Tensor: {}'.format(t.name))
print('{}\n'.format(K.get_value(t)[:10]))

# Set up logging
!rm -rf ./logs/ # clear old logs
from datetime import datetime
import os
root_logdir = "logs"
run_id = datetime.now().strftime("%Y%m%d-%H%M%S")
logdir = os.path.join(root_logdir, run_id)


# register callbacks, this will be used for tensor board latter
callbacks = [
tf.keras.callbacks.TensorBoard( log_dir=logdir, histogram_freq=1,
write_images=True, write_grads = True ),
GradientCallback()
]
然后,我在适合期间使用回调:
network.fit(train_pipe, epochs = epochs,batch_size = batch_size, validation_data = val_pipe, callbacks=callbacks)
现在,当我检查张量板时,我可以在左侧过滤器上看到渐变,但在直方图选项卡中没有显示任何内容:
Histogram tensorboard gradients
我在这里缺少什么?我是否正确记录了梯度?

最佳答案

看起来问题在于您在 tf 摘要编写器的上下文之外编写直方图。
我相应地更改了您的代码。但我没有试过。

class GradientCallback(tf.keras.callbacks.Callback):
console = False
count = 0
run_count = 0

def on_epoch_end(self, epoch, logs=None):
weights = [w for w in self.model.trainable_weights if 'dense' in w.name and 'bias' in w.name]
self.run_count += 1
run_dir = logdir+"/gradients/run-" + str(self.run_count)
with tf.summary.create_file_writer(run_dir).as_default()
with tf.GradientTape() as g:
# use test data to calculate the gradients
_x_batch = test_images_scaled_reshaped[:100]
_y_batch = test_labels_enc[:100]
g.watch(_x_batch)
_y_pred = self.model(_x_batch) # forward-propagation
per_sample_losses = tf.keras.losses.categorical_crossentropy(_y_batch, _y_pred)
average_loss = tf.reduce_mean(per_sample_losses) # Compute the loss value
gradients = g.gradient(average_loss, self.model.weights) # Compute the gradient

for nr, grad in enumerate(gradients):
tf.summary.histogram(str(nr), data=grad)
if self.console:
print('Tensor: {}'.format(grad.name))
print('{}\n'.format(K.get_value(grad)[:10]))

关于tensorflow2.0 - 张量板直方图上未显示 tensorflow v2 梯度,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/63514062/

26 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com