tensorflow2.0 - 张量板直方图上未显示 tensorflow v2 梯度

标签 tensorflow2.0 tensorboard gradienttape

我有一个简单的神经网络,我试图通过使用如下回调使用张量板绘制梯度:

class GradientCallback(tf.keras.callbacks.Callback):
    console = False
    count = 0
    run_count = 0

    def on_epoch_end(self, epoch, logs=None):
        weights = [w for w in self.model.trainable_weights if 'dense' in w.name and 'bias' in w.name]
        self.run_count += 1
        run_dir = logdir+"/gradients/run-" + str(self.run_count)
        with tf.summary.create_file_writer(run_dir).as_default(),tf.GradientTape() as g:
          # use test data to calculate the gradients
          _x_batch = test_images_scaled_reshaped[:100]
          _y_batch = test_labels_enc[:100]
          g.watch(_x_batch)
          _y_pred = self.model(_x_batch)  # forward-propagation
          per_sample_losses = tf.keras.losses.categorical_crossentropy(_y_batch, _y_pred) 
          average_loss = tf.reduce_mean(per_sample_losses) # Compute the loss value
          gradients = g.gradient(average_loss, self.model.weights) # Compute the gradient

        for t in gradients:
          tf.summary.histogram(str(self.count), data=t)
          self.count+=1
          if self.console:
                print('Tensor: {}'.format(t.name))
                print('{}\n'.format(K.get_value(t)[:10]))

# Set up logging
!rm -rf ./logs/ # clear old logs
from datetime import datetime
import os
root_logdir = "logs"
run_id = datetime.now().strftime("%Y%m%d-%H%M%S")
logdir = os.path.join(root_logdir, run_id)


# register callbacks, this will be used for tensor board latter
callbacks = [
    tf.keras.callbacks.TensorBoard( log_dir=logdir, histogram_freq=1, 
                                   write_images=True, write_grads = True ),
    GradientCallback()
]
然后,我在适合期间使用回调:
network.fit(train_pipe, epochs = epochs,batch_size = batch_size, validation_data = val_pipe, callbacks=callbacks)
现在,当我检查张量板时,我可以在左侧过滤器上看到渐变,但在直方图选项卡中没有显示任何内容:
Histogram tensorboard gradients
我在这里缺少什么?我是否正确记录了梯度?

最佳答案

看起来问题在于您在 tf 摘要编写器的上下文之外编写直方图。
我相应地更改了您的代码。但我没有试过。

class GradientCallback(tf.keras.callbacks.Callback):
    console = False
    count = 0
    run_count = 0

    def on_epoch_end(self, epoch, logs=None):
        weights = [w for w in self.model.trainable_weights if 'dense' in w.name and 'bias' in w.name]
        self.run_count += 1
        run_dir = logdir+"/gradients/run-" + str(self.run_count)
        with tf.summary.create_file_writer(run_dir).as_default()
          with tf.GradientTape() as g:
            # use test data to calculate the gradients
            _x_batch = test_images_scaled_reshaped[:100]
            _y_batch = test_labels_enc[:100]
            g.watch(_x_batch)
            _y_pred = self.model(_x_batch)  # forward-propagation
            per_sample_losses = tf.keras.losses.categorical_crossentropy(_y_batch, _y_pred) 
            average_loss = tf.reduce_mean(per_sample_losses) # Compute the loss value
            gradients = g.gradient(average_loss, self.model.weights) # Compute the gradient

          for nr, grad in enumerate(gradients):
            tf.summary.histogram(str(nr), data=grad)
            if self.console:
                  print('Tensor: {}'.format(grad.name))
                  print('{}\n'.format(K.get_value(grad)[:10])) 

关于tensorflow2.0 - 张量板直方图上未显示 tensorflow v2 梯度,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/63514062/

相关文章:

python-3.x - 属性错误: module 'tensorflow_core.keras.layers' has no attribute 'Conv1d'

tensorflow - 凯拉斯 GlobalMaxPooling2D 类型错误 : ('Keyword argument not understood:' , 'keepdims' )

python - 使用 Tensorflow 2.0 使用多个 GPU 进行训练时出现错误 : Out of range: End of sequence

python - 在自定义层中禁用了 Tensorflow 2 急切执行

python - 关于 tf.summary.scalar 的使用

python - Tensorflow 2.0 Autograph 间接修改(隐藏状态)可以工作,但它不应该工作

python - 在 TensorFlow2 中使用 GradientTape() 计算偏导数时出现问题

python - TensorFlow 图像分类

python - TensorBoard 中图像的列和行是什么意思?

python - 如何将 Tensorflow BatchNormalization 与 GradientTape 结合使用?