How to get loss gradients with respect to activation in Tensorflow

In the example, the cifar10loss gradients by parameters can be calculated as follows:

grads_and_vars = opt.compute_gradients(loss)
for grad, var in grads_and_vars:
    # ...

Is there a way to get the loss gradients with respect to activation (and not the parameters) and watch them in the Tensorboard?

+4
source share
1 answer

You can use the function tf.gradients()to calculate the gradient of any scalar tensor for any other tensor (provided that the gradients are defined for all ops between these two tensors):

activations = ...
loss = f(..., activations)  # `loss` is some function of `activations`.

grad_wrt_activations, = tf.gradients(loss, [activation])

Visualizing this in TensorBoard is difficult at all, as it grad_wrt_activationis (usually) a tensor with the same shape as activation. Adding tf.histogram_summary()op is probably the easiest way to render

# Adds a histogram of `grad_wrt_activations` to the graph, which will be logged
# with the other summaries, and shown in TensorBoard.
tf.histogram_summary("Activation gradient", grad_wrt_activations)
+4

Source: https://habr.com/ru/post/1624386/


All Articles