You can use the function tf.gradients()to calculate the gradient of any scalar tensor for any other tensor (provided that the gradients are defined for all ops between these two tensors):
activations = ...
loss = f(..., activations)
grad_wrt_activations, = tf.gradients(loss, [activation])
Visualizing this in TensorBoard is difficult at all, as it grad_wrt_activationis (usually) a tensor with the same shape as activation. Adding tf.histogram_summary()op is probably the easiest way to render
tf.histogram_summary("Activation gradient", grad_wrt_activations)