List of differentiable operations in a tensor flow

Is there a main list of Tensorflow operations that are differentiable (i.e. will automatically differentiate)?

Two other ways to express this:

  • A list of options that ops.NoGradient does not have.
  • A list of options that will not trigger LookupError .

For example, I would suggest that all Control Flow statements are not differentiable (e.g. tf.where ). How would I find this differently than manually by running them through tf.gradients to see if they throw a LookupError .

"Commonsense" is not a valid answer.

Thanks.

EDIT:

tf.where differentiable, so my intuitions are wrong. Perhaps the right question is which operators in Tensorflow are not differentiable.

Thanks.

+5
source share
1 answer

No, there is no list (you can be the first to create it). Also, as far as I know, the documentation of each function also does not talk about it ( tf.size non-differentiable, but does not talk about it).

Besides the method you proposed, you can also extract this data from the source code. For example, all operating systems that have a gradient implemented, @ops.RegisterGradient before declaring a method. For ops that don't have a gradient, you will be ops.NotDifferentiable(

Not relevant, but possibly useful .

+2
source

Source: https://habr.com/ru/post/1268880/


All Articles