Is there a main list of Tensorflow operations that are differentiable (i.e. will automatically differentiate)?
Two other ways to express this:
- A list of options that
ops.NoGradient does not have. - A list of options that will not trigger
LookupError .
For example, I would suggest that all Control Flow statements are not differentiable (e.g. tf.where ). How would I find this differently than manually by running them through tf.gradients to see if they throw a LookupError .
"Commonsense" is not a valid answer.
Thanks.
EDIT:
tf.where differentiable, so my intuitions are wrong. Perhaps the right question is which operators in Tensorflow are not differentiable.
Thanks.
source share