This is one of the benefits of using Theano / Tensorflow and the libraries built on top of them. They can give you automatic gradient calculation of mathematical functions and operations.
Keras gets them by calling:
def gradients(loss, variables):
return T.grad(loss, variables)
def gradients(loss, variables):
'''Returns the gradients of `variables` (list of tensor variables)
with regard to `loss`.
'''
return tf.gradients(loss, variables, colocate_gradients_with_ops=True)
(keras/optimizers.py) grads = self.get_gradients(loss, params), , params. params - . , , . , , , .
, , - /. , , . ( ), . , softwax exp, sum div, auto grad , / grad Theano/Tensorflow.
Ops :
http://deeplearning.net/software/theano/extending/extending_theano.html
https://www.tensorflow.org/versions/r0.12/how_tos/adding_an_op/index.html
source
share