How does kera (or any other ML framework) calculate the level gradient of a lambda function for back propagation?

Keras allows you to add a layer that calculates a user-defined lambda function. What I am not getting is how Keras knows to calculate the gradient of this custom function for backpropagation.

+4
source share
1 answer

This is one of the benefits of using Theano / Tensorflow and the libraries built on top of them. They can give you automatic gradient calculation of mathematical functions and operations.

Keras gets them by calling:

# keras/theano_backend.py
def gradients(loss, variables):
    return T.grad(loss, variables)

# keras/tensorflow_backend.py
def gradients(loss, variables):
    '''Returns the gradients of `variables` (list of tensor variables)
    with regard to `loss`.
    '''
    return tf.gradients(loss, variables, colocate_gradients_with_ops=True)

(keras/optimizers.py) grads = self.get_gradients(loss, params), , params. params - . , , . , , , .

, , - /. , , . ( ), . , softwax exp, sum div, auto grad , / grad Theano/Tensorflow.

Ops : http://deeplearning.net/software/theano/extending/extending_theano.html https://www.tensorflow.org/versions/r0.12/how_tos/adding_an_op/index.html

+3
source

Source: https://habr.com/ru/post/1664914/


All Articles