Gradient from Theano expression to render filter in Keras

For ConvNet, it may be interesting to find a normalized input that maximizes the activity of a single conv. filter as a way to visualize filters . I would like to do this in Keras deep learning package. This can be done using the black box optimization algorithm with the code from the FAQ .

# with a Sequential model get_3rd_layer_output = theano.function([model.layers[0].input], model.layers[3].get_output(train=False)) layer_output = get_3rd_layer_output(X) 

However, if I had a gradient, it would be a greatly simplified optimization task. How to extract a gradient from a Theano expression and inject it into a Python optimization library such as Scipy?

+5
source share
1 answer

You can print the gradient as described here and pass it to Scipy. You can also do optimizations in Theano - see this question .

However, perhaps the most straightforward approach is to create a get_gradients() function that uses theano.grad() to return the gradients of the filters relative to the input, and then calls scipy.optimize.minimize with jac=get_gradients . According to the documentation:

jac: bool or called, optional Jacobian (gradient) of an objective function. [...] jac can also be called, returning a gradient Target. In this case, it should accept the same arguments as fun.

+2
source

Source: https://habr.com/ru/post/1239098/


All Articles