Keras - using activation function with parameter

How can leaking ReLUs be used in the latest version of keras? The relu () function takes an optional 'alpha' parameter, which is responsible for the negative slope, but I cannot figure out how to pass ths paramtere when building the layer.

This line, as I tried to do this,

model.add(Activation(relu(alpha=0.1))

but then i get an error

TypeError: relu() missing 1 required positional argument: 'x'

How can I use a ReLU leak or any other activation function with some parameter?

+12
source share
3 answers

reluis a function, not a class, and it introduces an activation function as a parameter x. The activation level takes a function as an argument, so you can initialize it with a lambda function through an input x, for example:

model.add(Activation(lambda x: relu(x, alpha=0.1)))
+7

, (keras doc), github question, , relu .

from keras.layers.advanced_activations import LeakyReLU

model.add(Dense(512, 512, activation='linear')) # Add any layer, with the default of an identity/linear squashing function (no squashing)
model.add(LeakyReLU(alpha=.001))   # add an advanced activation

?

0

. .

class activation_wrapper(object):
    def __init__(self, func):
        self.func = func

    def __call__(self, *args, **kwargs):
        def _func(x):
            return self.func(x, *args, **kwargs)
        return _func

, - .

wrapped_relu = activation_wrapper(relu).

,

model.add(Activation(wrapped_relu(alpha=0.1))

model.add(Dense(64, activation=wrapped_relu(alpha=0.1))

, @Thomas Jungblut, - . , , , .

0
source

Source: https://habr.com/ru/post/1678420/


All Articles