Keras How to use max_value in Relu Activation Function

The Relu function, as defined in keras / activation.py, is:

def relu(x, alpha=0., max_value=None): return K.relu(x, alpha=alpha, max_value=max_value) 

It has the maximum value that can be used for a clip. Now, how can this be used / called in code? I tried the following: (a)

  model.add(Dense(512,input_dim=1)) model.add(Activation('relu',max_value=250)) assert kwarg in allowed_kwargs, 'Keyword argument not understood: ' + kwarg AssertionError: Keyword argument not understood: max_value 

(b)

  Rel = Activation('relu',max_value=250) 

same mistake

(c)

  from keras.layers import activations uu = activations.relu(??,max_value=250) 

The problem is that he expects the input to be present in the first value. Error: 'relu () accepts at least 1 argument (1 given)'

So how do I make a layer?

  model.add(activations.relu(max_value=250)) 

has the same problem 'relu () takes at least 1 argument (1 set)'

If this file cannot be used as a layer, then there seems to be no way to specify the value of the Relu clip. This means that the comment here https://github.com/fchollet/keras/issues/2119 closing the proposed change is wrong ... Any suggestions? Thanks!

+7
source share
4 answers

You can use the ReLU function for the Keras back-end. Therefore, first import the backend:

 from keras import backend as K 

You can then pass your own function as activation using the backend functionality. It will look like

 def relu_advanced(x): return K.relu(x, max_value=250) 

Then you can use it as

 model.add(Dense(512, input_dim=1, activation=relu_advanced)) 

or

 model.add(Activation(relu_advanced)) 

Unfortunately, you have to hard code the additional arguments. Therefore, it is better to use a function that returns your function and passes your custom values:

 def create_relu_advanced(max_value=1.): def relu_advanced(x): return K.relu(x, max_value=K.cast_to_floatx(max_value)) return relu_advanced 

Then you can pass your arguments with

 model.add(Dense(512, input_dim=1, activation=create_relu_advanced(max_value=250))) 

or

 model.add(Activation(create_relu_advanced(max_value=250))) 
+7
source

This is what I used with the Lambda layer to implement the relu clip: Step 1: define the function that reluclip will do:

 def reluclip(x, max_value = 20): return K.relu(x, max_value = max_value) 

Step 2: add the Lambda layer to the model: y = Lambda(function = reluclip)(y)

0
source

It is as simple as one lambda:

 from keras.activations import relu clipped_relu = lambda x: relu(x, max_value=3.14) 

Then use it like this:

 model.add(Conv2D(64, (3, 3))) model.add(Activation(clipped_relu)) 

When reading a model stored in hdf5 use the custom_objects dictionary:

 model = load_model(model_file, custom_objects={'<lambda>': clipped_relu}) 
0
source

Checked below, this will work:

 import keras def clip_relu (x): return keras.activations.relu(x, max_value=1.) predictions=Dense(num_classes,activation=clip_relu,name='output') 
0
source

Source: https://habr.com/ru/post/1261605/


All Articles