The Relu function, as defined in keras / activation.py, is:
def relu(x, alpha=0., max_value=None): return K.relu(x, alpha=alpha, max_value=max_value)
It has the maximum value that can be used for a clip. Now, how can this be used / called in code? I tried the following: (a)
model.add(Dense(512,input_dim=1)) model.add(Activation('relu',max_value=250)) assert kwarg in allowed_kwargs, 'Keyword argument not understood: ' + kwarg AssertionError: Keyword argument not understood: max_value
(b)
Rel = Activation('relu',max_value=250)
same mistake
(c)
from keras.layers import activations uu = activations.relu(??,max_value=250)
The problem is that he expects the input to be present in the first value. Error: 'relu () accepts at least 1 argument (1 given)'
So how do I make a layer?
model.add(activations.relu(max_value=250))
has the same problem 'relu () takes at least 1 argument (1 set)'
If this file cannot be used as a layer, then there seems to be no way to specify the value of the Relu clip. This means that the comment here https://github.com/fchollet/keras/issues/2119 closing the proposed change is wrong ... Any suggestions? Thanks!