Global Weight Loss in Keras

Is there a way to establish global weight loss in Keras?

I know about the tiered level with regularizers ( https://keras.io/regularizers/ ), but I could not find any information on how to set global weight decay.

+9
source share
3 answers

According to the github repo ( https://github.com/fchollet/keras/issues/2717 ) there is no way to do a global weight loss. I answered this here, so others who have the same problem do not need to look for an answer anymore.

To get global weight loss in keras, regularizers must be added to each layer of the model. In my models, these layers represent batch normalization (beta / gamma stabilizer) and dense / convolution layers (W_regularizer / b_regularizer).

This is where multi-level regularization is described: ( https://keras.io/regularizers/ ).

+9
source

It is not possible to directly apply the β€œglobal” weight loss directly to the entire keras model.

However, as I describe here , you can use weight reduction for a model by looking at its layers and manually applying regularizers to the corresponding layers. Here is the relevant code snippet:

model = keras.applications.ResNet50(include_top=True, weights='imagenet') alpha = 0.00002 # weight decay coefficient for layer in model.layers: if isinstance(layer, keras.layers.Conv2D) or isinstance(layer, keras.layers.Dense): layer.add_loss(keras.regularizers.l2(alpha)(layer.kernel)) if hasattr(layer, 'bias_regularizer') and layer.use_bias: layer.add_loss(keras.regularizers.l2(alpha)(layer.bias)) 
+5
source

Placing the complete code for applying weight loss on Keras models (inspired by the post above):

 # a utility function to add weight decay after the model is defined. def add_weight_decay(model, weight_decay): if (weight_decay is None) or (weight_decay == 0.0): return # recursion inside the model def add_decay_loss(m, factor): if isinstance(m, tf.keras.Model): for layer in m.layers: add_decay_loss(layer, factor) else: for param in m.trainable_weights: with tf.keras.backend.name_scope('weight_regularizer'): regularizer = lambda: tf.keras.regularizers.l2(factor)(param) m.add_loss(regularizer) # weight decay and l2 regularization differs by a factor of 2 add_decay_loss(model, weight_decay/2.0) return 
0
source

Source: https://habr.com/ru/post/1261636/