Is this the right way to add regularization to the tensorflow layer?

I added a built-in regulator tf.contrib.layers.l2_regularizeras such:

regularizer = tf.contrib.layers.l2_regularizer(scale=0.1)
layer1 = tf.layers.dense(tf_x, 50, tf.nn.relu, kernel_regularizer=regularizer)
layer2 = tf.layers.dense(layer1, 50, tf.nn.relu, kernel_regularizer=regularizer) 
output = tf.layers.dense(layer2, 5, tf.nn.relu)

I tried different values ​​for scale(0,1-1), but it doesn't seem to be that way. I was wondering if I need to add a regularizer to some other position (i.e. Optimizer, train, etc.), Or it could just be because of my data.

I did not configure any other position in my code for the regularizer, and then in tf.layers, as indicated above.

+3
source share

Source: https://habr.com/ru/post/1683836/


All Articles