TensorFlow reuse variable using tf.layers.conv2d

I am trying to make 2 conv layers with the same weights, however the API API is not working.

import tensorflow as tf x = tf.random_normal(shape=[10, 32, 32, 3]) with tf.variable_scope('foo') as scope: conv1 = tf.contrib.layers.conv2d(x, 3, [2, 2], padding='SAME', reuse=True, scope=scope) print(conv1.name) conv2 = tf.contrib.layers.conv2d(x, 3, [2, 2], padding='SAME', reuse=True, scope=scope) print(conv2.name) 

He is typing

 foo/foo/Relu:0 foo/foo_1/Relu:0 

Switching from tf.contrib.layers.conv2d to tf.layers.conv2d does not solve the problem.

It has the same problem with tf.layers.conv2d :

 import tensorflow as tf x = tf.random_normal(shape=[10, 32, 32, 3]) conv1 = tf.layers.conv2d(x, 3, [2, 2], padding='SAME', reuse=None, name='conv') print(conv1.name) conv2 = tf.layers.conv2d(x, 3, [2, 2], padding='SAME', reuse=True, name='conv') print(conv2.name) 

gives

 conv/BiasAdd:0 conv_2/BiasAdd:0 
+5
source share
1 answer

In the code you wrote, variables are reused between two layers of convolution. Try the following:

 import tensorflow as tf x = tf.random_normal(shape=[10, 32, 32, 3]) conv1 = tf.layers.conv2d(x, 3, [2, 2], padding='SAME', reuse=None, name='conv') conv2 = tf.layers.conv2d(x, 3, [2, 2], padding='SAME', reuse=True, name='conv') print([x.name for x in tf.global_variables()]) # prints # [u'conv/kernel:0', u'conv/bias:0'] 

Note that only one weight and one displacement tensor were created. Although they share weights, layers do not use actual calculation. Therefore, you see two different names for operations.

+10
source

Source: https://habr.com/ru/post/1265618/


All Articles