How can you reuse a variable scope in a tensor stream without creating a new default scope?

I created a variable region in one part of my graph, and later in another part of the graph I want to add OP to an existing region. This corresponds to this distilled example:

import tensorflow as tf with tf.variable_scope('myscope'): tf.Variable(1.0, name='var1') with tf.variable_scope('myscope', reuse=True): tf.Variable(2.0, name='var2') print([n.name for n in tf.get_default_graph().as_graph_def().node]) 

What gives:

 ['myscope/var1/initial_value', 'myscope/var1', 'myscope/var1/Assign', 'myscope/var1/read', 'myscope_1/var2/initial_value', 'myscope_1/var2', 'myscope_1/var2/Assign', 'myscope_1/var2/read'] 

My desired result:

 ['myscope/var1/initial_value', 'myscope/var1', 'myscope/var1/Assign', 'myscope/var1/read', 'myscope/var2/initial_value', 'myscope/var2', 'myscope/var2/Assign', 'myscope/var2/read'] 

I saw this question that did not seem to have an answer directly related to the question: TensorFlow, how to reuse the variable scope name

+5
source share
1 answer

Here is one easy way to do this using as with somename in the context manager. Using this property somename.original_name_scope , you can get this scope and then add more variables to it. The following is an illustration:

 In [6]: with tf.variable_scope('myscope') as ms1: ...: tf.Variable(1.0, name='var1') ...: ...: with tf.variable_scope(ms1.original_name_scope) as ms2: ...: tf.Variable(2.0, name='var2') ...: ...: print([n.name for n in tf.get_default_graph().as_graph_def().node]) ...: ['myscope/var1/initial_value', 'myscope/var1', 'myscope/var1/Assign', 'myscope/var1/read', 'myscope/var2/initial_value', 'myscope/var2', 'myscope/var2/Assign', 'myscope/var2/read'] 

Note
Also note that setting reuse=True is optional; That is, even if you pass reuse=True , you will still get the same result.


Another way (thanks to OP itself!) Is to simply add / at the end of the variable area when reused, as in the following example:

 In [13]: with tf.variable_scope('myscope'): ...: tf.Variable(1.0, name='var1') ...: ...: # reuse variable scope by appending `/` to the target variable scope ...: with tf.variable_scope('myscope/', reuse=True): ...: tf.Variable(2.0, name='var2') ...: ...: print([n.name for n in tf.get_default_graph().as_graph_def().node]) ...: ['myscope/var1/initial_value', 'myscope/var1', 'myscope/var1/Assign', 'myscope/var1/read', 'myscope/var2/initial_value', 'myscope/var2', 'myscope/var2/Assign', 'myscope/var2/read'] 

Note
Also note that setting reuse=True is optional again; That is, even if you pass reuse=True , you will still get the same result.

+3
source

Source: https://habr.com/ru/post/1275894/


All Articles