How can I list all the Tensorflow variables / constants / placeholders a node depends on?
Example 1 (adding constants):
import tensorflow as tf
a = tf.constant(1, name = 'a')
b = tf.constant(3, name = 'b')
c = tf.constant(9, name = 'c')
d = tf.add(a, b, name='d')
e = tf.add(d, c, name='e')
sess = tf.Session()
print(sess.run([d, e]))
I would like to have a function list_dependencies()like:
list_dependencies(d) returns ['a', 'b']list_dependencies(e) returns ['a', 'b', 'c']
Example 2 (matrix multiplication between the filler and the weight matrix, followed by the addition of a displacement vector):
tf.set_random_seed(1)
input_size = 5
output_size = 3
input = tf.placeholder(tf.float32, shape=[1, input_size], name='input')
W = tf.get_variable(
"W",
shape=[input_size, output_size],
initializer=tf.contrib.layers.xavier_initializer())
b = tf.get_variable(
"b",
shape=[output_size],
initializer=tf.constant_initializer(2))
output = tf.matmul(input, W, name="output")
output_bias = tf.nn.xw_plus_b(input, W, b, name="output_bias")
sess = tf.Session()
sess.run(tf.global_variables_initializer())
print(sess.run([output,output_bias], feed_dict={input: [[2]*input_size]}))
I would like to have a function list_dependencies()like:
list_dependencies(output) returns ['W', 'input']list_dependencies(output_bias) returns ['W', 'b', 'input']