How to get summary information about tensor flow RNN

I implemented a simple RNN using tensor flow, shown below:

cell = tf.contrib.rnn.BasicRNNCell(state_size)
cell = tf.contrib.rnn.DropoutWrapper(cell, output_keep_prob=keep_prob)

rnn_outputs, final_state = tf.nn.dynamic_rnn(cell, batch_size, dypte=tf.float32)

It works great. But I would like to write weight variables in the resume of the author. Is there any way to do this?

By the way, are we using tf.nn.rnn_cell.BasicRNNCellor tf.contrib.rnn.BasicRNNCell? Or are they identical?

+4
source share
2 answers

But I would like to write weight variables in the resume of the author. Is there any way to do this?

You can get the variable with tf.get_variable(). tf.summary.histogramaccepts a tensor instance, so it would be easier to use Graph.get_tensor_by_name():

n_steps = 2
n_inputs = 3
n_neurons = 5

X = tf.placeholder(dtype=tf.float32, shape=[None, n_steps, n_inputs])
basic_cell = tf.nn.rnn_cell.BasicRNNCell(num_units=n_neurons)
outputs, states = tf.nn.dynamic_rnn(basic_cell, X, dtype=tf.float32)

with tf.variable_scope('rnn', reuse=True):
  print(tf.get_variable('basic_rnn_cell/kernel'))

kernel = tf.get_default_graph().get_tensor_by_name('rnn/basic_rnn_cell/kernel:0')
tf.summary.histogram('kernel', kernel)

, tf.nn.rnn_cell.BasicRNNCell tf.contrib.rnn.BasicRNNCell? ?

, , tf.nn.rnn_cell, tf.contrib 1.x.

+4

. , , . compute_gradients.

, "" (, ) dynamic_rnn. . ,

optzr = tf.train.AdamOptimizer(learning_rate)
grads_and_vars = optzr.compute_gradients(loss) 

"grads_and_vars" - (, ). , "grads_and_vars", / , . ,

for grad, vars in grads_and_vars:
    print (vars, vars.name)
    tf.summary.histogram(vars.name, vars)

Ref: https://www.tensorflow.org/api_docs/python/tf/train/AdamOptimizer#compute_gradients

+2

Source: https://habr.com/ru/post/1693988/


All Articles