Saving and restoring a trained LSTM in a tensor stream

I trained the LSTM classifier using BasicLSTMCell. How to save the model and restore it for use in subsequent classifications?

+4
source share
6 answers

It was interesting to me. As pointed out, the usual way to save the model in TensorFlow is to use it tf.train.Saver(), but I believe that it saves the values tf.Variables. I'm not quite sure if there are tf.Variablesimplementations inside BasicLSTMCellthat are automatically saved at the same time, or if there is another step that needs to be taken, but if all else fails, BasicLSTMCellthey can be easily saved and loaded into the brine file.

+2
source

We found the same problem. We were not sure whether the internal variables were saved. We figured out that you should create a splash screen after creating / defining BasicLSTMCell. Otherwise, it is not saved.

+3
source

- tf.train.Saver. . saver op, .

:

https://www.tensorflow.org/versions/r0.11/how_tos/variables/index.html

, , , .

Saver, . Variable.name.

, , inspect_checkpoint , , print_tensors_in_checkpoint_file.

Saver tf.train.Saver(), .

# Create some variables.
v1 = tf.Variable(..., name="v1")
v2 = tf.Variable(..., name="v2")
...
# Add an op to initialize the variables.
init_op = tf.initialize_all_variables()

# Add ops to save and restore all the variables.
saver = tf.train.Saver()

# Later, launch the model, initialize the variables, do some work, save the
# variables to disk.
with tf.Session() as sess:
  sess.run(init_op)
  # Do some work with the model.
  ..
  # Save the variables to disk.
  save_path = saver.save(sess, "/tmp/model.ckpt")
  print("Model saved in file: %s" % save_path)

Saver . , .

# Create some variables.
v1 = tf.Variable(..., name="v1")
v2 = tf.Variable(..., name="v2")
...
# Add ops to save and restore all the variables.
saver = tf.train.Saver()

# Later, launch the model, use the saver to restore variables from disk, and
# do some work with the model.
with tf.Session() as sess:
  # Restore variables from disk.
  saver.restore(sess, "/tmp/model.ckpt")
  print("Model restored.")
  # Do some work with the model
  ...
+2

, LSTM (, -). , Saver, , ... () . metagraph, , tf.Variables, . , , .

" ?" / " ?" / " LSTM, - python var?" / .. :

for i in tf.global_variables():
    print(i)

vars

for i in my_graph.get_operations():
    print (i)

ops. , python var,

tf.Graph.get_tensor_by_name('name_of_op:N')

op - , , N - (, ) , .

op, ... ...

+1

tf.train.Saver save (*.ckpt) . save, (, , ):

# Create some variables.
v1 = tf.Variable(..., name="v1")
v2 = tf.Variable(..., name="v2")
...
# Add an op to initialize the variables.
init_op = tf.initialize_all_variables()

# Add ops to save and restore all the variables.
saver = tf.train.Saver()

# Later, launch the model, initialize the variables, do some work, save the
# variables to disk.
with tf.Session() as sess:
  sess.run(init_op)
  # Do some work with the model.
  ..
  # Save the variables to disk.
  save_path = saver.save(sess, "/tmp/model.ckpt")
  print("Model saved in file: %s" % save_path)

During classification / output, you create an instance of the other tf.train.Saverand call restore, passing the current session and checkpoint file for recovery. You can call restorejust before using your model for classification by calling session.run:

# Create some variables.
v1 = tf.Variable(..., name="v1")
v2 = tf.Variable(..., name="v2")
...
# Add ops to save and restore all the variables.
saver = tf.train.Saver()

# Later, launch the model, use the saver to restore variables from disk, and
# do some work with the model.
with tf.Session() as sess:
  # Restore variables from disk.
  saver.restore(sess, "/tmp/model.ckpt")
  print("Model restored.")
  # Do some work with the model
  ...

Link: https://www.tensorflow.org/versions/r0.11/how_tos/variables/index.html#saving-and-restoring

-1
source

Source: https://habr.com/ru/post/1659947/


All Articles