Transfer parameters from the training schedule to the output

In order not to transfer the optimizer and gradient nodes to the output environment, I am trying to create two versions of graphs: one with training nodes, and the other without it.

And the idea was to use tensorflow.train.Saverto transfer variables from the train schedule version to the output schedule version.

So, I tried the following:

# Create training graph
trainingGraph = tf.Graph()
with (trainingGraph.as_default()):
  trainOp, lossOp = self.CreateTrainingGraph()
  trainInitOp = tf.initialize_variables(tf.all_variables(), "init_variables")

  # Add saver op
  self.saverOp = tf.train.Saver()

# Create inference graph
inferenceGraph = tf.Graph()
with (inferenceGraph.as_default()):
  self.CreateInferenceGraph()

  # Add saver op, compatible with training saver
  tf.train.Saver(saver_def=self.saverOp.as_saver_def())

In this case, it CreateTrainingGraph()causes CreateInferenceGraph()and adds the optimizer and losses on top of it.

For some reason, the constructor tf.train.Saverdoes not add save/restore_allnode to the output graph (or I just don’t understand what it is doing saver_def). I tried an empty constructor and

sess.run([model.saverOp._restore_op_name],
         { model.saverOp._filename_tensor_name : "Params/data.pb" })

with an error

<built-in function delete_Status> returned a result with an error set

What is the right way to achieve this?

+4
1

, tf.train.Saver() , . saver.restore(sess, filename) .

NB , (i) (.. tf.all_variables()) , (ii) . , . (, self.CreateTrainingGraph() self.CreateInferenceGraph() tf.name_scope(), .)

( saver_def , , , tf.import_graph_def() — Saver, Saver Python, , , .)

+5

Source: https://habr.com/ru/post/1621632/


All Articles