In order not to transfer the optimizer and gradient nodes to the output environment, I am trying to create two versions of graphs: one with training nodes, and the other without it.
And the idea was to use tensorflow.train.Saverto transfer variables from the train schedule version to the output schedule version.
So, I tried the following:
trainingGraph = tf.Graph()
with (trainingGraph.as_default()):
trainOp, lossOp = self.CreateTrainingGraph()
trainInitOp = tf.initialize_variables(tf.all_variables(), "init_variables")
self.saverOp = tf.train.Saver()
inferenceGraph = tf.Graph()
with (inferenceGraph.as_default()):
self.CreateInferenceGraph()
tf.train.Saver(saver_def=self.saverOp.as_saver_def())
In this case, it CreateTrainingGraph()causes CreateInferenceGraph()and adds the optimizer and losses on top of it.
For some reason, the constructor tf.train.Saverdoes not add save/restore_allnode to the output graph (or I just don’t understand what it is doing saver_def). I tried an empty constructor and
sess.run([model.saverOp._restore_op_name],
{ model.saverOp._filename_tensor_name : "Params/data.pb" })
with an error
<built-in function delete_Status> returned a result with an error set
What is the right way to achieve this?