What your splash screen creates is called βV2 Breakpointβ and was introduced in TF 0.12.
I did pretty well (although the C ++ docs are terrible, so it took me a whole day to solve them). Some people suggest converting all variables to constants or freezing the chart , but none of them are really needed.
Python part (save)
with tf.Session() as sess: tf.train.Saver(tf.trainable_variables()).save(sess, 'models/my-model')
If you create a Saver using tf.trainable_variables() , you can save some headaches and space on tf.trainable_variables() . But maybe some more complex models need to save all the data, then delete this argument in Saver , just make sure you create Saver after creating the chart. It is also very reasonable to give all variables / layers unique names, otherwise you may run into various problems.
C ++ part (output)
Please note that checkpointPath is not a path to any of the existing files, but simply their common prefix. If you mistakenly specified the path to the .index file, TF will not tell you that this is wrong, but it will die during output due to uninitialized variables.
#include <tensorflow/core/public/session.h> #include <tensorflow/core/protobuf/meta_graph.pb.h> using namespace std; using namespace tensorflow; ... // set up your input paths const string pathToGraph = "models/my-model.meta" const string checkpointPath = "models/my-model"; ... auto session = NewSession(SessionOptions()); if (session == nullptr) { throw runtime_error("Could not create Tensorflow session."); } Status status; // Read in the protobuf graph we exported MetaGraphDef graph_def; status = ReadBinaryProto(Env::Default(), pathToGraph, &graph_def); if (!status.ok()) { throw runtime_error("Error reading graph definition from " + pathToGraph + ": " + status.ToString()); } // Add the graph to the session status = session->Create(graph_def.graph_def()); if (!status.ok()) { throw runtime_error("Error creating graph: " + status.ToString()); } // Read weights from the saved checkpoint Tensor checkpointPathTensor(DT_STRING, TensorShape()); checkpointPathTensor.scalar<std::string>()() = checkpointPath; status = session->Run( {{ graph_def.saver_def().filename_tensor_name(), checkpointPathTensor },}, {}, {graph_def.saver_def().restore_op_name()}, nullptr); if (!status.ok()) { throw runtime_error("Error loading checkpoint from " + checkpointPath + ": " + status.ToString()); } // and run the inference to your liking auto feedDict = ... auto outputOps = ... std::vector<tensorflow::Tensor> outputTensors; status = session->Run(feedDict, outputOps, {}, &outputTensors);
For completeness, here is the Python equivalent:
Python output
with tf.Session() as sess: saver = tf.train.import_meta_graph('models/my-model.meta') saver.restore(sess, tf.train.latest_checkpoint('models/')) outputTensors = sess.run(outputOps, feed_dict=feedDict)