Deploy the model to ml-engine, exporting with tf.train.Saver ()

I want to deploy the model in the new version of Google ML Engine. Previously, using Google ML, I could export my training model by creating tf.train.Saver(), saving the model with saver.save(session, output).

So far, I have not been able to figure out whether the exported model will be obtained in this way, is still deployed to the ml-engine, otherwise I must follow the training procedure described here and create a new instructor package and be sure to prepare my model using the ml-engine.

Can I use tf.train.Saver()to get a model that I will deploy to the ml-engine?

+4
source share
1 answer

tf.train.Saver () only creates a breakpoint.

Cloud ML Engine uses SavedModel created from these APIs: https://www.tensorflow.org/versions/master/api_docs/python/tf/saved_model?hl=bn

The saved model is a checkpoint + serialized protobuf containing one or more graphic definitions + a set of signatures announcing the inputs and outputs of the chart / model + additional asset files, if applicable, so that all of them can be used during maintenance.

I offer a couple of examples:

, , SavedModel.

, , , - : SavedModel google cloud ml

+4

Source: https://habr.com/ru/post/1672846/


All Articles