How to deploy a locally prepared TensorFlow chart file on the Google Cloud Platform?

I followed the TensorFlow for Poets tutorial and replaced flower_photos with a few of my own classes. Now I have my file labels.txtand my graph.pbsaved on my local machine.

Is there any way to host this pre-prepared model for the Google Cloud Platform? I read the documents and all I can find is instructions on how to create, train and deploy models from my ML Engine. But I don’t want to spend money on training my model on Google’s servers, when I just need them to host my model, so I can call it for forecasts.

Does anyone else face the same issue?

+2
source share
2 answers

Deploying a locally trained model is a supported use case; the instructions are pretty much the same no matter where you trained him:

To deploy the model version, you will need:

TensorFlow saved saved memory is stored in Google Cloud Storage. You can get the model:

  • Follow the Cloud ML Engine Learning Instructions for Learning in the Cloud.

  • Training elsewhere and exporting to SavedModel.

, TensorFlow for Poets , SavedModel ( , ). "" script ( graph.pb ):

input_graph = 'graph.pb'
saved_model_dir = 'my_model'

with tf.Graph() as graph:
  # Read in the export graph
  with tf.gfile.FastGFile(input_graph, 'rb') as f:
      graph_def = tf.GraphDef()
      graph_def.ParseFromString(f.read())
      tf.import_graph_def(graph_def, name='')

  # CloudML Engine and early versions of TensorFlow Serving do
  # not currently support graphs without variables. Add a
  # prosthetic variable.
  dummy_var = tf.Variable(0)

  # Define SavedModel Signature (inputs and outputs)
  in_image = graph.get_tensor_by_name('DecodeJpeg/contents:0')
  inputs = {'image_bytes': 
tf.saved_model.utils.build_tensor_info(in_image)}

  out_classes = graph.get_tensor_by_name('final_result:0')
  outputs = {'prediction': tf.saved_model.utils.build_tensor_info(out_classes)}

  signature = tf.saved_model.signature_def_utils.build_signature_def(
      inputs=inputs,
      outputs=outputs,
      method_name='tensorflow/serving/predict'
  )

  # Save out the SavedModel.
  b = saved_model_builder.SavedModelBuilder(saved_model_dir)
  b.add_meta_graph_and_variables(sess,
                                 [tf.saved_model.tag_constants.SERVING],
                                 signature_def_map={'predict_images': signature})
  b.save() 

( this codelab SO).

, , :

  # Loads label file, strips off carriage return
  label_lines = [line.rstrip() for line 
                 in tf.gfile.GFile("retrained_labels.txt")]
  out_classes = graph.get_tensor_by_name('final_result:0')
  out_labels = tf.gather(label_lines, ot_classes)
  outputs = {'prediction': tf.saved_model.utils.build_tensor_info(out_labels)}
+1

, , ... , . pb txt , Tensorflow HTTP-. ... . .

tensorflow , gunicorn mod_wsgi

, , .

+1

Source: https://habr.com/ru/post/1679673/


All Articles