How to convert CloudML alpha model to SavedModel?

In the alpha version of CloudML Cloud Prediction Service, the format for exporting the model is:

inputs = {"x": x, "y_bytes": y}
g.add_to_collection("inputs", json.dumps(inputs))
outputs = {"a": a, "b_bytes": b}
g.add_to_collection("outputs", json.dumps(outputs))

I would like to convert this to SavedModel without relearning my model. How can i do this?

+1
source share
1 answer

We can convert this to SavedModel by importing the old model, creating signatures and re-exporting it. This code has not been verified, but something like this should work:

import json
import tensorflow as tf
from tensorflow.contrib.session_bundle import session_bundle

# Import the "old" model
session, _ = session_bundle.load_session_bundle_from_path(export_dir)

# Define the inputs and the outputs for the SavedModel
old_inputs = json.loads(tf.get_collection('inputs'))
inputs = {name: tf.saved_model.utils.build_tensor_info(tensor)
          for name, tensor in old_inputs}

old_outputs = json.loads(tf.get_collection('outputs'))
outputs = {name: tf.saved_model.utils.build_tensor_info(tensor)
           for name, tensor in old_outputs}

signature = tf.saved_model.signature_def_utils.build_signature_def(
    inputs=inputs,
    outputs=outputs,
    method_name=tf.saved_model.signature_constants.PREDICT_METHOD_NAME
)

# Save out the converted model
b = builder.SavedModelBuilder(new_export_dir)
b.add_meta_graph_and_variables(session,
                               [tf.saved_model.tag_constants.SERVING],
                               signature_def_map={'serving_default': signature})
b.save()
+1
source

Source: https://habr.com/ru/post/1016004/


All Articles