How to convert jpeg image to json file in google machine learning

I am working on a Google ML cloud and I want to get a forecast for a jpeg image. For this, I would like to use:

gcloud beta ml predicts --instances = INSTANCES --model = MODEL [--version = VERSION]

( https://cloud.google.com/ml/reference/commandline/predict )

Instances is the path to the json file with all the image information. How can I create a json file from my jpeg image?

Many thanks!

+4
source share
3 answers

- , , JPEG. , CloudML , . tf.map_fn . .. :

# Number of channels in the input image
CHANNELS = 3

# Dimensions of resized images (input to the neural net)
HEIGHT = 200
WIDTH = 200

# A placeholder for a batch of images
images_placeholder = tf.placeholder(dtype=tf.string, shape=(None,))

# The CloudML Prediction API always "feeds" the Tensorflow graph with
# dynamic batch sizes e.g. (?,).  decode_jpeg only processes scalar
# strings because it cannot guarantee a batch of images would have
# the same output size.  We use tf.map_fn to give decode_jpeg a scalar
# string from dynamic batches.
def decode_and_resize(image_str_tensor):
  """Decodes jpeg string, resizes it and returns a uint8 tensor."""

  image = tf.image.decode_jpeg(image_str_tensor, channels=CHANNELS)

  # Note resize expects a batch_size, but tf_map supresses that index,
  # thus we have to expand then squeeze.  Resize returns float32 in the
  # range [0, uint8_max]
  image = tf.expand_dims(image, 0)
  image = tf.image.resize_bilinear(
      image, [HEIGHT, WIDTH], align_corners=False)
  image = tf.squeeze(image, squeeze_dims=[0])
  image = tf.cast(image, dtype=tf.uint8)
  return image

decoded_images = tf.map_fn(
    decode_and_resize, images_placeholder, back_prop=False, dtype=tf.uint8)

# convert_image_dtype, also scales [0, uint8_max] -> [0, 1).
images = tf.image.convert_image_dtype(decoded_images, dtype=tf.float32)

# Then shift images to [-1, 1) (useful for some models such as Inception)
images = tf.sub(images, 0.5)
images = tf.mul(images, 2.0)

# ...

, , , ( ) _bytes. base64, CloudML , :

inputs = {"image_bytes": images_placeholder.name}
tf.add_to_collection("inputs", json.dumps(inputs))

, gcloud, :

{"image_bytes": {"b64": "dGVzdAo="}}

( , image_bytes - , {"b64": "dGVzdAo="}).

, , - :

echo "{\"image_bytes\": {\"b64\": \"`base64 image.jpg`\"}}" > instances

:

gcloud beta ml predict --instances=instances --model=my_model

, "". , gcloud HTTP-:

{"instances" : [{"image_bytes": {"b64": "dGVzdAo="}}]}
+6

...

Google , , , . images_to_json.py, json

+2

Python JSON base64, "gcloud ml-engine Foretnt", :

import json
import base64
with open('path_to_img.jpg', 'rb') as f:
    img_bytes = base64.b64encode(f.read())
json_data = {'image_bytes': {'b64': img_bytes.decode('ascii')}}
with open('path_to_json_file.json', 'w+') as f:
    json.dump(json_data, f)

I spent a lot of time making this all work for the TensorFlow Keras and Google Cloud ML models. After it finally worked, I put together a sample code in the hope that it could help others who are facing the same problems when deploying TF models in Google cloud ML. It can be found here: https://github.com/mhwilder/tf-keras-gcloud-deployment .

0
source

Source: https://habr.com/ru/post/1662164/


All Articles