Download numpy array in google-cloud-ml job

In the model I want to run, I have some variables that need to be initialized with specific values.

I currently store these variables in numpy arrays, but I don’t know how to adapt my code so that it works on the google-cloud-ml job.

I am currently initializing my variable as follows:

my_variable = variables.model_variable('my_variable', shape=None, dtype=tf.float32, initializer=np.load('datasets/real/my_variable.npy'))

Can someone help me?

+3
source share
2 answers

-, / GCS (, , gsutil) , . - , , , , . , , ( , gcloud beta ml init-project), . , . .

, GCS. Tensorflow , , , GCS. file_io:

from StringIO import StringIO
import tensorflow as tf
import numpy as np
from tensorflow.python.lib.io import file_io

# Create a variable initialized to the value of a serialized numpy array
f = StringIO(file_io.read_file_to_string('gs://my-bucket/123.npy'))
my_variable = tf.Variable(initial_value=np.load(f), name='my_variable')

, StringIO, file_io.FileIO , numpy.load.

: , , GCS file_io file_io, :

np.save(file_io.FileIO('gs://my-bucket/123', 'w'), np.array([[1,2,3], [4,5,6]]))

Python 3 from io import StringIO from StringIO import StringIO.

+16

, . (Python 3):

from io import BytesIO
import numpy as np
from tensorflow.python.lib.io import file_io

:

dest = 'gs://[BUCKET-NAME]/' # Destination to save in GCS
np.save(file_io.FileIO(dest, 'w'), np.ones((100, )))

:

f = BytesIO(file_io.read_file_to_string(src, binary_mode=True))
arr = np.load(f)
0

Source: https://habr.com/ru/post/1672147/


All Articles