I work with a learning platform on Google, cloudML
.
The big picture: I'm trying to figure out the cleanest way to deliver their docker environment and run on compute instances of Google, access the cloud API and my storage.
Running locally, I have my service account installed
C:\Program Files (x86)\Google\Cloud SDK>gcloud config list Your active configuration is: [service] [compute] region = us-central1 zone = us-central1-a [core] account = 773889352370-compute@developer.gserviceaccount.com disable_usage_reporting = False project = api-project-773889352370
I am loading a calculation instance with google container image family
gcloud compute instances create gci --image-family gci-stable --image-project google-containers --scopes 773889352370-compute@developer.gserviceaccount.com ="https://www.googleapis.com/auth/cloud-platform"
EDIT: You must explicitly set the scope for communication with the cloud.
Then I can ssh into this instance (for debugging)
gcloud compute ssh benweinstein2010@gci
In a calculation instance, I can pull the cloudML docker from GCR and run it
docker pull gcr.io/cloud-datalab/datalab:local docker run -it --rm -p "127.0.0.1:8080:8080" \ --entrypoint=/bin/bash \ gcr.io/cloud-datalab/datalab:local
I can confirm that I have access to my desired bucket. No credential issues.
root@cd6cc28a1c8a :/# gsutil ls gs://api-project-773889352370-ml gs://api-project-773889352370-ml/Ben/ gs://api-project-773889352370-ml/Cameras/ gs://api-project-773889352370-ml/MeerkatReader/ gs://api-project-773889352370-ml/Prediction/ gs://api-project-773889352370-ml/TrainingData/ gs://api-project-773889352370-ml/cloudmldist/
But when I try to install the bucket
root@139e775fcf6b :~
Must I have to activate my service account from the docker container? I had similar ones (unresolved issues elsewhere )
gcloud auth activate-service-account
I can pass the docker credentials to the .json file, but I'm not sure where / if gcloud ssh transfers these files to my instance?
I have access to the cloud platform more widely, for example, I can send a request to the cloud API.
gcloud beta ml predict --model ${MODEL_NAME} --json-instances images/request.json > images/${outfile}
which succeeds. Therefore, some credentials are transferred. I think I could pass it to calculate the engine, and then from the computational mechanism to the docker instance? I don't seem to be using the tools as intended. I thought gcloud would handle this as soon as I authenticate locally.