Cloud Storage: setting up service account credentials for boto python library

I follow this guide to upload a file to a bucket that I created manually: https://cloud.google.com/storage/docs/xml-api/gspythonlibrary

I seem to have problems setting up credentials for both the service account and the user account. I want to use this on a web server, so ideally it should be configured with a service account.

I created the API using the API Manager in the console and loaded the JSON. Meanwhile, my gcloud auth is configured with my OAUTH login. I tried gsutil config -eand got an error:

CommandException: OAuth2 is the preferred authentication mechanism with the Cloud SDK. Run "gcloud auth login" to configure authentication, unless you want to authenticate with an HMAC access key and secret, in which case run "gsutil config -a".

I also tried to authenticate the service account using: gcloud auth activate-service-account --key-file <json file>

but still fails to allow access using python boto. I also copied the id and the key from ~/.config/gcloud/to ~/.boto, but that didn't work either. I'm not sure how to configure authentication for the python server to access the cloud storage. I do not use App Engine, but Cloud Compute to configure the web server.

Here is my source code:

import boto
import gcs_oauth2_boto_plugin
import os
import shutil
import StringIO
import tempfile
import time

CLIENT_ID = 'my client id from ~/.config/gcloud/credentials'
CLIENT_SECRET = 'my client secret from ~/.config/gcloud/credentials'
gcs_oauth2_boto_plugin.SetFallbackClientIdAndSecret(CLIENT_ID, CLIENT_SECRET)

uri = boto.storage_uri('', 'gs')
project_id = 'my-test-project-id'
header_values = {"x-test-project-id": project_id}
# If the default project is defined, call get_all_buckets() without arguments.
for bucket in uri.get_all_buckets(headers=header_values):
    print bucket.name

The most recent error:

Traceback (most recent call last):
  File "upload/uploader.py", line 14, in <module>
    for bucket in uri.get_all_buckets(headers=header_values):
  File "/Users/ankitjain/dev/metax/venv/lib/python2.7/site-packages/boto/storage_uri.py", line 574, in get_all_buckets
    return conn.get_all_buckets(headers)
  File "/Users/ankitjain/dev/metax/venv/lib/python2.7/site-packages/boto/s3/connection.py", line 444, in get_all_buckets
    response.status, response.reason, body)
boto.exception.GSResponseError: GSResponseError: 403 Forbidden
<?xml version='1.0' encoding='UTF-8'?><Error><Code>InvalidSecurity</Code><Message>The provided security credentials are not valid.</Message><Details>Incorrect Authorization header</Details></Error>
+4
source share
1 answer

, GCE . , AWS boto . , , Google s3:

conn = boto.connect_s3(app.config['S3_KEY'], app.config['S3_SECRET'], "c.storage.googleapis.com")
bucket = conn.get_bucket(app.config['S3_BUCKET'], validate=False)

HMAC, , Google. : https://cloud.google.com/storage/docs/migrating

0

Source: https://habr.com/ru/post/1650984/


All Articles