Boto.s3: copy () on key object loses Content-Type metadata

Here is an example S3 key copy code. There are many reasons you can do this, one of which is updating key metadata. And although this seems to be a widespread solution for this, there is a big problem. The problem is that when I do the example below, I actually lose my Content-Type, which by default returns "application / octet-stream" (not very useful if you are trying to serve web images).

# Get bucket conn = S3Connection(self._aws_key, self._aws_secret) bucket = conn.get_bucket(self._aws_bucket) # Create key k = Key(bucket) k.key = key # Copy old key k.metadata.update({ meta_key: meta_value }) k2 = k.copy(k.bucket.name, k.name, k.metadata, preserve_acl=True) k = k2 

Any ideas? Thanks.

+6
source share
2 answers

The following GitHub Gist worked for me:

 import boto s3 = boto.connect_s3() bucket = s3.lookup('mybucket') key = bucket.lookup('mykey') # Copy the key onto itself, preserving the ACL but changing the content-type key.copy(key.bucket, key.name, preserve_acl=True, metadata={'Content-Type': 'text/plain'}) key = bucket.lookup('mykey') print key.content_type 

It's time to work though!

+5
source

take a look at this post

you need to do

 key = bucket.get_key(key.name) 

then

 metadata['Content-Type'] = key.content_type will work 

otherwise key.content_type will return application/octet-stream

+1
source

Source: https://habr.com/ru/post/907616/


All Articles