How can I get the Amazon S3 bucket size using Python (Boto lib)?

I would like to get the size (in bytes and # of keys) of an Amazon S3 bucket.
I am looking for an effective way to get the size of a bucket.

One possible way (which is NOT effective): I can get a list of keys from the basket and summarize the size of each key. This is inefficient when I have a thousand keys, because I have to look for every key size.

Is there an effective solution?

UPDATE:
The following code is not what I'm looking for (because it is inefficient):

bucket = conn.get_bucket("bucket_name") total_size = 0 for key in bucket.list(): total_size += key.size 
+5
source share
2 answers

There seems to be no direct call. You can iterate over the keys and take stock.

 bucket = conn.get_bucket(self.container) size = 0 for key in bucket.list(): size += key.size 

This should only be used if there are a small number of keys in the bucket and the calculation is not performed very often.

Check this (Not Boto) for a more useful option.

+2
source

I found something, you can get the number of keys in the bucket using these codes:

 from boto.s3.connection import S3Connection conn = S3Connection('<aws access key>', '<aws secret key>') bucket = conn.get_bucket("bucket_name") number_of_keys = len(bucket.get_all_keys())}) 

I still need the bucket size (in bytes)


0
source

Source: https://habr.com/ru/post/1200218/


All Articles