How do I get the file / key size in boto S3?

PythonAmazon S3Boto

Python Problem Overview


There must be an easy way to get the file size (key size) without pulling over a whole file. I can see it in the Properties of the AWS S3 browser. And I think I can get it off the "Content-length" header of a "HEAD" request. But I'm not connecting the dots about how to do this with boto. Extra kudos if you post a link to some more comprehensive examples than are in the standard boto docs.

EDIT: So the following seems to do the trick (though from looking at source code I'm not completely sure.):

bk = conn.get_bucket('my_bucket_name')
ky = boto.s3.key.Key(bk)
ky.open_read()  ## This sends a GET request. 
print ky.size

For now I'll leave the question open for comments, better solutions, or pointers to examples.

Python Solutions


Solution 1 - Python

This would work:

bk = conn.get_bucket('my_bucket_name')
key = bk.lookup('my_key_name')
print key.size

The lookup method simply does a HEAD request on the bucket for the keyname so it will return all of the headers (including content-length) for the key but will not transfer any of the actual content of the key.

The S3 tutorial mentions this but not very explicitly and not in this exact context. I'll add a section on this to help make it easier to find.

Note: for every old link like http://boto.cloudhackers.com/s3_tut.html that returns a 404, add in "/en/latest" right after the ".com" : http://boto.cloudhackers.com/en/latest/s3_tut.html . (Someone needs to explore mod_rewrite...)

Solution 2 - Python

in boto3:

s3.head_object also performs a HEAD request to retrieve the meta data about the object:

s3 = boto3.client('s3')
response = s3.head_object(Bucket='bucketname', Key='keyname')
size = response['ContentLength']

Solution 3 - Python

in boto3 using an S3 resource:

boto3.resource('s3').Bucket(bucketname).Object(keyname).content_length

The head_object call of the S3 client returned me an http "403 Forbidden"

Solution 4 - Python

In Boto 3:

Using S3 Object you can fetch the file (a.k.a object) size in bytes. It is a resource representing the Amazon S3 Object.

In fact you can get all metadata related to the object. Like content_length the object size, content_language language the content is in, content_encoding, last_modified, etc.

import boto3
    
s3 = boto3.resource('s3')
object = s3.Object('bucket_name','key')
file_size = object.content_length #size in bytes

Reference boto3 doc

Solution 5 - Python

You can also get a list of all objects if multiple files need to be checked. For a given bucket run list_objects_v2 and then iterate through response 'Contents'. For example:

s3_client = boto3.client('s3')
response_contents = s3_client.list_objects_v2(
        Bucket='name_of_bucket'
        ).get('Contents')

you'll get a list of dictionaries like this:

[{'Key': 'path/to/object1', 'LastModified': datetime, 'ETag': '"some etag"', 'Size': 2600, 'StorageClass': 'STANDARD'}, {'Key': 'path/to/object2', 'LastModified': 'datetime', 'ETag': '"some etag"', 'Size': 454, 'StorageClass': 'STANDARD'}, ... ]

Notice that each dictionary in the list contains 'Size' key, which is the size of your particular object. It's iterable

for rc in response_contents:
    print(f"Size: {rc.get('Size')}")

You get sizes for all files you might be interested in:

Size: 2600
Size: 454
Size: 2600
...

Attributions

All content for this solution is sourced from the original question on Stackoverflow.

The content on this page is licensed under the Attribution-ShareAlike 4.0 International (CC BY-SA 4.0) license.

Content TypeOriginal AuthorOriginal Content on Stackoverflow
QuestionmjhmView Question on Stackoverflow
Solution 1 - PythongarnaatView Answer on Stackoverflow
Solution 2 - PythonKristianView Answer on Stackoverflow
Solution 3 - PythonoyophantView Answer on Stackoverflow
Solution 4 - PythonsatznovaView Answer on Stackoverflow
Solution 5 - PythonLeo SkhrnkvView Answer on Stackoverflow