AWS S3: how do I see how much disk space is using

Amazon S3Amazon Web-Services

Amazon S3 Problem Overview


I have AWS account. I'm using S3 to store backups from different servers. The question is there any information in the AWS console about how much disk space is in use in my S3 cloud?

Amazon S3 Solutions


Solution 1 - Amazon S3

The command line tool gives a nice summary by running:

aws s3 ls s3://mybucket --recursive --human-readable --summarize

Solution 2 - Amazon S3

Yippe - an update to AWS CLI allows you to recursively ls through buckets...

aws s3 ls s3://<bucketname> --recursive  | grep -v -E "(Bucket: |Prefix: |LastWriteTime|^$|--)" | awk 'BEGIN {total=0}{total+=$3}END{print total/1024/1024" MB"}'

Solution 3 - Amazon S3

To find out size of S3 bucket using AWS Console:

  1. Click the S3 bucket name
  2. Select "Metrics" tab
  3. You should see "Bucket metrics" which by default includes "Total bucket size"

Solution 4 - Amazon S3

s3cmd can show you this by running s3cmd du, optionally passing the bucket name as an argument.

Solution 5 - Amazon S3

The AWS CLI now supports the --query parameter which takes a JMESPath expressions.

This means you can sum the size values given by list-objects using sum(Contents[].Size) and count like length(Contents[]).

This can be be run using the official AWS CLI as below and was introduced in Feb 2014

 aws s3api list-objects --bucket BUCKETNAME --output json --query "[sum(Contents[].Size), length(Contents[])]"

Solution 6 - Amazon S3

On linux box that have python (with pip installer), grep and awk, install AWS CLI (command line tools for EC2, S3 and many other services)

sudo pip install awscli

then create a .awssecret file in your home folder with content as below (adjust key, secret and region as needed):

[default]
aws_access_key_id=<YOUR_KEY_HERE>
aws_secret_access_key=<YOUR_SECRET_KEY_HERE>
region=<AWS_REGION>

Make this file read-write to your user only:

sudo chmod 600 .awssecret

and export it to your environment

 export AWS_CONFIG_FILE=/home/<your_name>/.awssecret

then run in the terminal (this is a single line command, separated by \ for easy reading here):

aws s3 ls s3://<bucket_name>/foo/bar | \
grep -v -E "(Bucket: |Prefix: |LastWriteTime|^$|--)" | \
awk 'BEGIN {total=0}{total+=$3}END{print total/1024/1024" MB"}'
  • the aws part lists the bucket (or optionally a 'sub-folder')
  • the grep part removes (using -v) the lines that match the Regular Expression (using -E). ^$ is for blank line, -- is for the separator lines in the output of aws s3 ls
  • the last awk simply add to total the 3rd colum of the resulting output (the size in KB) then display it at the end

NOTE this command works for the current bucket or 'folder', not recursively

Solution 7 - Amazon S3

Cloud watch also allows you to create metrics for your S3 bucket. It shows you metrics by the size and object count. Services> Management Tools> Cloud watch. Pick the region where your S3 bucket is and the size and object count metrics would be among those available metrics.

Solution 8 - Amazon S3

See https://serverfault.com/questions/84815/how-can-i-get-the-size-of-an-amazon-s3-bucket

Answered by Vic...

<?php
if (!class_exists('S3')) require_once 'S3.php';

// Instantiate the class
$s3 = new S3('accessKeyId', 'secretAccessKey');
S3::$useSSL = false;

// List your buckets:
echo "S3::listBuckets(): ";
echo '<pre>' . print_r($s3->listBuckets(), 1). '</pre>';

$totalSize = 0;
$objects = $s3->getBucket('name-of-your-bucket');
foreach ($objects as $name => $val) {
    // If you want to get the size of a particular directory, you can do
    // only that.
    // if (strpos($name, 'directory/sub-directory') !== false)
    $totalSize += $val['size'];
}

echo ($totalSize / 1024 / 1024 / 1024) . ' GB';
?>

Solution 9 - Amazon S3

In addition to Christopher's answer.

If you need to count total size of versioned bucket use:

aws s3api list-object-versions --bucket BUCKETNAME --output json --query "[sum(Versions[].Size)]"

It counts both Latest and Archived versions.

Solution 10 - Amazon S3

Getting large buckets size via API (either aws cli or s4cmd) is quite slow. Here's my HowTo explaining how to parse S3 Usage Report using bash one liner:

cat report.csv | awk -F, '{printf "%.2f GB %s %s \n", $7/(1024**3 )/24, $4, $2}' | sort -n

Solution 11 - Amazon S3

The AWS console wont show you this but you can use Bucket Explorer or Cloudberry Explorer to get the total size of a bucket. Both have free versions available.

Note: these products still have to get the size of each individual object, so it could take a long time for buckets with lots of objects.

Solution 12 - Amazon S3

Based on @cudds's answer:

function s3size()
{
    for path in $*; do
        size=$(aws s3 ls "s3://$path" --recursive | grep -v -E "(Bucket: |Prefix: |LastWriteTime|^$|--)" | awk 'BEGIN {total=0}{total+=$3}END{printf "%.2fGb\n", (total/1024/1024/1024)}')
        echo "[s3://$path]=[$size]"
    done
}

...

$ s3size bucket-a bucket-b/dir
[s3://bucket-a]=[24.04Gb]
[s3://bucket-b/dir]=[26.69Gb]

Also, Cyberduck conveniently allows for calculation of size for a bucket or a folder.

Solution 13 - Amazon S3

This is an old inquiry, but since I was looking for the answer I ran across it. Some of the answers made me remember I use S3 Browser to manage data. You can click on a bucket and hit properties and it shows you the total. Pretty simple. I highly recommend the browser: https://s3browser.com/default.aspx?v=6-1-1&fam=x64

Solution 14 - Amazon S3

You asked: information in AWS console about how much disk space is using on my S3 cloud?

I so to the Billing Dashboard and check the S3 usage in the current bill.

They give you the information - MTD - in Gb to 6 decimal points, IOW, to the Kb level.

It's broken down by region, but adding them up (assuming you use more than one region) is easy enough.

BTW: You may need specific IAM permissions to get to the Billing information.

Solution 15 - Amazon S3

Mini John's answer totally worked for me! Awesome... had to add

--region eu-west-1 

from Europe though

Solution 16 - Amazon S3

Well, you can do it also through an S3 client if you prefer a human friendly UI.

I use CrossFTP, which is free and cross-platform, and there you can right-click on the folder directory -> select "Properties..." -> click on "Calculate" button next to Size and voila.

Solution 17 - Amazon S3

s3admin is an opensource app (UI) that lets you browse buckets, calculate total size, show largest/smallest files. It's tailored for having a quick overview of your Buckets and their usage.

Solution 18 - Amazon S3

So I am going to add Storage Lens from AWS on here with the default dashboard.

https://docs.aws.amazon.com/AmazonS3/latest/userguide/storage-lens-optimize-storage.html?icmpid=docs_s3_hp_storage_lens_dashboards

It is really super useful for identify hidden cost of storage like "incomplete multipart uploads"

It really should probably be now the first port of call for answering this question before you now reach for the code.

Solution 19 - Amazon S3

I use Cloud Turtle to get the size of individual buckets. If the bucket size exceeds >100 Gb, then it would take some time to display the size. Cloud turtle is freeware.

Attributions

All content for this solution is sourced from the original question on Stackoverflow.

The content on this page is licensed under the Attribution-ShareAlike 4.0 International (CC BY-SA 4.0) license.

Content TypeOriginal AuthorOriginal Content on Stackoverflow
QuestionKennyPowersView Question on Stackoverflow
Solution 1 - Amazon S3thaavikView Answer on Stackoverflow
Solution 2 - Amazon S3cuddsView Answer on Stackoverflow
Solution 3 - Amazon S3endrijuView Answer on Stackoverflow
Solution 4 - Amazon S3markuskView Answer on Stackoverflow
Solution 5 - Amazon S3Christopher HackettView Answer on Stackoverflow
Solution 6 - Amazon S3JScoobyCedView Answer on Stackoverflow
Solution 7 - Amazon S3roweleeView Answer on Stackoverflow
Solution 8 - Amazon S3JonLovettView Answer on Stackoverflow
Solution 9 - Amazon S3ruletkinView Answer on Stackoverflow
Solution 10 - Amazon S3Jakub GłazikView Answer on Stackoverflow
Solution 11 - Amazon S3Geoff ApplefordView Answer on Stackoverflow
Solution 12 - Amazon S3Evgeny GoldinView Answer on Stackoverflow
Solution 13 - Amazon S3user7191982View Answer on Stackoverflow
Solution 14 - Amazon S3Danny SchoemannView Answer on Stackoverflow
Solution 15 - Amazon S3pitxon_netView Answer on Stackoverflow
Solution 16 - Amazon S3Yiannis TsimalisView Answer on Stackoverflow
Solution 17 - Amazon S3maksionView Answer on Stackoverflow
Solution 18 - Amazon S3JamesKnView Answer on Stackoverflow
Solution 19 - Amazon S3Sangram AnandView Answer on Stackoverflow