How can I find out why my storage space on Amazon EC2 is full?

LinuxAmazon Web-ServicesAmazon Ec2

Linux Problem Overview


When I run df -h on my Amazon EC2 server, this is the output:

[ec2-user@ip-XXXX ~]$ df -h
Filesystem            Size  Used Avail Use% Mounted on
/dev/xvda1             25G   25G     0 100% /
tmpfs                 4.0G     0  4.0G   0% /dev/shm

For some reason, something is eating up my storage space.

I am trying to find all of the big files/folders and this is what I get back:

[ec2-user@ip-XXXX ~]$ sudo du -a / | sort -n -r | head -n 10
993580  /
639296  /usr
237284  /usr/share
217908  /usr/lib
206884  /opt
150236  /opt/app
150232  /opt/app/current
150224  /opt/app/current/[deleted].com
113432  /usr/lib64

How can I find out what's eating my storage space?

Linux Solutions


Solution 1 - Linux

Well, I think its one (or more) logfiles which have grown too large and need to be removed/backupped. I would suggest going after the big files first. So find all files greater than 10 MB (10 MB is a big enough file size, you can choose +1M for 1MB similarly)

sudo find / -type f -size +10M -exec ls -lh {} \;

and now you can identify which ones are causing the trouble and deal with them accordingly.

As for your original du -a / | sort -n -r | head -n 10 command, that won't work since it is sorting by size, and so, all ancestor directories of a large file will go up the pyramid, while the individual file will most probably be missed.

Note: It should be pretty simple to notice the occurence of similar other log files/binaries in the location of the files you so find, so as a suggestion, do cd in to the directory containing the original file to cleanup more files of the same kind. You can also iterate with the command for files with sizes greater than 1MB next, and so on.

Solution 2 - Linux

If you are not able to find any gigantic file , killing some processes might solve the issue (it worked for me, read full answer to know why)

Earlier:

/dev/xvda1       8256952 7837552         0 100% /

Now

/dev/xvda18256952 1062780   6774744  14% /

Reason: If you do rm <filename> on a file which is currently open by any process, it doesn't delete the file and the process still could be writing to the file. These ghost files can't be found by find command and they can't be deleted. Use this command to find out which processes are using deleted files:

lsof +L1

Kill the processes to release the files. Sometimes its difficult to kill all the processes using the file. Try restarting the system (I don't feel good, but that's a quick solution, makes sure no process uses the deleted file)

Read This: https://serverfault.com/questions/232525/df-in-linux-not-showing-correct-free-space-after-file-removal/232526

Solution 3 - Linux

At /, type du -hs * as root:

$ sudo su -
cd /; du -hs *

You will see the full size of all folders and identify the bigger ones.

Solution 4 - Linux

This space is consumed by mail notifications

you can check it by typing

sudo find / -type f -size +1000M -exec ls -lh {} \;

It will show large folders above 1000MB

Result will have a folder

/var/mail/username

You can free that space by running the following command

> /var/mail/username

Note that greater than (>) symbol is not a prompt, you have to run the cmd with it.

Now check you space free space by

df -h

Now you have enough free space, Enjoy... :)

Solution 5 - Linux

ansh0l's answer is the way to go to find large files. But, if you want to see how much space each directory in your files system is consuming, cd to the root directory, then do du -k --max-depth='. This will show you how much space is being consumed by each subdirectory within the root directory. When you spot the culprit, cd to that directory then run the same command again, and repeat, until you find the files that are consuming all of the space.

Attributions

All content for this solution is sourced from the original question on Stackoverflow.

The content on this page is licensed under the Attribution-ShareAlike 4.0 International (CC BY-SA 4.0) license.

Content TypeOriginal AuthorOriginal Content on Stackoverflow
QuestionD_RView Question on Stackoverflow
Solution 1 - LinuxAnshul GoyalView Answer on Stackoverflow
Solution 2 - Linuxuser18853View Answer on Stackoverflow
Solution 3 - LinuxRicardo MartinsView Answer on Stackoverflow
Solution 4 - LinuxKrishan Kumar MouryaView Answer on Stackoverflow
Solution 5 - Linuxmti2935View Answer on Stackoverflow