Finding the space hogs on a Linux server

I was trying to cut the fat out of a disk-challenged server recently and wanted to see what was taking up all the space. This link was a good start, but with some cleanup a more useful version of this command, exporting the output to a flat file readable in Excel, is:

# find /var/www/ -type f -size +10000k -exec ls -lh {} \; | awk '{ print $9 "\t" $5 }' > bigfiles.txt

Which spits out any files under /var/www/ that are greater than 10MB in size.

On the other hand, to find every directory over 1GB in size, sorted, do

# du -h / | grep ^[0-9.]*G | sort –rn

See this link for several other variants of this.

Leave a comment

Your email address will not be published. Required fields are marked *