Terminal Tip: Find files by size

While helping a client assess their hosting account disk use overage I found a real quick command useful to locate the large-file culprits.

I used ssh to connect to their server and ran the following script, which exposes files greater than 100mb:

find / -mount -noleaf -type f -size +100000k -print0 | xargs -0 ls -lhSr | perl -ne ‘/(\S+\s+){4}(\S+)\s+(\S+\s+){3}(.*)/ and printf(“%*s %s\n”,7,$2.”:”,$4);’

Voila, this exposes:
100M: /var/lib/mysql/asdf-123.accessdomain.com.err
100M: /home/domain1/public_html/wp-content/uploads/2015/09/100MB.zip
102M: /usr/lib/locale/locale-archive
108M: /home/domain12/.trash/public_html.zip
119M: /home/domain13/public_html/wp-admin/core.20977
178M: /home/domain1/public_html/core.21078
201M: /home/domain14/public_html/wp-content/uploads/2015/09/210MB.zip
270M: /home/domain/logs/domain.ca-ssl_log-Jun-2017.gz
438M: /home/domain1/tmp/analog/ssl/domain.ca/cache
464M: /home/domain1/logs/domain.php.error.log
1.3G: /home/domain15/public_html/wp-content/tarfile.tar.gz
1.3G: /home/domain1/.trash/tarfile.tar.gz

I was able to easily clean up these archive files (I didn’t touch wp-admin/core or /usr/lib files because I don’t know what those are) and reduce the disk usage from 84% => 67%.

With the underlying cause of the daily alerts resolved, I have re-enabled the warning alerts for 82%.

Like what you read? Give M. Leslie Bent a round of applause.

From a quick cheer to a standing ovation, clap to show how much you enjoyed this story.