List Directories and Files recursively on thousands of FTP Servers

Only a few lines of code under Linux

honze
5 min readApr 13, 2018
Photo by Samuel Zeller on Unsplash

Motivation

Recently I came across the interesting task to check thousands of machines for sensitive data. Nobody wants to leak data through a misconfigured server. One classic way is to have a forgotten FTP server, which contains an old backup of some machine or service. These often contain — of course — credentials, private keys, database dumps or whatever an administrator felt like it was worth backing it up. Backups itself are not a bad idea, but putting them on an anonymously accessible FTP server is a big no-no. It is worth noticing, that I do not blame any administrator for that, but I show them the issue.

So I had a list with servers to check. A few would be easy. Just find the FTP port with nmap (if not 21) and start browsing. But my list had over onethousand servers. And some had deep directory structures. I like point-and-click adventures, but this one could get boring very easily.

At first I looked for a ready to use tool, that would solve my task, but to my surprise I could not find one that I liked. I do not say, that there is none, but I think I am just a bit too picky. Then I thought: This is the ideal situation to script one myself. Of course this would not be a polished piece of art. It should only get the job done.

I divided the task into several parts.

  • The first part is to list the content of a FTP server recursively,
  • the second part is to do that for thousands of machines and
  • the final part is to analyze the data.

List recursively

There are a lot of ways to list the file system of a FTP server. You can use ftp, wget, or any other tool that can handle FTP and knows what recursion is. But I found ncftpls from the package ncftp. It is a FTP client meant to be used in scripts. A perfect match for my task.

The man page has an example for recursive listing:

ncftpls -R ftp://ftp.ncftp.com/

This produces the following output:

drwxr-xr-x 3 ftpuser ftpusers 3 Sep 6 2006 gnu
drwxr-xr-x 3 ftpuser ftpusers 6 Nov 27 2016 libncftp
drwxr-xr-x 5 ftpuser ftpusers 10 Dec 4 2016 ncftp
drwxr-xr-x 8 ftpuser ftpusers 9 Jan 30 18:56 ncftpd
drwxr-xr-x 3 ftpuser ftpusers 5 Jan 11 2001 unixstuff
drwxr-xr-x 5 ftpuser ftpusers 6 Sep 6 2006 winstuff
./gnu:
drwxr-xr-x 3 ftpuser ftpusers 4 Sep 6 2006 gnuplot
./gnu/gnuplot:
drwxr-xr-x 2 ftpuser ftpusers 16 Sep 6 2006 binaries
-rw-r — r — 1 ftpuser ftpusers 4367613 Jul 14 2006 gnuplot+png-4.0.0.tar.gz
### [A LOT MORE DIRECTORIES] ###./winstuff:
drwxr-xr-x 2 ftpuser ftpusers 3 Mar 2 1999 cpse
lrwxr-xr-x 1 ftpuser ftpusers 18 Aug 9 2007 ncdbm -> ../unixstuff/ncdbm
drwxr-xr-x 2 ftpuser ftpusers 4 May 15 1999 trayping
drwxr-xr-x 2 ftpuser ftpusers 4 Nov 12 1998 trayspy
./winstuff/cpse:
-rw-r — r — 1 ftpuser ftpusers 569071 May 11 1998 cpse.zip
./winstuff/trayping:
-rw-r — r — 1 ftpuser ftpusers 605546 Mar 2 1999 trayping-1.0.zip
-rw-r — r — 1 ftpuser ftpusers 610346 May 15 1999 trayping.zip
./winstuff/trayspy:
-rw-r — r — 1 ftpuser ftpusers 639479 Nov 12 1998 trayspy.zip
-rw-r — r — 1 ftpuser ftpusers 176497 Nov 12 1998 trayspysrc.zip

193 lines in just 1.9 seconds. So ~100 lines a second. This is reasonably fast for remote recursive search. But, some FTP servers do not allow the recursive search in this way. You will only get the listing of /. But in my case I did not encounter this problem.

Parallel execution

If you want to check thousands of servers, you do not want to check one after the other. You want parallel execution. This is the reason for GNU parallel. If you know xargs a bit, then this will be easy for you. If not: No problem, the man page of parallel is a big help, pun intended!

Thinking a bit ahead, I know that I want to accumulate all results into a single file. To make things easier and more readable, I often write a small shell script which does the listing of one server and then I hand this script to parallel.

This is the content of my ftp-ls-R.sh:

#!/bin/sh
echo $1
ncftpls -R ftp://$1

It is just a simple script, that takes the hostname as first argument. It writes it so that we can see, which listing belongs to which server. (ncftpls will not write the hostname to the console, so that is the reason.) Then ncftpls is called with -R for recursive listing and the mandatory ftp:// before the hostname.

All hosts to be checked are in the hosts.txt. Now it is time to call parallel:

cat hosts.txt | parallel -k --bar -j16 ./ftp-ls-R.sh | tee result.txt

parallel reads all hostnames from the standard input, so just pipe them with cat. The option -k preserves the order of the output. So nothing gets mixed up, because one FTP server is faster than another. I really like a progress bar, so I use --bar and I want to use 16 parallel executions with -j16. At the end I call the small shell script and pipe the output to tee, which lets me see the execution and let me have a copy on disk.

Analysis

The job only took a few minutes to list thousand servers recursively. Most of them were not configured for anonymous access, so the result.txt listed only the hostname and no directory listing. But I discovered a few servers with backups of other machines. You can easily search in the result.txt for interesting keywords. I manually connected and inspected the files. No surprise: I found credentials, private keys and a whole database.

Conclusion

It is astonishing what you can find with only this little effort. The scan did not take that long, so this could be interesting for some red team scenarios or penetration tests.

--

--

honze

www.honze.net — 1+1=10, Hacker, Nerd, former Soldier, working as InfoSec Pro — München