So, It’s been a while since I’ve blogged about something. It’s because I’m busy with my cybersecurity startup called PrimeFort (www.primefort.net) for the past 1 year!
So, as like all the startup founder, I was thinking how am I gonna get clients, as a hacker this was my first step of trying to bring in a million dollar company by hacking it and approaching them to be our client because I’ve literally had everything to run their business without them :P !
For privacy concern, I’m not gonna disclose the site name. So, let’s call it as examplesite.com
As like all, I was crawling the site. Started with robots.txt ‘www.examplesite.com/robots.txt’ (in case if you don’t know what’s robots.txt, have a look at this documentation) they’ve put only their product directory in that, no juicy information was there and then I went ahead and ran dir buster for a couple of minutes nothing interesting turned out this time as well
and I was like looking at the website like below pic and questioning to myself about my existence.
But, Initially, i was thinking that they’d have used any CMS to develop their site. But, It took 30 minutes to realize I was a fool even if I knew it already!
After the 30 minutes disappointment, went out and had a cup of coffee and was reading whatever they’ve mentioned on their site and then checked their source code for the sake to feel like a hacker at least, then I was randomly typing WordPress, Joomla,safe, security blah blah. After I typed, I found a line in their site, which was stating that, For security concerns, they had hosted their files in Amazon S3 Bucket!
There’s this famous quote “Know your enemy more than he does himself” and the below thing is an example for it and it’s same when you try to hack something! xD
Now, came to a conclusion that they save their documents and files in AWS Cloud Server! But, I wasn’t sure of their AWS s3 bucket link.
Before that, what’s Amazon s3?
Amazon S3 is cloud storage for the Internet. To upload your data (photos, videos, documents etc.), you first create a bucket in one of the AWS Regions. You can then upload any number of objects to the bucket and every files would be having an unique link and then they’re delivered through Amazon CDN. Also, you can create up to 100 buckets per account. Each bucket can contain an unlimited number of files. Buckets cannot be nested, you can not create a bucket within a bucket. In addition, These days many websites using Amazon services!
Then googled for a couple of hours and ended up finding a script called ‘Bucket Finder’ which is written in ruby!
Script and installation manual Link : https://digi.ninja/projects/bucket_finder.php
So, what this bucket finder actually does? as per the creator says!
This is a fairly simple tool to run, all it requires is a wordlist and it will go off and check each word to see if that bucket name exists in the Amazon’s S3 system. Any that it finds it will check to see if the bucket is public, private or a redirect.
Public buckets are checked for directory indexing being enabled, if it is then all files listed will be checked using HEAD to see if they are public or private. Redirects are followed and the final destination checked. All this is reported on so you can later go through and analyse what has been found.
Then I went ahead and typed all the name that’s related to this company which I wanted to hack and made a list as follows to feed that with bucket finder script.
So, now we have a guessed list and the script, now execute it as follows!
and this was the output of it.
Now, you gotta find a way to hack those buckets! To hack such buckets it has to be a misconfigured bucket. So, how to find it?
The output of bucket_finder was something like,
Bucket redirects to example site.com redirects to examplesite.s3.amazonaws.com
and whatever comes before s3 is the name of the bucket and to connect with such clients you’ll have to install aws-cli
To install it, have a look at this below documentation http://docs.aws.amazon.com/cli/latest/userguide/installing.html
Once you’re done with the installation part, You’ll have to configure the amazon aws-cli with an access key (Assuming you already have an account in amazon s3) and you can find yours in the below link
To configure, have a look at the below document
Once you’re done with the configuration part, you’re good to go!
To check whether the bucket is vulnerable or not, you’ll have to execute the command as follows in your terminal once you’re done with the installation of aws-cli of course! Let’s see how to do that…
You’ll have to execute it like, aws s3 ls s3://bucketname (bucketname is the name that comes before .s3.amazonaws.com)
in our case, it’s ‘aws s3 ls s3://examplesite’ and the output is as follows.
Let’s go ahead and see if there’s anything interesting in the server and also let’s try to put a file inside their server also you can execute commands like ls, mv, cp.
First, let’s try uploading a file.
and it’s successfully uploaded and now let’s dissect the command.
aws s3 : mentioning s3 server
mv : move file
test : the file you want to upload
s3://examplesite : the site in which you want to upload your files!
full command : aws s3 mv test.txt s3://examplesite
once it is successful you’ll get an output like move : ./file name along with the path of it.
Now, its time to find something juicy in the server!
Found a file name, ‘E-Mail’ quickly opened the folder to see what’s inside ,with a command ‘ls’ and found all of them were users documents, digital signature, birth certificate, aadhar! I tried downloading but it gave me an error like, ‘Access-Denied’.
Also, I tried opening the file I’ve uploaded but it gave me an error like below
and I was like, why me always!? :(
Then, I went ahead and googled once again as I don’t want to hear “Google It Bro” From any haawkers!
Then found results in a couple of blogs, which was saying that we have to change the permission of the files while you uploading them by using the below command as follows
But, what about the sensitive files which are already in the site? Searched in google again. found a bunch of results most of them were saying, we can’t do anything about the existing files! But, fortunately, in a blog, it was mentioned that by synchronizing or by changing the name of the existing folder you can change the permission the commands are as follows,
Then downloaded my own documents from that site as I was one of their clients and finally I could able to download all the E-Mail ids of the customer, Documents that they had submitted and literally everything that their client provided them. But, I didn’t stop there, I don’t wanna destroy anything but, I want to do something evil but it shouldn’t hurt
So, I went ahead and was finding the path of the logo of the company and replaced it with my SVG Image *XSS* Don’t tell to anyone k? and it which shows as follows! :P
Content of my SVG Image as follows!
Uploaded it with ACL public-read permission
And replaced their original logo with this SVG and Booom! :D Executed
Once I’m done with my things, Right away I went to their office and met the CTO, I was telling all about this and he got shocked and that’s the story of ‘How I got my first client’
To patch it, make sure Amazon S3 ACL is configured properly and that’s all for now.
Hope you love reading it and learned something! :) Bhai bhai bhai!
An Advice for beginners! please, don’t ask someone for a particular blog to read because the knowledge would be very narrow as that comes from a single person.
Always remember — blog link = single person = single perspective = less knowledge! don’t lose the hope, Importantly don’t die before death!
If any queries, drop me a mail at firstname.lastname@example.org ! Cheers!
The Blog Content has been made available for informational and educational purposes only.
I, hereby disclaims any and all liability to any party for any direct, indirect, implied, punitive, special, incidental or other consequential damages arising directly or indirectly from any use of the Blog Content is solely responsible by the readers