How, I dumped crypto data by chaining directory listing to open S3 Bucket

Hello Everyone,

Today, I am gonna share one of my interesting private bug Bounty finding and very Unique s3 Bucket finding but before starting, Let’s just understand what is S3 Bucket and why it is very important.

What Is S3 Bucket:

An Amazon s3 (Simple Storage service) is a service from AWS (Amazon Web Services) which is like a cloud storage used to store file, folders, objects , etc. It is used to mostly store images, videos, PDFS, text files, and in rare cases to store source backups, credentials in plain text, etc. AWS can be used using the website or using the CLI (which we will be using).

Also, S3 Transfer Acceleration helps execute fast, secure transfers from a client to an S3 bucket via AWS edge locations.

What Is problem:

Problems with AWS S3 buckets’ permissions are as old as the service itself. I think that 2 the most known researches about this issue were performed by Skyhigh pointing that 7% of all S3 buckets are open and by Rapid7 pointing that even 17% are open.

How I found an open S3 Bucket:

There are multiple ways to find an associated Amazon s3 bucket of the target application, It can be found by brute forcing target using many tools.

During my recon using Amass subdomain enumeration tool, I found a domain, which seems very interesting to me and i started browsing it and found out that domain is vulnerable to Directory listing vulnerability and quickly, I started to check each directories but ended up with nothing because all mentioned directories were listing the data but i was unable to download it, not Sure why!!!!

I was totally frustrated because i wasted around 5–6 hour and started to looking around and was trying to find out bypasses by using various methods such curl command, Wget and other methods. But nothing worked!!!!

I was about to drop an idea to look around more in same subdomain but i just want to give a last chance to myself. During browsing domain, i noticed that a unique request i.e.

serving content to domain “ data was nothing but the same data listed on directory, I understood that data validation is on domain “” side but not from S3 bucket side.

I quickly checked above domain “”, as it was a S3 bucket and found out that the bucket is open to public and contain same data. which was present on . “”.

Now, i understood completely, how application is behaving.

I, quickly open my terminal and fire S3 bucket commands, it was completely working fine and was able to browse the data and able to download data, not only download but i was able to list, cp and move the data, because the bucket was open to world and Open S3 Bucket is also a part of OWASP serverless top 10 and serving all the content to domain, i.e.


  1. Execute the below commands from the CLI

Listing a file — aws s3 ls s3://<xyz>-uploads/

Deleting a file — aws s3 rm s3://<xyz>-uploads/test.html

Image for post
Image for post

Now I have so many zip file and after extraction of those zip files, i got DB config file containing so many information, few of them mentioned but can’t mentioned all because of data criticality.

Image for post
Image for post
DB Version and tables details
Image for post
Image for post
Meta data and table details

Hope you enjoyed my write!!!


  1. Review the bucket ACLs to verify WRITE and WRITE_ACP are only set on specific users, never on groups such as AllUsers or AuthenticatedUsers.
  2. Take a look and see how you are uploading objects to S3 buckets and make sure you set the proper ACLs on both buckets and objects.

Note: Newly created Amazon S3 buckets and objects are private and protected by default.

An Independent Security researcher and Part time Bug Bounty Hunter, Security Engineer @Visa, Previously @Paytm.

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store