Securing GCS Buckets: disable directory listing!

Tomáš Papež
Vortex Cloud
Published in
3 min readApr 1, 2024

Imagine you accidentally store sensitive customer data in a publicly accessible Google Cloud Storage bucket.

A simple web search could reveal all your files, exposing your company to a potential data breach.

While it’s common practice to make your GCS buckets to be publicly accessible by assigning the Storage Object Viewer role to all users — as recommended by Google Documentation — it’s crucial to understand the implications of this setting.

By consulting the GCS IAM roles documentation, you’ll notice the Storage Object Viewer role encompasses a broad range of permissions, namely:

resourcemanager.projects.get
resourcemanager.projects.list
storage.managedFolders.ge
storage.managedFolders.list
storage.objects.get
storage.objects.list

This contrasts with the more restrictive Storage Legacy Object Reader role, which grants a singular permission

storage.objects.get

The term “Legacy” shouldn’t deter you, but it’s likely to remain a viable option for the foreseeable future.

What are the real world implications of using Storage Object Viewer ?

The role allows for the listing of all objects in a bucket. When a bucket is public, this means anyone can easily discover and read every file and folder within your GCS bucket, as demonstrated by a simple navigation to the bucket’s root directory, which generates an XML file listing all contents and it can even by indexed be search engines.

This is further illustrated by the gsutil ls command, which lists the objects in a bucket.

But what happens when you use the Storage Legacy Object Viewer?

Since the role is much more restrictive, it significantly reduces public visibility. While it prevents listing all files, individual objects remain accessible if the URLs are known to consumers.

And in the console with gsutil it looks like this:

IAM Best Practices: Always follow the principle of least privilege. Audit your GCS IAM permissions regularly to avoid unnecessary exposure.

Few more TIPS and TRICKS with GCS public buckets

https://cloud.google.com/storage/docs/best-practices

The use of difficult-to-guess bucket and object names is essential, especially for public buckets. Even with the listing disabled, attackers could potentially discover the bucket name. They would then need to resort to guesswork or brute-force scanning to locate individual files.

Placing the GCS behind a CDN with a custom domain name does not mitigate this pain point. It only conceals the bucket name, but the risk of exposing sensitive information remains.

Think about what really needs to be public - is it the whole bucket? Or you can use Signed URLs to share specific objects where you can control the type (read, write, delete) and duration of access.

Don’t let misconfigurations expose your data. Secure your storage environment with Vortex’s deep Google Cloud expertise..

--

--