A lightweight way to check the IAM permissions granted to your Cloud Platform Cloud Storage buckets
Understanding the permissions you’ve granted to your Cloud Storage buckets is important so you can avoid inadvertently granting too broad permissions so please take time out to understand IAM roles as they pertain to Cloud Storage by reading this .
The Cloud Storage IAM best practices are listed here but to validate that you haven’t made a mistake in setting up permissions a lightweight check can be useful .
That’s what this post is all about . For a framework approach you may want to check out the extensible Forseti Security ( I’ll revisit that in depth at some point)
You can quickly check permissions on a bucket by utilising a very underrated feature that GCP has available that allows you to test API’s in the browser.
To do this you need to use two of the API pages:
Buckets List api docs- this will give you a list of all the buckets in a project if you are authenticated and have as a minimum the roles/viewer permission granted on the project
Now you have a list of buckets you can obtain the permissions against the identified buckets by using the buckets:getiampolicy api docs . For this page as minimum you need the storage.buckets.getIamPolicy permission on the project containing the bucket
The permissions on the bucket is returned as JSON ( yeah!) which for one of my buckets ( suitably anonymised output)looked like this:
{“kind”: “storage#policy”,“resourceId”: “projects/_/buckets/mydemobucket”,“bindings”: [{“role”: “roles/storage.legacyBucketOwner”,“members”: [“projectEditor:my-project-id”,“projectOwner:my-project-id”]},{“role”: “roles/storage.legacyBucketReader”,“members”: [“projectViewer:my-project-id”]},{“role”: “roles/storage.objectViewer”,“members”: [“user:joe@joesworkdomain”]}],“etag”: “CAM=”}
That’s a bit tedious though ( as it’s separate http request for each bucket ) and as I like to automate stuff and use the easy to use Client libraries I wrote a small python script that you can use as the basis to automatically check the permissions on the buckets in your project(s). It also gives you a list of the projects you have access to as well ( just because really).
It writes the permissions for each bucket to a text file but it can easily be extended to write the values to a CloudSQL table or BigQuery. ( I tend to split the list project function out into a separate script and use the projectID’s that outputs to set the GOOGLE_CLOUD_PROJECT env variable then call a script that does the bucket checking for that project)
The script writes a single line for each permission against a bucket in the format :
<Bucket: BUCKETNAME> : Role: roles/storage.BUCKET-IAM-ROLE, Members: set([u’GROUP or USER:my-project-id’,u’GROUP or USER:my-project-id’,])If the same role is granted to additional groups or users ( members in GCP IAM terminology) that entry is included “, “ separated as shown above.
An example ( suitably anonymised output) for one of my buckets looked like :
<Bucket: mydemobucket> : Role: roles/storage.legacyBucketOwner, Members: set([u’projectEditor:my-project-id’, u’projectOwner:my-project-id’])<Bucket: mydemobucket> : Role: roles/storage.objectViewer, Members: set([u’user:joe@joesworkdomain’])<Bucket:mydemobucket> : Role: roles/storage.legacyBucketReader, Members: set([u’projectViewer:my-project-id’])
Please note that the output is sensitive so please ensure that you take appropriate steps to take care in how you manage the data and who you give access to it.
You can find my base python script here ( Yes I know no tests but it’s a really simple script & I wanted to turn this around before I got back to work after my week off) .
The script was Inspired by Ric Harvey( @ric_harvey) who posted to github a handy s3 permission checker. Thanks Ric !
Mine doesn’t give you as pretty an output as Ric’s though it’s really designed for you to fork and do something more than just write to a text file

