Google Cloud Storage “downsizer”

Further adventures with Google Cloud Functions

Daz Wilkin
Google Cloud - Community
5 min readDec 21, 2017

--

Nobody stop me… this is too much fun ;-)

App Engine Standard provides an Images service that many customers value and often use to create thumbnails (in Google Cloud Storage). Firebase has a Cloud Functions sample that uses ImageMagick for thumbnails too.

Here’s my version. It uses file streaming so should (haven’t tested) scale better and it creates multiple downsized (thumbnail) images. I decided to use Cloud Source Repositories to host my code which works well though would benefit from change triggering of new Cloud Functions deployments… files FR.

Setup

I think you may benefit from Google Cloud Platform’s Free (as in beer) Tier because Google Cloud Functions, Google Cloud Storage and Google Cloud Source Repositories appear all to be included.

Open a bash terminal and get coding:

Cloud Source Repositories (CSR)

CSR is a service that provides hosted, private Git repos. I’m generally lazy and use a regular file system for hosting my code but, keeping changes is great with Git. Once the CSR repo is created, clone it locally for your code:

Create index.js:

And package.json:

Open Visual Studio Code (or your preferred editor). Replace the value of ‘root’ (#10) with the value of your bash env variable ${ROOT}. Save it.

Visual Studio Code includes a Git client and so you may stage your changes and define a commit using it. Alternatively, if you prefer to use the command-line:

Any time you change either of these files, repeat the add, commit, push command to ensure your files are reflected correctly in CSR:

Cloud Source Repositories: default

Let’s deploy this code as a Cloud Function

Cloud Functions

Sticking with the command-line, you should be able to:

You’ll note that I’m mixing up “downsizer” and “thumbnail”, apologies. I started with “thumbnail” but now prefer “downsizer”. It’s more… dramatic. Fortunately, Cloud Functions permits aliasing. We give Cloud Functions the name “downsizer” but we tell it that the function exported in the Node.js code is called “thumbnail”. The value of the source flag is specific to ${PROJECT}, the fact that the repo is called “default” and that we’re using “master” for simplicity.

A few minutes later…

Cloud Functions: “downsizer” deployed

Or, if you’d prefer:

Testing

Because we created buckets for this project and have yet to use them, they’re both empty:

Find your favorite image and use gsutil to copy it to the ‘trigger’ bucket. This will trigger the Cloud Function to generate 4 thumbnails of our image in the ‘thumbnail’ bucket. In my example, I’m also moving (renaming) the file to “/2013/road-trip/henry.jpg” to show that the source path is preserved.

Here’s me with my best buddy:

4000x3000

This image is 4000x3000 pixels and is 1.7 MiB

All being well, you should expect 4 thumbnails to appear in the ‘thumbnails’ bucket. Here’s my result:

Enumerate these using “gsutil ls -l” I learn that they are 18364, 26154, and 15758 bytes per the list order which sounds about right. So, let’s confirm:

The 64x64 is actually 64 by 48 because my original image was not square (64/4000*3000=48)

64x48

And, 256x256 is actually 256 by 192 for the same reason:

256x192

Logs

I’ve become slightly obsessive recently about grepping Stackdriver Logging logs so, I’ll control myself and just give you an example:

NB the “freshness” flag which I learned of earlier today and, in this case, just retrieves the last 10 minutes’ logs.

Update: 21-Dec-17

I revised the sample to leverage header metadata to specify the desired thumbnail sizes. The code above reflects this. Now, if the object includes a ‘goog-meta-sizes’ header with an array of sizes, these will used instead of the default set of [“256x256”,”128x128",”64x64"]:

Results in:

NB I find it difficult to track down the specifications for Cloud Functions event data. It is documented, here and then, for Cloud Storage Objects here.

Conclusion

This isn’t vastly different to my earlier Exploder post but, absent alternatives from Google or 3rd-parties, it’s straightforward to build, deploy and run, Cloud Functions for event-driven processing.

Feedback always welcome!

Tidy-up

You can delete the Cloud Functions:

You may delete buckets after recursively delete all their objects. Please be VERY careful using this command:

Alternatively you can simply delete the project which will delete everything within it too:

Thanks!

--

--