Deploying to Google Cloud: From single function to monorepo

Eduardo Lanchares
UmamiTech Blog

--

At Umamicart we use Google Cloud Functions for multiple projects: from internal tools to handling the fulfillment process to our own carrier service for calculating shipping rates.

Most of them are implemented as serverless functions, which allowed us to forget about managing the infrastructure by ourselves.

Overtime more and more functions were added to the project and eventually things started getting more complicated.

Background

Initially had each function in its own repository, which was great in the beginning. But as new functions were added we started to have common logic shared across all projects. While this approach is perfectly fine, it led to having different versions of the same logic, and updating it was cumbersome.

So we decided to search for other solutions to reuse shared modules in our projects.

Our requirements

We wanted to achieve the following:

  • Easy maintenance of our shared modules
  • Simple development with no build steps or custom scripts
  • Able to deploy multiple functions at once

We tried some options and finally reach an acceptable one. Here’s a list of our findings.

First attempt: sharing modules in a separate directory

The idea was to put every project in its own folder and have a commons directory where we put shared logic, something like this:

my-function
├── index.js
└── package.json
other-function
├── index.js
└── package.json
commons
└── shared-module.js

This was our initial approach but it had a major drawback: it just didn’t work when we deployed to the cloud.

It turns out that GCF requires that the root directory have the following structure in order to work:

.
├── index.js
└── package.json

And it was not our case 😫. We wanted to have a root directory with clearly separated functions and get that to work. We found two ways to make it work.

Deploy only the function’s directory

We initially tried deploying the function’s directory my-function. But the commons directory could not be found (since it wasn’t uploaded).

// my-function/index.js

const functions = require("@google-cloud/functions-framework");
// This will fail as commons directory is not found
const { sayFoo } = require("../commons/shared-module");

functions.http("myFunction", (req, res) => {
res.send(sayFoo());
});

To overcome this you can add a build step that takes care of copying the commons directory and translating paths so everything works as expected. If you are curious check out how they do it here.

Deploy root directory

As we have to conform with GCF directory structure, this requires some extra steps:

  • We need to create index.jsfile at the root pointing to the function’s main file
  • Prepare a root’s package.jsonwith function’s dependencies

Though this can be a simple solution, this also requires some automation that didn’t feel right for our needs.

✅ Easy maintenance of our shared modules

❌ Simple development with no build steps or custom scripts

❌ Able to deploy multiple functions at once

Final approach: migrating to a monorepo

After considering different options, we found that a monorepo was a perfect fit for us since it allowed us to have:

  • Code sharing. Being able to reuse shared logic easily in form of a package. This solved the main requirement we had
  • Flexible development cycle. No need for package publishing
  • Function isolation. Each function’s dependencies are defined explicitly in a package.json

After some changes this was our project structure:

functions
└── my-function
├── index.js
└── package.json
└── other-function
├── index.js
└── package.json
packages
└── common-lib
└── index.js
└── package.json
├── index.js
└── package.json

We used yarn workspaces as it’s the package manager we’re used to, so this monorepo capability came out of the box.

With this change we were able to use our shared modules like this:

// functions/my-function/index.js

const functions = require("@google-cloud/functions-framework");
const { sayFoo } = require("@umamicart/common-lib");

functions.http("myFunction", (req, res) => {
res.send(sayFoo());
});

Yay 😃! Everything was working great in development. Our development cycle was the same as always and if we ever have to fix common-lib, the rest of the functions will benefit from that change without needing to do anything special (besides redeploying the function).

But there was still a pending thing: how to deploy a single function in a monorepo?

After searching a bit, we found two possible options:

  • Publish our shared packages and deploy each function individually. Ideal solution but we didn’t want to mess with package publishing.
  • Deploy the whole monorepo and only load the required function dynamically. Not ideal because we had to install all the function’s dependencies.

Since installing all packages was not an issue for us, we ended up choosing the latter.

The key thing was to prepare the root’s index.js file so that only the deployed function was required. We tried different options and we finally used a dynamic require:

// index.js

module.exports = require(process.env.FUNCTION_HANDLER);
// functions/my-function/index.js

const functions = require("@google-cloud/functions-framework");
const { sayFoo } = require("@umamicart/common-lib");

function myFunctionHandler(req, res) {
res.send(sayFoo());
});

functions.http("myFunction", myFunctionHandler);

exports.myFunction = myFunctionHandler;

And then specify it in the deploy command like this:

gcloud functions deploy my-function \
--entry-point myFunction \
--set-env-vars FUNCTION_HANDLER=my-function \
--trigger-http \
--allow-unauthenticated

This seems a bit hacky but it works™️. The good thing is that if the FUNCTION_HANDLER variable is not specified, the function won’t run (and Google Cloud will not replace the current version). So it’s kind of “safe” to do it. It’s also possible to enforce the use of this environment variable using some linting rule.

This approach allowed us to deploy multiple functions at once running just one command (thanks to yarn workspaces).

So we ended up meeting our initial requirements:

✅ Easy maintenance of our shared modules

✅ Simple development with no build steps or custom scripts

✅ Able to deploy multiple functions at once

Conclusion

This was our journey trying to figure out the best way to deploy to GCF. As always, every project has its own requirements/constraints. There are no right answers, so any of the above-mentioned options may be right for your use case.

When choosing between the approaches you have to consider:

- How often are the shared modules updated?
- How many projects are using the shared modules?

Depending on the answers you may be just fine with copy-pasting the shared modules, or you may need to go all the way with a monorepo.

Hope this helps!

References

https://github.com/lostpebble/generate-package-json-webpack-plugin

https://dev.to/czystyl/google-cloud-functions-in-monorepo-44ak

--

--