How we use Go Swagger at Phrasee

Simone Trubian
Phrasee
Published in
7 min readJul 12, 2018

At Phrasee we care about our product and the experience our users get in using our platform. Phrasee is a web-based platform that allows our users to generate subject lines for marketing purposes. Those carefully generated and optimised subject lines however would be useless unless sent to our client’s customers. From early on we realised that our users enjoyed using our platform but often had problems in managing email sends in their Email Service Providers (ESP). Phrasee requires our users to run a test for each marketing campaign, and learns from the data that is fed back. Those two actions (creating a multivariate send and feeding back results to our platform) often forces our clients to jump through hoops in their EPS, thus adding overhead to their normal routine.

Some time ago we started an ongoing effort to integrate our platform with the biggest (and not so big) ESP’s so that our users could get the best out of Phrasee whilst enjoying a better experience. To achieve this we envisioned a service that would connect to our client’s ESP of choice, automatically create email multivariate tests, schedule the sends and retrieve send results.

Because this service has to connect to many different ESP’s, managing all those 3rd party API’s is a critical part of the project itself. From previous experience we went right away for one of the main standards for defining and documenting REST API’s: Swagger (now known as OpenAPI). For the actual service we decided on an implementation in Go, a language invented by Google that has a sweet spot for this kind of applications. The first class networking standard library and language primitives made it a great fit to tackle our operational and architectural challenges. We were already happy with our choices but then we discovered a little command line tool that takes an OpenAPI specification and automatically generates a fully-fledged Go server or client. Enter go-swagger!

Go Swagger workflow

Go swagger comes as a Command Line Interface (CLI) tool. Despite not having many commands it has the ones needed to manage the lifetime of a specification document. Here at Phraseee we use the CLI in a simple but effective workflow. First whenever needed we use it to initialise a new spec, then we start the edit/validate/generate/compile cycle. We make a small edit in the specification file and run the tool with the “validate” command to validate against the OpenAPI standard. If there are errors we go back editing until all issues are resolved. Then we run the tool with the “generate client” or “generate server” commands to automatically produce code.

The “generate” commands are quite powerful and accept many parameters. This allows a user to be very specific about what parts of the specification file(s) one wants to use for a round of generation. By using these features one can logically segment a big specification file using the “tags” and “operation” fields. Then by feeding those values to the generate command parameters it is possible to generate only those specific parts of the specification. This is where the interplay between API specification and code generation gets tricky and artful. Initially we overlooked this feature and as we little by little discovered we found it a bit daunting, but as it turns out it grants the developer a lot of power and granularity so it is well worth exploring.

When we’re happy with what we generated we compile the code right away and start wiring up the new code with the existing codebase. We strive to keep this loop tight by making small incremental changes and testing them right away.

Phase 1 use swagger to spec the server

The “generate server” command will generate a server able to serve the defined protocols (HTTP or HTTPS), all models and a handler function stub per endpoint specified. On top of that Go Swagger will also generate validation functions for all endpoints. Of course the developer is still entrusted with writing the actual business logic of the API, but as far as the humdrum job of writing boilerplate goes we now let go-swagger take care of it.

A feature we particularly like about Go swagger is that most generated code is wrapped in data types. So for instance a handler function, its input parameters and its replies are all wrapped in data types. This goes a long way in allowing the compiler to ensure that the generated code in used in the right way.

For instance in an API with endpoint A and B trying to pass the request parameters of A to the handler of B would result in a compilation error. The same will occur in trying to pass to the reply function of A the reply data of B. A caveat here is that all reply data types implement a common interface so it is still technically possible to use the wrong reply function.

Despite these shortcomings using the CLI tool is a bliss, being able to change the API just by editing the spec file is convenient and safe. It sped up setting up the server in the first place and even now that the API is a lot more mature we still make good use of the generate server command to refactor and modify the server API.

Phase 2 use swagger to spec our internal client

The go swagger tool is serving us so well for managing our server but that needn’t be the only thing it’s useful for. The CLI is capable of generating not just servers but also clients. Our microservice exposes a server API, but often has to behave as a client. We saw the value of this feature and started right away generating client code for other internal services.

Generating client code is almost like generating servers. The “generate client” command accepts almost the same parameters as “generate server” and we use the same edit-validate-generate-compile workflow we use for server specs.

The only caveat is that in defining client specification one is actually specifying the API of the server the client is meant to contact. Also Go swagger will generate all data types but will not generate an executable as you probably don’t need one anyway.

Phase 3 use swagger to spec all models (step too far)

As mentioned previously the go swagger tool generates automatically all the models used by the API’s. The tool will put all models in a models/ directory, neatly packaged together and ready to import from the same path. The temptation here was too big to resist, so we started using the tool to generate models used for internal representations as well as the ones exchanged with the outer world. Although initially we enjoyed letting Go Swagger do the work for us we also soon realised that letting it generate your data types meant having less control over those types. For one thing we could not define our own serialisation tags, but the real deal breaker was not being able to have polymorphic data types that contained interfaces. The latter reduced greatly our ability to streamline our code and make full use of the language features, so ultimately we decided to do “the right thing” and only use go swagger to manage only the API models.

Phase 4 use swagger to write all clients and test servers

Our microservice has to connect to several 3rd party services that although being very different in implementation, logically all do the same thing. To maintain sanity and consistency we decided to package all 3rd party API clients and business logic in Go packages. All those packages implement an interface we defined, which allows hiding away all the implementation details from the main server.

One of the issues we always faced with the consistent interface approach is that changing the behaviour of the interface can potentially change one of the implementations in unexpected ways. The simple answer to this problem is to ask the QA people to run extensive regression tests in all the integrated platforms. It’s easy to see that this is not scalable, so coming up with automated integration testing become a pressing matter.

After using swagger to generate the spec for the internal client it dawned on us that connecting to an internal service is no different than connecting to a 3rd party service. To that end we realised we could use the tool to generate the 3rd party clients just as well.

There are a few reasons to do this but initially we only saw one: generating a single, convenient client that manages all connections. With time we realised that we could just run the `generate server` command on the specification of a client to generate a mock server for testing purposes. Once the skeleton is created the developer can implement trivial handler functions that return constants. This however goes a long way in creating a full end-to-end testing environment that cuts down development time even further.

Conclusions

We started using Go Swagger as a tool to improve our team productivity and in that respect it is constantly delivering. The productivity gains are not just in terms of having to write ourselves less code. The real benefit comes in using a tool that enforces good practices and reduces our room for error. Having a self-documented system means all API users know where to get up to date information when they need it. A clever use of data types by Go Swagger makes developing and refactoring the codebase easier and quicker.

--

--