Photo by JOHN TOWNER on Unsplash

Getting Started with gRPC and Golang

Tom Jordi Ruesch
7 min readNov 6, 2021

--

REST is awesome. It is a great protocol for providing clients and endusers with a simple and flexible interface for a wide range of use cases. However, REST might not be the optimal approach when designing and implementing service-to-service APIs because of its relative slowness and the lack of bidirectional communication.

In this article we will take a look at how to install all the tools that are needed to get started on gRPC with Go and we will take a look on how to read and write .proto files. In the next article we will take a look at how to actually design, build and test your API with Go.

If your programming language of choice is not Go, I promise you will still find this article helpful to get startet with gRPC as the technology is totally language- and platform-independent and we will not write a single line of Go in this article.

Why (when) should you care?

gRPC is a modern open source high performance Remote Procedure Call (RPC) framework that can run in any environment.

When creating a microservice-based infrastructure you need to make sure that the communication between those microservices is stable, reliable and fast. While REST ‘s simplicity and flexibility is awesome for creating public APIs it was not intended for high volume service-to-service communication. With REST implementation is easier and development times are lower while gRPC typically is between 7 to 10 times faster than REST.

gRPC is built with performance and platform-independence in mind. It enables backend developers to take advantage of exciting technologies such as HTTP/2, bidirectional streaming and protocol buffers.

Instead of text-based message formats such as json (which is used by REST APIs), gRPC uses Protocol Buffers. Protocol buffers (or protobuf in short) were developed at Google and are a language- and platform-neutral mechanism for serializing structured data. Protocol buffers are a very efficient way of encoding data because the encoded message uses only as many bytes as necessary to hold the given value. I will not go into the details of how the encoding works, but if you want to read more, the official documentation is an excellent resource.

If you are thinking about how to structure your microservice-backend or any internal services and how these services can communicate whit each other, gRPC is a very interesting option to consider.

Photo by Kelly Sikkema on Unsplash

Tooling

As we already know, protocol buffers are language- and platform-agnostic. That means that we define or protobufs in a dedicated schema language and then codegen the code to use in our API. The protobuf compiler protoc supports most of the mayor programming languages. Some, such as Go, through plugins.

So lets install the compiler and the go-plugins.

On Linux (at least for the distros with apt available) you can install the toolset via

$ apt install -y protobuf-compiler

for mac the equivalent via homebrew would be

$ brew install protobuf

Check that the compiler version is 3+ with protoc --version.

If the installation via one of the above mentioned package managers fails or your platform does not support installation via apt or brew, you can also install the protobuf toolset from precompiled binaries or from source.

Since I am planning to write my API in Go, I will need some additional plugins as the compiler does not know yet how to generate Go code. These tools are installed via

$ go install google.golang.org/protobuf/cmd/protoc-gen-go@v1.27
$ go install google.golang.org/grpc/cmd/protoc-gen-go-grpc@v1.1

The protoc-gen-go generates Go code for both proto2 and proto3 versions of the protocol buffer language. protoc-gen-go-grpc creates the specific language bindings for gRPC. In earlier versions of the plugin the gRPC binary was part of protoc-gen-go but Google decided to encapsulate that functionality in a separate plugin.

Sidenote:

Be sure that your $GOBIN (i.e. $GOPATH/bin) is in your $PATH. If not, here is your one-liner to copy+paste.

$ export PATH="$PATH:$(go env GOPATH)/bin"
Photo by Sven Mieke on Unsplash

Writing our Protocol Buffers

We will be creating a translation service that takes a text along with a source and target language and returns the translated text along with the language information and the number of characters that are billed for the translation.

The proto file for that API looks something like this.

I will not go into too much depth about what is going on in that proto file because there is an excellent resource from Google out there where the protocol buffers and the language syntax are explained in great depth, but I will discuss the basic structure of protocol buffers.

First, let’s take a look at the message type.

You might think of a message as a unit or a container which is loaded with information, packed to binary and sent over the wire where it is unpacked and read. For each field, you must specify the field type, its name and the field number. So far so nice and clear.

Wait, what the heck is a field number? Each field is assigned a unique number that will be used to identify your fields in the binary format of the message. For backwards-compatibility it is important not to change these numbers once the message is in use.

Geek tip: For optimized performance, you should use the numbers 1 through 15 for very frequently occurring message fields. These numbers, together with the field type, are converted to a single byte in binary format. Field numbers in the range 16 through 2047 take two bytes. Remember also to reserve some of those ultra-efficient small numbers for potential new fields in the future, as you cannot (should not) change the field number of an element.

Besides the standard field types, such as string, int32 or float, you can define enumerations if you want your field to take only a specific set of values such as we did with the Languages enumeration.

So far our protocol buffers are not specifically designed for gRPC, they are just a way of describing data. For our gRPC API however, we not only need data but also functionality or services.

The service keyword defines a piece of logic at a high level such that the compiler knows which interfaces to create for you. In our case we create an rpc service called Translation that has a Translate method that takes an TranslationInput input and returns an TranslationOutput.

Photo by Danist Soh on Unsplash

Generating Code

This is the moment of truth. We installed our tooling and have written a simple enough proto file for our gRPC API. Now we let the compiler do its magic and generate the Go code for us. If you are reading this with a different programming language in mind, this is the part where you could find the workflow for your specific language to be slightly different. Take a look at the official documentation for recommendations on your specific language.

For the last time in this article, let’s dig in, shall we?

We want to tell the protobuf compiler to take our translations.proto file and compile it to some nice and clean Go code for us to use in our translation API. For this task we will use the protoc command line tool which we downloaded earlier in this article. But first, let’s take a look at the file structure of this project.

grpc_getting_started
├── main.go
└── protos
└── translations.proto

I have created a folder called protos, where I put all my proto files and the generated code. While it is not necessary to follow this design, it is a good idea to store all the code and .proto files in one place.

I want the compiler to put the generated Go code into a directory called translations within my protos folder. I use the following comand line command to do so.

$ protoc 
--go-grpc_out=protos \
--go_out=protos \
protos/translations.proto

The --go-grpc_out and --go_out tell the protoc compiler to use the go and go-grpc plugins, and where to put the outputs of those. As the argument we define the proto file to take as an input to the compiler.

After running the command above, your repository should look like this.

grpc_getting_started
├── main.go
└── protos
├── translations.proto
└── translations
├── translations_grpc.pb.go
└── translations.pb.go

For those of you wondering how the compiler knew to put the code in this new translations folder, take another look at the translations.proto file, wehere you will find an option called go_package which points to the subdirectory translations of the directory in which the proto file resides.

That’s it! We have successfully understood the basics of gRPC and protocol buffers, downloaded the required toolset, wrote our first proto file and compiled it to Go code. After completing all of these steps, we deserve a little break.

In the next article we will take a look on how to use the generated code to actually build our gRPC API in Go with a few lines of code.

Thank you so much for reading and see you in the next article!

If this article has helped you or woke your interest in gRPC or Go, please do me a BIG favor and leave me a clap. This is my very first article on medium and I would much appreciate your feedback and support.

--

--