gRPC and why it can save you development time

Napon Mekavuthikul
Red Crane
Published in
9 min readJul 22, 2019

At this point you should have heard about “gRPC” (At least once in the title). In this post, I’ll stress about the benefits of adopting gRPC as a communicating medium between micro-services.

First, I’ll try to provide a short brief history of architecture evolution. Second, I’ll stress about using REST (as the medium) and problems that can arise. Third, gRPC kicks in. And I’ll finish with an example of my development workflow.

Brief history of architecture evolution

This section will list and discuss pros/cons of each architecture (focusing on web-based applications)

Monolith

Everything in one package.

pros:

  • Easy to start
  • Single code base for everything

cons:

  • Hard to scale (partially)
  • Load the server (Server-side-rendering)
  • Bad user-experience (high loading time)
  • Hard to scale development team
Monolith architecture
Inside monolith architecture

Monolith v2 (Frontend-Backend)

Clear separation between front-end logic and back-end logic. Back-end is still monolith.

pros:

  • Can separate teams into front-end and back-end
  • Better user-experience (front-end logic(app) to client side)

cons:

  • [Still] Hard to scale (partially)
  • [Still] Hard to scale development team
Frontend-Backend architecture

Micro-service

One service (package) per one thing. Communicate between each package using network.

pros:

  • Scalable components
  • Scalable team
  • Flexible language choice (If standardized communication is used)
  • Deploy/Fix each package independently

cons:

  • Introduce network problems (latency between communicating)
  • Documentation, agreement needed for communication between services
  • Hard to identify bugs, if shared database is used
Micro-service architecture with shared database
Micro-service architecture with standalone database per service

REST (as the medium) and problems that can arise

REST (JSON over HTTP) is the current most popular way of communicating between services due to it’s ease of use. Using REST gives you the flexibility of using any languages for each services.

Typical REST call

However, flexibility comes with some pitfalls. It requires a very strict agreement from developer to developer. The sketch below demonstrate a very common scenario that often occur during development.

Developer A want Developer B to make a service
Bad request
Expectation vs Actual

Problems:

  • Relies on human agreement
  • Relies on Documentation (requires maintenance/update)
  • Need a lot of “formatting, parsing” from and to the agreement (both services)
  • Most of the development time is spent on agreeing and formatting not business logic

gRPC kicks in

gRPC is a modern open source high performance RPC framework that can run in any environment.

What is RPC? RPC stands for remote procedure call. It is a protocol that one program can use to request a service from a program located in another computer on a network without having to understand the network’s details.

Remote Procedure Call

RPC with REST as the medium

Using RPC client/library from the creator of the service will ensure correctness when calling the service. If we want to use RPC and REST as the medium, developer B will have to write the client code for developer A to use. If both developers use the different language of choice, this is a major problem for developer B because he need to write PRC client in another language that he is not accustomed to. And if different services also need to use service B, developer B will have to spend a lot of time making RPC client in different languages and has to maintain it.

Protobuf?

Protocol buffers are Google’s language-neutral, platform-neutral, extensible mechanism for serializing structured data. gRPC uses protobuf as a language to define data structures and services. You can compare this with a strict documentation for REST services. The Protobuf syntax is very strict so that machine can compile.

The below code block is a simple proto file that describes a simple todo service along with the data structure for communicating.

message” keyword for defining data structure

service” keyword for defining service

rpc” keyword for defining function of a service

syntax = "proto3";package gogrpcspec;message Employee {
string name = 1;
}
message Task {
Employee employee = 1;
string name = 2;
string status = 3;
}
message Summary {
int32 todoTasks = 1;
int32 doingTasks = 2;
int32 doneTasks = 3;
}
message SpecificSummary {
Employee employee = 1;
Summary summary = 2;
}
service TaskManager {
rpc GetSummary(Employee) returns (SpecificSummary) {}
rpc AddTask(Task) returns (SpecificSummary) {}
rpc AddTasks(stream Task) returns(Summary) {}
rpc GetTasks(Employee) returns (stream Task) {}
rpc ChangeToDone(stream Task) returns (stream Task) {}
}

Compiling proto to Server code

Since the protobuf is very strict, we can use “protoc” to compile the proto file into server code. After compiling, you’ll need to implement to real logic to it.

protoc --go_out=plugins=grpc:. ${pwd}/proto/*.proto \
--proto_path=${pwd}

Compiling proto to Client code

With the proto file we can use “protoc” to compile it client code in to many popular languages: C#, C++, Dart, Go, Java, javascript, Objective-C, PHP, Python, Ruby, etc.

gRPC rpc types

gRPC support multiple rpc types (however I’m not going to stress in this article)

  • Unary RPC (Request-Response)
  • Client streaming RPC
  • Server streaming RPC
  • Bidirectional streaming RPC

Development workflow

In order to adopt gRPC across teams, we’ll need something that is..

  • Centralised repo (for gRPC spec for communication between services)
  • Automate code generation
  • Service user (client) can use the generated code via package manager (for their language of choice) eg. go get/ pip install

The code for this example can be found in this repo:

- https://github.com/redcranetech/grpcspec-example
- https://github.com/redcranetech/grpc-go-example
- https://github.com/redcranetech/grpc-python-example

The structure of the repo

.
├── HISTORY.md
├── Makefile
├── README.md
├── genpyinit.sh
├── gogrpcspec //go generated code here
│ └── ...
├── proto
│ └── todo.proto
├── pygrpcspec //python generated code here
│ ├── ...
└── setup.py

git hooks

I’m going to setup githook to automatic generate stuff before committing. You can use a CI (drone/gitlab/jenkins/…) if that’s suits you. ( The downside of using githook is every developers need to config the githook first)

You’ll need a directory(folder) to keep pre-commit script. I’m calling it “.githooks”

$ mkdir .githooks
$ cd .githooks/
$ cat <<EOF > pre-commit
#!/bin/sh
set -e
make generate
git add gogrpcspec pygrpcspec
EOF
$ chomd +x pre-commit

The pre-commit script will trigger the Makefile and git add the 2 directories (gogrpcsepc, pygrpcspec)

In order for githooks to work, developers must run this git config command:

$ git config core.hooksPath .githooks

We’ll add this command to a Makefile for developers to run this command easily(calling “make init”). The content of the Makefile should be like this.

# content of: Makefileinit:
git config core.hooksPath .githooks
generate:
# TO BE CONTINUE

Generating code

We’ve set up githooks to run the Makefile (“make generate”). Let’s dive into the command which will automatically generate the code. This article will focus on two languages - go, python

Generating go code

We can compile the .proto files into go code by using protoc.

protoc --go_out=plugins=grpc:. ${pwd}/proto/*.proto \
--proto_path=${pwd}

We’ll use protoc through docker instead (for ease of use across developers)

docker run --rm -v ${CURDIR}:${CURDIR} -w ${CURDIR} \ znly/protoc \ --go_out=plugins=grpc:. \ ${CURDIR}/proto/*.proto \ --proto_path=${CURDIR}

Take a look at the generate command below (we are going to remove, generate and the move the code into the proper folder)

# content of: Makefileinit:
git config core.hooksPath .githooks
generate:
# remove previously generated code
rm -rf gogrpcspec/*

# generate go code
docker run --rm -v ${CURDIR}:${CURDIR} -w ${CURDIR} \
znly/protoc \
--go_out=plugins=grpc:. \
${CURDIR}/proto/*.proto \
--proto_path=${CURDIR}

# move generated code into gogrpcspec folder
mv proto/*.go gogrpcspec

After the code is generated, user(developers) who wishes to use the code for stub of a server or client for calling the service can download by using go get command

go get -u github.com/redcranetech/grpcspec-example

Then use it

import pb "github.com/redcranetech/grpcspec-example/gogrpcspec"

Generating python code

We can compile the .proto files into python code by using protoc.

protoc --plugin=protoc-gen-grpc=/usr/bin/grpc_python_plugin \
--python_out=./pygrpcspec \
--grpc_out=./pygrpcspec \
${pwd}/proto/*.proto \
--proto_path=${pwd}

We’ll use protoc through docker instead (for ease of use across developers)

docker run --rm -v ${CURDIR}:${CURDIR} -w ${CURDIR} \
znly/protoc \
--plugin=protoc-gen-grpc=/usr/bin/grpc_python_plugin \
--python_out=./pygrpcspec \
--grpc_out=./pygrpcspec \
${CURDIR}/proto/*.proto \
--proto_path=${CURDIR}

In order to make the generated code into python package for installing via pip we need extra steps:

  • create setup.py
  • Modifying the generated code (the generated code uses folder name import, but we’ll change it into relative)
  • folders need to contain “__init__.py” exposing the generated code

Create a setup.py file with this template:

# content of: setup.pyfrom setuptools import setup, find_packageswith open('README.md') as readme_file:
README = readme_file.read()
with open('HISTORY.md') as history_file:
HISTORY = history_file.read()
setup_args = dict(
name='pygrpcspec',
version='0.0.1',
description='grpc spec',
long_description_content_type="text/markdown",
long_description=README + '\n\n' + HISTORY,
license='MIT',
packages=['pygrpcspec','pygrpcspec.proto'],
author='Napon Mekavuthikul',
author_email='napon@redcranetech.com',
keywords=['grpc'],
url='https://github.com/redcranetech/grpcspec-example',
download_url=''
)
install_requires = [
'grpcio>=1.21.0',
'grpcio-tools>=1.21.0',
'protobuf>=3.8.0'
]
if __name__ == '__main__':
setup(**setup_args, install_requires=install_requires)

Generate __init__.py

Our __init__.py for pygrpcspec folder has to be

# content of: pygrpspec/__init__.pyfrom . import proto
__all__ = [
'proto'
]

And __init__.py for pygrpcspec/proto folder has to be

# content of: pygrpspec/proto/__init__.pyfrom . import todo_pb2
from . import todo_pb2_grpc
__all__ = [
'todo_pb2',
'todo_pb2_grpc',
]

In order for developers to be able to add more .proto files and auto generate the __init__.py a simple shell script can solve this

# content of: genpyinit.shcat <<EOF >pygrpcspec/__init__.py
from . import proto
__all__ = [
'proto'
]
EOF
pyfiles=($(ls pygrpcspec/proto | sed -e 's/\..*$//'| grep -v __init__))
rm -f pygrpcspec/proto/__init__.py
for i in "${pyfiles[@]}"
do
echo "from . import $i" >> pygrpcspec/proto/__init__.py
done
echo "__all__ = [" >> pygrpcspec/proto/__init__.py
for i in "${pyfiles[@]}"
do
echo " '$i'," >> pygrpcspec/proto/__init__.py
done
echo "]" >> pygrpcspec/proto/__init__.py

Modifying the generated code

(You can skip this read if you’re not very familiar with python modules)

We want to change every “from proto import” into “from . import”. The reason behind this is because we are putting both the datatype, service stub in the same directory and for calling the module outside the module every refer inside should be relative.

sed -i -E 's/^from proto import/from . import/g' *.py

At this point your Makefile should look like this:

# content of: Makefileinit:
git config core.hooksPath .githooks
generate:
# remove previously generated code
rm -rf gogrpcspec/*

# generate go code
docker run --rm -v ${CURDIR}:${CURDIR} -w ${CURDIR} \
znly/protoc \
--go_out=plugins=grpc:. \
${CURDIR}/proto/*.proto \
--proto_path=${CURDIR}

# move generated code into gogrpcspec folder
mv proto/*.go gogrpcspec
# remove previously generated code
rm -rf pygrpcspec/*

# generate python code
docker run --rm -v ${CURDIR}:${CURDIR} -w ${CURDIR} \
znly/protoc \
--plugin=protoc-gen-grpc=/usr/bin/grpc_python_plugin \
--python_out=./pygrpcspec \
--grpc_out=./pygrpcspec \
${CURDIR}/proto/*.proto \
--proto_path=${CURDIR}

# generate __init__.py
sh genpyinit.sh

# modify import using sed
docker run --rm -v ${CURDIR}:${CURDIR} -w ${CURDIR}/pygrpcspec/proto \
frolvlad/alpine-bash \
bash -c "sed -i -E 's/^from proto import/from . import/g' *.py"

After the code is generated, user(developers) who wishes to use the code for stub of a server or client for calling the service can download by using pip command

pip install -e git+https://github.com/redcranetech/grpcspec-example.git#egg=pygrpcspec

Then use it

from pygrpcspec.proto import todo_pb2_grpc
from pygrpcspec.proto import todo_pb2

To sum up, gRPC is a excellent way of communicating between micro services due to protobuf’s syntax strictness that can compile into client code in many different languages.

All codes in this article:
- https://github.com/redcranetech/grpcspec-example
- https://github.com/redcranetech/grpc-go-example
- https://github.com/redcranetech/grpc-python-example

--

--