Introduction to api gateway for micro services using express gateway
Adoption of micro services architectures bring agility into the software development.It allows for a faster feature development . Each micro service is small enough to be developed and maintained by couple of software developers . The small footprint also allows for faster boot up and shut down times. Adoption of micro services results into new sets of challenges. One of the challenge would be of a communication between micro services .
Communication between micro services .
- api based communication
- event based communication
In api based communication pattern , micro services communicate with each other by using api end points . As long as each micro services knows the url and message format of other micro service , communication would be straight forward . However every micro service knowing each others exact location is not feasible . To illustrate the same , let us consider a simple application below.
A simple application
Let us consider a simple application consisting of 2 micro services .
- customer service : Ability to store and fetch customer information. Customer service runs on port 3000.
- product service : Ability to store and fetch product information . Here for example consider a sports product ( like cricket bat , cricket ball ) Product services runs on port 4000.
It is possible to communicate to each other on a single machine by using well known rest end points . However solution will not scale if we consider multiple instances of the micro services and deployment in cloud environment. In cloud environment you would typically run each service on a separate machine . This machine probably would be a virtual machine or a container which would come up dynamically . It will not be possible for each service to understand location of other services.
User interface development
User interface development is a difficult task ( in a micro service context) if user interface needs to talk to multiple micro services . Communication to each micro service requires knowledge of machine and port on which the micro service is running .
In a cloud environment , a micro service may have multiple instances of the service running on different hosts and it is not simply possible for user interface to keep track of the service instances .
Api gateway
API gateway comes as required solution . From user interface we can make a call to API without specifically getting concerned about where the actual micro services are running . From user interface , we can always point to api gateway url and api gateway will internally route the request to correct downstream micro service.
A typical scenario
- A typical scenario , would be user interface makes a api request to fetch customer information . Instead of directly communicating to Customer micro service , ui code will make api request on api gatway url http://localhost:8080/api/customers
- api gateway would be configured to be aware of underlying micro services . it will route the request to http://localhost:3000/api/customers
- when user interface wants to make a api request to fetch product information . Instead of direct communication to product microservice , ui code will make api request on api gateway url http://localhost:8080/api/products
- api gateway would be configured to be aware of underlying microservices ( product in this case ) . it would route the request to http://localhost:4000/api/products
- Advantage here , user interface is decoupled from the underlying micro services .
Actual implementation
Here i would like to demonstrate the usage of api gateway using a concrete example .
My microservice development , i would use a open source framework loopback here . why loopback ? Answer is simple
- loopback framework allows easy definition of a model and allows easily scaffolding of apis from the same . Essentially i would compare it with something like mongoose ( which defines a model for mongodb )
- Loopback is database agonstic . You can simple connect to nosql database like mongodb or rdbms like postgresql .
- ok .. lets get it rolling on…
Installation
- Installation is straightforward . loopback gives you simple command line tools . I install cli tools globally so that i can use it across multiple micro services.
npm install -g loopback-cli
- Point to note here is that strongloop ( which developed loopback ) is acquired by IBM now , however loopback remains open source , Here i use the most basic tools . we do not require to install ibm tools like apiconnect.
npm install -g strongloop
Create a loopback application
- Installation is done at this point and we can scaffold a simple application . First we will create a Customer application and then followed by Product application . type following command to start creating customer application.
lb
It will ask you few questions .
- name of the application : it would be default populated to the directory . keep it same
- which version of the loopback -here i keep it 2.x LTS version
- what kind of application do you have in mind : empty-server . Here my objective of using loopback was just to quickly scaffold a rest api . We would subsequently configure a customer model and data source for connecting to database.
- Once you have done all this steps , it will automatically run npm install interactively . Look at the folder structure which has been created for our application
Files of interest for api development
- server/model-config.json : This file is used to specify which models are configured in the system . when loopback application starts up , it will automatically load up the model files and configure rest end points for them
- server/config.json : This file is used to specify various setting which are used by loopback application . In current context , important parameter is the port parameter on which our http server starts . For Customer service , service port would be 3000 ( this is the default value when application is scaffolded so we will keep it same as of now )
- server/datasources.json : This file is used to specify how loopback application will connect to the database . For our example micro services ( customer and product ) , we will connect to mongo db . loopback gives connectors to many database ( e.g mongo db , postgres ) . We just need to specify connector in datasources.json . This is how we will specify data source connection in datasources.json
{"mongodb": {
"host": "localhost",
"port": 27017,
"url": "",
"database": "masterdb",
"password": "",
"name": "mongodb",
"user": "",
"connector": "mongodb"
}}
- Ideally we should have a seperate database for customer micro service and product micro service . However for practical purpose , here i create only 1 database.
Customer model
- Loopback model is a simple json file . In this example i create a simple customer model with basic attributes such as firstname , lastname , phone ,date of birth (dob ) .
{
"name": "Customer",
"description": "A Customer model representing our customers.",
"base": "PersistedModel",
"idInjection": true,
"properties": {
"firstName": {
"type": "String",
"required": "true"
},
"lastName": {
"type": "String",
"required": "true"
},
"phone": {
"type": "number",
"required": "true"
},
"dob": {
"type": "date",
"required": "true"
}
},
"validations": [],
"relations": {},
"acls": [],
"methods": {}}
- A couple of important points about loopback model
- base property allows you to specify the base class for the model . Here we give name as PersistedEntity . In this case, loopback will save the model into the database when rest api with post/put methods are being called . when get method is called ,loopback will fetch the information from the database . It saves developer from writing lot of boiler plate code for database interaction .
- idInjection if set to true adds automatically id property to model . This would mean that developer need not manually add this property to the model .
Finish creation customer model and api
- We need to configure the newly created model so that loopback can load the model up and scaffold api for the same
- model-config.json is the file to configure the models . it expects the model into a directory common/models or models .
- Save the Customer.json which we created inside common/models directory . Note that common directory will be parallel to server and client directory
- Entry inside model-config.json for Customer model would be as follows.
"Customer" : {
"dataSource" :"mongodb" ,
"public" : "true"
}
Start the application and check for customer services
Our customer micro service is ready . Go to the directory from where we ran the lb command . Run the following command .
node .
On console , you would see below output . By default loopback comes with bundled swagger . Here we will can test api output for customer model. We can also use curl or postman to test our api’s
Web server listening at: http://localhost:3000
Browse your REST API at http://localhost:3000/explorer
we can create a new customer
curl -X POST --header 'Content-Type: application/json' --header 'Accept: application/json' -d '{ \
"firstName": "Kevin", \
"lastName": "systrom", \
"phone": 654231291, \
"dob": "1983-12-30T16:36:26.847Z", \
"id": "c2" \
}' 'http://localhost:3000/api/Customers'Request URLhttp://localhost:3000/api/Customers
Response Body
{
"firstName": "Kevin",
"lastName": "systrom",
"phone": 654231291,
"dob": "1983-12-30T16:36:26.847Z",
"id": "c2"
}
We can verify whether customer indeed got created using get method
curl -X GET --header 'Accept: application/json' 'http://localhost:3000/api/Customers'
Response Body
[
{
"firstName": "Vishal",
"lastName": "Sundar",
"phone": 9890820268,
"dob": "2000-09-30T09:35:22.411Z",
"id": "c1"
},
{
"firstName": "Kevin",
"lastName": "systrom",
"phone": 654231291,
"dob": "1983-12-30T16:36:26.847Z",
"id": "c2"
}
]
newly created customer exists (id : c2 ) . Response output shows 2 customers as first customer was already existing in database . and in api request we did not give any filter.
Ok our first micro service ( Customer) is ready , time to get second service ready . Steps are exactly same .
- Go to a separate directory , product-service
- Create a empty loopback application using cli ( command lb)
- Add a new model product
- Product model has a reference to customer id . This model is for illustration purpose only . A real world model will have many more columns.
{"name": "Products",
"description": "Maintains a list of products",
"base": "PersistedModel",
"idInjection": true,
"properties": {
"productName": {
"type": "String",
"required": "true"
},
"productDesc": {
"type": "String",
"required": "true"
},
"customerId": {
"type": "String",
"required": "true"
}},
"validations": [],
"relations": {},
"acls": [],
"methods": {}
}
Make entry for database in datasources.json . As simplicity we will point to the local mongo db database only . Following below is the entry for server/datasources.json.
{
"db": {
"host": "localhost",
"port": 27017,
"url": "",
"database": "masterdb",
"password": "",
"name": "mongodb",
"user": "",
"connector": "mongodb"}
}
Make product model entry to server/model-config.json
"Products" : {
"dataSource" : "db",
"public" : "true"
}
Second micro service ( Product ) , we would start on different port . First micro service was started on 3000 port . Here we will start the service on port 4000 . In config.json port would be 4000 for product application .
Start the product service and use apis to create product data
Similar to the customer service , start the product service . We need to navigate to the directory (product-service ) and enter the command below.
node .
You would get following output on the console
Web server listening at: http://localhost:4000
Browse your REST API at http://localhost:4000/explorer
Create two new products for a existing customer ( id : c2)
curl -X POST --header 'Content-Type: applic ation/json' --header 'Accept: application/json' -d '{ \
"productName": "Cricket bat", \
"productDesc": "SG cobra select", \
"customerId": "c2", \
"id": "p3" \
}' 'http://localhost:4000/api/Products'curl -X POST --header 'Content-Type: application/json' --header 'Accept: application/json' -d '{ \
"productName": "Cricket ball", \
"productDesc": "Kookaburra ball", \
"cifId": "c2", \
"id": "p4" \
}' 'http://localhost:4000/api/Products'
Inquire and check the products for c2 . You will see that filter is encoded in the curl request . and actual filter value is {“where”:{ “customerId”: “c2” }} . We are trying to inquire on the product for customer having id c2.
curl -X GET --header 'Accept: application/json' 'http://localhost:4000/api/Products?filter=%20%7B%22where%22%3A%7B%20%22customerId%22%3A%20%22c2%22%20%7D%7D'
Response output gives records for both products
[ { "productName": "Cricket bat",
"productDesc": "SG cobra select",
"customerId": "c2", "id": "p3"
},
{ "productName": "Cricket ball",
"productDesc": "Kookaburra ball",
"customerId": "c2", "id": "p4" } ]
Both micro services are ready and now we are ready or implementing api gateway pattern. we will use Express gateway as “api gateway” . https://www.express-gateway.io/ .
- It is built on the top of express.js . It is simple to use . Basic scenarios in the use cases which we are building can be implemented using configuration .
- It has credential and identity management and authorisation management . I have not explored this part in this article .
Installation of express api gateway
Installation is quite straight forward .
npm install -g express-gateway
Create one application gateway using cli tool . This gateway would be responsible for micro services communication
eg gateway create
Like lb command , eg command will prompt for set of questions . Choose getting started with Express gateway.
eg gateway create ? What is the name of your Express Gateway? api-gateway ?
Where would you like to install your Express Gateway? api-gateway ? What type of Express Gateway do you want to create? (Use arrow keys) ❯ Getting Started with Express Gateway
Basic (default pipeline with proxy)
How to start express gateway
npm start
api gateway starts on the port defined in gateway.config.yml/gateway.config.json file . Default port would be 8080.
Most important file
Most important file here would be gateway configuration file . Default file ships in yml format named as gateway.config.yml . Personally i find yml structure difficult to manage . Api gateway server allows json format file also . json format is easier to edit and verify . Instead of doing deep dive into file structure , i would like to visit the use case , which we are trying to implement and then comeback to gateway.config.json file .
Use case 1 : From client ( or UI ) i want the ability to call multiple microservices
Problem statement
- UI will make a request to the api gateway on port 8080 for customer service . api gateway will route the request to correct end point ( Customer microservice)
- UI will make a request to the api gateway on port 8080 for Product service .Api gateway will route the request to correct end point ( product micro service)
Express api gatway allows 2 types of end points
Api end points : Api end points are used by clients to make api requests
Service end points : service end points are the end points of the actual micro services . In our case it would be the end points of product and customer services .
gateway.config.json
{
"http": {
"port": 8080
},
"admin": {
"port": 9876,
"hostname": "localhost"
},
"apiEndpoints": {
"api": {
"host": "localhost",
"paths": "/ip"
},
"cust": {
"host": "localhost",
"paths": "/api/customers/*"
},
"product": {
"host": "localhost",
"paths": "/api/products"
}},
"serviceEndpoints": {
"httpbin": {
"url": "https://httpbin.org"
},
"custsrv": {
"url": "http://localhost:3000/"
},
"prodsrv": {
"url": "http://localhost:4000/"
}
},
"policies": [
"basic-auth",
"key-auth",
"cors",
"expression",
"log",
"oauth2",
"proxy",
"rate-limit"
],
"pipelines": [
{
"name": "default",
"apiEndpoints": [
"api"
],
"policies": [{
"proxy": [
{
"action": {
"serviceEndpoint": "httpbin",
"changeOrigin": true
}
}
]
}
]
},
{
"name": "default-1",
"apiEndpoints": [
"cust"
],
"policies": [ {
"proxy": [
{
"action": {
"serviceEndpoint": "custsrv"
}
}
]
}
]
},
{
"name": "default-2",
"apiEndpoints": [
"product"
],
"policies": [{
"proxy": [
{
"action": {
"serviceEndpoint": "prodsrv"
}
}
]
}
]
}]
}
Explanation
- There are 2 api end points defined for supporting our use cases . one is cust and second one is product . it simply means api gateway is looking for any request which is coming http://localhost:8080/api/customers/* and http://localhost:8080/api/products/ .
"cust": {
"host": "localhost",
"paths": "/api/customers/*"
},
"product": {
"host": "localhost",
"paths": "/api/products"
}
port 8080 comes from following setting at the top of the json file.
"http": {
"port": 8080
},
- How does the down stream micro services get called .
- There are totally 3 service end points defined here . First one comes from the tutorial for getting started with express . custsrv end point defines the url for customer microservice and prodsrv
"serviceEndpoints": {
"httpbin": {
"url": "https://httpbin.org"
},
"custsrv": {
"url": "http://localhost:3000/"
},
"prodsrv": {
"url": "http://localhost:4000/"
}
}
- Wiring api end points and service end points . Last work is to wire api end points and service end points . That is done through a construct called as pipeline. Here is the one for customer. Pipeline allows multiple policies to be applied on the request ( or api end points ) . Policy which we used here is proxy . Here we supply action parameter . Result would be that input request is forwarded to customer micro service.
{
"name": "default-1",
"apiEndpoints": [
"cust"
],
"policies": [ {
"proxy": [
{
"action": {
"serviceEndpoint": "custsrv"
}
}
]
}
]
}
- Similar pipeline can be written for product service also
{
"name": "default-2",
"apiEndpoints": [
"product"
],
"policies": [
{
"proxy": [
{
"action": {
"serviceEndpoint": "prodsrv"
}
}
]
}
]
}
Summary
- Api gateway is one of the important pattern in microservices architecture pattern . More details of the pattern can be found at https://microservices.io/patterns/apigateway.html
- Api gateway addresses many other client requirements ( like fanning out single request to multiple other services ) .
- In this article ,we looked at the simple use case where client wanted a single point of entry.