Aurora Solutions
Jan 2 · 5 min read
Photo by Marc Sendra martorell on Unsplash

An Introduction To REST

REpresentational State Transfer, commonly known with the acronym REST, is an architectural style which deals with creation of web services. The term was first used and defined by Roy Fielding in the year 2000 in his doctoral thesis Architectural Styles and the Design of Network-based Software Architectures.

Any web service that complies to this architectural style can be called as a RESTful web service. In fact, RESTful web services have become the de facto method for integrating applications. Many social media networks, enterprise software and services expose REST Interfaces so that third party applications can code against them to integrate with.

Advantages of RESTful Web Services

There are many advantages of using RESTful web services over other conventional ways of sharing information such as a SOAP-based service, sharing a jar, etc. REST utilizes HTTP protocol.


In order to access a RESTful interface all you need is a HTTP-compatible client which can be as simple as cURL utility or a simple web browser. From a developer’s point of view, as long as the programming language supports invoking HTTP methods, developing REST client program is as easy as pie.

Flexibility And Low Maintenance Cost

It is easy to maintain and add new methods or APIs at the REST server end without compromising on the existing APIs. RESTful web services can be hosted over a standalone web container such as Tomcat or over other cloud services like AWS and attribute to a low application maintenance cost.

Content Negotiation

It is up to the REST server to decide on the media types that can be supported. Same data can be represented in multiple media formats such as JSON, Text, XML. REST client can put a content-type header in the request sent so that appropriate media type can be retrieved.


Since REST is stateless, REST server does not have to bother about saving the client’s state at the server end. REST client would pass around its state. As a result, it is possible to scale to thousands of concurrent users.

Performance of RESTful Web Services

As the application scales up, it can often face performance issues. So, it is better to have load/performance tests from start. Most enterprises into software development have dedicated testing teams that would perform integration tests, whitebox tests and SIT tests to ensure no bottlenecks are present due to immense usage of exposed RESTful APIs by REST clients.

Common Issues Found At REST Layer

Network Latency

No matter how good you design the REST layer, the speed of the network matters a lot. Network latency can thus indirectly impact the responsiveness of the REST server.

Needless Data

Sometimes the design of REST layer could be done in haste leading to unwanted data being sent back and forth from REST server to client.

Business and Database Layer Performance

Even if REST API is designed properly, the performance can be impacted due to poor design of Business Logic or the DAO layers.


Since RESTful Web Services work on top of HTTP, it is important to ensure proper web service security is implemented to avoid unauthorized access, snooping and DDoS attacks.

Server Side Load

Having too many client requests in place could make the server load high and if the server doesn’t have enough capacity it can result in poor performance at the REST server end.

The Need For Bench-marking

Depending on the language in which the RESTful web service is written, it is a good practice to use a bench-marking tool so that much of the issues that could be showstoppers at production could be found beforehand and dealt with.

This is recommended because if performance issues go unnoticed and find their way into the customer environment, it could tarnish the reputation of the software. Moreover, getting access to the production site and debugging the performance would be a time consuming and tedious task as multiple levels of support is required in a real world scenario. Hence most API development teams prefer using bench-marking tools to evaluate performance of the application before their release.

Some of the well known bench-marking tools are:

  • Apache JMeter
  • Wrk
  • http_loadJMeter
  • thos

Tackling Performance Issues In Production

It is quite possible for performance issues to creep in production environment as neither a 100 percent perfect software nor software testing methodology exist. If at all this happens after a lot of testing done before release, it is not a problem to worry too much about. In this section we would be looking at the ways to handle performance issues of RESTful APIs in a production environment.

Enable Debug Logs

To find out what could be wrong, we need to get access to the customer environment and enable debug logs both at the client as well as server ends. In cloud based applications where RESTful web services become just a part of the application, we can use a SaaS based solution such as Loggly for log management. This would mean easier debugging for cluster-based deployments.

Using a Profiler

Some teams opt for reusing the test suite/application used at the development site by test team if that allows to time the request and response. This is another approach to check performance issues at the client-server communication layer alone. But if the cause of performance issue is not peripheral but a deep-rooted one, this could not be enough.

Using a bench-marking tool such as JMeter would be recommended in such cases as it is easier to build up system load by simulating multiple users sending a lot of requests to the server side. Then a profiler such as YourKit could be used for CPU and memory profiling.

Best Practices

Here are some of the best practices that can be followed in order to ensure minimal issues at production:


PUT would require the whole data to be sent to the REST server. If the requirement at client end is to update only a small subset of the REST resource, it is advisable to use a PATCH command so that network bandwidth is not eaten up by tons of requests reaching the server.

Compressed Data

If the REST resource is fairly large, it is better to allow compressed data to be sent as response. At the client end, all you need to do is to use Accept-Encoding header as gzip. After retrieving the data, extra CPU cycles may be required at the client end to uncompress it but still it is something worth to consider as it would help to relieve some of the server load.

Enable Partial Responses

It is a good practice to design the REST server and client such that partial information can be requested and sent across to the client.

Enable Caching at REST Server

If some data is frequently accessed, then caching it at the server would have a positive impact on REST server performance in production.


RESTful Web Services are widely being used in software industry to enable integration of multiple applications due to advantages of them over conventional information exchange methodologies. They are simpler to use, easy to maintain and use less resources at production. Although applications would have undergone a lot of both manual and automation testing before release, it is possible for issues to crop up at production. In this article, we saw the common issues found and how to tackle them at production followed by best practices to minimize issues.

Aurora Solutions

Reliable Web, Mobile and Devsecops development

Aurora Solutions

Written by

Aurora Solutions

Reliable Web, Mobile and Devsecops development

Welcome to a place where words matter. On Medium, smart voices and original ideas take center stage - with no ads in sight. Watch
Follow all the topics you care about, and we’ll deliver the best stories for you to your homepage and inbox. Explore
Get unlimited access to the best stories on Medium — and support writers while you’re at it. Just $5/month. Upgrade