Secure API Gateway From DDoS/DoS Attack Impacts In AWS

Swwapnil Pawar
The Security Chef
5 min readDec 1, 2022

--

Fig.1 API Gateway (PC: Amazon)

As you all know, Amazon API Gateway is a fully-managed service that enables developers to create, publish, maintain, monitor, and secure APIs at any scale. APIs act as the front door for applications to access data, business logic, or functionality from backend services. Using API Gateway, you can create RESTful APIs and WebSocket APIs that enable near real-time, two-way communication applications. API Gateway supports a variety of backend integrations, enabling containerized, serverless, and traditional instance-based workloads.

API Gateway handles all the tasks involved in accepting and processing up to hundreds of thousands of concurrent API calls. This includes traffic management, cross-origin resource sharing (CORS) support, authorization, and access control, throttling, monitoring, caching, and API version management.

We are going to look at the security prioritization of the API gateway when building microservices or three-tier serverless web architecture.

When designing an application, it must ensure that only authorized clients have access to their API resources. When designing a multi-tier application, you can take advantage of several different ways in which Amazon API Gateway contributes to securing your logic tier:

  1. Transit Security
  2. API authorization
  3. Access Restrictions
  4. Private APIs
  5. Firewall protection using AWS WAF

The above mechanism provides security to the APIs but we will look at specifically protecting/mitigating APIs from DDoS/DoS attacks.

The method to protect APIs from DDoS is known as Rate Limiting.

Rate limiting can do the following:

  • Prevent any particular client from exhausting application resources.
  • Protect your application instances from erratic and unpredictable spikes in the rate of client requests.

Rate limiting helps you prevent your API from being overwhelmed by too many requests. API Gateway throttles requests to your API using the token bucket algorithm, where a token count for a request and the maximum bucket size is the burst. API Gateway sets a limit on a steady-state rate and a burst of request submissions against all APIs.

Let's understand more about the token bucket algorithm.

The token-bucket algorithm provides an alternative to fair queuing (Fair Queuing) for providing a traffic allocation to each of several groups. The main practical difference between fair queuing and the token bucket is that if one sender is idle, fair queuing distributes that sender’s bandwidth among the other senders. The token bucket does not.

Fig.2 Token Bucket Algorithm

As shown in the above diagram, The idea behind a token bucket is that there is a notional bucket somewhere, being filled at a steady rate with tokens (or if more divisibility is needed, with fluid); any overflow from the bucket is discarded. To send a packet, we need to be able to take one token from the bucket; if the bucket is empty then the packet is non-compliant and must suffer special treatment as above. If the bucket is full, however, then the sender may send a burst of packets corresponding to the bucket capacity (at which point the bucket will be empty).

There are a number of ways to implement rate limiting on your APIs. The various types of rate limits are processed in sequential order, as shown in Table 1.

If any of the limits are exceeded for the rate limit, Amazon API Gateway blocks the request and responds with HTTP Status Code 429 Too Many Requests when a request is rate limited or throttled.

Fig.3. API Gateway Security Practices
  • Token Bucket algorithm is very memory efficient.
  • Token Bucket technique allows spike in traffic or burst of traffic. A request goes through as long as there are tokens left. This is super important since traffic burst is not uncommon. One example is events like Amazon Prime Day when traffic spikes for a certain time period.

Cons:

  • A race condition, as described above, may cause an issue in a distributed system due to concurrent requests from the same user.

Amazon CloudFront integration

Amazon CloudFront distributes traffic across multiple edge locations, and filters requests to help ensure that only valid requests will be forwarded to your API Gateway deployments. There are two ways to use CloudFront with API Gateway:

  • With an edge-optimized endpoint API Gateway instance which delivers your API via an AWS-managed CloudFront distribution which is controlled by AWS
  • With a Regional endpoint API Gateway instance that you can integrate with your own self-managed CloudFront distribution

When integrating CloudFront with Regional API endpoints, CloudFront supports geo-blocking, which you can use to help prevent requests from particular geographic locations from being served.

API Gateway can be configured to accept requests only from CloudFront, using a few approaches. This can help prevent anyone from accessing your API Gateway deployment directly.

Methods include:

  • Requiring an API key to be validated for requests on API Gateway, which CloudFront can insert into the x-api-key header before forwarding the request to the origin, in this case API Gateway.
  • Requiring validation of a customized header (not x-api-key) with a known valid value for requests on API Gateway. CloudFront inserts the header and value on the request. A Lambda request authorizer can validate the presence of the expected header and return a “403 unauthorized” error if it is not present.
  • Authenticating the user with AWS Lambda@Edge, then signing all requests with AWS request signing before sending the request to API Gateway. API Gateway uses AWS IAM-based authorization to validate the signature.

AWS Shield and AWS Shield Advanced

AWS Shield Standard defends against the most common, frequently occurring network and transport layer DDoS attacks that target your website or applications. All customers benefit from AWS Shield Standard.

AWS Shield Advanced can be added to protect Amazon CloudFront distributions and Amazon Route 53 hosted zones, providing additional protections against DDoS attacks. During a DDoS attack, your instances can mitigate the attack up to the throughput of the instance.

AWS Shield Advanced provides expanded DDoS attack protection for web applications running on the resources. AWS Shield Advanced manages the mitigation of Layers 3, 4, and 7 attacks. Additionally, with the appropriate AWS support level, AWS Shield Advanced provides access for customers to the AWS DDoS Response Team.

Attack Surface Reduction

That’s not always a possibility but in case it is, you are decreasing the attack surface drastically by whitelisting the source IPs. This means that your API Gateway server will stop receiving traffic from individual visitor IP addresses and instead receive traffic from specific Origin IP addresses, which are shared by all proxied hostnames.

Note that, Security is an ongoing effort. Protecting data in transit and at rest, implementing a strong identity foundation, minimizing attack surface area, mitigating DDoS attack impacts, implementing inspection and protection techniques, and automating security best practices all enable an in-depth defense strategy that every organization should implement.

If you like the article, Feel free to share & clap ;)

--

--

Swwapnil Pawar
The Security Chef

Entrepreneur, Cloud Evangelist, AWS/Google Certified Architect, Building Cool Things With Serverless. Avid Reader