Configure forward proxying on Apigee X

Federico Preli
Google Cloud - Community
7 min readJan 11, 2024

TL;DR

This article provides a step-by-step guide on how to configure Apigee X to leverage a custom Forward Proxy for outbound traffic either to the Internet or to any VM or service within your private network.

For the context of this article, we’ll leverage an existing Apigee X PAYG organisation so we won’t explain how to get it up and running, but if you are interested in the provisioning process, please refer to the official documentation.

Context

Before starting with the technical details of the implementation, let’s deep dive a little bit on the Forward Proxy concept and the use cases you might want to implement within your organisation or company relying on this configuration.

Forward Proxy introduction

Starting with your own home set up, when you browse a specific page, within a so called, standard Internet communication, your computer (for example, “client1” in our diagram) would reach out directly to the server hosting the page you have requested (generally labelled “Internet” in this case) and this origin server would respond back to you, the client. On the other side, when a forward proxy is in place, any client will instead send requests to the forward proxy, which will then forward the request to the hosting server. After shaping the answer, the hosting server will then send a response back to the forward proxy, simply forwarding the response back to our client.

So at this point my question would be:

Why would anyone add this extra layer or step to their router to the Internet?

Actually there are few reasons someone might want to set up and use a forward proxy in their own home set up:

  • User Privacy: A forward proxy stands between a user’s computer and the public Internet and makes requests on its behalf. This can help protect the privacy of the users behind the proxy and mask its identity.
  • Allow/Deny Policy Enforcement: With a forward proxy, all users web traffic flows through the proxy. This allows the proxy to inspect the requests and responses and enforce “allow” or “deny” security policies to restrict the permitted traffic.

These are two concrete use cases for your home set up but these are still valid if you think about your own company or cloud infrastructure. Indeed, for example, you as Security Admin of your organisation, can apply corporate policies to decide which Internet websites and/or services can be reached and what are the ones you’d like to block.

Moving specifically to enterprise companies, other possible use cases are

  • Traffic Visibility: All the desired web traffic flows through the forward proxy. This provides insight into how your organisation uses cloud infrastructures, applications and other third-party services and which are the Internet service providers your company is contacting every day
  • “Shadow” Company Detection: Devices deployed without company approval are prevented from communicating within and outside of the company network perimeter (from an L7 perspective). A forward proxy can identify these communications and use them to identify unauthorised devices deployed on the corporate network.

Forward Proxy with Apigee X

Starting from the considerations we just had, with Enterprise companies leveraging GCP and Apigee X, there was the need of a native integration between Apigee and any custom Forward Proxy reachable from a networking point of view by your own Apigee deployment.

From a technical perspective, the core idea behind any connection to a Forward Proxy is to use an HTTP CONNECT request to establish an HTTP CONNECT tunnel between the client (in our case Apigee) and the proxy. The CONNECT request must specify the target host (so the one you’d like to reach through the Forward Proxy) and port that the client needs to access.

The following steps briefly outline the process.

1) The client sends an HTTP CONNECT request to the proxy server.

2) The proxy server uses the host and port information in the HTTP CONNECT request to establish a TCP connection with the target server.

3) The proxy server returns an HTTP 200 response to the client.

4) The client establishes an HTTP CONNECT tunnel with the proxy server.
After HTTPS traffic arrives at the proxy server, the proxy server transparently transmits HTTPS traffic to the remote target server through the TCP connection. The proxy server only transparently transmits HTTPS traffic and does not decrypt HTTPS traffic.

How can you do that with Apigee? Well, until December 2023, for Apigee X, this was possible only with custom Javascript code, something we don’t want to rely on for a Production environment.
But now, thanks to the latest release (1–11–0-apigee-8), Apigee natively supports this functionality and with few Management API calls you can configure your forwarding proxy for outbound traffic.

To do so, you just need to specify the Forward Proxy VM’s (or Load Balancer) IP address and port at Apigee environment level via Apigee Management API and, from that point moving forward, all the Apigee API Proxies deployed in that environment will automatically forward all the incoming requests to the designed Forward Proxy.

To summarise, from a high-level perspective, here is how the integration should look.

Interested in how to implement this step-by-step? Let’s see how.

Step-by-step guide

This how-to guide will first deploy and install a super simple Forward Proxy in your own GCP organisation and then, finally, configure Apigee to CONNECT to this proxy and reach a target endpoint.

[Optional] Step One: Deploy a Forward Proxy

If you already have your own Forward Proxy installed and configured to be reached by Apigee, you can directly move to “Step Two”.

1) In a network that can be reached by Apigee, create a Compute Engine VM with the following command (you need to either first set the variables or replace them with literal values for your own environment)

gcloud compute --project=$PROJECT_ID instances create apigee-forward-proxy --zone=$VM_ZONE --machine-type=e2-medium --subnet=$SUBNET --network=$NETWORK --network-tier=PREMIUM --no-restart-on-failure --maintenance-policy=TERMINATE --preemptible --service-account=$PROJECT_NUMBER-compute@developer.gserviceaccount.com --scopes=https://www.googleapis.com/auth/cloud-platform --tags=http-server,https-server,forward --image=projects/debian-cloud/global/images/debian-11-bullseye-v20231212 --image-project=debian-cloud --boot-disk-size=10GB --boot-disk-type=pd-standard --boot-disk-device-name=apigee-forward-proxy --no-shielded-secure-boot --shielded-vtpm --shielded-integrity-monitoring --reservation-affinity=any

2) SSH into the newly created VM
3) Install Squid as our Forward Proxy. The system should prompt for confirmation, enter Y and allow the process to complete itself.

sudo apt-get update
sudo apt-get install squid

4) Open the squid.conf file in your preferred text editor (vim in my case)

sudo vim /etc/squid/squid.conf

5) Modify the “http_access deny all” property with “http_access allow all”,
6) Save the new version of the file and exit
7) Restart Squid to set the new configuration in place

sudo systemctl restart squid

That’s all folks!

The out of the box configuration should allow all the internal traffic (Apigee in our case) to reach our Squid Proxy.
If this is not the case, there are two hints I can share with you to solve the issue:

  • Check if the Apigee CIDR range is included in the Squid ACL (Access Control List) list by editing the /etc/squid/squid.conf file
  • Check if the Squid VM is reachable from Apigee and if a GCP Allow Firewall rule has been created to satisfy the requirement similar to the following one
Here is my Firewall rule configuration

Step Two: Configure Apigee to leverage the Forward Proxy

From the Google Cloud Shell

1) Let’s create a simple Apigee Proxy called “apigee-target-proxy” that is catching “/iloveapis” calls and just forwarding them to “https://mocktarget.apigee.net/”.
Note: If you have any questions or doubts on how to do so, please refer to the official documentation

2) Let’s now get the private IP address of the “apigee-forward-proxy” instance (or your own Forward Proxy) we’ve just created either looking at the “Compute Engine” section of the Google Cloud Console or with the following gcloud command

gcloud compute instances describe apigee-forward-proxy --zone $VM_ZONE | grep networkIP

Note that, for anything more than a simple proof of concept, you will likely want to use a Managed Instance Group (or a GKE Cluster) for your Forward Proxy to achieve High Availability. In that case, you will reach the Forward Proxy with the IP address of an ILB (etc.)

3) Use the following gcloud command to get the AUTH token to call the Apigee Management API to update the environment.
Be sure the user performing the command has the necessary Apigee permissions (e.g. with the Apigee Org Admin for example)

gcloud auth print-access-token

4) Update the Apigee environment where the API Proxy that will leverage on the Forward Proxy is deployed with the following API call and by specifying the Forward Proxy IP (retrieved just few steps before) and Port (3128 is Squid default port)

METHOD: "PUT"
URL: "https://apigee.googleapis.com/v1/organizations/$PROJECT_ID/environments/$ENV"
BODY: { “forwardProxyUri”: “http://$FORWARD_PROXY_IP:3128” }

Note: Once set, the Forward Proxy configuration will be used by any API Proxy deployed in that environment. If this is something you don’t want, you just need to specify the following property as part of the <Properties> tag of the API Proxy’s Target Endpoint configuration.

<Property name=”use.proxy”>false</Property>

Tests

Last but not least, you can perform an “/iloveapis” API call towards Apigee and see the request flowing through the Forward Proxy to “https://mocktarget.apigee.net/".

How can you do that? One possible way I would recommend is to tail the Squid log at /var/log/squid on the Forward Proxy instance you created earlier.

Conclusion

The Apigee-Forward Proxy integration is a really powerful tool. You can rely on this pattern to configure the outbound traffic for your organization if you want to centrally enforce several security enforcement.

Before moving forward with the production deployment, be sure all the features you are expecting from this integration are there, here are the latest.

--

--

Federico Preli
Google Cloud - Community

Cloud Consultant @ Google. Working in the cloud industry.