Building a CI/CD pipeline for your Apigee APIs

Emanuelburgess
9 min readFeb 28, 2024

--

Apigee is a platform for developing, securing and managing APIs. APIs are digital products and should be treated as such. In today’s digital landscape applications depend on APIs to provide seamless digital experiences. Using a pipeline you can design, develop, and test your APIs in an automated fashion to help enforce standardization, consistency and reduce manual errors. In this hands on tutorial we will:

  • Create an eval organization.
  • Setup a Git repository.
  • Copy sample code into a repository.
  • Enable Cloud Build API.
  • Create a trigger.
  • Create an appropriate service account with appropriate permissions.
  • Set your environment variables.
  • Create a commit and push it to any branch.
  • Monitor Cloud Build for results of the build.

The benefits of using CI/CD to build your APIs

Having manual steps in your software delivery process causes inconsistent delivery of software to your customers. By automating your product delivery process the right way you can create a seamless experience for your organization. APIs should be treated as any other software product because APIs are products. APIs allow you to create digital products for your customers by defining and documenting how they can access your business logic in a structured manner. Because APIs are software that we build and deploy we can take advantage of CI/CD pipelines to automate our software delivery process.

Some of the benefits of CI/CD are:

Increased velocity — Automating your software delivery process allows you to speed up delivery by removing manual steps.

Increased software quality — Using CI/CD allows you to bake quality checks and tests right into your delivery process.

Immediate feedback — If your software does not meet the criteria for various tests, it fails and you get immediate feedback which also helps speed up delivery.

Increased collaboration — Pull requests require approval from other team members, this allows for increased collaboration and feedback.

Reduction in cost — Automation leads to a deduction in delivery time which saves your company money.

CI/CD Lifecycle Tooling

There are various tools that you can use to automate your SDLC. These tools are used to implement different testing and deployment steps. Some command pipeline steps are:

  1. Static code analysis: Static code analysis allows you to test the code without actual execution. Regular static code analysis can help you to improve the quality of your code by detecting potential errors and security vulnerabilities. Static code analysis also promotes standardization through consistency. In our example we’ll be using eslint.
  2. Unit testing: Validate API functionality and logic. Unit tests focus on individual units of code. They help to ensure that each unit of code works as expected. Incorporating unit tests in your pipeline will help you detect bugs sooner. This document will use mocha for our unit testing in the example.
  3. Code coverage: Code coverage measures how much code is being covered by your unit tests. A high code coverage percentage indicates that more of your code is being tested, which can help to reduce the risk of errors. You can set a threshold within your configuration that will enforce that your code coverage must reach a certain percentage or “grade” in order to pass to the next step in your pipeline.
  4. Deploy: This step can include configuration and packaging of your code as well. This step will import your code into Apigee and deploy it to an Apigee environment for you. This document will use Apigee Maven Deploy for the deployment tool.
  5. Integration testing: After our API proxy passes all the above tests and is deployed, we then need to test our APIs. We can use integration tests to ensure that our API is behaving as expected. Creating various tests ensures that we have consistency. This document will use Apickli for integration testing.

Architecture

  1. Developer creates appropriate proxy configuration and tests locally. Then pushes the commit to a specific branch:

In the proposed solution outlined in this document, we are using Cloud Source Repositories for our source code repository (SCR). The SCR holds all of our code and configuration files.

2. The push triggers a build using the cloudbuild.yaml (includes testing, validation and deployment steps):

A trigger can be built either on the console or via code to look for a change when code is committed to a specific branch. The build server will use a configured .yaml file for the build.

3. Cloud Build then uses a service account with appropriate permissions to deploy the proxy:

Cloud Build is our build tool that will take the .yaml file, parse it and perform all the requested steps.

Apigee Evaluation Organizations

In order to build and run the sample CI/CD pipeline, we need some infrastructure to deploy it to. An Apigee eval org is a temporary organization for testing Apigee. An eval org allows you to test Apigee and not have to worry about any initial long term commit. This allows you to make sure it will fit the needs of your organization. You should already have an eval org provisioned and set up that allows for external traffic using a load balancer with a certificate installed. If not you can set it up by using the Apigee provisioning tool.

Environments

Apigee uses the concept of environments to deliver isolated spaces where you can create, deploy and test your proxies. Environments enable your organization to have different stages in your software development life cycle (SDLC). When implementing CI/CD for your API proxies you should consider mapping an environment to a branch in your repository. When using git, branches can represent environments. Defining your environment endpoints in your configuration on each branch will ensure that when your code is promoted it gets deployed to a targeted environment. In this solution we are deploying to just 1 environment, however in a real world environment with multiple stages in your SDLC you would need to configure multiple environments and environment types.

Proxies

A proxy is a mechanism in Apigee that allows you to deploy your APIs. Proxies decouple your backend from your API logic. Proxies have 2 types of endpoints:

  • ProxyEndpoint: Defines the way front-facing apps consume your APIs. You configure the ProxyEndpoint to define the URL of your API proxy. You can attach policies to the ProxyEndpoint to enforce security, quota checks, and other types of access control and rate-limiting.
  • TargetEndpoint: Defines the way the API proxy interacts with your backend services. You configure the TargetEndpoint to forward requests to the proper backend service.

Setting up Cloud Shell

We will use Cloud Shell as our development environment. In Cloud Shell select your Apigee related project. The project should be the same name as your Apigee evaluation organization. You can select your project from the drop down menu or use the following command to set your project as default.

Creating a service account

A service account is a special kind of account typically used by an application or compute workload. Using a service account allows for applications within Google Cloud to authenticate to each other without requiring manual configuration of credentials. Each service account gets a unique identifier such as an email address in order to reference the service account when needed. To build the referenced pipeline we will need for Cloud Build to have permissions to talk to Apigee. In Cloud Shell create a service account that uses the following roles:

Setting variables

Setting the following variables will help with the configuration (externally reachable) hostname of your Apigee environment for the integration test.

Creating a Google Cloud Source Repository

The Cloud Source Repository is going to be used as our source code repository. Here we will hold all of your Apigee related code. Cloud Source Repositories allow you to privately host, track, and manage changes to your codebase all in Google Cloud. There are other alternatives like Github, Gitlab, or Bitbucket. Cloud Source Repository allows for tighter integration with Google Cloud Build.

  1. Open up Cloudshell and type the following command:

This will create a repo for you called apigee-cicd-demo in Cloud Source Repositories.

2. Clone repo to cloud shell session:

NOTE: If you get an authentication error you can click “How to set up” and choose an authentication type.

Getting sample code

Apigee’s github repo is where the sample code is located. This repo holds all the source code and example configuration for running the CI/CD pipeline. The best way to get this code into Cloud Source Repository is to clone it in Cloud Shell and copy it over to your source code repo.

Creating a trigger

A trigger allows Cloud Build to monitor our Cloud Source Repository for any commits on any branch. Once a commit is pushed, Cloud Build will automatically grab the Cloudbuild.yaml file from our repo and start the build process. When Cloud Build gets to the deploy phase with Maven it will invoke our service account and deploy to our targeted Apigee environment. To create a trigger enter the following code in cloudshell.

Notice the build-config argument points to the build file that we want Cloud Build to use.

Invoke your pipeline

Now we can create a commit, just by adding any file (even an empty one) to the repo. Once a commit is pushed to the apigee-cicd-demo branch your pipeline will automatically kick off.

Cloud Build

Cloud build is a serverless build tool in Google Cloud that supports multiple languages. The git push command will trigger the pipeline to run in Cloud Build.

CI/CD best practices

Ideally your organization should build a different pipeline for each environment. The closer you get to your production environment testing methods and deployment targets should change. For example, If I am building a pipeline in a lower environment, such as dev, my environment config and deploy should map to my dev infrastructure in Apigee and code should be stored on the dev branch in my Git repo. Additionally my actual integration test should be using a mock backend because at this stage maybe the actual backend hasn’t been created yet. Also mock backends are great for development and cost purposes.

The closer you get to production you should be using a live backend. In a release pipeline it’s ideal to run functional tests on a backend that more closely resembles what an actual customer would access in the real world. In this stage functional tests would be configured to reflect production like functionality. After the artifact is pushed to a repo, then code should be promoted by merging into the Main/Master branch.

In a production pipeline end to end and functional scenario based testing should be used and the proxy should be deployed to your production environment.

Integrating CI/CD with Apigee allows you to create better digital experiences for your customers. If you are ready to get started or want to read up more on Apigee, you can learn more about it here.

--

--