Multiple Artifacts in AWS Codepipeline and CodeBuild

Yubi
VivritiEngineering
Published in
8 min readAug 23, 2018

We wanted to automate our build and deploy process and move to CI/CD. Since we run our infrastructure on AWS, our natural choice was AWS CodePipeline and AWS CodeBuild. We started with a module, a web application archive used as a background job, run on Elastic Beanstalk. The application had more than one jar file as a dependency. Unfortunately, if you use AWS CodeBuild, AWS CodePipeline allows just one artifact as input to the next step, a serious limitation, in our opinion. We considered a few different ways to solve this problem

  • Not using Code Build — Jenkins was an option but then again this required some amount of set up
  • Putting up a local private maven repository or artifactory — Again, this involved some amount of set up which we did not want to invest in
  • Storing the output jar files in S3 and using commands in the buildspec file to download the dependencies to the local repository before using them in subsequent steps in the pipeline

We decided to use the third approach since it seemed to be the easiest of the above and secondly, we simply expect AWS CodeBuild to catch up and allow us to use multiple artifacts as input to subsequent steps soon. Nevertheless, we had to cross several hurdles to get it working. This blog outlines the process we followed.

Setting up the Pipeline

Source

  1. Log into your AWS account
  2. Choose the AWS Code Pipeline service
  3. Start creating a new pipeline by clicking on Get Started and or Create New Pipeline
  4. Provide a name for your pipeline — ours was called vivsecservice and click Next
  5. Choose your source provider. We initially used AWS Codecommit but we could never get to making cross account access work and hence we moved our code to github. This blog explains the set up using github but this can as easily be set up using AWS CodeCommit
  6. Once you choose GitHub, you will be asked to connect to github. Click, Connect to Gihub, provide your login details and connect to github. You will have a choice of repositories you can choose. You an choose the repository that contains the source code that will be built into the .war file. Choose the branch you want to build from. In our case, We choose repository called vivsecservice and the branch we are using is master since this setup is for a production build. Click Next to proceed to the next step.

Build

The next step is to set up the build process.

  1. Choose AWS CodeBuild as the build provider (the whole purpose of this blog)
  2. Choose the create a new build project option (Important)
  3. Provide a name. We will use vivsecservice
  4. Set up the build environment. We will use Java8 on an Ubuntu machine

5. Choose the No Cache option — This can be set up later if needed

6. Choose the Create a service role in your account. You can leave the pre-populated Role name as is. But do note this down since you will need this later. In our case the new role that will be created is code-build-vivsecservice-service-role

7. Save the build project. Once the project is saved, you can click the View Project Details link.

8. This will lead you to the AWS Code Build Project page. Click on Edit Project. Go the Environment: How to build section and click on Update build specification. Simply enter the build specification file name. In our case, the build specification file name is buildspec-prod.yml.

9. Leave the rest of the data as is. Scroll down to the bottom and click Update

10. Close the AWS CodeBuild window. Go back to the AWS Codepipeline window and click Next

Deploy

  1. The next screen is the Deploy screen. We will deploy to AWS Elastic Beanstalk.

2. Provide the application and environment you would like to deploy the output war file to. In our case we are using an application called securitization-cashflowgen and the environment is securitization-cashflowgen-prod1

3. Click Next. Click the Create role button. Use the default provided options. Click on the Allow button to create a new role that will be used. Going forward, for new pipelines, you can reuse this role. In our case a new role called AWS-CodePipeline-Service is created.

4. Click Next Step to proceed to the Review page. Review the details and click the Create Pipeline button to create the pipeline. You would have now created the pipeline. You would have now landed on the pipeline page for your pipeline.

Editing the Pipeline to suit your needs

Source

  1. As outlined before, we have multiple jars that will be built as part of this process and our eventual web application (vivsecservice) depends on these jars. So, now, we will modify the pipeline to address this case.
  2. We will take a small subset of the problem to illustrate how to achieve what we set out to do. We have two libraries that we need to build, vivcommon and awsutils. awsutils build depends on the output of vivcommon build namely, vivcommon-1.0.0.jar.
  3. First, we set up the build for vivcommon. Click on the Edit button to edit the pipeline. Next, click on the Edit button to edit the stage. Remove the existing Source box (since we can’t rename the stage and we want to give it a good name, we are going to delete this action entirely and rebuild) by clicking on the x. Click on + to create a new action.
  4. Choose Source for action category and fill up the rest of the details. We will call our Action name vivcommonsrc, Source Provider will be GitHub, repository vivriticapital/vivcommon and branch master. (Note that you will need to click on the Connect to GitHub button to connect to GitHub). Scroll down to the bottom and in the provide a name for the output artifact #1. We will call our output artifact vivcommonsrc.

5. Similarly, add the other source actions you will need. In our case, we will just add awsutils source action.

Build

  1. Next, we will set up the build step for vivcommon. Edit the build stage and remove the originally created action named CodeBuild. Click on + button to add a new action.

2. Set Action Category to Build. Choose AWS CodeBuild as Build Provider. Action name is vivcommonbuild. Under Configure your project, choose Create a new build project. You can provide a project name of your choice. We will use vivcommonbuild. Set the environment details as necessary. We will set it as Ubuntu, Java and JDK8. Choose the Create a service role in your account. Leave the role name as the default value. The input artifact needs to be the output artifact of the Source step. Fill in details of output artificat. We will use vivcommonsrc and vivcommonout.

3. Next, save the Build project. A build project is created and a link to the new build project will be provided. Click on the View project details link. On the AWS CodeBuild page, click on the Edit Project button.

4. On this page, under the Environment: How to build section, click the Update build specification link. Enter a buildspec file name. Here we will use a custom buildspec file called buildspec-prod.yml. The contents of the build spec file are provided below.

5. Essentially, in the post build command, we will upload the generated jar file to S3. So, this uploaded jar file plays the role of being the second (or nth) artifact of subsequent steps in the pipeline.

6. As can be seen above, the s3 destination bucket is provided. Since we will have different bucket names for different environments (eg. dev, staging and production), we will use different buildspec files and hence the need for using buildspec-prod and buildspec-otherenv files.

7. Now, the output jar file is uploaded to S3 and subsequent steps can download the same and use it as input for their builds

8. When the build runs, the build process role is code-build-vivcommonbuild-service-role. And this role needs to have access to S3 since this needs to be able to upload (and in some cases, download) artifacts from S3. Hence, we will need to provide permissions.

9. Hop over to the IAM service. Look for the role we are using for the build. In this case, look for code-build-vivcommonbuild-service-role in IAM.

10. And attach the AmazonS3FullAccess policy to this role as shown below. Note that you can make your access more restrictive by creating a custom policy that provides access to only the particular code build bucket and artifacts within

8. Click on Add Action to finish creating the action for building the first artifact. In our case we are done with the build for vivcommon library.

9. Next, we will create the build step for our second library, vivawsutils. vivawsutils essentially depends on two artifacts, one its source vivawsutilssrc and second, the vivcommonout artifact. And this is exactly the limitation that AWS CodePipeline/Build has.

10. The buildspec for vivawsutils looks as shown in the diagram below.

11. As part of the pre-build step, we download the dependeny jar from S3 as maven would have downloaded it and cached. The build code for vivawsutils picks up this jar (proxy artifact) from the maven repository cache and its source as input artifact and builds.

12. The set up of the build for vivawsutils works the same way as before and has to be set up exactly as we did for vivcommon except for the use of vivcommonsrc as input artifact.

13. Partial view of how vivawsutils build set up looks like and the resultant pipeline is shown in the next couple of imags.

Conclusion

Our entire pipeline post the set up is shown below.

--

--