How To Invoke AWS Lambda in CodePipeline

Coder Shunshun
3 min readMar 2, 2018

AWS lambda is simple and powerful. Write code and run code. No need to worry about where to run the code. You can use lambda to perform almost any tasks. If you never know AWS Lambda, you can get start with it today. Check out AWS Lambda doc here.

You may heard about Continuous Integration and Continuous Delivery if you were involved in any applications/services deployment works. AWS CodePipeline is a service which can help you achieve them. Since AWS Lambda is powerful and allows you to do almost any tasks, to integrate AWS Lambda with your CodePipeline can make your continuous integration and delivery process more adaptive and dynamic. I am going to talk about how to invoke AWS Lambda in CodePipeline in this article.

Imagine you have two CodePipelines for non-production and production deployment and you create CodePipelines using CloudFormation. The CodePipeline uses artifacts in S3 bucket as source and whenever there’re changes in the artifacts, the CodePipeline will be triggered.

Now you need to perform following deployment tasks in non-production CodePipeline:

1 . Pull your application source artifact from S3

2. Test the source code

3. Deploy the application in non-production environment

4. Validate the deployment

5. Trigger the production CodePipeline to deploy application in production

I assume you already get the step 1–4 done, now you may wonder how can we trigger the production CodePipeline to initiate production deployment.

Ok, let’s do the job via AWS Lambda. Let’s simplify the task. What we need to do here is coping the tested and validated source code into production S3 bucket. Remember we can use Lambda to do almost all tasks? Let’s write a Lambda called copy_artifact_to_prod_lambda_function for this task.

const AWS = require('aws-sdk')
const s3 = new AWS.S3()
const prodBuckets = process.env.PROD_BUCKETS
exports.copyRepoToProdS3 = (event, context) => {
let promises = []
for (let prodBucket of prodBuckets) {
let params = {
Bucket: prodBucket,
CopySource: <source-object-key>,
Key: <destination-object-key>
}
promises.push(s3.copyObject(params).promise())
}
return Promise.all(promises)
.then((data) => {
console.log('Successfully copied repo to buckets!')
}).catch((error) => {
console.log('Failed to copy repo to buckets!', error)
})
}

I used s3.copyObject function in Javascript AWS SDK to copy source from original S3 bucket to a new destination S3 bucket. Now we need to tell the lambda the source object key and destination object key. (fill out the above placeholders).

How to do that? We have all those info in the CodePipeline, so we need to invoke the lambda with these info.

Create the following CodePipeline Stage in your CloudFormation template:

- Name: TriggerProdDeploy
Actions:
- Name: InvokeLambda
ActionTypeId:
Category: Invoke
Owner: AWS
Provider: Lambda
Version: '1'
InputArtifacts:
- Name: artifact_source
Configuration:
FunctionName: copy_artifact_to_prod_lambda_function
UserParameters: !Sub |
{
"S3ObjectKey":"${artifact_name}.zip"
}
RunOrder: '1'

When this stage get triggered, it will invoke the lambda with a CodePipeline Job event like following:

{
"CodePipeline.job": {
"id": "11111111-abcd-1111-abcd-111111abcdef",
"accountId": "111111111111",
"data": {
"actionConfiguration": {
"configuration": {
"FunctionName": "copy_artifact_to_prod_lambda_function",
"UserParameters": "{\"S3ObjectKey\": \"artifact_name.zip\"}"
}
},
"inputArtifacts": [
{
"location": {
"type": "S3",
"s3Location": {
"objectKey": "artifact_name/D1POYh1.zip",
"bucketName": "you_codepipeline"
}
},
"revision": "IfteBl9KXooTJR7ZCHGP3H5Tx5390M9t",
"name": "artifact_source"
}
],
"outputArtifacts": [],
"artifactCredentials": {
"secretAccessKey": "secretAccessKey",
"sessionToken": "sessionToken",
"accessKeyId": "accessKeyId"
}
}
}
}

Then you lambda is able to access this JSON event to get source object key and destination object key like following:

exports.copyRepoToProdS3 = (event, context) => {
const jobId = event['CodePipeline.job'].id
const s3Location = event['CodePipeline.job'].data.inputArtifacts[0].location.s3Location
const cpParams = JSON.parse(event['CodePipeline.job'].data.actionConfiguration.configuration.UserParameters)
let promises = []
for (let bucket of prodBuckets) {
let params = {
Bucket: bucket,
CopySource: s3Location['bucketName'] + '/' + s3Location['objectKey'],
Key: cpParams['S3ObjectKey']
}
promises.push(s3.copyObject(params).promise())
}
return Promise.all(promises)
.then((data) => {
console.log('Successfully copied repo to buckets!')
}).catch((error) => {
console.log('Failed to copy repo to buckets!', error)
})
}

The last step is notifying your CodePipeline the lambda execution results. You can call codepipeline.putJobSuccessResult and codepipeline.putJobFailureResult to achieve this. You can click the links to see the examples.

Now your CodePipeline should be able to invoke the lambda to copy the source code to production S3 bucket. I forgot tomention at above, you need to give your CodePipeline right privilege to be able to invoke lambda. These can be managed by IAM policies. I won’t talk about IAM policy in this article, if you want to know more about it, please check out AWS doc or ask me the questions.

Hope this article can give you some helps.

--

--