Streamlining Git Action Logs: Capture and Store Logs in AWS S3 for Seamless CI/CD Workflows

Dharunash A S
BI3 Technologies
Published in
5 min readMay 13, 2024

Imagine streamlining your Git workflows by effortlessly capturing and managing logs directly within your AWS ecosystem. With Git Actions and AWS S3, this becomes a reality.

This blog will guide you through the process of harnessing the power of Git Actions to capture logs and seamlessly store them in an AWS S3 bucket.

You’ll learn step-by-step instructions and best practices for automating log management and enhancing your CI/CD workflows. Git Actions offer a versatile and integrated approach, enabling you to centralize and optimize your logging process with ease.

Let’s explore how to leverage Git Actions and AWS S3 to streamline your development workflow and ensure greater visibility and efficiency.

Step 1: Create a Role in AWS Using Custom Trust Policy

  1. Sign in to the AWS Management Console: Go to the AWS Management Console and navigate to the IAM service.

2. Create a New Role: Click on “Roles” in the left navigation pane, then click on “Create role”.

3. Select “Custom trust policy” to create a relationship between Git and AWS.

4. Use the below Json script to create the Custom trust policy, instead of ************ provide your AWS account id.

{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Principal": {
"Federated": "arn:aws:iam::************:oidc-provider/token.actions.githubusercontent.com"
},
"Action": "sts:AssumeRoleWithWebIdentity",
"Condition": {
"StringEquals": {
"token.actions.githubusercontent.com:aud": "sts.amazonaws.com"
},
"StringLike": {
"token.actions.githubusercontent.com:sub": "repo:BI3-Technologies-Pty-Ltd/aws_cdk:*"
}
}
}
]
}

5.Add write permissions for S3, specify a name for role and click “Create role”.

Step 2: Install Python and Assign AWS Role

  1. Install the python in the environment where the Git actions need to be executed.
  2. Create an empty Json file for capturing the logs.
  3. Assume the same AWS Role which created in the step.1
      - name: Setup python
uses: actions/setup-python@v4
with:
python-version: '3.9'

- name: Creating a log file
run: echo '{}' > log.json

- name: Configure AWS Credentials
uses: aws-actions/configure-aws-credentials@v1.7.0
with:
role-to-assume: arn:aws:iam::************:role/GitHubAction-AssumeRoleWithAction
role-session-name: GitHubActionsSession
aws-region: ap-south-1

Step 3: Create a Custom Python Script in Workflow Folder

In your Git repository, create a custom Python script in the “.github/workflows/” directory that will execute the command passed as a parameter, capture the Git action logs, and save them as a Json file in the git environment.

import subprocess
import sys
import os
import json
import pytz
from datetime import datetime

def deployment(process,command):

local_time_zone = pytz.timezone('Australia/Sydney')
with open('log.json', 'r') as file:
data = json.load(file)

if "run_id" not in data:
data['run_id'] = os.environ['GITHUB_RUN_ID']
current_utc_time = datetime.now(pytz.utc)
data['started_at'] = f"{current_utc_time.astimezone(local_time_zone)}"

result = subprocess.Popen(command, shell=True, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
stdout, stderr = result.communicate()
exit_code = result.returncode

if exit_code == 0:
print(stdout.decode('utf-8'))
data[process] = 'command executed successfully'
status = 'success'
if process == 'CDK Deploy':
data['status'] = 'success'
current_utc_time = datetime.now(pytz.utc)
data['ended_at'] = f"{current_utc_time.astimezone(local_time_zone)}"
else:
# print(stderr.decode('utf-8'))
data[process] = str(stderr.decode('utf-8'))
status = 'failed'
data['status'] = 'failed'
current_utc_time = datetime.now(pytz.utc)
data['ended_at'] = f"{current_utc_time.astimezone(local_time_zone)}"

# Write the updated data back to the JSON file
with open('log.json', 'w') as file:
json.dump(data, file, indent=4)

if status == 'failed':
raise Exception(stderr.decode('utf-8'))

if __name__ == "__main__":
deployment(sys.argv[1],sys.argv[2])

Step 4: Create a YML File

Within your Git repository’s “.github/workflows/” directory, introduce a YAML file to orchestrate the execution of the Python script. This YAML configuration orchestrates the deployment of AWS services using AWS CDK within Git Actions, synergizing with the existing infrastructure.

name: Deploy AWS CDK

# Controls when the action will run.
on:
push:
branches:
- '*'
jobs:
aws_cdk:
runs-on: ubuntu-22.04
permissions:
contents: read
id-token: write

env:
GITHUB_RUN_ID: ${{github.run_id}}

steps:

- name: Checkout
uses: actions/checkout@v3

- name: Setup python
uses: actions/setup-python@v4
with:
python-version: '3.9'

- name: Creating a log file
run: echo '{}' > log.json

- name: Configure AWS Credentials
uses: aws-actions/configure-aws-credentials@v1.7.0
with:
role-to-assume: arn:aws:iam::************:role/GitHubAction-AssumeRoleWithAction
role-session-name: GitHubActionsSession
aws-region: ap-south-1

- name: install npm
run: 'sudo apt update -y && sudo apt install nodejs npm -y'

- name: Install AWS CDK
run: 'sudo npm install -g aws-cdk'

- name: Install Requirements
run: 'pip3 install -r requirements.txt'

- name: CDK Synth
run: python .github/workflows/deploy.py "Cdk Synth" "cdk synth"
# "Cdk Synth" is a keyword for our reference
# "cdk synth" is a command to execute

- name: CDK bootstrap
run: python .github/workflows/deploy.py "CDK Bootstrap" "cdk bootstrap aws://************/ap-south-1 --bootstrap-bucket-name my-custom-cdk-staging"
# "CDK Bootstrap" is a keyword for our reference
# "cdk bootstrap aws://************/ap-south-1 --bootstrap-bucket-name my-custom-cdk-staging" is a command to execute

- name: CDK Deploy
run: python .github/workflows/deploy.py "CDK Deploy" "cdk deploy --all"
# "CDK Deploy" is a keyword for our reference
# "cdk deploy --all" is a command to execute

- name: Upload logs to s3
run: |
time=$(TZ='Australia/Sydney' date +"%Y%m%d%H%M%S")
mv log.json "${GITHUB_RUN_ID}_${time}".json

S3_FOLDER="git-actions/workflow-logs/AWS-CDK"

# Upload the workflow logs to the S3 bucket
aws s3 cp $GITHUB_RUN_ID*.json s3://training-s3bucket-bi3/$S3_FOLDER/

if: always() # This step should always run, even if previous steps fail

Sample logs for successful and failed workflows.

{
"run_id": "********",
"started_at": "2023-10-20 22:42:26.274979+11:00",
"Cdk Synth": "command executed successfully",
"CDK Bootstrap": "command executed successfully",
"CDK Deploy": "command executed successfully",
"status": "success",
"ended_at": "2023-10-20 22:44:01.766283+11:00"
}
{
"run_id": "********",
"started_at": "2023-10-13 15:14:08.208437+11:00",
"Cdk Synth": "No stacks match the name(s) anmko\n",
"status": "failed",
"ended_at": "2023-10-13 15:14:18.935183+11:00"
}

CONCLUSION

Finally, using a Python script, successfully captured the logs as a JSON file and seamlessly transferred it to an S3 bucket, leveraging the assume role configuration. This streamlined process aids in achieving efficient capture and storage of logs in AWS S3, facilitating seamless CI/CD workflows.

About Us

Bi3 has been recognized for being one of the fastest-growing companies in Australia. Our team has delivered substantial and complex projects for some of the largest organizations around the globe, and we’re quickly building a brand that is well-known for superior delivery.

Website: https://bi3technologies.com/

Follow us on,
LinkedIn: https://www.linkedin.com/company/bi3technologies
Instagram:
https://www.instagram.com/bi3technologies/
Twitter:
https://twitter.com/Bi3Technologies

--

--