Using AWS EventBridge to Clean Up Your Service-Oriented Architecture: An Overview

Andrew Jorczak
The Startup
Published in
9 min readJul 14, 2020

Background

My employer recently set me out on a project to figure out a way to move configuration data from one legacy Rails app to another different Rails API, with disparate data models. Of course, there are many ways to do this, but as someone who has been the victim (and instigator of) unnecessary service-oriented architecture design, we wanted to avoid standing up a separate service to achieve these goals. Also, my team is currently tasked with overhauling our SOA to something smaller, so anything that could begin to reduce that scope was a good thing.

After some digging into Gregor Hohpe and Bobby Woolf’s Enterprise Integration Patterns, we realized what we needed was a message bus. Now, we could implement this ourselves, but the overhead cost would be too great.

This is when I found out about AWS EventBridge, a relatively new service on AWS executing standard event bus interactions.

This seemed like a good choice for our purposes (we use AWS for many things, custom events cost $1.00/million events sent) and we set ourselves out on a POC. It took little time to figure out the ins and outs and our integration ending up being pretty straightforward, but I initially had trouble finding any blog posts or readings about utilizing EventBridge with two existing large web applications. Most of what I could find online was, say, connecting a partner integration of Salesforce and storing data to DynamoDB, or triggering Cloudwatch events via the default EventBridge, etc.

Hopefully to save you some headache of staring at AWS SDK documentation for hours on end, I’ve set up a sample repo to use as a guide to follow along.

Prereqs:

  • This won’t be much use for you unless you have an AWS account. You’ll need your AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY stored somewhere in your environment in order to establish connections to EventBridge. The AWS SDK loads these automagically from your ~/.aws/credentials file so no need for much setup there on the application. (See AWS documentation for more info.)
  • Later versions of this will hopefully be on Docker and include Localstack for local development with AWS services and some out-of-the-box necessities, but for the moment, you’ll also need some standard-ish Rails needs to run locally: Postgres, Node for front-end assets, Ruby 2.7, bundler.
  • To follow along with example code, see the repo here: https://github.com/CoolJorcz/eventbridge_sample
  • As a general overview of Event-Driven Architectures, this 2019 NY AWS Summit video was very useful to me.

Overview

Let’s say you work at a very famous book company and you’re moving your transaction storage to an external payment service, but it’s not quite ready for prime time and you need to ingest the current data on a per user action basis. EventBridge applies a lightweight, non-blocking solution and is relatively easy to set up in Rails.

The flow is similar to something below:

Payment Service to Payment API

Let’s get started.

AWS Updates

These updates can be executed in order, however I find working from the Lambda backwards is easier as those connections need to be established before we can do anything. Also, we’re going to be doing a lot of clicking in the console, but if you’d prefer to do this via CloudFormation or Terraform or whatever IaC framework, by all means do (we did this most of this via Terraform at my company. Note that as of this writing EventBridge’s support with Terraform is limited).

Let’s start by creating our Lambda connection:

In the AWS console, go to the Lambda Service overview page.

Click “Create new function” on the right-hand side.

From here, let’s use a blueprint and search for https-request to find the https-request nodejs lambda. For your purposes, you can select any lambda blueprint here, but for this exercise it makes sense to use the https-request functionality since we’ll be connecting to another API.

Title the Lambda paymentServiceConnection or whatever you please, it just needs to be the lambda that is triggered in EventBridge on the next step. Select “Create new basic execution role” (If you’re seting this up your self from cloudformation/ terraform, at a minimum you should connect to CloudWatch Events and will need put events activated from eventbridge.)

Create the function at the bottom. This may take a couple of minutes.

Once the function is created, you can edit the lambda code in place (Most likely this should be stored in source control somewhere. v2 of this will show the lambda code deployed from the existing repo.)

In Designer, search for EventBridge, create a new rule for Cloudwatch Events health checks. (This is optional but recommended).

We now have a lambda that doesn’t do much, but it does enough to unblock us. Let’s update the code so that we’re logging out the event details to see how it’s formatted once it comes through EventBridge.

Line 12 to logout the event

Now go to the EventBridge Service page, and click create rule.

Name the rule payment_service_integration and add your description

Now Define pattern as an Event pattern and the Event matching pattern as a Custom pattern then add the following Event pattern:

Source and detail-type need to match whatever is sent from your App to EventBridge (tagging the region helps delineate if you have multiple servers cross-region, helping debug issues, but it’s fine to name it whatever.)

Select the default event bus, and Enable the rule on the selected event bus.

Connect your lambda trigger and create your rule.

Now we can get to the code updates.

I’ve set up a very rudimentary web application to play around with, with scaffolded transactions and books. Below I have some codeblocks that are really poorly formatted due to Medium’s limitations, but hopefully help out.

After a transaction is saved, we want to send it to the payment service. (Using a callback here to show how that flow is executed. I probably would not do this in production as callbacks introduce some weird event flows (multiple executions, etc.) and we’d probably want to batch send the transactions depending on the amount in a more ETL-based flow, but for simplicity’s sake we’re going to use a callback.)

# app/models/transaction.rbclass Transaction < ApplicationRecord
belongs_to :book
validates :customer_email, presence: true
validates :sale_price, presence: true
after_create :queue_for_payment_service private def queue_for_payment_service
serialized_transaction = TransactionSerializer.new(self)
.as_json(
include: '**')
PaymentServiceJob.perform_later(payload: serialized_transaction)
end
end

Since we’re sending along to EventBridge as JSON, we’ll want to ensure that our object is serialized without any particular sensitive information available (yes, the customer email is sensitive, this should be encrypted at rest and decrypted when received by the payment API. Again, for simplicity’s sake, we’ll pass it along) We pass the transaction into the PaymentServiceJob background job object. We’re using DelayedJob to process these jobs here, and it will pick up the job and send it along to our EventBridgeService, which we’ve wrapped as a service object.

# app/jobs/payment_service_job.rbclass PaymentServiceJob < ApplicationJob
queue_as :payment
def perform(payload:)
detail_type = 'payment_service'
EventBridgeService.new
.call(detail: payload.to_json,
detail_type: detail_type)
end
end

In PaymentServiceJob, we define our detail type (the same as the expected detail_type in the rule above) and continue to send along the payload as defined. We extract the detail_type so EventBridgeService can be used with multiple rules.

Here’s the meat of our code in the EventBridgeService object.

# app/services/event_bridge_service.rbclass EventBridgeService
attr_reader :options
def initialize(options: default_options)
@options = options
@client = Aws::EventBridge::Client.new(
region: @options[:region],
)
end
def call(detail:, detail_type:)
event_payload = define_event_hash(detail, detail_type)
resp = client.put_events(entries: [event_payload])
{ events: resp.data.entries }
if resp.data.failed_entry_count.zero?
check_errors(resp.data.entries)
end
def check_errors(eventbridge_events)
event_arr = []
eventbridge_events.each do |entry|
if entry.error_code.nil?
event_arr << entry
next
end
error_msg = "AWS events error code: #{entry.error_code},\
message: #{entry.error_message}"
Rails.logger.error(error_msg)
event_arr << entry
end
{ events: event_arr }
end
def define_event_hash(detail, detail_type)
{
time: Time.zone.now.to_s,
source: options[:source],
resources: [""],
event_bus_name: options[:event_bus_name],
detail: detail,
detail_type: detail_type,
}
end
privateattr_reader :client def default_options
{
region: "us-west-2",
event_bus_name: "default",
source: "bookstore",
}
end
end

The main method here to send events out is put_events on Aws::EventBridge::Client . Because the AWS SDK loads variables automagically (see above re: AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY), all we need to establish the connection is the defined EventBridge in the particular region we’ve created our rule in.

Let’s now see how this all works.

First, we’ll create a transaction. Sale price is in cents here.

We create the transaction, and it will be updated in Postgres.

We should have a couple of options of how we kick off the delayed job. Manually, from your command line you can run rake jobs:work if you’re running this locally.

Now go back to your Lambda function and click on the monitoring tab. You should see some metrics surrounding the event firing (you might have to wait a minute, Cloudwatch logs take some time to propagate).

And looking in our CloudWatch Logs, we can see our logged out event. This lambda of course errors out, we’re not matching the expected event construction to send along to another API, but prior to that we log the event contents to establish the connection and is a good first step to completing our integration.

Update the lambda to take the event response body. Note the httpsOptions object and assigning payload as the stringed version of event.detail:

exports.handler = (event, context, callback) => {
const payload = JSON.stringify(event.detail);

const httpsOptions = {
method: 'PUT',
hostname: 'www.example.com',
path: '/api/transactions',
port: 443,
headers: {
'Content-Type': 'application/json',
'Content-Length': Buffer.byteLength(payload)
}

}

const req = https.request(httpsOptions, (res) => {
let body = '';
console.log('Status:', res.statusCode);
console.log('Headers:', JSON.stringify(res.headers));
res.setEncoding('utf8');
res.on('data', (chunk) => body += chunk);
res.on('end', () => {
console.log('Successfully processed HTTPS response');
// If we know it's JSON, parse it
if (res.headers['content-type'] === 'application/json') {
body = JSON.parse(body);
}
callback(null, body);
});
});
req.on('error', callback);
req.write(JSON.stringify(payload));
req.end();
};

You can load up a sample lambda event to test out the successful flow from above, but let’s look at the full flow. Let’s execute a new transaction now.

Create a new transaction
Created Transaction

Wait a couple of minutes and let’s see the execution flow through EventBridge to the Lambda.

Success!

Summary

This is just a tiny integration of what we can use EventBridge for. Too often we utilize the AWS SDK for many integrations within the application (S3, SQS, SNS, etc.), creating more chances for failure with those individual components. With this pattern, we can move all of those individual services to specified rules and reduce the scope of our application to communicate to our individual AWS components via EventBridge.

Hopefully you found this tutorial useful and gives you a head-start on your EventBridge integration within Rails or another web app. I welcome any feedback/concerns/Github issues you may run into with this sample code.

--

--

Andrew Jorczak
The Startup

Engineering Manager at Lob. Mostly Ruby, Node, AWS, Integrations. Ask me things. https://www.andrewjorczak.com/