AWS’ API Gateway allows you to directly connect it to/proxy many other AWS services. This article discusses doing this with DynamoDB, as a way to create an API that adds data to DynamoDB, without needing to use a Lambda function. There are existing AWS docs on using API Gateway as a proxy for DynamoDB, however, as usual, those only cover how to do this in the AWS console. In particular, I’ll show how I set this up using the Serverless Framework (or CloudFormation, as the bulk is really just CloudFormation code), and how you transform the web request’s JSON so it can be directly PUT into DynamoDB. …

This is the first in a series of articles covering my favorite podcasts. This one covers Teamistry, now on it’s second season.

I’m starting this series off with Teamistry, both because it’s probably my current favorite, as well as it surprised me how much I liked it. Why? First, their description:

Teamistry is the chemistry of unsung teams that achieve the impossible

Honestly, I thought it would likely be a lot of BS management technique or silly team building ideas, etc. Teamistry is provided by Atlassian, a company who makes Jira (issue and project tracking software), Confluence (wiki), Trello (project managment), as well as Bitbucket (and other git software). So, given I work in tech, and am quite familiar with Atlassian, I had a bit of bias going in based on a lot of the software they make (i.e. for team/issue/project management). Instead, these are absolutely captivating stories!

Developers using Postgres on Mac — I’d suggest using, instead of Homebrew to install and manage databases. allows you to run specific, and multiple versions easily and won’t auto-upgrade your Postgres to a version you didn’t want.

I really like Homebrew in general for package management on macOS. That said, there is one key behavior of Brew that really irks me, which is that by default it will upgrade all your installed packages any time you do a new install or update! This is terrible for databases in particular. It is very likely you don’t want your DB version updated, as you should have that version match what your deployed software uses. Additionally, often when updating DB versions, it means you need to update the datastore, which can sometimes be a little time consuming or tedious. The version argument is the big one for me though. I typically am deploying production software to the cloud, where there are specific versions of databases provided by the cloud vendors. …

Cognito user-based authenticated API calls through API Gateway generally require use of AWS’ v4 signing of the API request to employ API Gateways automatic authentication. Specifically for this case, we do not use a lambda authorizer— both because I don’t want to maintain that code, and because that adds a Lambda invocation for every API call (and this is a high volume API call). But what can you do if you cannot sign the URL for your API call (i.e. you don’t control the actual HTTP call)? …

I’m a huge fan of serverless, but one of the areas that’s still a bit tougher with it is testing. While you can mock an API call to a function, it’s not quite as easy to test an API endpoint as I’d experienced in more traditional stacks. This led me to creating a test suite in Postman, that hits my deployed APIs. Specifically I’m using this more as a smoke test, and final proof that the deployed APIs work as expected from real world use cases. I still have unit and integration tests in my codebase, but these Postman tests have turned out to be an extra layer of confidence. …

Amplify has a notifications mechanism (email) that sends you build notices. However, these notices have typos in the URLs they include, as well as they don’t incorporate your custom domain. Here’s one way to address that.

I’ve been using AWS Amplify Console to deploy a frontend app. Amplify has multiple components, so this is specifically about the Console piece, which deploys static web apps to S3+CloudFront. Amplify has some great features, including creating separate (subdomain based) deploys for every branch or pull request you create. You can configure this via patterns, to control whether all or only branches with certain names see this, etc. There are a ton of other great features going on around this as well. …

The AWS Go SDK configuration docs don’t provide an example of how to configure your code to use a role-based profile (that gets its keys from another profile) from your credentials file. It took me a few minutes to figure this out, so posting this here in case someone else searches for it, maybe I can save you the time :)

Image for post
Image for post
Photo by Kevin Ku on Unsplash

If your ~/.aws/credentials file contains profiles that just specify a role, where they then use the source_profile attribute to point to the profile that actually has the AWS access and secret keys, it’s not immediately obvious how to configure this in the Go SDK. …

The AWS Aurora Serverless Data API allows you to query an Aurora Serverless database via an HTTP interface, and specifically, to do so outside the VPC of the database. This is great for Lambda functions and also avoids DB connection management. Additionally, it makes for an easy way to do simple queries of your DB from the command line or similar.

Image for post
Image for post
Photo by Max Langelott on Unsplash

I recently converted a Lambda function (written in Go) from making direct Aurora Postgres SQL calls, to using the AWS Data API. The idea is that this avoids database connection management, and allows moving your Lambda function outside the VPC. The advantage of that is now you don’t need a NAT gateway in the VPC to allow the lambda to make outside network calls (e.g. this lambda also calls out to a third party HTTP API). NAT gateways have an approximately $30/month base cost (plus the cost of data transfer). Furthermore, this reduces the complexity of your configuration (assuming you don’t need the NAT gateway, etc. for other things). For example, this avoids around 140 lines of CloudFormation in my serverless.yml …

GitHub Actions are a nice option for CI or other automated actions on your repo. In setting up an action to run tests on some Go code, which requires a Postgres/PostGIS database, I ran into a few hiccups. This post covers these issues and my solutions, in hopes that I save others time who may have a similar need. I also tried this setup with CircleCI, but didn’t get it working (notes at the end).

Below I’ll cover the setup and solutions, and provide the final (working) configuration I used.

Image for post
Image for post

Postgres/PostGIS Setup

First, my production case uses AWS Aurora Postgres Serverless. This means it is Postgres 10.7 compatible underneath. This is key in choosing the right PostGIS Docker image to use. You can find a list of all possible choices in their Docker Hub. Additionally, the tests require seeding the database with some GIS data in order for the tests to pass. …


Chris Bailey

NatureQuant CTO/Co-founder. HotelTonight Co-founder. Cyclist, trail runner, espresso & coffee lover, geek, traveler, foodie.

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store