Recording and Saving Audio — Pt.2

React/Rails Stack + S3

August Giles
3 min readOct 22, 2018

Hello!

In Part 1, we set up recording audio with the MediaRecorder API and sent that data to our backend. In this post, we’ll take that information and send it to S3 using Rails Active Storage.

S3

S3 is one of the data storage services that AWS provides. This is where we’ll store the recordings. You will want to…

  • Set up an AWS account
  • Set up a bucket in S3 — this page in AWS documentation was very helpful
  • They will present you with ‘access key id’ and ‘secret access key’ for your bucket. Keep those safe and we’ll use those here in the next bit.

Active Storage

The role that active storage plays is that it takes larger information (like our recording) and attaches the record of it to an Active Record object while sending the file to our S3 storage. The Active Storage docs are wonderful, definitely check those out.

To do boilerplate…

  • Add gem "aws-sdk-s3", require: false to your gemfile to indicate S3
  • Active Storage creates two tables in your database. Run rails active_storage:install to create the migration then rails db:migrate to run the migration.
  • Now we need to indicate to our application where to store these recordings and how to get there. According to the docs, in your config/storage.yml file, provide the following:
local:
service: Disk
root: <%= Rails.root.join("storage") %>
test:
service: Disk
root: <%= Rails.root.join("tmp/storage") %>
amazon:
service: S3
access_key_id: ""
secret_access_key: ""
bucket: ""
region: "" # e.g. 'us-east-1'
  • This is where we get to use that access key and id, but we don’t want to put them straight in the file as it isn’t secure. This is where we utilize the config/credentials.yml.enc file. This is a great post on how to get that set up. If using VIM, which I found worked best, go ahead and check out this resource for VIM commands. Then reference that encrypted like so.
amazon:
service: S3
access_key_id: <%= Rails.application.credentials.dig(:aws, :access_key_id) %>
secret_access_key: <%= Rails.application.credentials.dig(:aws, :secret_access_key) %>
region: "" # e.g. 'us-east-1'
bucket: "" # 'bucket-name'
  • Now in config/environments/production place the following few lines of code. This lets rails know a few things. You are using S3 in production and it should route to your bucket. Also you have a master key that it should be using to decrypt your credentials file.
config.active_storage.service = :amazon
config.require_master_key = true
  • Similar action should be taken for your config/environments/development. We are just saving the recordings locally. Feel free to set to amazon if that’s your preference in development.
config.active_storage.service = :local

Attaching a Resource

Finally!

  • Decide which model you want to attach the recording to. At the top of that model, write out the relationship has_one_attached :recording or has_many_attached :recordings
  • Go to your controller and make a method called attach_recording or something along those lines. Below is the method I made.
def attach_recording
assignment = Assignment.find(params[:id])
r = assignment.recordings.attach(params[:recording])
url = Rails.application.routes.url_helpers.rails_blob_url(r.first, only_path: true)
render json: {message: "Attached to File", url: url}
end
  • In my case, I wanted to attach my recording to an existing ‘assignment’. So (1) I found the assignment (2) attached the recording to the assignment using assignment.recordings.attach(params[:recording]). (3) After url = is the string that’s needed to get a sharable url from your active storage. (4) We’re sending that sharable url back to the front end in a json response!
  • If you want to pull that sharable url back out in any other part of the application, just find the instance and call it using the information after url =

Troubleshooting

Again, check out the docs. AWS. Active Storage. Active Storage docs in general have a lot of information about setting up S3 to your backend.

I ran into an issue of permissions for accessing my AWS bucket during this project — specifically when trying to call my information back. If you find yourself in that situation, consider googling different S3 permissions.

--

--