Recording and Playing Audio Using AVFoundation

Rob Deans
4 min readDec 13, 2016

--

Hey y’all! It’s been a minute but I wanted to share part of an app that I’ve been working on that makes use of Apple’s AVFoundation framework.

The app will function like an MPC Player, more commonly known as a drum machine. Buttons on the app will correspond to different recordings, which are captured using the iPhone’s microphone. The end result is a digital beatbox capable of record and playing back vocals as well as a variety of other sounds!

Please note the floppy disk drive for added memory 💾

To accomplish this goal, I started to familiarize myself with Apple’s AVFoundation. As the name implies, this is the framework where Apple lays the foundation for the app’s audio and visual components. This includes AVPlayer for video playback, as well as AVAudio for everything related to audio.

AVAudio is used to play all kinds of audio files, one at a time or simultaneously, and also controls all the parameters such as playback level, stereo positioning, rewinding and fast-forwarding, and can even obtain data from playback-level metering. In addition, AVAudioPlayer comes with a litany of audio effects including reverb, delay, distortion, EQ, Pitch control, and CoreMIDI capabilities. Given Apple’s experience developing powerful Digital Audio Workstations (DAWs) such as GarageBand and Logic, and am very excited to see what the built-in Swift frameworks have to offer!

But let’s not get ahead of ourselves. First step: record audio. To do so we’ll need to import AVAudio into our Xcode project, have our ViewController conform to a couple protocols, and create the following variables:

(Never mind the CoreMIDI, that’s for another project)

Don’t worry about initializing just yet, we’ll to that in the following function that configures audio. Essentially what it will do is create a file path and name to store the audio, create a dictionary of audio settings, and execute a Do-Try-Catch to initialize our audioRecorder variable. You can call this function in the viewDidLoad(), and the code is as follows:

Forgive me for providing code in a screenshot and not snippets… but it’s great practice and even better for learning!

Next we need to make a function to retrieve a file at a specific URL path, and play that file through our audioPlayer. Let’s call the first one getFileURL() -> URL, and call it within an argument of the next one that sets up our audioPlayer. This is the function that we will call upon each time we want our audioPlayer play.

disableMPCButtons() makes sure that no audio is playing at the time of recording

Lastly we need to set up a pair of buttons to call on these functions when tapped, one to record and the other the play. To assure that these two functions don’t overlap, I set up labels on these buttons so that if audio is already recording or playing, the next tap will stop it

And that’s it! Try it out with your computer; if you press the record button, the recording should save to a specific audio path, and that audio path will then be called to play after pressing your play button.

Next step for me is to add multiple paths to record audio for multiple buttons, and try to add effects such as reverb, delay, and pitch variation, so that simple audio recordings can be sculpted and enriched to mimic a deep bass kick or a crisp snare drum in an open hall. Ray Wenderlich also offers this fascinating AudioKit tutorial, which illuminates Swift’s oscillator and sound synthesis capabilities.

Apparently Swift comes built in with a suite of technical plug-ins and frameworks for developers who are musically inclined, and I look forward to exploring these tools and sharing them with the community.

Happy coding!

--

--

Rob Deans

Chronicling curiosities in Swift and Beyond ~ iOS Developer // Tech Talent Pipeline // Flatiron School Alumnus // Musician // NYC