TPI: Analysing Audio on iOS
High-performance, real-time audio analysis
At the core of the TPI iPhone app are its sound tools and perhaps most useful are the SPL Meter (Sound Pressure Level) and the RTA (Real-time Analyser). These were the first two tools that were discussed when the initial brief was given to Cocoon and are by far the most complex sections of the app.
To understand why, it’s possibly wise to break down what the app needed to do in order to achieve the analysis required.
- Capture the audio from the microphone on the user’s device.
- Perform fast Fourier transform (FFT) calculations on the audio captured.
- Convert the values from the FFT to the correct linear and dB-A or dB-C values.
- Display that data in custom views at 60 fps.
Apple thankfully provides frameworks to achieve all of this. The AVFoundation framework can be used to capture the audio. Accelerate can be used to quickly perform those FFT calculations. Core Graphics/Core Animation is available to rapidly draw those gauges and charts Paul designed.
These frameworks are fairly low-level and would require a lot of code to get the end result but there is an elegant solution available.
Introducing AudioKit
Prior to working on the TPI app itself we started doing some experimentation to see just how feasible it would be to achieve the desired result. During those experimentations I played with using Apple’s frameworks directly and achieved some respectable results. However, I also found a fantastic open-source framework called AudioKit.
Like almost anything in programming there’s always someone who’s done it first or better and in handling audio in realtime on iOS that’s no exception. AudioKit provides a comprehensive toolkit for analysing and generating audio in Swift. It uses those low-level frameworks (AVFoundation and Accelerate) and wraps it up in an easy to understand node-based solution.
What that means is the amplitude or frequency of the mic input can be captured with the same performance as using Apple’s frameworks directly (bloody quick) but with using few lines of code.
Experiments
These are some of the earliest experimentations that we did prior to Paul’s final design. At this point we were unsure what exactly the app would look like and how the data would be displayed but it was a good idea to do these outside of the app project as it meant we could quickly dump something and create a new experiment if it wasn’t working.
Moving to the TPI App
Once we were happy with the approach to take in building the tools and Paul had finalised his design with the client, it was time to create the SPL Meter and RTA within the TPI app project.
My initial approach was to initialise AudioKit on each view separately and then attempt to shut it down when the user exited the view or the application. This however is not the correct approach at all and I quickly found out that doing this would result in crashes that seemed to be fairly sporadic.
I ended up creating an “AudioEngine” singleton that would handle all of the tools (SPL Meter, RTA, and Sound Generator) within the TPI app. It allowed me to have a central place to manage the notes, taps, and trackers required for the tools; as well as being able to include helper methods for converting to dB-A/dB-C and linear values for the various graphs/gauges.
Here’s a very cut-down example of what was created.
I knew as soon as I saw the designs that the UI would need to be built from scratch but I still tried to use an open-source graph for the RTA. Within 10 minutes it was clear that it wouldn’t work. I needed it to update rapidly to deliver a smooth 60 fps whereas the graphs I tried expected static data and didn’t concentrate on performance.

The SPL Gauge
I knew that the gauge on the left would be the trickiest of the two to create so I started tackling that one first. I’d never created a circular progress bar previously so I was slightly anxious.
Creating it turned out to be easier than anticipated and the construction can be broken down in to two sections: the outer ring and the ticks on the inside. I created the ticks directly in the draw(_ rect: CGRect) method of the view using a UIBezierPath. As I knew the number of ticks required I could use Pi to stroke at each specific point. I did this 4 times for both the small and large ticks in the red and grey. As the circle had a section cut out at the bottom I multiplied the angle calculated by a the fraction 0.86.
To speed up the process I was prototyping this in Swift playgrounds. I highly recommend this approach as it allowed me to check things without needing to compile the whole app and navigate to the correct view every time.
I had initially also drawn the outer rings using the same approach but because of the need for a gradient it became apparent that it would be easier to use some CALayers
I used a CAShapeLayer to create a full circle for each of the rings and used this as the starter path for a UIBezierPath. It was then simply a case of setting the strokeStart and strokeEnd to 0 and 0.86 respectively and stroking it grey for the first ring. The progress ring was similar but the current progress (i.e. 0.5 for 50%) was multiplied by 0.86 to get the end of the stroke. Alongside this a CAGradientLayer was used to create the desired design and the path was used as a mask on that layer.
The RTA Chart
Coming from the SPL gauge this chart was significantly more straight forward. As I said previously I did try to create this using some open-source charts and tried two or three before giving up and rolling my own.
This was four core CALayers with sublayers or paths added to those layers. Those layers are: barLayer, maxLayer, gridLayer, and gradientLayer. I created a render method that I could call whenever I needed the value to update. Originally I had this called automatically whenever the data or maxData values were set using Swift’s fantastic didSet observers but that caused some issues with it changing after a user had paused the value so I opted to call it manually when requited.
Let’s Recap
Recapping there were a number of issues we faced when building the SPL Meter and RTA for TPI.
On the UI side there were performance issues using pre-made charts which resulted in visible stuttering when updating the values. I also had to refresh myself on GCSE maths when it came to Pi!
However, possibly the biggest issue that held-up the development was the calibration of the tools. Initially I was under the assumption that the value delivered by the device could be converted using a basic 20 * log10(amplitude) calculation into dB but that was extremely naïve.
Over the course of the project I learned about dB-A, db-B, and dB-C weighting, got to play with audio analysis equipment that costs thousands, and was challenged—extensively. This is the best kind of project. There’s nothing more exciting for me than learning something new or being given a challenge that you initially balk at and think “whoa, how the hell am I going to do that?” and having to solve it. For me, that’s what I love about programming: solving puzzles.
You might also want to check out TPI: Building the iPhone App and TPI: A Design Process.

