Simple Modular Classes for some common AudioKit v5 Controls

Bruce Cichowlas
Geek Culture
Published in
3 min readMay 25, 2021

AudioKit is a really excellent package for developing audio applications for iOS and some related environments. It is comprehensive enough to be overwhelming at times. Sometimes it helps to break it down into modular classes, which you can then use in your view models or whatever architecture you might be using for Swift or SwiftUI. I won’t describe installing AudioKit here, since that is covered well elsewhere.

Here are some simple classes that I made for using an oscillator, playing back a sample and recognizing a pitch. You might want to parameterize them further for your own purposes.

I also use strings starting with #@nnn for diagnostics and #!nnn for error messages. You may use some other conventions.

Here’s an Oscillator class:

import UIKit
import AudioKit
class Osc: NSObject {
let engine = AudioKit.AudioEngine()
let osc = AudioKit.PWMOscillator()


override init() {
super.init()
engine.output = osc
}

func tune(_ freq: Float){
osc.frequency = freq
start() // adding this was necessary when using PWMOscillator but not with Oscillator
}

func start(){
osc.start()

do{
try engine.start()
} catch let err {
print("#@25 engine did not start \(err.localizedDescription)")
}
}

func stop(){
osc.stop()
engine.stop()
}
}

Sample Player class:

import AudioKitclass SamplePlayer: NSObject {
let engine = AudioEngine()
let player = AudioPlayer()

var soundName = "Synth"
var isPlaying = false
var isRunning = false

func setSoundName(_ s: String){
soundName = s
}

func start(){
if !isPlaying{
let reverb = Reverb(player)
reverb.dryWetMix = 0.25
engine.output = reverb
do {
if (!isRunning)
{
try engine.start()
}
isRunning = true
player.start()
let audioFile = Bundle.main.url(forResource: soundName, withExtension: "mp3")
if let file = audioFile {
do {
try player.load(url: file)
} catch let err {
print("#!23 player problem err: \(err.localizedDescription)")
}
player.volume = 10.0
// add some reverb here and balance the volume vs. other apps without clipping noticeably
player.play()//not sure if it will loop
isPlaying = true
} else {
print("#!18 problem with audioFile")
}

} catch let err {
print("#!35 problem with engine err:\(err.localizedDescription)")
}
} else {
print("#41 was already playing")
}

}

func stop(){
player.stop()
isPlaying = false
//engine.stop()
}

}

Pitch Detection class:

import UIKit
import AudioKit
struct PitcherData {
var pitch: Float = 0.0
var amplitude: Float = 0.0
var serial: Int = 0

}
protocol PitcherProtocol {
var pitcherData: PitcherData { get }

}
class Pitcher: NSObject, ObservableObject {
@Published var pitcherData: PitcherData = PitcherData() // initialize
var engine = AudioEngine()
var silence: Fader // see below
var mic: AudioEngine.InputNode

var engineRunning = false
var trackerRunning = false

var tracker: PitchTap!
var serial = 0

func update(_ pitch: AUValue, _ amp: AUValue) {
serial += 1
pitcherData = PitcherData(pitch: pitch, amplitude: amp, serial: serial)

}
override init(){
guard let input = engine.input else {
fatalError()
}
mic = input
silence = Fader(mic, gain: 0)

super.init()
tracker = PitchTap(mic) { [weak self] pitch, amp in
if let self = self{
if self.trackerRunning {
DispatchQueue.main.async {
self.update(pitch[0], amp[0])
}
}
}
}
}
func start(){
do {
if !engineRunning {
engine.output = silence
try engine.start()
}
engineRunning = true
if !trackerRunning {
tracker.start()
trackerRunning = true
}
} catch let ex {
print("#@83 \(Date().toDateAndTimeStringWithMilliSeconds()) ex: \(ex.localizedDescription)")
}
}

func stop(){
trackerRunning = false
}
}

Then just instantiate them in your main code to use them.

let osc = Osc()

I’m using Combine with SwiftUI here, particularly with the pitch detector “Pitcher”. I’ve also used delegate when I’m not using Combine.

--

--

Bruce Cichowlas
Geek Culture

Inventor, iOS/Android contract developer. Creator of Play Along Keys, a free music performance app in iOS and Android stores. You don't need music experience.