How I Built an Audio Player App — Part 2

** Precursor — Part 1
** Next Post — 
Part 3

Intro

I recently worked on a small music player app for Churches where they can play, control, and modulate different components of a song. Components being Drums, Vocals, Bass, Guitar etc.. Moreover, they can change Pitch as well as BPM (beats per minute — rate). This is way too many steps ahead of a simple audio player app where you only play one song, with no control on its components, no BPM change, no pitch change.

Continue, realising AVAudioPlayer is not enough

After successfully changing the BPM (rate) of the song, I tried changing the pitch. After digging into AVAudioPlayer class files, Apple’s documentation, stack overflow, I realised you can’t change pitch of the song using AVAudioPlayer. You must use AVAudioEngine to do that! Urgh! Now I have to completely rewrite my AudioManager.

Sidenote rant: There are two types of clients, one who understands these things happen, you make mistakes. The other type are the ones who don’t get it. @clients, if you are the former ones, I would mostly like working with you. If you are the latter, move along.

Moving on..

The benefit of using AVAudioEngine now is that I’ll have more control over the audio and its components. I can modulate rate and pitch, play/pause individual components etc.

Now, I’m initializing the AudioEngine so that I should be able to play just one song. I would also like to be able to change its pitch, so I’ll take care of that here as well:

class AudioManager {

var engine = AVAudioEngine()
fileprivate var playerNode = AVAudioPlayerNode()
fileprivate var timePitch = AVAudioUnitTimePitch()
override init() {
let format = engine.inputNode!.inputFormat(forBus: 0)

engine.attach(playerNode)
engine.attach(timePitch)

engine.connect(playerNode,
to: timePitch,
format: format)
engine.connect(timePitch,
to: engine.outputNode,
format: format)

do {
try engine.start()
} catch {
print("couldn't start the engine")
}
}
...
}

The way AudioEngine works is you attach “nodes” into it. Each node can be responsible for its own task, it can be playing a song, it can be changing the time pitch etc.. In this case, I’ve attached playerNode with timePitch, and timePitch with outputNode. So now if I’ll change the pitch at timePitch node, it will make sure to output the song with changed pitch. Here’s how we do it:

func changePitch(value: Float) {
timePitch.pitch = value
}
func changeRate(value: Float) {
timePitch.rate = value
}

We have got it to play audio now, we can also change the rate and pitch. But there is one problem here. Earlier we conformed to AudioPlayerDelegate, which gave us duration, current playing time, state of the audio. We don’t have any such thing in AudioEngine. Well, that is just sad! So in AVAudioPlayer we can show correct player time, duration etc., but can’t change pitch. To solve this, we are going to play audio in both AVAudioPlayer and AVAudioEngine, just that AVAudioPlayer will be on mute.

To play a song now, we just have to do this — play both at the same time:

fileprivate var player = AudioPlayer()
func play(atURL songURL: URL) {
let audioItem = AudioItem(mediumQualitySoundURL: songURL)!
let audioFile = try! AVAudioFile(forReading: songURL)
  playerNode.scheduleFile(audioFile, 
at: nil,
completionHandler: nil)
playerNode.play()
  player.play(item: audioItem)
}

Tracking progress

I create a wrapping protocol over AVAudioPlayer in our AudioManager. It essentially passes on the information whatever it receives from being an AVAudioPlayer delegate.

protocol AudioManagerDelegate: NSObjectProtocol {
func audioManager(_ audioManager: AudioManager,
didUpdateProgressionTo time: TimeInterval,
percentageRead: Float)
  func audioManager(_ audioManager: AudioManager, 
didFindDuration duration: TimeInterval)
  func audioManager(_ audioManager: AudioManager, 
didChangeStateFrom from: AudioPlayerState,
to state: AudioPlayerState)
}

..and this is how we are calling these methods:

weak var delegate: AudioManagerDelegate?
extension AudioManager: AudioPlayerDelegate {
func audioPlayer(_ audioPlayer: AudioPlayer,
didUpdateProgressionTo time: TimeInterval,
percentageRead: Float) {
songPlayedSeconds = time
delegate?.audioManager(self,
didUpdateProgressionTo: time,
percentageRead: percentageRead)
}

func audioPlayer(_ audioPlayer: AudioPlayer,
didFindDuration duration: TimeInterval,
for item: AudioItem) {
songDurationSeconds = duration
delegate?.audioManager(self, didFindDuration: duration)
}

func audioPlayer(_ audioPlayer: AudioPlayer,
didChangeStateFrom from: AudioPlayerState,
to state: AudioPlayerState) {
delegate?.audioManager(self,
didChangeStateFrom: from,
to: state)
}
}

Wrapping up:

We have rewritten our AudioManager and got AVAudioEngine to working. We can now change pitch, rate, track current time of the song. We are doing good so far.

Next up, we want to have control over different components of the song. That will be handled in next part.


Tanmay is a fulltime freelance iOS developer. You can know more about him here

If you liked this, please go ahead and recommend it. 💚