How to Create an Audio Unit Extension from Scratch in XCode

A step-by-step guide

Photo by chuttersnap on Unsplash

There aren’t many resources regarding Audio Unit Extensions out there. There’s one Apple sample project, some info, and WWDC talk about it:

https://developer.apple.com/videos/play/wwdc2015/508/

The basic idea about Audio Unit V3 is that, by default, it runs as separated process (Extension Service Process) and communicates with the host application using IPC (interprocess communication). There are pros and cons to this approach (it’s safer, but it adds overhead, around 40u seconds for each call, which could be problematic when working with very low latencies).

For easier development, and in cases in which the IPC overhead is too high on OS X, there’s an option to run Audio Unit In-Process if both host and Audio Unit specify that option. I’ll use this approach and show how to do it later.

Our plugin will only change the volume (of course, in real life you wouldn’t need a AU plugin for that—let’s keep it simple!).

We’ll have host app, extension, and a framework. The framework will contain audio unit code. With the framework we’ll be able to link it against both the extension and our host app for easy debugging during development (the extension main binary cannot be loaded into another process).

So let’s create the host application first. This will also be the containing app for the extension.

Create a new macOS CocoaApp. Call it AUHost.

Drag and an audio file to the project. (I’m using a file called “z.wav.”)

Inside the AUHost group, create AudioPlayer.swift.

Update ViewController.swift with the following:

Make sure you can hear your file being played!

Now, let’s create the extension.

Go to File New/Target, choose Audio Unit Extension, update Product Name (i.e. “VolumePlugin”), Subtype Code (use “demo”). and Manufacturer Code (also use “demo”).

Now create a shared framework where we’ll put audio unit plugins files.

Go to File / New / Target…

This is our project structure before refactoring extension and framework:

Because we are using loading in-process, we need to move all extension’s code into framework and update extension’s Info.plist.

Move files from VolumePlugin into AUFramework and link them as necessary by updating their target memberships.

AudioUnitViewController.xib should be linked to both extension and framework like so:

We need to expose VolumePluginAudioUnit.h to be able to use it in our AudioViewController.swift and outside the framework. To do that, set its Target Membership to AUFramework and make it Public, then import it inside AUFramework.h.

Updated “umbrella” public header, exposing framework and including plugin header.

Remove the bridging header, which was created inside extension as you can’t use bridging headers inside Frameworks.

The refactored project structure looks like this:

The project should build and then play audio files as it used to do.

Because we will run the plugin as in-process inside the host, we need to update audio unit’s Info.plist by adding the key AudioComponentBundle with value set to the framework bundle identifier inside the NSExtensionAttributes dictionary.

Inside VolumePlugin, open Info.plist inside as source and add:

We are officially ready to start working on Audio Unit plugin:)

Update AuHost / ViewController.swift

Add custom view in the storyboard and connect it with view controller using auContainer: NSView!

First add an Audio Unit view controller to Host app view controller.

Inside AUHost/ViewController.swift import AUFramework and add audio unit view controller property and create function, which will load audio unit view. Updated AUHost/ViewController.

We are using VolumePlugin.appex, which is our plugin extension.

Run the app; you should see the “Your AudioUnit UI goes here!” from AudioUnitViewController.xib.

Update AUFramework / AudioUnitViewController.swift

Now update AUFrameworkViewController. Create a horizontal slider in AudioUnitViewController.xib and corresponding action inside AudioUnitViewController.swift. Make sure to change maximum value from 100 to 1. You don’t want to blast your speakers.

Change var audioUnit: AudioUnit?to var audioUnit: VolumePluginAudioUnit? and make it public so we can access it from outside framework (from AUHost) and add optional property volumeParam of type AUParameter.

Open AudioUnitViewController.xib and add horizontal slider; connect it with AudioUnitViewController.swift.

Create a function connectWithAU(), inside which we’ll connect volumeParam to AUParameter from VolumePluginAudioUnit. We need to use the parameter identifier as key (for simplicity, I’m not changing the default param1 identifier).

Updated AudioUnitViewController.swift:

Update AUFramework VolumePluginAudioUnit.m

This will require some more work than previous updates.

We need to manually add AVFoundation to our AUFramework, otherwise we’ll be getting linker errors like:

ndefined symbols for architecture x86_64: "_OBJC_CLASS_$_AVAudioFormat", referenced from:
objc-class-ref in VolumePluginAudioUnit.o "_OBJC_CLASS_$_AVAudioPCMBuffer", referenced from:
objc-class-ref in VolumePluginAudioUnit.o
ld: symbol(s) not found for architecture x86_64
clang: error: linker command failed with exit code 1 (use -v to see invocation)
Manually add AVFoundation.framework for AUFramework target

Rename VolumePluginAudioUnit.m to VolumePluginAudioUnit.mm (to be able use C++).

Create C++ struct Buffer (inside Buffer.hpp), which will hold audio samples and add it as instance variable_buffer.

Inside VolumePluginAudioUnit.mm import Buffer:

#import "Buffer.hpp"

and add properties for AUAudioUnitBus and AUAudioUnitBusArray input and output bus and input and output bus arrays.

Inside initWithComponentDescription initialize Buffer, input, and output buses. The template has comments and functions that we basically need to replace with our created properties. For example, there’s already _parameterTree.implementValueObserver with block, inside which we get value passed by audio unit parameterTree (either connected from host view controller or by default connected from plugin’s view controller). In our case it contains the slider value. So let’s update the volume using this value:

__block Buffer *buffer = &_buffer;
_parameterTree.implementorValueObserver = ^(AUParameter *param, AUValue value) {
buffer->volume = value;
};
updated function initWithComponentDescription

Next update AUAudioUnit overrides, simply returning input and output bus arrays.

- (AUAudioUnitBusArray *)inputBusses {
return _inputBusArray;
}
- (AUAudioUnitBusArray *)outputBusses {
return _outputBusArray;
}

Update our Buffer struct inside the allocateRenderResourcesAndReturnError function.

Lastly update the internalRenderBlock function. This is where we are processing audio samples in real time. We are processing a bunch of them depending on size of the buffer, so it’s almost real time.

Back to AUHost

Update AudioPlayer.swift to process audio through our audio plugin. Add two properties: audioUnit: AUAudioUnit? and audioUnitNode: AVAudioUnit? and the function selectAudioUnitWithComponentDescription.

From the function startPlaying(), remove the connecting engine:

engine.connect(playerNode, to: engine.mainMixerNode, format: file.processingFormat)

This will be done on audio unit connect / disconnect.

Final update to AudioPlayer.swift

Finally, update AUHost/ViewController.swift to connect audio unit plugin to audio player. We need to convert the component sub type and component manufacturer we used in the beginning to 4-byte code (the same as in Info.plist). I’m using https://codebeautify.org/string-hex-converter.

Call this function inside viewDidLoad() before audioPlayer.play().

That’s almost it! We need to disable sandboxing for AUHost app AUHost.entitlements. If we keep sandboxing on, AVAudioUnit.instantiate and the error is:

NSOSStatusErrorDomain Code=-3000 “invalidComponentID”

After turning off sandboxing, it might be necessary to Clean Build Folder. In my case I had to reopen the project in Xcode.

In the WWDC talk about Audio Unit V3 components, they mention the need to add Inter-App Audio, but it only applies to iOS. If we do it, the OS X the app will be instantly killed by the system, and XCode will only tell us Finished running without any errors. Inside the system console, the error is due to using restricted entitlements.

Resources