How to Create an Audio Unit Extension from Scratch in Xcode

A step-by-step guide

Borama Apps
May 4, 2019 · 8 min read
Image for post
Image for post
Photo by on

There aren’t many resources regarding Audio Unit Extensions out there. There’s one Apple sample project, some info, and WWDC talk about it:

The basic idea about Audio Unit V3 is that, by default, it runs as separated process (Extension Service Process) and communicates with the host application using IPC (interprocess communication). There are pros and cons to this approach (it’s safer, but it adds overhead, around 40u seconds for each call, which could be problematic when working with very low latencies).

For easier development, and in cases in which the IPC overhead is too high on OS X, there’s an option to run Audio Unit In-Process if both host and Audio Unit specify that option. I’ll use this approach and show how to do it later.

Our plugin will only change the volume (of course, in real life you wouldn’t need a AU plugin for that—let’s keep it simple!).

We’ll have host app, extension, and a framework. The framework will contain audio unit code. With the framework we’ll be able to link it against both the extension and our host app for easy debugging during development (the extension main binary cannot be loaded into another process).

So let’s create the host application first. This will also be the containing app for the extension.

Create a new macOS CocoaApp. Call it AUHost.

Drag and an audio file to the project. (I’m using a file called “z.wav.”)

Inside the AUHost group, create AudioPlayer.swift.

Update ViewController.swift with the following:

Make sure you can hear your file being played!

Now, let’s create the extension.

Go to File New/Target, choose Audio Unit Extension, update Product Name (i.e. “VolumePlugin”), Subtype Code (use “demo”). and Manufacturer Code (also use “demo”).

Image for post
Image for post
Image for post
Image for post
Image for post
Image for post

Now create a shared framework where we’ll put audio unit plugins files.

Go to File / New / Target…

Image for post
Image for post
Image for post
Image for post

This is our project structure before refactoring extension and framework:

Image for post
Image for post

Because we are using loading in-process, we need to move all extension’s code into framework and update extension’s Info.plist.

Move files from VolumePlugin into AUFramework and link them as necessary by updating their target memberships.

AudioUnitViewController.xib should be linked to both extension and framework like so:

Image for post
Image for post

We need to expose VolumePluginAudioUnit.h to be able to use it in our AudioViewController.swift and outside the framework. To do that, set its Target Membership to AUFramework and make it Public, then import it inside AUFramework.h.

Image for post
Image for post
Updated “umbrella” public header, exposing framework and including plugin header.

Remove the bridging header, which was created inside extension as you can’t use bridging headers inside Frameworks.

The refactored project structure looks like this:

Image for post
Image for post

The project should build and then play audio files as it used to do.

Because we will run the plugin as in-process inside the host, we need to update audio unit’s Info.plist by adding the key AudioComponentBundle with value set to the framework bundle identifier inside the NSExtensionAttributes dictionary.

Inside VolumePlugin, open Info.plist inside as source and add:

We are officially ready to start working on Audio Unit plugin:)

Update AUHost / ViewController.swift

Add custom view in the storyboard and connect it with view controller using auContainer: NSView!

First add an Audio Unit view controller to Host app view controller.

Inside AUHost/ViewController.swift import AUFramework and add audio unit view controller property and create function, which will load audio unit view. Updated AUHost/ViewController.

We are using VolumePlugin.appex, which is our plugin extension.

Run the app; you should see the “Your AudioUnit UI goes here!” from AudioUnitViewController.xib.

Update AUFramework / AudioUnitViewController.swift

Now update AUFrameworkViewController. Create a horizontal slider in AudioUnitViewController.xib and corresponding action inside AudioUnitViewController.swift. Make sure to change maximum value from 100 to 1. You don’t want to blast your speakers.

Change var audioUnit: AudioUnit?to var audioUnit: VolumePluginAudioUnit? and make it public so we can access it from outside framework (from AUHost) and add optional property volumeParam of type AUParameter.

Open AudioUnitViewController.xib and add horizontal slider; connect it with AudioUnitViewController.swift.

Create a function connectWithAU(), inside which we’ll connect volumeParam to AUParameter from VolumePluginAudioUnit. We need to use the parameter identifier as key (for simplicity, I’m not changing the default param1 identifier).

Updated AudioUnitViewController.swift:

Update AUFramework VolumePluginAudioUnit.m

This will require some more work than previous updates.

We need to manually add AVFoundation to our AUFramework, otherwise we’ll be getting linker errors like:

ndefined symbols for architecture x86_64: "_OBJC_CLASS_$_AVAudioFormat", referenced from:objc-class-ref in VolumePluginAudioUnit.o "_OBJC_CLASS_$_AVAudioPCMBuffer", referenced from:objc-class-ref in VolumePluginAudioUnit.o
ld: symbol(s) not found for architecture x86_64
clang: error: linker command failed with exit code 1 (use -v to see invocation)
Image for post
Image for post
Manually add AVFoundation.framework for AUFramework target

Rename VolumePluginAudioUnit.m to (to be able use C++).

Create C++ struct Buffer (inside Buffer.hpp), which will hold audio samples and add it as instance variable_buffer.

Inside import Buffer:

#import "Buffer.hpp"

and add properties for AUAudioUnitBus and AUAudioUnitBusArray input and output bus and input and output bus arrays.

Inside initWithComponentDescription initialize Buffer, input, and output buses. The template has comments and functions that we basically need to replace with our created properties. For example, there’s already _parameterTree.implementValueObserver with block, inside which we get value passed by audio unit parameterTree (either connected from host view controller or by default connected from plugin’s view controller). In our case it contains the slider value. So let’s update the volume using this value:

__block Buffer *buffer = &_buffer;_parameterTree.implementorValueObserver = ^(AUParameter *param, AUValue value) {buffer->volume = value;};
updated function initWithComponentDescription

Next update AUAudioUnit overrides, simply returning input and output bus arrays.

- (AUAudioUnitBusArray *)inputBusses {return _inputBusArray;}- (AUAudioUnitBusArray *)outputBusses {return _outputBusArray;}

Update our Buffer struct inside the allocateRenderResourcesAndReturnError function.

Lastly update the internalRenderBlock function. This is where we are processing audio samples in real time. We are processing a bunch of them depending on size of the buffer, so it’s almost real time.

Back to AUHost

Update AudioPlayer.swift to process audio through our audio plugin. Add two properties: audioUnit: AUAudioUnit? and audioUnitNode: AVAudioUnit? and the function selectAudioUnitWithComponentDescription.

From the function startPlaying(), remove the connecting engine:

engine.connect(playerNode, to: engine.mainMixerNode, format: file.processingFormat)

This will be done on audio unit connect / disconnect.

Final update to AudioPlayer.swift

Finally, update AUHost/ViewController.swift to connect audio unit plugin to audio player. We need to convert the component sub type and component manufacturer we used in the beginning to 4-byte code (the same as in Info.plist). I’m using .

Call this function inside viewDidLoad() before

That’s almost it! We need to disable sandboxing for AUHost app AUHost.entitlements. If we keep sandboxing on, AVAudioUnit.instantiate and the error is:

NSOSStatusErrorDomain Code=-3000 “invalidComponentID”

After turning off sandboxing (leave “sandboxing on” for the extension), it might be necessary to Clean Build Folder. In my case I had to reopen the project in Xcode.

In the WWDC talk about Audio Unit V3 components, they mention the need to add Inter-App Audio, but it only applies to iOS. If we do it, the OS X the app will be instantly killed by the system, and Xcode will only tell us Finished running without any errors. Inside the system console, the error is due to using restricted entitlements.

Let’s test our plugin in GarageBand!

The way Audio Unit V3 are different from old .components is that in order to get register it in your system, you just run the container app once (which contains your extension .appex). I don’t know why but I noticed that sometimes it’s enough to run you project in the Xcode, or run it from DerivedData folder , BUT sometimes you need to Archive your project -> Distribute -> copy the .app and run the app.

Let’s do that, open GarageBand select our plugin and it doesn’t work :(

Image for post
Image for post

The plugin is registered, we can verify running:

$ pluginkit -mv | grep bora!    co.borama.AUHost.VolumePlugin(1.0) FBCF4FEC-E388-40BC-8731-91F06A835EAD 2019-09-26 01:37:13 +0000 /Users/michal/Desktop/AUHost 2019-09-26 03-36-24/

However when we run:

auval -a

We get Cannot open component: 4099 error.

Let’s open our container app and check our extension:

Image for post
Image for post
Image for post
Image for post
Extension has no binary…

The extension contains no binary, we need to fix it!

We need to add a dummy source file to our extension target because it’s needed for the extension binary to be created, loaded and linked with the framework bundle.

Create an objective-c file and add there a function:

void dummy() {}
Image for post
Image for post

Now the extension will contain the binary, so let’s run auval -a again!

This time we have Cannot open component: 4097 error.

The problem lies in our extension’s .plist. When Xcode created the extension for us, it set NSExtensionPrincipalClass to:


But we moved the our principal class inside the framework. We need to change the property to:


This is where the AudioUnitViewController is now located.

Archive, copy and run the container app. Run the auval -a yet again! No complains this time. Restart GarageBand, load the plugin. It seems to work, BUT when we click on it it doesn’t show our lovely UI :(

Image for post
Image for post

We need to get back to AudioUnitViewController.swift inside AUFramework and add initializer without parameters:

Again Archive-Distribute-Copy App, run the container app. Reopen GarageBand. Load some lovely audio, add VolumePlugin, open it up and it works!

Last minor update to get rid of this auval warning:

dyld: warning, LC_RPATH @executable_path/../Frameworks in /Users/...../AUFramework.framework/AUFramework being ignored in restricted program because of @executable_path

We need to change AUFramework install path from default “@executable_path” to “@loader_path” inside Build Settings / Linking

Image for post
Image for post

Now this is really it!

One last note, it’s good to put lots of NSLogs add your tag to them and keep the console open filtered your tag.

Also to clean up (sometimes necessary) auval, you can run auval -v x x x.

Follow me on twitter:

Check out my ios / android apps:


Better Programming

Advice for programmers.

Welcome to a place where words matter. On Medium, smart voices and original ideas take center stage - with no ads in sight. Watch

Follow all the topics you care about, and we’ll deliver the best stories for you to your homepage and inbox. Explore

Get unlimited access to the best stories on Medium — and support writers while you’re at it. Just $5/month. Upgrade

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store