Tuesday, July 29, 2008

Cocoa AudioUnits

Hello,

In order to be useful, an audio processing graph must be made up of processing nodes. Without processing tools such as delays, equalizers and bandpass filters, my audio processing graph becomes an audio IO graph. So if I want to develop and test my system, I need to have some of these tools available. There is a wide assortment of processing libraries available online, including SoundObjects, STK, CLAM, and others. These libraries are mostly GPL licensed, but at least I can use them during my internal testing.

However, that being said, my computer already contains a wide assortment of such tools packaged as plugins, AudioUnit plugins to be precise. So, in order to test my engine and make it useful right out of the gate (on OS X), I have decided to include support for loading AudioUnits as nodes in my processing graph.

You can look on Apple's developer network for a broad and detailed explanation of what AudioUnits are. If you have experience with VST, MAS, or even DirectX plugins, then you already have an idea what AudioUnits are.

AudioUnits can be pro grammatically used by means of the AudioUnit framework, and the AudioToolbox framework. While the AudioUnit framework provides the bare essentials needed to instantiate and use AudioUnits (AU's for short), AudioToolbox builds upon this core and adds some very useful utilities, including the ability to build AU graphs... Hey! Wait a minute!? I am trying to build an audio processing graph engine! No worries, the AU graph system and my own engine have very different goals, and as such, the overlap is is minimal. The differences are as follows:
  • Portability: Obviously AU's are only available on OS X.
  • Realtime insulation: While the AU graph system does support connection management from a non-realtime context, additional realtime constraints issues are not clearly defined.
  • Extensible: The only way to extend the capabilities of an AU graph is to add AU's.
The most important difference to me is the second, that of realtime constraints insulation. To be honest I am quite happy to limit myself to developing on OS X, as it is by far the best platform I have worked on. Regardless, I find enough justification to develop my own audio processing framework.

Cocoa AudioUnits

I am a little surprised that AudioUnits have not been available as a Cocoa library. Perhaps there is little demand, many pro audio applications have their roots in pre-OS X systems, or are developed on a portable layer. Having been frustrated using the C interface to AudioUnits, I came up with a thin Cocoa wrapper, which allows me to treat AudioUnits as first class Cocoa objects. This is very nice when you want to store your AU's in an NSArray, or archive them, or display them in an NSTableView, etc. The wrapper is thin because it tries to model the existing C API in objective-C, with a few convenience functions thrown in. In addition to the ArkAudioUnit class, I have included ArkAudioUnitEditor and ArkAudioUnitManager. The former provides support for displaying AU views using one line of code, while the latter makes it easy to discover and instantiate AU's by category or human readable string.

The Cocoa AudioUnit source code may be found at http://code.google.com/p/audiodeveloper/ and is BSD licensed. There is no documentation yet (you may recall that one of my goals for this project was comprehensive documentation!) but the header files are fairly neat, and the interface so closely mimics the AudioUnit C API that developers should have little trouble understanding it.

Here is a snippet to get you started. Error checking is omitted for brevity.

// Declared elsewhere...
// ArkAudioUnit * output, * mixer, * reverb;

[ArkAudioUnitManager createDefaultAUGraph];

ArkAudioUnitManager * auManager =
[ArkAudioUnitManager defaultManager];

[auManager createAllAudioUnitLists];

output = [auManager createDefaultOutput];

[output initialize];

NSString * defaultMixerName =
[[auManager mixerNames] objectAtIndex:0];

mixer = [auManager createMixerWithName:defaultMixerName];

[mixer initialize];

reverb = [ArkAudioUnit audioUnitWithType:kAudioUnitType_Effect
subType:kAudioUnitSubType_MatrixReverb
manufacturer:kAudioUnitManufacturer_Apple];

[reverb initialize];
[reverb setBypassing:YES]; // Reverb is off by default.

[output connectInput:0 fromAudioUnit:reverb element:0];
[reverb connectInput:0 fromAudioUnit:mixer element:0];

[output retain];
[mixer retain];
[reverb retain];

// ...
// Later, in response to an action, this code will
//
display a window with the AudioUnit view inside.
[reverb showEditor];

Not Bad huh? I have used this code for about 4 years in a few different projects. No major bugs have popped up, but as the BSD license says, don't blame me if things go awry.

Anyways, that's enough for now, I suppose I will revise and expand at some point, until then, goodbye!

No comments: