Inter-App Audio. The Audio Unit API allows for Inter-App Audio on iOS. Audio streams (and MIDI commands) can be sent between apps. For example, an app can provide an audio effect or filter. Another app can then send its audio to the first app to apply the audio effect. The filtered audio is sent back to the originating app in real time. Apr 14, 2018 2018.04.01 iOS Audio Unit Extension (AUv3) New today: Vatanator SX (Georgi Georgiev).universal AU HOST AudioBus 3 (AudioBus Pty Ltd).instruments & effects.universal Audio Evolution Mobile Studio (Davy Wenzler).instruments & effects.iPad only AUM (Kymatica).instruments, effects & AU MIDI plugins.universal. Here is a list of 10 best iOS audio interface which may help you to choose the best one for audio recording or music production with your iOS device. IK Multimedia iRig Pro Image source: – ikmultimedia. IK Multimedia iRig Pro is one of the best iOS audio interfaces in. Version 3 audio unit components can be registered in the following ways: Package the component into an app extension containing an AudioComponents Info.plist entry. The principal class must conform to the AUAudioUnitFactory protocol, which will typically instantiate an AUAudioUnit subclass.
Check out my gear on Kit: Coming to you Vlog Style 🤓🎥😎 Here's a List of My Top 5 free Audio Unit Apps for iOS Music Production.
Host sync has been available for ages on the desktop, and has been part of the Inter-App Audio technology since it came in iOS 7. It’s also how the new Audio Unit extensions in iOS 9 syncs to their hosts. The code examples in this document will show the Inter-App Audio API for host sync, which however is very similar to the new AUv3 API used by Audio Unit extensions.
Ableton Link is a new technology by Ableton, that allows sharing a common tempo and beat grid between apps and devices. It synchronizes beat, phase and tempo but leaves the transport controls (play/stop/rewind) up to each individual app. Each app need to be started and stopped by their own, but will play in sync with the other connected apps once started.
From a users point of view, those two technologies serve quite different purposes. While Ableton Link is great for live jamming, specifically involving multiple users and devices, it lacks a global transport control. There is also no master-slave relationship with Link. Host sync on the other hand, is a master (host) to slave (IAA node app or AU plugin) relationship, and also carries transport state. IAA nodes can also remote-control the transport of the host, and switch to the host app or even any of the other IAA node apps hosted by the same host.
Implementing the basic sync mechanism for these two are actually quite similar, however.The idea is that in each render buffer, you check where you should be at the end of this buffer, or actually where you should be at the first sample of the next buffer.
At the top of your render callback, you call ABLLinkBeatTimeAtHostTime()
, passing it the mHostTime that was passed in your render callbacks timestamp argument. But first, you must add the buffer duration in host ticks as well as the device output latency, plus any additional delay you’re adding to your audio.
This way, you get the precise beat time for the actual time corresponding to the end of the buffer (or actually the start of the next buffer).
The beat time will not advance in exact equal increments, even when the tempo is fixed. This is because of small adjustments Link does to keep all the peers in sync.
If you need to, you can calculate the exact tempo for the current buffer by checking how many beats that fits in the buffer, using the ABLLinkBpmInRange()
function from ABLLinkUtils.h.
Link also provides functions to get and set the session tempo, setting the sync quantum, and more. See ableton.github.io/linkkit for full API documentation.
Host sync is done through a set of callbacks contained in a HostCallbackInfo struct, that you retrieve from the host when connected and cache in a variable that is safe to access from the audio thread.
The two important callbacks in the HostCallbackInfo struct are beatAndTempoProc
and transportStateProc2
.
Beat time clock
At the start of each render cycle, call the beatAndTempoProc()
callback. You must call this in the render callback (audio thread), not the main thread!
Ios Audio Unit Apps Free
Take the beat time it gives and add the number of beats that fits in the current buffer according to the tempo you got from the callback. This will give you the expected beat time at the end of the buffer.
The given tempo should be used as an indication of where the beat time will be at the end of the buffer, there’s no guarantee that this will be precisely correct. At the next render cycle, you know where you are and again you guess where you should be at the end of the buffer.
Note that the tempo might fluctuate and be different for each render cycle, or it might be stable and smoothed. The fluctuations might come because the host is in turn syncing to something else, like Ableton Link or MIDI clock.
My mixer and host app AUM as of version 1.0 sends a fluctuating tempo that is the precise tempo for the current buffer, but since many IAA node apps might not be prepared for this it will instead send the stable session tempo beginning with version 1.1.
However, even with a stable tempo, nodes must be able to handle jitter in the beat time. The guessed beat time for end of buffer will not always align with the reported beat time you get during the next render cycle. More on this later in this document.
Transport state
Host sync has transport state that lets the node/plugin know if the transport is playing and if it’s recording, as well as some more esoteric information that we won’t focus on here.
To get the current transport state, you use the corresponding callback in the same way as the beat and tempo callback:
Call the transportStateProc2()
function at the top of your render callback, and detect if the play state has changed. You pass NULL for any values that you are not interested in. Start playing your audio in the same buffer as the host changes state to playing.
Starting at negative beat time
When the buffer comes where state changes to playing, the beat time for the start of the buffer might very well be negative! This could happen if the host has pre-roll, or if it’s syncing to Link and waiting for sync quantum phase. In those situations, the exact beat time 0 will not be at the start of the buffer, but somewhere in the middle of it.
A node or plugin must be careful and calculate the frame offset where the first beat actually is within the buffer.
There might also be multiple buffers with negative beat time before the buffer containing the start beat frame comes. So, it’s important that a node must treat negative beat time to mean “just sit and wait until time 0”.
Transport panel
It’s nice for an IAA node app to display a transport control panel and the current time for its host. To get the current state, just use the transportStateProc2
again, now passing in some more variables that we are interested in:
Note that the current sample time is given in the host sample rate, so make sure to use that when converting it to seconds.
You can call the above function periodically to update your UI, or if you are already calling it in your audio thread you can store the variables from there and read them in an UI timer to update your transport panel.
Note that this use here is only for displaying the transport state and time location for the user, responding to the actual playing
state should be done in your audio thread.
For remote controlling the host transport, you send AudioUnitRemoteControlEvents:
Now that you know the beat time at the end of the current buffer, you need a way to get there in a nice way.
It might be enough to just change the playback rate/speed.More advanced techniques might involve time stretching.If your app is trigger-based, for example a drum machine, just calculate the frame offsets for each event to see where in the buffer they should start playing.
If the jump from your current beat time is too big, or going backwards, then you should handle this by relocating/skipping to the correct beat position. The beat time can and will go backwards, when the host rewinds or when Link waits for the phase to reach the next sync quantum.
Jitter
On most 32-bit devices there’s jitter between mSampleTime
and mHostTime
of the timestamp passed to your render callback. Since Ableton Link is based on mHostTime
, you’ll see fluctuations in the incrementations of the beat time, and thus also in the calculated precise tempo for each buffer. If Link is also connected to other Link-enabled apps, the fluctuations might be larger and incorporate adjustments made by Link to keep all peers in sync.
One question I had regarding Link: Does the jittery beat time average out in the long run to stay in sync with a theoretical ideal clock source? Ableton responded that yes, this should be true but with the caveat that the hosttime clock of one of the devices in a session will be used as reference. Clocks can have slightly different frequency from each other, so it’s not true that it will match up with a theoretical “ideal” clock - it will match up with the actual physical clock of one of the devices on the network and the others will make slight adjustments to stay in sync with that.
If your app is sensitive to jitter, you might want to smooth the rate of change, but in such a way that no change is lost, only accumulated and spread over a longer time period. For example something like this:
I think it makes a lot of sense for an IAA app to support both host sync and Ableton Link. For an AU extension, host sync is the only reasonable option.
In my opinion, an app should use IAA sync if available, else fall back to Link if enabled. This is because a user expects a hosted node app to sync with the host. So the app should use ABLLinkSetActive()
to deactivate Link while syncing to IAA, and disable or hide the Link button in the app. You can also provide an “IAA host sync” toggle just in case the user wants to use Link instead and are not hosting the app inside a Link-enabled host such as AUM.
Detection of IAA sync
To detect if host provides IAA sync or not, use the following code. You could call this directly after being connected to host, on the main thread after you’ve read and cached the HostCallbackInfo to a variable:
Audio Unit
An Audio Unit app extension gives users a convenient way to create or modify audio in any iOS or macOS app that uses sound, including music production apps such as GarageBand or Logic Pro X.
An Audio Unit app extension has two main elements: an audio unit proper and a user interface (UI) to control the audio unit. The audio unit itself is a custom plug-in in which you implement an audio creation or an audio processing algorithm. You build the audio unit using the Audio Unit framework, whose APIs are described in Audio Unit Framework Reference. (Designing and building audio units is not covered in the current document, which instead explains how to incorporate an audio unit into an Audio Unit app extension target.) When creating an Audio Unit app extension, you design and create its UI in the storyboard file that is included in the extension’s Xcode template.
Figure 6-1 shows the architectural relationships between an audio unit proper and its user interface, which are both contained in the Audio Unit app extension, and between the extension and the host app that is using it.
If you have need to provide an Audio Unit app extension without a UI, you can do so by excluding the audio unit user interface, as suggested by the dashed outline in the figure.
Before you begin

Make sure the Audio Unit extension point is appropriate for your purpose. An Audio Unit app extension provides real-time audio processing that generates an audio stream to send to a host app, or that modifies an audio stream from a host app and sends it back to the host app. Audio Unit app extensions cannot perform recording.
To find out about other types of app extensions you can create, see Table 1-1.
Figure 6-2 shows an example UI for a custom filter Audio Unit app extension. The “draggable point” in the figure is a control that lets the user modify audio unit parameters for resonance and cutoff frequency.
For a video introduction to Audio Unit app extensions, watch the WWDC 2015 presentation Audio Unit Extensions. For a code example that shows how to create an Audio Unit app extension, see AudioUnitV3Example: A Basic AudioUnit Extension and Host Implementation. For more information on the audio unit API, see Audio Unit Framework Reference.
Audio Unit app extensions are supported in iOS 9.0 and later, and in macOS v10.11 and later. The Audio Unit app extension template is available starting in Xcode 7.
How Audio Unit App Extensions Work
In a host app that supports audio units in its audio processing pipeline, a user can choose to use an Audio Unit app extension to add the app extension’s features to the host app.
Each Audio Unit app extension contains exactly one audio unit. There are four audio unit types you can choose from, according to the role for your app extension:

For audio creation: A generator unit creates audio according to a digital signal processing (DSP) algorithm that you provide. An instrument unit creates audio, typically using a voice bank, in response to MIDI events.
For audio modification: An effect unit modifies audio according to a DSP algorithm. A music effect unit modifies audio, using DSP, in response to MIDI events.
Table 6-1 summarizes the four variants of Audio Unit app extension you can create, according to the contained audio unit type:
Category | Employs DSP | Employs DSP and responds to MIDI events |
---|---|---|
Audio creation |
|
|
Audio modification |
|
|
.
For the principal class in an Audio Unit app extension, subclass the AUViewController
class. (If you need to provide an Audio Unit app extension with no UI, subclass the NSObject
class instead.) The principal class for your extension must conform to the AUAudioUnitFactory
protocol.
Using the Xcode Audio Unit App Extension Template

The Xcode Audio Unit app extension template includes default source files for an AUAudioUnit
subclass for the audio unit itself, an Info.plist
file, an AUViewController
subclass, and a MainInterface.storyboard
file.
Listing 6-1 shows the Info.plist
keys and values for an iOS Audio Unit app extension for the Effect variant. The type
key in this property list specifies the audio unit type that determines the variant, in this case with a value of aufx
. For explanations of all the available keys and values, see Table 6-2.
<key>NSExtension</key>
<dict>
<key>NSExtensionAttributes</key>
<dict>
<key>AudioComponents</key>
<array>
<dict>
<key>description</key>
<string>TremoloUnit</string>
<key>manufacturer</key>
<string>Aaud</string>
<key>name</key>
<string>Aaud: TremoloUnit</string>
<key>sandboxSafe</key>
<true/>
<key>subtype</key>
<string>tmlo</string>
<key>tags</key>
<array>
<string>Effects</string>
</array>
<key>type</key>
<string>aufx</string>
<key>version</key>
<integer>0001</integer>
</dict>
</array>
</dict>
<key>NSExtensionMainStoryboard</key>
<string>MainInterface</string>
<key>NSExtensionPointIdentifier</key>
<string>com.apple.AudioUnit-UI</string>
</dict>
Ios Audio Unit Apps Free
The Audio Unit app extension template includes an Audio Unit Type option that lets you pick among four variants: Instrument, Generator, Effect, and Music Effect. Each of these variants includes a storyboard file for a user interface. If you need to create an app extension without a UI, with any of these variants, perform the following steps after you have created the app extension target:
Ios Audio Unit Apps App
Replace the
AUViewController
subclass with anNSObject
subclass.Replace the
NSExtensionMainStoryboard
key with theNSExtensionPrincipalClass
key.
An Audio Unit app extension has several customizable values in its Info.plist
file, described in Table 6-2.
Key | Value description |
---|---|
| A product name for the audio unit, such as |
| A manufacturer code for the audio unit, such as |
| The full name of the audio unit. This is derived from the |
| (macOS only) A Boolean value indicating whether the audio unit can be loaded directly into a sandboxed process. For more information on sandboxing, see App Sandboxing. |
| A subtype code for the audio unit, such as |
| An array of tags that describe the audio unit. The following predefined tags are already localized for your convenience: |
| The specific variant of the Audio Unit app extension, as you choose it when setting up the Xcode template. The four possible types and their values are: Effect ( |
| A version number for the Audio Unit app extension, such as |
| The name of the main storyboard file for the Audio Unit app extension. This key is required unless you are specifically creating an Audio Unit app extension without a user interface. In that unusual case, use the |
| The extension point identifier for the Audio Unit app extension. This value is |
Designing the User Interface
In iOS, a host app defines the size and position of a container view that embeds the remote view controller from the Audio Unit app extension.
For more information on designing app extension user interfaces for iOS, see iOS Human Interface Guidelines.
For macOS, consider the size and position of the selected content in the host app when specifying the size and position of the Audio Unit app extension’s main view.
Ios Audio Unit Apps For Windows 7
For macOS, use the preferredContentSize
property of the NSViewController
class to specify the Audio Unit app extension main view’s preferred size, based on the size of the selected content. (You can also specify minimum and maximum sizes for the extension’s view, to ensure that a host app doesn’t make unreasonable adjustments to the view.) To specify a preferred position for the app extension main view, set the preferredScreenOrigin
property to the lower-left corner of the extension’s view.
For more information on designing app extensions for macOS, see macOS Human Interface Guidelines.
Important
For best results on either platform, use Auto Layout to implement the user interface of an Audio Unit app extension.
Connecting the App Extension UI to the Audio Unit
You must connect your App Extension UI—specifically, the audio unit user interface code—to the audio unit proper. Critically, you cannot assume the order in which the extension UI and its associated audio unit are loaded when a host app requests the app extension. The AUViewController
subclass must attempt to connect its UI controls to its audio unit parameters when either the UI has been loaded or when the audio unit has been loaded, whichever happens first. Listing 6-2 shows code that attempts to connect the extension UI to its audio unit for both cases.
@implementationAudioUnitViewController{
AUAudioUnit*audioUnit;
}
-(AUAudioUnit*)createAudioUnitWithComponentDescription:(AudioComponentDescription)descerror:(NSError**)error{
audioUnit=[[MyAudioUnitalloc]initWithComponentDescription:descerror:error];
// Check if the UI has been loaded
if(self.isViewLoaded){
[selfconnectUIToAudioUnit];
}
returnaudioUnit;
}
-(void)viewDidLoad{
[superviewDidLoad];
// Check if the Audio Unit has been loaded
if(audioUnit){
[selfconnectUIToAudioUnit];
}
}
-(void)connectUIToAudioUnit{
// Get the parameter tree and add observers for any parameters that the UI needs to keep in sync with the Audio Unit
}
@end
Overriding AUAudioUnit Properties and Methods
You must override the following properties in your AUAudioUnit
subclass:
Override the
inputBusses
getter method to return the app extension’s audio input connection points.Override the
outputBusses
getter method to return the app extension’s audio output connection points.Override the
internalRenderBlock
getter method to return the block that implements the app extension’s audio rendering loop.
Also override the allocateRenderResourcesAndReturnError:
method, which the host app calls before it starts to render audio, and override the deallocateRenderResources
method, which the host app calls after it has finished rendering audio. Within each override, call the AUAudioUnit
superclass implementation.
Copyright © 2018 Apple Inc. All rights reserved. Terms of Use | Privacy Policy | Updated: 2017-10-19