Blog

Playing audio in time using Remote IO

I got an email today with a question about how to handle playback of audio in time, synchronised with a clock. My ‘musical notepad’ app Loopy does this, and I thought I’d briefly explain how.

Any app that makes use of the Remote IO audio unit framework (which is generally necessary for the kind of responsiveness required in a realtime musical app) provides audio to the hardware via a callback, which is periodically called when the hardware is ready for more.

The trick here is to provide the right chunk of samples in this callback for the current time position.

Loopy achieves this by:

1. Keeping track of where in the timeline we are at the time the callback is called

This is easily accomplished by keeping a record of the time the clock was started, subtracting this from the current time, and possibly performing a modulus with the tempo. For example:

  • (now - startTime) % timePerBar gives the number of time units into the current bar (lets call it timeIntoBar).
  • timeIntoBar / (timePerBar/beatsPerBar) gives the number of beats into the current bar, and
  • timeIntoBar % (timePerBar/beatsPerBar) gives us the time into the current beat.

2. Determining first if we should be playing audio at this time, and if so, which samples should be playing

This involves first converting our time units from step 1 into samples. For instance, you can convert microseconds to samples by dividing your time by 1000000/yourSampleRate. Aside: Of course, you can convert back from samples to time by multiplying instead of dividing.

Next, in the case of Loopy’s metronome, for example, we test for whether samplesIntoBeat < sound.lengthInSamples. If so, that means we should be playing audio. If the sound was a loop, of course, we could be always playing.

The offset into the sound, in samples, is just samplesIntoBeat, in the case of the simple metronome. In the case of a loop, you probably will be more interested in the number of samples into your loop — so instead of determining (now - startTime) % timePerBar, you may be interested in (now - startTime) % timePerLoop.

So, we want to return the requested number of samples starting from this offset into the sample array representing our audio.

3. Returning smooth audio in time

Note that if you just go returning any old set of samples, willy-nilly, you’re going to get nasty clicks and pops from discontinuities you get by not matching the start of your next buffer to the last one.

To ensure smoothness, Loopy keeps track of the offset of the last samples we returned, and just return the immediately following bunch of samples — unless we’re more than some threshold number of samples out of time, in which case we’ll suffer the pop in order to stay synchronised. Actually, you can even generally avoid the pop if you smoothly blend buffers over a short time, removing any discontinuity.

Final words

The example above was a relatively trivial one, for a metronome sound. For longer audio that may span multiple bars, you’ll probably want to perform a modulus by the length of your audio clip, possibly quantised to your time signature, and possibly using a per-loop time base, so you can start the loop at any point in your timeline and have it begin from the start. This is something Loopy doesn’t currently do — Loopy will keep your loops synchronised so when you start a loop playing, it’ll play whatever part corresponds to the current timeline, not from the start of the loop. Maybe it’ll be an option in the future?

I wrote a little about the timing of loops in my second article on Loopy’s implementation.

, , , , . Bookmark the permalink. Both comments and trackbacks are currently closed.

4 Comments

  1. Espen
    Posted January 11, 2011 at 9:24 pm | Permalink

    Hey, this is exactly what i have been looking for :)

    static OSStatus inputRenderCallback ( void *inRefCon, AudioUnitRenderActionFlags *ioActionFlags, const AudioTimeStamp *inTimeStamp, UInt32 inBusNumber, UInt32 inNumberFrames,buffer(s) AudioBufferList *ioData ) {

    Though how exactly do you get the current time in the render callback? Any chance to get a slightly more technical explanation on this?

    Cheers

    • Posted January 11, 2011 at 10:35 pm | Permalink

      Hey Espen,

      There are a number of ways; I use mach_absolute_time(), which seems to function pretty well. Just remember the time when you started, and subtract that from the current time to get the time into playback.

  2. Espen
    Posted January 13, 2011 at 9:59 pm | Permalink

    Thanks Michael :)

    So i create an instance of the input callback and do my calculations in there? How exactly would I do that?

    How do i get the values from the timestamp in the callback and get them in another thread. Please excuse my noobness on these basic questions. Though id love to get a description of this in more basic and easy to understand for learners. Thanks again.

  3. Ash
    Posted May 21, 2012 at 4:40 pm | Permalink

    Any chance you could post a simple example of a metronome using this technique. I am working on an app that would greatly make use of this – http://stackoverflow.com/questions/10688286/recording-against-a-metronome-of-set-length-using-remote-io