Tomorrow, Monday December 10, my friend and partner-in-crime Sebastian Dittmann and I are launching a project over twelve months in the making: Audiobus. We’re very proud of what we’ve managed to do, and we both firmly believe that Audiobus is going to fundamentally alter the way people create music on the iPad and iPhone.
You can find out more about Audiobus itself at audiob.us, but I wanted to take a moment to breathe, look back, and explain why the hell I’ve been so quiet over the last year.
The Tale of Audiobus
Thirteen months ago, while my partner Katherine and I were mid-roadtrip, Copenhagen to the South of France (now that’s a drive), I was working on an update to Loopy HD.
For those not familiar with our somewhat unconventional lifestyle, my home and office is a 6 metre by 3 metre 1993 Hymer motorhome named Nettle (an unconventional name for a German gal like she is, but we liked it. We found her in the Cotswolds, after all.). In mid 2009, we left our lives and loved ones in Melbourne, Australia, and began taking a few years off from the Real World to live our dream, travelling Europe. Along the way, I’ve found my calling: developing creative apps.
So, I was updating Loopy HD, with the beautiful countryside of northern France out the window, red squirrels burying nuts in the leaf litter outside, tweaking Loopy’s MIDI support (me, not the squirrels).
For those not in the know, MIDI is a quite simple, very mature format — there are devices made in the 80’s (that’s before I was even born) which work just fine over MIDI with devices made today, even iPads — which lets music hardware interact. In Loopy’s case, MIDI is used to synchronise the tempo with other devices.
MIDI on the iPad and iPhone has quite a nice little feature called “Virtual MIDI”, which lets apps send MIDI messages to each other. Not techy? Trust me, it’s super-cool.
At the time, I was ruminating on the coolness, and had a thought. You see, the MIDI standard defines support for “System Exclusive” messages (SysEx), which lets devices send anything they want over the MIDI channel. Usually, it’s used for sending manufacturer-specific information like patch configurations or firmware updates.
But it occurred to me that a wide-open standard like this, essentially a “dumb pipe” — one of my favourite terms which means it doesn’t know or care about what you pump through it — a dumb pipe with the ability to communicate app-to-app has enormous potential.
Up until then, if you wanted to use more than one app as part of a music creation workflow, you had to use Audio Copy — just like copying and pasting text, but with chunks of audio. Audio Copy was huge, and meant that apps could be used together — recording here, manipulating there — but it works offline.
We’re talking editing, not performing.
Direct app-to-app communication…Well, that’s another ball game entirely. If you can do that, you can send data — that’s audio, or anything you like — live. It’s like suddenly discovering cables. Or discovering the telephone, after passing letters around.
I poked around on the Interwebs to see if anything like this had been done before, and I noticed a brief discussion on the Open Music App Collaboration (OMAC) mailing list — a group of talented iOS music developers who got together with the aim of working together to improve iOS music. The conversation hadn’t really gone anywhere, but I noticed one bright spark had said he’d been playing with the idea.
I emailed him.
Name look familiar? I didn’t know him then, but that’s our friend Rolf Wöhrmann of Tempo Rubato — he makes NLog, one of the first apps to support Audiobus.
It turns out, no one had really made any progress on this so I, always easily-distracted, thrust aside Loopy HD with the intention of exploring this exciting new idea for a little while.
A month or two, tops.
Two weeks later — and only about 130km further into our road trip, I might add — I had a working prototype, “iOS Audio Pipeline”, finished up during the annual crane migration over the fields of southern Champagne. Every evening, the cranes would fly overhead in long meandering V’s, calling out to each other in lovely mournful-sounding cadences. I like cranes.
But hey — the thing worked, guys. I was successfully sending realtime audio, app-to-app.
I excitedly updated the OMAC guys with my findings — the original email message is still there, warts and all.
The responses were enthusiastic, but the question of what Apple might feel about this sort of thing arose. I was pointed in the direction of a guy who has had a fair bit of experience being tossed around by App Review. I didn’t know him, but his name was Sebastian, and he chimed in on OMAC soon after.
Little did I know that I’d just met my future friend and business partner who would be instrumental in making the whole project come together.
Sebastian used his contacts at Apple to try to take their temperature, and received a pretty vague and noncommittal response, but, in Sebastian’s words — “It looks like this could actually work and be approved. No guarantee though but definitely not a clear NO.”.
In the meantime, Katherine and I had gradually migrated further south, through some truly beautiful French countryside, all tumbledown villages and autumn colours, cyan-coloured rivers flowing over light grey riverbeds.
We eventually found ourselves in the Provençal seaside town of Istres. It wasn’t nearly as nice a spot to spend the winter as it sounds, and as we’d hoped — actually, it rather unpleasantly reminded us of our three months in Tunisia! — but we were paid up and committed for the month, and I had work to do.
iOS Audio Pipeline steadily evolved, the prototype thrown out and a more efficient version began to take shape, based on a lower-level technology than Virtual MIDI: Mach Ports, the tech Virtual MIDI is built upon, which I’d subsequently discovered that I could actually use directly.
As happens so frequently in those golden early days of a project, the code poured out of me. It always reminds me of that anecdote about Michelangelo, revealing the sculpture that’s already there within the marble block — all that’s to be done is to remove the superfluous material. Not that I’m so grandiose as to compare myself to Michelangelo, but I can kinda see what he meant, in the way the code just reveals itself, everything falling into place. It’s almost hypnotic.
Here’s an early mockup of the interface — it actually existed like this for a short time, but I don’t have any live screenshots. Originally, I’d envisioned it as a control panel that was built into each app, accessible via whatever means the developer saw fit:
It was somewhat modelled on the de facto standard method for making Virtual MIDI connections available, and was, to be honest, quite clunky.
Along the way, I had more and more conversations with Sebastian, and we fast became friends.
Then he dropped the bomb, with the idea that turned iOS Audio Pipeline from an interesting piece of technology into an actual concrete, viable product, and solidified our partnership.
In Part 2 of this article, I’ll write about how Sebastian and I took iOS Audio Pipeline, a slightly nerdy-but-cool audio transport protocol, and turned it into Audiobus, the frankly awesome product it is today. There’ll be half a year spent in a beautiful village in the foothills of the Pyrenees, and an encounter in Barcelona. There’ll be battles, victories, the odd crisis, and many, many hours of programming. But don’t worry. It has a happy ending.