A Tasty Pixel » Blog http://atastypixel.com/blog Thu, 31 Jul 2014 03:04:35 +0000 en-US hourly 1 Thirteen Months of Audiobus: Part 2 http://atastypixel.com/blog/thirteen-months-of-audiobus-part-2/ http://atastypixel.com/blog/thirteen-months-of-audiobus-part-2/#comments Thu, 24 Jul 2014 09:22:32 +0000 http://atastypixel.com/blog/?p=2688 This is the long awaited sequel to the tale of Audiobus’ development. I’m completing this article now, on the day we say an emotional farewell to our motorhome Nettle, who has today been sold to a new family in the UK. It seems like a fitting time to tie off some loose ends as we start the next chapter of our lives in our new home in Australia.

In Part 1 of this article, I wrote about the early stages of the technology which was to become Audiobus, our inter-app audio platform, now supported by over 500 great music apps. Part 1 ended just as Sebastian had one of his genius moments, which I obnoxiously left as a cliffhanger. So, onwards:

It was winter in the south of France, and I was buried in the best kind of work: A new project, and one that brought together a bunch of different interests into a challenging, exciting heap.

But first, it was time to move on and find a more satisfying place to spend the rest of the winter.

I’d spent a little time researching places we could stay which were open all year, and I found one that looked promising: A little caravan park, wedged between the Aude river and the ruins of a 12th century cathedral, in the foothills of the Pyrenees, which sprawl across the French/Spanish border, in an old Roman spa village called Alet-les-Bains. So we packed up and took off across the countryside, taking a couple of days about it.

When we finally arrived, following the snaking, cyan-coloured Aude up a valley in the foothills, we were thrilled: Alet-les-Bains was the kind of French village that we would fantasise about before we left Australia, all delightfully wonky half-timber buildings and narrow winding lanes, and right out our door, too.



Alet kitty

MG 4015

MG 4021

It even snowed for us a couple times, which really was rather sporting. Snow in a medieval French village! Hard to beat that.

Snow in Alet

Snow in Alet

Snow in Alet

Nettle, in the snow

Anyway, it was a tremendous spot to spend a few months programming.

So far, I’d put together an interface for my inter-app audio system that was built into each app. Users would bring up the interface (somehow, depending on the app), then select sources/destinations from there.

Screenshot 2011 12 27 20 58 43

I’d been conversing with my friend Sebastian more and more frequently, sometimes spending hours each day talking. I was struggling with two things: Usability, and a way to turn this idea into a viable product. And then, Sebastian solved both: We make a separate app which does all of the configuration.

That solves the usability problem, as it means there’s one, very accessible and simple way to connect stuff together. The problem with the traditional technique for connecting apps via Virtual MIDI is that every app has a different interface, accessed in a unique way for each app, and with app-specific behaviour. There’s no consistency at all, and it’s never clear what one should expect. Having it all done in a centralised location makes everything absolutely clear, and gives users a nice top-down overview to boot.

But even more elegantly, having a centralised app solves the product problem. My original thought was finding a way for developers to fund the development. That’s not a great idea, firstly because there are a very limited number of developers, and secondly because it actually disincentivises adoption! By having an actual app we can sell, our users can help fund development, which is a vastly more sustainable model.

So, Sebastian’s proposal turned what would be a mainly self-funded, unsustainable project into something that we can truly throw our weight behind. That’s when we became a partnership for real, and dived into designing the workflows.

“iOS Audio Pipeline”, the in-progress name for the project, soon became “Audio Bus”, then “AudioBus”, and finally (with Sebastian’s German influence!), “Audiobus”. It felt right.

We went through a number of iterations of icon concepts, some more lewd than others:

Screen Shot 2014 07 24 at 6 25 56 pm

…until we settled on the final version:

Audiobus Icon

We spent months and months carefully going over details, talking for hours each day about how to best represent the connection screen, how to handle multi-app workflows, tearing through ideas until we found something that worked. It was out of these discussions that the concept of the three-group “Input-Filter-Output” pipeline came into being.

Screenshot 2012 08 13 17 55 03

This three-way division between types of apps let us (and our community of developers and users) communicate more easily about what apps could do and how they interacted. It also led to a very simple and easy to operate connection mechanism that required just a short sequence of taps.

Another thing that fell out of these discussions was the Audiobus connection panel, a vital ingredient for a multi-app workflow, where you might want to control an app in the background from a different app in the foreground. Crucially, this give us the ability to trigger recording and playback in a timely fashion, without needing to switch between apps to do it. It also gave us a way to navigate between the currently-active apps of a session, pulling them together.

Screen Shot 2014 07 24 at 3 54 56 pm

A concept we struggled with was how to initiate an Audiobus session: due to the limitations of the platform at the start, Audiobus could only detect apps once they were running and active, after they’d logged into our registration system. That felt wrong, and we needed a way to avoid that awkward additional step.

The solution came in the form of a separate registration system that allowed developers to register apps with our database which was then downloaded by the Audiobus app and used to identify which apps were installed, without those apps needing to be launched in advance. The same mechanism that allowed us to determine if an app was installed — the use of a custom URL — we could also use to kick-start those apps, so that give us both an easy-to-use app launcher as well as an on-device registration system.

That meant diving into some web development, which resulted in the Audiobus Developer Center we have today.

Developer site

This registry gave us something more valuable, though: a fully automated (from our point of view, anyway!) dynamic list of apps that work with Audiobus, categorised by their capabilities. That meant we could make this list available on our website, allowing people to discover new apps. Tied together with Apple’s affiliate marketing system that gave us, aside from an additional revenue stream, some fantastic tracking abilities.

It’s that tracking that let us see just what implementing Audiobus has done for those compatible apps: just in the first few months of 2014, we sold almost a third of a million dollars worth of third-party apps through our compatible apps directory, which is staggering.

In the meantime, Sebastian and I met for the first time in Barcelona! He flew down to join us, while we drove over the Pyrenees down into Spain, passing through snow and down to the 35°C coastal climate.

MG 4847

MG 4852

IMG 1568

After nearly a year of working together, despite having never met face-to-face before it seemed quite natural to hang out in person, although our productivity admittedly dropped somewhat during the week in food, beer, and vermouth-induced merriment.


MG 4965

We made contact with a number of developers we’re close to, and began the process of trying out Audiobus in other environments than our own apps. It was a pleasure working with the likes of Rolf Wöhrmann, Jaroslaw Jacek, Hamilton Feltman; Kalle Paulsson and the Propellerhead team, the Positive Grid team and a number of other talented folks, and together we discovered and ironed out the remaining issues.

It was a thrill seeing these great apps passing live audio between themselves, and we were excited to see public interest growing as we got closer to having a launch-ready product.

Screen Shot 2014 07 24 at 6 33 25 pm

Meanwhile, we’d taken some time to head northwards through the beautiful region of Provence, France, where we spent some time hiking over glorious Mediterranean fjords and rowing and swimming in turquoise water near Cassis, cycling through fields of purple lavender, and driving through beautiful fairytale landscapes.

MG 5492

MG 5577

MG 5726

MG 5933

We ended up back in the UK, in the beautiful, quiet village of Wootton, where we continued to refine Audiobus and get it ready for launch, amidst the rolling green fields, bluebells, cooing pigeons and bleating sheep.

We’d taken Audiobus to the point where it was ready for a test submission to Apple App Review. We were very nervous about this, as Apple don’t have a pre-approval facility. That means that there can be a staggering amount of risk involved in developing a groundbreaking new product, especially one that treads dangerously close to “system” functionality. This was the one point of failure for us, and we uploaded a build to Apple with great trepidation.

The wait was excruciating. If Apple said no at this point, we would have no recourse: the last year of work would be wasted.

Then, the news arrived: we had the green light! We breathed a massive sigh of relief and continued with our refinements, building the website, writing the developer documentation, addressing bugs.

Winter began its approach, and we moved closer to London as frosty mornings became ever more common.

MG 7171

Finally, launch time arrived: we had our initial group of apps, we’d done our due diligence, everything was tested and ready to go. We uploaded the final public build to Apple, along with Audiobus-compatible builds of our other apps Loopy and SoundPrism, and sat back to wait for approval.

Then, disaster struck as we were contacted by a representative of App Review who told us they had a problem with our app and were taking some time to consider whether it would be rejected. It’s hard to express our feelings at this point, after having come all this way. We were terrified.

And then: SoundPrism was approved. Then Loopy. And then, to our enormous relief, we received notice that Audiobus had been approved. It was time for a drink.

Our work was done, and with great excitement in the frosty dawn of an early winter’s day we watched as the biggest, most complex project we’d ever worked on went out into the world.

http://atastypixel.com/blog/thirteen-months-of-audiobus-part-2/feed/ 1
Help! iOS 7 broke my microphone input! http://atastypixel.com/blog/help-ios-7-broke-my-microphone-input/ http://atastypixel.com/blog/help-ios-7-broke-my-microphone-input/#comments Wed, 23 Oct 2013 02:05:37 +0000 http://atastypixel.com/blog/?p=2658 We hear a lot about people having problems with their music apps on iOS 7 no longer receiving audio. I thought it was time I posted an article describing why this is happening, and how to fix it.

iOS 7 introduced a bunch of new security and privacy features and restrictions. In particular, when an app wants to record audio, iOS 7 will block the app from doing so until the user gives permission. Usually this happens via an alert dialog in the app:

Screen Shot 2013 10 23 at 12 41 10

However, if one taps “Don’t Allow”, the system won’t ask again — ever. That can spell confusion and frustration (and support emails, and 1-star App Store reviews!) for users who tapped the dialog away without reading it, and then discovered they’re unable to record audio.

Alas, there’s not much that can be done about that from our end, except for explaining how to fix the issue once this happens.

The trick is to open Privacy Settings for the device, and enable Microphone access for the app. The controls can be found in the system Settings app:

Screenshot 2013 10 23 12 55 54

Screenshot 2013 10 23 12 29 13

Once you’ve turned this on, the app should begin receiving audio. Depending on certain factors, you may need to quit and restart the app.

http://atastypixel.com/blog/help-ios-7-broke-my-microphone-input/feed/ 0
Want to be able to downgrade your apps? Save ‘em before updating. http://atastypixel.com/blog/want-to-be-able-to-downgrade-your-apps-save-em-before-updating/ http://atastypixel.com/blog/want-to-be-able-to-downgrade-your-apps-save-em-before-updating/#comments Thu, 30 May 2013 04:01:46 +0000 http://atastypixel.com/blog/?p=2639 Regrettably, the App Store doesn’t really make it easy to downgrade apps if an update goes awry. This can be pretty problematic if you use your apps for critical stuff like live music and it all goes horribly wrong the day before a gig.

That problem’s pretty easy to solve though. Just back up your apps before upgrading. That way you can try out new updates without the risk. Here’s how:

Open iTunes, then select the “Apps” section from the drop-down box on the top right:

Screen Shot 2013 05 30 at 13 49 14

Next, find the app you want to back up, right-click on it, and select “Show in Finder” (or whatever the Windows equivalent is!).

Screen Shot 2013 05 30 at 13 50 52

Finally, grab the “ipa” file, and copy it somewhere safe.

Screen Shot 2013 05 30 at 13 52 04

If you want to be really safe, grab a piece of software like Macroplant iExplorer which lets you access the files on your device. Then hook up your iDevice via USB, and back up the Documents and Library folders from within the app. That’ll save all your files and config just in case the update applies some non-backwards-compatible changes.

Screen Shot 2013 05 30 at 13 56 34

Now, you may update your app with impunity.

If you change your mind and want to go back to how it was before, drag that backup you made back into iTunes, and tell iTunes to replace the current version. Sync your device, and if you backed up your Documents/Library folder, drag your backup back into the original app folder within iExplorer.

By the way: If it’s too late to make a manual backup, but you use Time Machine or another backup utility, then you’ll find the older version of the app in your backup, within your iTunes music folder. For me, it’s in ~/Music/iTunes/Mobile Applications.

http://atastypixel.com/blog/want-to-be-able-to-downgrade-your-apps-save-em-before-updating/feed/ 6
Loopy update, now fully accessible via VoiceOver! http://atastypixel.com/blog/loopy-update-now-fully-accessible-via-voiceover/ http://atastypixel.com/blog/loopy-update-now-fully-accessible-via-voiceover/#comments Sun, 12 May 2013 03:01:47 +0000 http://atastypixel.com/blog/?p=2633 “Loopy HD. Double-tap to open.”

Loopy’s now VoiceOver-accessible for blind users!

Here’s what else is new:

  • Added “Toggle Reverse” MIDI Action
  • Revised behaviour of “Toggle record, then record next track”, “Toggle record, mute, then record next track” and “Mute and unmute next muted track” controller functions
  • Auto-select tracks when recording while using MIDI/Bluetooth controllers
  • Added “Simultaneous Recording” setting
  • Re-record action now cancels in-progress recordings
  • Accuracy improvements in MIDI sync
  • Reduced loop boundary crossfade duration
  • Assorted bug fixes

Please note: As of this version, Loopy’s no longer iOS 4 compatible. If you are one of those users still on iOS 4, I recommend you back up the existing version of Loopy prior to updating, by right-clicking Loopy in your apps list in iTunes, and clicking “Show in Finder”, then copying that file somewhere safe.

http://atastypixel.com/blog/loopy-update-now-fully-accessible-via-voiceover/feed/ 1
Loopy HD 1.4 and Loopy 2.5 Bring Reverse and Decay; Loopy HD on 50% sale http://atastypixel.com/blog/loopy-hd-1-4-and-loopy-2-5-bring-reverse-and-decay-loopy-hd-on-50-sale/ http://atastypixel.com/blog/loopy-hd-1-4-and-loopy-2-5-bring-reverse-and-decay-loopy-hd-on-50-sale/#comments Mon, 01 Apr 2013 02:46:40 +0000 http://atastypixel.com/blog/?p=2627 Effects Update

Loopy News!

I’m happy to announce Loopy HD 1.4 and Loopy 2.5 – a significant update that brings the features most frequently requested by users: Reverse and decay.

Also – Loopy HD is 50% off!

You can access the new effects via the track menu, which can now be rotated to access the new menu items.

Reverse will play the track back in the reverse direction – you’ll see the position indicator move in the opposite direction. You can still do everything you usually can with reversed tracks, including position offsetting, overdubbing and merging.

Decay works while you are overdubbing a track: While it’s enabled, it will eat away at your track audio as you overdub new audio on top of it, fading away old layers as you make new ones.

There’s also a change to the way you finish track recordings: Now, when you punch out, Loopy will count out to the next cycle. That means you can tap the track at any time, and it’ll automatically end on the next cycle. To punch out immediately, tap twice. The first tap begins the count-out, and the second tap ends straight away.

New actions that can be triggered via MIDI or Bluetooth:

  • Toggle fading
  • Mute and play next muted track
  • Toggle record, mute, then start recording next track

More changes:

  • New tracks recorded that are multiples of the clock now always start at 12 o’clock
  • Audio engine tweak to now have sample-level latency compensation accuracy
  • Fixed bug with count-in quantize slider resetting
  • Fixed timing problem with blank loops from saved/autosaved sessions
  • Fixed an issue with recognizing MIDI note controls
  • Fixed some problems with re-record feature
  • Redesigned tutorial system
  • Make panel display selector always move in one direction when tapped on iPad

Loopy HD and Loopy are available on the App Store right now.

http://atastypixel.com/blog/loopy-hd-1-4-and-loopy-2-5-bring-reverse-and-decay-loopy-hd-on-50-sale/feed/ 3
Searching iOS header files with Xcode http://atastypixel.com/blog/searching-ios-header-files-with-xcode/ http://atastypixel.com/blog/searching-ios-header-files-with-xcode/#comments Mon, 01 Apr 2013 00:57:37 +0000 http://atastypixel.com/blog/?p=2624 I’m often having to grep through various iOS frameworks in search of error codes that appear (“What the bloody hell does -10867 mean?”). This can be a bit annoying – especially while working with Core Audio – so I put together an Alfred workflow that does it for me.

Here it is – type “hs” (short for “header search”) then the text you want to search for, and it’ll give you matching results. Hit enter to open that file:line combination in Sublime Text, or edit the action script to work with the editor of your choice.

Search Xcode Header Files.alfredworkflow

Screen Shot 2013 04 01 at 11 54 24

http://atastypixel.com/blog/searching-ios-header-files-with-xcode/feed/ 0
Encrypting and decrypting text with Alfred 2 http://atastypixel.com/blog/encrypting-and-decrypting-text-with-alfred-2/ http://atastypixel.com/blog/encrypting-and-decrypting-text-with-alfred-2/#comments Mon, 25 Mar 2013 01:52:39 +0000 http://atastypixel.com/blog/?p=2616 Here’s a couple of Alfred 2 workflows that implement encryption and decryption via AES256, useful for doing things like sharing passwords.

Select some text (or copy it to the clipboard), and hit the encryption hotkey, and you’ll be prompted for a password; the encrypted contents will be copied to the clipboard.

Then when the recipient has the encrypted text, select or copy it, hit the decryption hotkey, and the original password will be requested. Then, the original text will be displayed and copied to the clipboard.



Screen Shot 2013 03 25 at 12 51 50

Screen Shot 2013 03 25 at 12 52 14

http://atastypixel.com/blog/encrypting-and-decrypting-text-with-alfred-2/feed/ 0
The Amazing Audio Engine is here, and it’s open source and Audiobus-ready http://atastypixel.com/blog/the-amazing-audio-engine-is-here-and-its-open-source-and-audiobus-ready/ http://atastypixel.com/blog/the-amazing-audio-engine-is-here-and-its-open-source-and-audiobus-ready/#comments Tue, 19 Mar 2013 06:05:11 +0000 http://atastypixel.com/blog/?p=2603 Taae

I’m very pleased to announce that The Amazing Audio Engine has pulled into the station. It’s been a long time in the making, and there have been one or two minor distractions along the way, but I’m proud of the result:

A sophisticated and feature-packed but very developer-friendly audio engine, bringing you the very best iOS audio has to offer. We’re talking audio units, block or object-based creation and processing, filter chains, recording and monitoring anything, multichannel input support, brilliant lock-free synchronization and rich Audiobus support.

You’ll find The Engine, a bunch of documentation and the brand-new community forum at theamazingaudioengine.com

It’s also open source. And it’s ready for Audiobus.

http://atastypixel.com/blog/the-amazing-audio-engine-is-here-and-its-open-source-and-audiobus-ready/feed/ 0
Thirteen Months of Audiobus http://atastypixel.com/blog/thirteen-months-of-audiobus/ http://atastypixel.com/blog/thirteen-months-of-audiobus/#comments Sun, 09 Dec 2012 16:59:26 +0000 http://atastypixel.com/blog/?p=2582 Tomorrow, Monday December 10, my friend and partner-in-crime Sebastian Dittmann and I are launching a project over twelve months in the making: Audiobus. We’re very proud of what we’ve managed to do, and we both firmly believe that Audiobus is going to fundamentally alter the way people create music on the iPad and iPhone.

You can find out more about Audiobus itself at audiob.us, but I wanted to take a moment to breathe, look back, and explain why the hell I’ve been so quiet over the last year.

The Tale of Audiobus

Thirteen months ago, while my partner Katherine and I were mid-roadtrip, Copenhagen to the South of France (now that’s a drive), I was working on an update to Loopy HD.

MG 4769

For those not familiar with our somewhat unconventional lifestyle, my home and office is a 6 metre by 3 metre 1993 Hymer motorhome named Nettle (an unconventional name for a German gal like she is, but we liked it. We found her in the Cotswolds, after all.). In mid 2009, we left our lives and loved ones in Melbourne, Australia, and began taking a few years off from the Real World to live our dream, travelling Europe. Along the way, I’ve found my calling: developing creative apps.

So, I was updating Loopy HD, with the beautiful countryside of northern France out the window, red squirrels burying nuts in the leaf litter outside, tweaking Loopy’s MIDI support (me, not the squirrels).

MG 3387

'MIDI is blasé'

For those not in the know, MIDI is a quite simple, very mature format — there are devices made in the 80’s (that’s before I was even born) which work just fine over MIDI with devices made today, even iPads — which lets music hardware interact. In Loopy’s case, MIDI is used to synchronise the tempo with other devices.

MIDI on the iPad and iPhone has quite a nice little feature called “Virtual MIDI”, which lets apps send MIDI messages to each other. Not techy? Trust me, it’s super-cool.

At the time, I was ruminating on the coolness, and had a thought. You see, the MIDI standard defines support for “System Exclusive” messages (SysEx), which lets devices send anything they want over the MIDI channel. Usually, it’s used for sending manufacturer-specific information like patch configurations or firmware updates.

But it occurred to me that a wide-open standard like this, essentially a “dumb pipe” — one of my favourite terms which means it doesn’t know or care about what you pump through it — a dumb pipe with the ability to communicate app-to-app has enormous potential.

Up until then, if you wanted to use more than one app as part of a music creation workflow, you had to use Audio Copy — just like copying and pasting text, but with chunks of audio. Audio Copy was huge, and meant that apps could be used together — recording here, manipulating there — but it works offline.

We’re talking editing, not performing.

Direct app-to-app communication…Well, that’s another ball game entirely. If you can do that, you can send data — that’s audio, or anything you like — live. It’s like suddenly discovering cables. Or discovering the telephone, after passing letters around.

I poked around on the Interwebs to see if anything like this had been done before, and I noticed a brief discussion on the Open Music App Collaboration (OMAC) mailing list — a group of talented iOS music developers who got together with the aim of working together to improve iOS music. The conversation hadn’t really gone anywhere, but I noticed one bright spark had said he’d been playing with the idea.

I emailed him.

Screen Shot 2012 12 09 at 14 22 33

Name look familiar? I didn’t know him then, but that’s our friend Rolf Wöhrmann of Tempo Rubato — he makes NLog, one of the first apps to support Audiobus.

It turns out, no one had really made any progress on this so I, always easily-distracted, thrust aside Loopy HD with the intention of exploring this exciting new idea for a little while.

A month or two, tops.

Two weeks later — and only about 130km further into our road trip, I might add — I had a working prototype, “iOS Audio Pipeline”, finished up during the annual crane migration over the fields of southern Champagne. Every evening, the cranes would fly overhead in long meandering V’s, calling out to each other in lovely mournful-sounding cadences. I like cranes.

MG 3408

But hey — the thing worked, guys. I was successfully sending realtime audio, app-to-app.

I excitedly updated the OMAC guys with my findings — the original email message is still there, warts and all.

The responses were enthusiastic, but the question of what Apple might feel about this sort of thing arose. I was pointed in the direction of a guy who has had a fair bit of experience being tossed around by App Review. I didn’t know him, but his name was Sebastian, and he chimed in on OMAC soon after.

Little did I know that I’d just met my future friend and business partner who would be instrumental in making the whole project come together.

Sebastian used his contacts at Apple to try to take their temperature, and received a pretty vague and noncommittal response, but, in Sebastian’s words — “It looks like this could actually work and be approved. No guarantee though but definitely not a clear NO.”.

In the meantime, Katherine and I had gradually migrated further south, through some truly beautiful French countryside, all tumbledown villages and autumn colours, cyan-coloured rivers flowing over light grey riverbeds.

MG 3433

MG 3461

We eventually found ourselves in the Provençal seaside town of Istres. It wasn’t nearly as nice a spot to spend the winter as it sounds, and as we’d hoped — actually, it rather unpleasantly reminded us of our three months in Tunisia! — but we were paid up and committed for the month, and I had work to do.

iOS Audio Pipeline steadily evolved, the prototype thrown out and a more efficient version began to take shape, based on a lower-level technology than Virtual MIDI: Mach Ports, the tech Virtual MIDI is built upon, which I’d subsequently discovered that I could actually use directly.

As happens so frequently in those golden early days of a project, the code poured out of me. It always reminds me of that anecdote about Michelangelo, revealing the sculpture that’s already there within the marble block — all that’s to be done is to remove the superfluous material. Not that I’m so grandiose as to compare myself to Michelangelo, but I can kinda see what he meant, in the way the code just reveals itself, everything falling into place. It’s almost hypnotic.

Here’s an early mockup of the interface — it actually existed like this for a short time, but I don’t have any live screenshots. Originally, I’d envisioned it as a control panel that was built into each app, accessible via whatever means the developer saw fit:

Audiobus early mockup

It was somewhat modelled on the de facto standard method for making Virtual MIDI connections available, and was, to be honest, quite clunky.

Along the way, I had more and more conversations with Sebastian, and we fast became friends.

Then he dropped the bomb, with the idea that turned iOS Audio Pipeline from an interesting piece of technology into an actual concrete, viable product, and solidified our partnership.

In Part 2 of this article, I’ll write about how Sebastian and I took iOS Audio Pipeline, a slightly nerdy-but-cool audio transport protocol, and turned it into Audiobus, the frankly awesome product it is today. There’ll be half a year spent in a beautiful village in the foothills of the Pyrenees, and an encounter in Barcelona. There’ll be battles, victories, the odd crisis, and many, many hours of programming. But don’t worry. It has a happy ending.

http://atastypixel.com/blog/thirteen-months-of-audiobus/feed/ 9
Loopy and Loopy HD, now with Bluetooth pedal/keyboard support, iPhone 5 http://atastypixel.com/blog/loopy-and-loopy-hd-now-with-bluetooth-pedalkeyboard-support-iphone-5/ http://atastypixel.com/blog/loopy-and-loopy-hd-now-with-bluetooth-pedalkeyboard-support-iphone-5/#comments Mon, 01 Oct 2012 22:08:36 +0000 http://atastypixel.com/blog/?p=2568 Loopy HD 1.3 and Loopy 2.4 just hit the App Store.

Loopy hd 1.3 loopy 2.4

The main new stuff is support for Bluetooth pedals like the AirTurn and Cicada (as well as any Bluetooth keyboard). This uses the same system as the MIDI control, so you can do all the same stuff. It’s pretty neat.

Also, iPhone 5 support, which introduces — you guessed it — another row of loops. How could I resist?

There’s also some dramatically improved clock code in there, which offers new behaviour when using the x/÷/+/- clock length controls to give you new options for putting interesting rhythms against each other, and better support for non-4/4 time signatures.

There’s a bunch of other relatively minor improvements in there. Here’s a summary of all that’s new:

  • Added support for Bluetooth pedals like the AirTurn and Cicada, and Bluetooth keyboards
  • Added iPhone 5 support
  • Enhanced support for alternative time signatures
  • Improved clock length manipulation, with more flexible behaviour for “+” and “-” buttons
  • Rearranged Settings screen for easier understanding
  • Added “Cancel pending actions” MIDI action
  • Keep MIDI device connections over multiple sessions
  • Ask for a session name when saving for the first time
  • Assorted bug fixes and optimisations
http://atastypixel.com/blog/loopy-and-loopy-hd-now-with-bluetooth-pedalkeyboard-support-iphone-5/feed/ 0
Updates, updates for everybody: Loopy HD 1.2 and Loopy 2.3 http://atastypixel.com/blog/updates-updates-for-everybody-loopy-hd-1-2-and-loopy-2-3/ http://atastypixel.com/blog/updates-updates-for-everybody-loopy-hd-1-2-and-loopy-2-3/#comments Sat, 28 Jul 2012 12:06:15 +0000 http://atastypixel.com/blog/?p=2558 Loopy hd 1 2

The update has landed!

Loopy’s now slicker and meatier than ever, with a brand-spanking new audio engine — with some nifty new audio processing smarts and just ~6-7ms latency, which sounds absolutely fantastic — greatly improved punch in/out controls, multi-channel audio interface support, and a colossal amount of other improvements.

You can read more about the update here, grab Loopy HD (for the iPad and iPhone) or Loopy (for the iPhone) right now on the App Store, and talk about it on the forum.

You can also check out the new introductory tutorial — there’ll be more soon.

Many thanks to the testing team for their hard work making sure the new update is house-trained.

Happy looping!

http://atastypixel.com/blog/updates-updates-for-everybody-loopy-hd-1-2-and-loopy-2-3/feed/ 2
Three Years On The Road: The Story So Far http://atastypixel.com/blog/three-years-on-the-road-the-story-so-far/ http://atastypixel.com/blog/three-years-on-the-road-the-story-so-far/#comments Tue, 03 Jul 2012 09:35:24 +0000 http://atastypixel.com/blog/?p=2554 My partner Katherine and I just hit the three year point of our adventure abroad. Here’s the story so far, over on our travel blog:

Three Years On The Road

http://atastypixel.com/blog/three-years-on-the-road-the-story-so-far/feed/ 4
Brand New Loopy, Coming Real Soon http://atastypixel.com/blog/brand-new-loopy-coming-real-soon/ http://atastypixel.com/blog/brand-new-loopy-coming-real-soon/#comments Mon, 18 Jun 2012 09:42:52 +0000 http://atastypixel.com/blog/?p=2545 I’m very happy to say shiny new versions of Loopy and Loopy HD are on their way!

I’ve had my nose to the grindstone over the past months; I’ve taken Loopy’s insides out, given them a good, solid spit-and-polish, and put them back in. The result is a huge number of performance enhancements, much better quality audio processing, and a more robust engine (which, incidentally, is soon to start leading a life of its own). What it means for you: More stability, better audio quality, improved workflows, hugs, puppies.

It’s true that this is most significantly an internal-evolution release, but there’s also some new stuff in here.

Multi-Channel Audio Interface Support, Baby

The most exciting for the more serious musicians and tinkerers among us is new support for multi-channel audio inputs. When you have a stereo source plugged in, you’ll have a choice of whether to record stereo, mono left channel or mono right channel. If you have a device with more than 2 channels, then you’ll be able to select any stereo pair, or one particular channel.

Loopy HD multi-channel input support

Count-In Quantize Length

By popular demand, I’ve also added a “Count-In Quantize Length” setting, which lets you set how long you want Loopy to count in when recording, independently of the clock length. The default options syncs with the clock length, or you can set a specific duration from a quarter of a bar, up to 16 bars.

Huge Punch In/Out Improvement

I’ve also made a fairly significant change to the punch in and punch out mechanism.

In prior versions of Loopy, the actual punch in/out command is fired when you release — when the touch ends. That means that if you’re a slow toucher (it’s okay, no one’s judging you), you could miss the punch in/out point by tenths of a second. I went back to the drawing board, and came up with a new system: Loopy begins recording, in the background, as soon as you touch a track. If that touch ends up being a punch in/out gesture (instead of, say, opening the menu), then recording continues, beautifully in time. It’s much more intuitive, and I think will end up making it far easier to get perfect loop timing.

Toggle Track Sync Via MIDI

I’ve added a MIDI-triggerable action to toggle track synchronisation with a foot switch, which makes it easy to record irregular-length tracks, hands-free.

SoundCloud Update

I’ve integrated the snazzy new(ish) SoundCloud interface, which looks fantastic and also takes care of all the social network sharing stuff.

SoundCloud interface in Loopy HD

And finally, I’ve added a Japanese localization (こんにちは!), and updated the Italian one.

The update’s in beta testing right now, and I’m expecting to submit it to Apple next week.

http://atastypixel.com/blog/brand-new-loopy-coming-real-soon/feed/ 11
Compiling Image Resources into a Static Library http://atastypixel.com/blog/compiling-image-resources-into-a-static-library/ http://atastypixel.com/blog/compiling-image-resources-into-a-static-library/#comments Tue, 15 May 2012 20:12:23 +0000 http://atastypixel.com/blog/?p=2523 I’ve recently been working on a static library for distribution to other developers — Audiobus — and I need to include a couple of graphical resources with the distribution. The usual solution to this is to include the resources separately in a bundle, and require the user to drop them in to their project along with the static library.

I thought I’d see if I could make the process just a little neater, and successfully devised a way to compile the images straight into the library, so the distribution remains nice and clean — just the library itself and a few header files.

Now, I can pop image resources into a folder, and after compiling, access them within the static library with:

UIImage *image = TPGetCompiledImage(@"Button.png");

It automatically handles “@2x” Retina images (although it doesn’t currently do “~ipad” versions).

Here’s how it’s done.

The magic is in a shell script which uses the xxd hex dump tool to create C code that represents the image data as a byte array, then creates around it a set of utilities to turn those arrays into UIImages on demand.

Along with it is a couple of template files — a header and implementation file — that describe the format of the derived code.

Finally, a little tweaking of the project in Xcode (with a brief foray into a text editor to work around some Xcode shortcomings) puts it all together.

Update: Fellow dev Cocoanetics pointed out that they’d solved a similar problem, and have a great write-up on how they create compiled resources using custom build rules on their blog.

The Image Resources

…Just a bunch of png files within a folder inside your project directory. The script assumes there are normal and retina (@2x) versions of each.

The Template

These are the template source files from which the end result will be derived. It contains a few tags that the accompanying shell script will process. I created it in Xcode, placing it within the same folder as the source png images, but removed it from the target’s compile phase, as we’ll be adding the derived source instead.

First the header, TPCompiledResources.h:

//  TPCompiledResources.h
//  Created by Michael Tyson on 13/05/2012.
//  Copyright (c) 2012 A Tasty Pixel. All rights reserved.
#import <UIKit/UIKit.h>
UIImage *TPGetCompiledImage(NSString* name);

And the implementation file, TPCompiledResources.m:

//  TPCompiledResources.m
//  Created by Michael Tyson on 13/05/2012.
//  Copyright (c) 2012 A Tasty Pixel. All rights reserved.
#import "TPCompiledResources.h"
UIImage *TPGetCompiledImage(NSString* name) {
    if ( [name isEqualToString:@"ORIGINAL_FILENAME"] ) {
        static UIImage *_SANITISED_FILENAME_image = nil;
        if ( _SANITISED_FILENAME_image ) return _SANITISED_FILENAME_image;
        if ( [[UIScreen mainScreen] scale] == 2.0 ) {
            _SANITISED_FILENAME_image = [[UIImage alloc] initWithCGImage:
                                                     [[UIImage imageWithData:[NSData dataWithBytesNoCopy:SANITISED_2X_FILENAME 
                                                                      length:SANITISED_2X_FILENAME_len freeWhenDone:NO]] CGImage] 
        } else {
            _SANITISED_FILENAME_image = [[UIImage alloc] initWithData:[NSData dataWithBytesNoCopy:SANITISED_FILENAME 
                                                                                           length:SANITISED_FILENAME_len freeWhenDone:NO]];
        return _SANITISED_FILENAME_image;
    return nil;

The Shell Script

Here’s the script that does all the work. The script looks for all “png” images in the given folder, then creates C code representing each image along with wrapper code to give access to the image byte arrays, with help from the template.

This script goes into a “Run Script” phase, placed at the beginning of the library’s build process.

# Where the images are (get this from the first "Input Files" entry)
# The name of the source template, minus extension
# Create C arrays, representing each image
for image in *.png; do
    xxd -i "$image" >> $tmp.1
# Read the code template
# Create loader code for each image
for image in *.png; do
    if echo "$image" | grep -q "@2x"; then continue; fi
    SANITISED_FILENAME=`echo "$ORIGINAL_FILENAME" | sed 's/[^a-zA-Z0-9]/_/g'`
    SANITISED_2X_FILENAME=`echo "$SANITISED_FILENAME" | sed 's/_png/_2x_png/'`
# Create the source file from the template and our generated code
sed "/{%IMAGEDATA START%}/ r $tmp.1
# Copy the template header file in
rm "$tmp.1" "$tmp.2"

The “Run Script” phase that hosts this script also needs a couple of additions, to tell Xcode what the inputs and outputs to the script are: In the “Input Files” section, the path to the image resource folder, with a “*.png” wildcard at the end, and also the path to those template files, TPCompiledResources.{h,m}. Finally, the two output files go in the “Output Files” section:

Screen Shot 2012 05 15 at 20 42 46

Project Setup

Now it was just a matter of setting up the project to include the derived source files in the build. This was a bit messy and took a little doing, but with guidance from this article by Ben Zado on file references relative to DERIVED_FILE_DIR, it wasn’t too painful:

  1. Build, in order to generate the derived source files.
  2. Navigate to the derived sources folder within the build products, and drag the TPCompiledResources.{m,h} into the project, placed within a new group (I called the group “Derived Sources”).
  3. The path type for those files (accessible from the properties viewer — Cmd-I) should be “Relative to Enclosing Group”, and consequently the “Path” field should just show the filename, with no path component. This was a bit touch-and-go for me, and I had a hard time making this happen, so I left it as-is for now, fixing it manually later in step 8.
  4. Under Xcode Preferences » Locations » Source Trees, add an entry with setting name “DERIVED_FILE_DIR“, display name “Derived Files” and path “$(DERIVED_FILE_DIR)“.
  5. Set the path type for the group containing the two derived sources to “Relative to Derived Files”.
  6. Quit Xcode, and open the project.pbxproj file from within your project’s bundle.
  7. Find the “Derived Sources” group (or whatever it was named in step 2), and delete the “path” property from the list of attributes.
  8. I had to also find the TPCompiledResources.{m,h} file sections and delete the “path” attribute for those, too.
  9. Reopen Xcode, and build — it should be good to go (don’t worry that the derived sources are shown in red in the project group — Xcode’ll find them).

Now that’s done, images can be accessed by TPGetCompiledImage(@"ImageName.png"). Yay!

http://atastypixel.com/blog/compiling-image-resources-into-a-static-library/feed/ 4
I ♥ Alfred: Code execution extensions http://atastypixel.com/blog/i-%e2%99%a5-alfred-code-execution-extensions/ http://atastypixel.com/blog/i-%e2%99%a5-alfred-code-execution-extensions/#comments Wed, 02 May 2012 13:01:12 +0000 http://atastypixel.com/blog/?p=2517 ExtensionI’m a really big fan of Alfred, and lately I’ve found it really useful for running tiny little snippets of code — whether it’s to quickly URL decode a string, or remind myself of how C integer-to-float conversion behaves, I find myself using these little extensions I put together quite frequently.

Here’re two workflows I use to run PHP code (one which just executes it and shows the result in Growl, and one which copies the result to the clipboard), and a workflow that runs a snippet of C code. Of course, it wouldn’t take much to make workflows for many other languages, too.

Alfred 2 workflows

Now with live results! Hit enter to copy result to clipboard.

Run C Code.alfredworkflow

Run PHP Code.alfredworkflow

Older, Alfred 1 extensions:

Execute PHP Code.alfredextension

Execute PHP Code, Copy Result.alfredextension

Run C code.alfredextension

Screen Shot 2013 03 25 at 11 25 11

http://atastypixel.com/blog/i-%e2%99%a5-alfred-code-execution-extensions/feed/ 0