Just a short one this week (I’ve been Audiobussing/Xmassing), but I want to ask you about what annoys you right now in music tech.
Read MoreI’m trying something new this week: I’ve been designing Loopy’s code modules, and this week I’m talking about how I plan to implement the actions/sequence stuff we’ve been talking about in past weeks.
It’s gonna get a little technical – quite the learning curve for me, figuring out how to talk about software development!
Read MoreObjective-C, or Swift?
I’ve been pondering what language to use to write Masterpiece, and have been vacillating back and forth over the last few days.
Swift is the future!
Swift – Apple’s fancy new programming language destined to replace Objective-C – is designed to make some of the more heinous errors one can make with Objective-C totally impossible. That’s very, very nice, and with the clear message from Apple that this is the future of iOS/Mac development, it seems to make sense to start any new projects in Swift, rather than the decades-old Objective-C.
So: Swift is the natural choice, right?
Hold up. Swift is still very young.
But – and this is a massive “but” – Swift is very new, and although Apple report they’ve been using it to make their own products, I contend it’s as-yet largely untested. Reports of bugs in the developer tools, including nonsensical error messages, crashes and assorted other compiler and editor bugs abound. Just a quick search turns up all kinds of problems:
- http://footle.org/2014/11/04/misleading-swift-compiler-errors/
- https://twitter.com/SteveStreza/status/542830670301515776
- https://twitter.com/jnpdx/status/542493939152859138
- https://twitter.com/tewha/status/542150178476548097
- https://twitter.com/hyperjeff/status/542096016762494976
- https://twitter.com/DamienPetrilli/status/541964343420944384
- http://stackoverflow.com/questions/27371000/xcode6-swift-type-inference-bug
- https://discussions.apple.com/thread/6526114?start=0&tstart=0
- http://finalize.com/2014/10/08/xcode-simulator-bug-with-swift-to-objective-c-call-passing-cmtime-structure/
Good god! No doubt these problems will sort themselves out over the next years, but who wants to be the guinea-pig responsible for finding and reporting them?
Not me. I wanna make stuff.
There’s one other big factor that I think has settled the matter in Objective-C’s favour, once and for all.
Swift and realtime audio is going to be hard.
I’m writing an audio app, and a pretty damn complex one. See, when you’re writing code for realtime audio, it’s absolutely critical you write code that’s fast, doesn’t allocate or release memory, and doesn’t wait on any locks. If you don’t get that right, you open yourself up to the risk of a nasty little thing called priority inversion which, with audio, means glitches because your code can’t provide audio fast enough to the hardware.
At the heart of the Objective-C runtime is all kinds of stuff that can’t take place on an audio thread without risking glitches: locks, memory allocations and the like. That means you really shouldn’t write audio code in Objective-C, but should drop down to plain C where it’s safe.
And you know what? The same thing almost certainly goes for Swift. At the very least, it’s untried and totally undocumented, which isn’t a good sign.
But the lovely thing about Objective-C, is that you can write C code right alongside Objective-C code, and directly access properties of Objective-C classes, with no additional cost. As far as I can tell, that’s really, really messy to do in Swift.
So, that’s that. Objective-C it is.
Read MoreThis week, I’ve been playing with custom on-workspace controls, and sequence tracks (which is going to be really neat).
Read MoreThis week, I’m talking about track layers and track editing; importing, exporting, comping, loop slicing and per-waveform muting, and a bunch of other stuff.
Read MoreWhoa. Ringleader over on the Loopy forum made an incredible suggestion I wanted to share with you verbatim:
Maybe a sequencer track type that could be added as an option on the design screen just like any other track? Maybe include both a “live” and more manual “step” way of configuring the sequence.
It would be cool if the sequencer track(s) could be pre-configured and saved in a song’s session template, then live record the audio to the same track locations as set in the sequence, so the main song format is all pre-set but all of the audio is still recorded live per performance.
Or choose to put the entire song on rails where you use one midi button to step through all of the pre-set sequenced steps which would take care of all the track managing for you! Or maybe this is all already possible in the new binding configuration you’ve already developed…
…and then the pièce de résistance:
If the sequence track behaved like any other track type and is located on the workspace, then it could use other track options (such as having a group of sequencer tracks where only one plays at a time) and be easily accessible and provide some live automation control for people who don’t use midi bindings.
Since Loopy MPE is now essentially a multi-channel/track DAW, and tracks aren’t all being combined and dumped into a stereo output, having the tracks sequenced would allow you to build song parts separately on different tracks and have them play back as added/removed musical layers. Add the shaker and layered vocals to the last chorus only. Add that quirky ambient intro guitar solo behind the second verse but not the first. A sequence could make this type of stuff easy to do in a live setting without tap dancing and greatly remove the number of buttons required on a midi controller, while still providing a level of spontaneity that baked pre-recorded tracks do not.
Regarding the other control options (faders) you are planning to add to the workspace, maybe those could be automated too so that effects could also be controlled by the sequence. The sequencer track could be the solution that takes live looping to an entirely new level – allowing one to focus more on the performance and audience instead of managing the looper. Powerful stuff!
How cool would that be? A gold freakin’ star to Ringleader.
Read MoreHere’s a look at my many failed attempts to figure out audio routing in Masterpiece. Tricky business! Feedback welcome.
I mentioned Sebastian Dittmann; he’s @dittsn on Twitter.
Read MoreHere’re some of my favourite new suggestions from people:
- Tristan Maduro proposed clip-based automation (which are stored and play back with the loop, as opposed to on the global timeline).
- Lushr proposed iterative, looping takes, where you continually record over and over, and then pick the best takes from those recordings.
- Oliver proposed on-screen controls to send MIDI CCs out to other devices and apps.
- Cimon and CPRophy suggested effects that feed back into track input, for some twisted track degeneration.
I’ve also had heaps of requests for loop slicing and manipulation, which sounds cool. Might need to be a separate app, though (although with iOS 8’s extensions, it’d integrate really nicely).
Finally, my favourite entreaties have been to maintain Loopy’s usability and simplicity in Masterpiece. People have complimented the ease of use factor in Loopy, and a few have emailed me to urge caution; not to overcomplicate. That’s priority number one for me: there’s no point having a tool with a billion widgets, wodgets and thromdimulators if it’s a chore to use.
I’ve asked those voices of restraint to help keep me in check during this design stage, a request which goes for you guys too! I do have a tendency to run headlong into complexity, as you’ll see from my rather catastrophic audio routing mockups next week, so I’d appreciate the feedback in the weeks to come. Be brutal (lord knows, Sebastian will be ;-)).
Read More