Audiobus action on Tumblr
I’m blogging about Audiobus’s development and other bits and pieces over on the [Audiobus Tumblr blog](http://audiobus.tumblr.com).
If you’re interested to see what I’m up to, do join me over there.
Read MoreI’m blogging about Audiobus’s development and other bits and pieces over on the [Audiobus Tumblr blog](http://audiobus.tumblr.com).
If you’re interested to see what I’m up to, do join me over there.
Read MoreIt’s called “Audiobus”, and — yep, them’s big words — it’s going to change the way people create music on iOS.
Here’re some mockups of the main interface…
[Subscribe here](http://audiob.us) for more news about Audiobus as it happens.
Read MoreGoogle App Engine is a fantastic platform for hosting webapps, and a great resource for iOS developers who need an online component to their products. It’s hard to believe that the service is essentially free! I’m using it with [The Cartographer](http://cartographer-app.com), but I found myself coming up against a hard limit with the datastore.
You see, the [datastore](http://code.google.com/appengine/docs/python/datastore/overview.html) limits entities to 1 Mb. I’m trying to store XML data in there, and sometimes that can exceed the 1 Mb limit.
XML being the verbose creature that it is compresses very nicely, so it occurred to me that if I selectively compress the larger blocks, I should be able to quite easily squeeze in underneath the limit. Sure enough, a 1.6 Mb XML block compressed into about 200 Kb.
App Engine makes it very easy to define custom properties on data models, so I’ve written a CompressibleTextProperty
class that automatically compresses/decompresses properties above a certain size. This means that there’s no performance loss for entities that are small enough to fit easily, but still enables the storage of bigger blocks of content.
The alternative was to break entities up into several different database entities, but this sounded like much more work, and sounded much less elegant.
So here’s what I came up with — it’s used the same way the other [Property types](http://code.google.com/appengine/docs/python/datastore/datamodeling.html#Property_Classes_and_Types) are used.
Read MoreOddly, iOS doesn’t provide any OpenSSL implementation at all — If you want to do anything with crypto (like checking signatures, checksumming, etc.), you have to build in the library yourself.
I came across a great [XCode project wrapper](https://github.com/sjlombardo/openssl-xcode) for OpenSSL yesterday, by Stephen Lombardo. This is an XCode project file that contains a target to build OpenSSL from source, and works with both Mac and iOS projects. I made some [modifications](https://github.com/michaeltyson/openssl-xcode) to it, in order to make it work by just dropping in the OpenSSL source tarball, without having to dirty up your source tree with the extracted OpenSSL distribution.
Here’s how to use it:
Library/openssl
within my project tree).On the Build tab for your project’s target, find the “Header Search Paths” option, and add the path:
$(SRCROOT)/Library/openssl/build/openssl.build/openssl/include
(Assuming you’ve put openssl.xcodeproj at the path Library/openssl
— adjust as necessary).
Then, you can just import and use as normal (#import <openssl/dsa.h>
, etc).
[Download it here](https://github.com/michaeltyson/openssl-xcode/zipball/master)
Read MoreOn the iPhone, when you are doing anything that uses the network, you’re supposed to let the user know something’s going on, via -[UIApplication setNetworkActivityIndicatorVisible:]
. This takes a boolean.
That’s all well and good, but if you have more than one object in your app that may do things with the network simultaneously, you’re going to clobber yourself.
A nice and easy solution: Maintain an activity counter and create a category on UIApplication to maintain it, and show or hide the indicator as appropriate. Then, whenever you start doing something with the network:
[[UIApplication sharedApplication] showNetworkActivityIndicator]; |
[[UIApplication sharedApplication] showNetworkActivityIndicator];
…And when you’re done:
[[UIApplication sharedApplication] hideNetworkActivityIndicator]; |
[[UIApplication sharedApplication] hideNetworkActivityIndicator];
Here’s a category that’ll do it:
Read MoreI got an email today with a question about how to handle playback of audio in time, synchronised with a clock. My ‘musical notepad’ app [Loopy](http://atastypixel.com/products/loopy) does this, and I thought I’d briefly explain how.
Any app that makes use of the [Remote IO](http://atastypixel.com/using-remoteio-audio-unit/) audio unit framework (which is generally necessary for the kind of responsiveness required in a realtime musical app) provides audio to the hardware via a callback, which is periodically called when the hardware is ready for more.
The trick here is to provide the right chunk of samples in this callback for the current time position.
Loopy achieves this by:
This is easily accomplished by keeping a record of the time the clock was started, subtracting this from the current time, and possibly performing a modulus with the tempo. For example:
(now - startTime) % timePerBar
gives the number of time units into the current bar (lets call it timeIntoBar
). timeIntoBar / (timePerBar/beatsPerBar)
gives the number of beats into the current bar, and timeIntoBar % (timePerBar/beatsPerBar)
gives us the time into the current beat.This involves first converting our time units from step 1 into samples. For instance, you can convert microseconds to samples by dividing your time by 1000000/yourSampleRate
. Aside: Of course, you can convert back from samples to time by multiplying instead of dividing.
Next, in the case of Loopy’s metronome, for example, we test for whether samplesIntoBeat < sound.lengthInSamples
. If so, that means we should be playing audio. If the sound was a loop, of course, we could be always playing.
The offset into the sound, in samples, is just samplesIntoBeat, in the case of the simple metronome. In the case of a loop, you probably will be more interested in the number of samples into your loop — so instead of determining (now - startTime) % timePerBar
, you may be interested in (now - startTime) % timePerLoop
.
So, we want to return the requested number of samples starting from this offset into the sample array representing our audio.
Note that if you just go returning any old set of samples, willy-nilly, you’re going to get nasty clicks and pops from discontinuities you get by not matching the start of your next buffer to the last one.
To ensure smoothness, Loopy keeps track of the offset of the last samples we returned, and just return the immediately following bunch of samples — unless we’re more than some threshold number of samples out of time, in which case we’ll suffer the pop in order to stay synchronised. Actually, you can even generally avoid the pop if you smoothly blend buffers over a short time, removing any discontinuity.
The example above was a relatively trivial one, for a metronome sound. For longer audio that may span multiple bars, you’ll probably want to perform a modulus by the length of your audio clip, possibly quantised to your time signature, and possibly using a per-loop time base, so you can start the loop at any point in your timeline and have it begin from the start. This is something Loopy doesn’t currently do — Loopy will keep your loops synchronised so when you start a loop playing, it’ll play whatever part corresponds to the current timeline, not from the start of the loop. Maybe it’ll be an option in the future?
I wrote a little about the timing of loops in my [second article on Loopy’s implementation](http://atastypixel.com/developing-loopy-part-2-implementation/).
Read MoreJust a quick one: This may be obvious to many devs, but it’s worth noting. One common and useful [debugging technique](http://www.cocoadev.com/index.pl?DebuggingTechniques) is breaking on exceptions, so that you can see exactly where in your app’s flow a breakpoint occurs.
This can be done by adding -[NSException raise]
and objc_exception_throw
to your breakpoints list.
Once an exception happens, you can then check out the exception itself to see what went wrong. The approach varies between platforms. If you’re in the simulator (or any Mac OS X app running on Intel), the exception will be stored in the $eax
register. Take a look by typing:
po $eax
If you’re on the iPhone, it’ll be $r0
, so:
po $r0
I recently wrote a custom view — a 3D vintage-looking pull lever — that provided a continuous property to control the state. I wanted to animated this smoothly, a-la CABasicAnimation, but couldn’t find a built-in way to do so.
So, I wrote a class that provides similar functionality to CABasicAnimation, but works on any object. I thought I’d share it.
Features:
NSNumber
and scalar numeric types, but easily extendable)chainedAnimation
, and it’ll be fired once the first animation completes)CADisplayLink
if available, to update in sync with screen updatesUse it like this:
- (void)startMyAnimation { TPPropertyAnimation *animation = [TPPropertyAnimation propertyAnimationWithKeyPath:@"state"]; animation.toValue = [NSNumber numberWithFloat:1.0]; // fromValue is taken from current value if not specified animation.duration = 1.0; animation.timing = TPPropertyAnimationTimingEaseIn; [animation beginWithTarget:self]; } |
- (void)startMyAnimation { TPPropertyAnimation *animation = [TPPropertyAnimation propertyAnimationWithKeyPath:@"state"]; animation.toValue = [NSNumber numberWithFloat:1.0]; // fromValue is taken from current value if not specified animation.duration = 1.0; animation.timing = TPPropertyAnimationTimingEaseIn; [animation beginWithTarget:self]; }
Make sure you also include the QuartzCore framework, used to access CADisplayLink
, if it’s available.
It’s BSD-licensed.
Grab it here: TPPropertyAnimation.zip
Read More