Audiobus action on Tumblr Audiobus action on Tumblr
  • Home
  • Posts
  • Home
  • Posts

Development

Audiobus action on Tumblr

Audiobus tumblrI’m blogging about Audiobus’s development and other bits and pieces over on the Audiobus Tumblr blog.

If you’re interested to see what I’m up to, do join me over there.

Read More

Some in-progress screenshots of my new project

It’s called “Audiobus”, and — yep, them’s big words — it’s going to change the way people create music on iOS.

Here’re some mockups of the main interface…

Audio Bus Mockup 1

b

Subscribe here for more news about Audiobus as it happens.

Read More

Breaking the limits: Storing data bigger than 1 Mb in Google App Engine’s Datastore

Google App Engine is a fantastic platform for hosting webapps, and a great resource for iOS developers who need an online component to their products. It’s hard to believe that the service is essentially free! I’m using it with The Cartographer, but I found myself coming up against a hard limit with the datastore.

You see, the datastore limits entities to 1 Mb. I’m trying to store XML data in there, and sometimes that can exceed the 1 Mb limit.

XML being the verbose creature that it is compresses very nicely, so it occurred to me that if I selectively compress the larger blocks, I should be able to quite easily squeeze in underneath the limit. Sure enough, a 1.6 Mb XML block compressed into about 200 Kb.

App Engine makes it very easy to define custom properties on data models, so I’ve written a CompressibleTextProperty class that automatically compresses/decompresses properties above a certain size. This means that there’s no performance loss for entities that are small enough to fit easily, but still enables the storage of bigger blocks of content.

The alternative was to break entities up into several different database entities, but this sounded like much more work, and sounded much less elegant.

So here’s what I came up with — it’s used the same way the other Property types are used.

Read More

Easy inclusion of OpenSSL into iOS projects

Oddly, iOS doesn’t provide any OpenSSL implementation at all — If you want to do anything with crypto (like checking signatures, checksumming, etc.), you have to build in the library yourself.

I came across a great XCode project wrapper for OpenSSL yesterday, by Stephen Lombardo. This is an XCode project file that contains a target to build OpenSSL from source, and works with both Mac and iOS projects. I made some modifications to it, in order to make it work by just dropping in the OpenSSL source tarball, without having to dirty up your source tree with the extracted OpenSSL distribution.

Here’s how to use it:

  1. Download the OpenSSL source.
  2. Put the downloaded OpenSSL source tar.gz into the same folder
    as openssl.xcodeproj (I put it in Library/openssl within my project tree).
  3. Drag the openssl.xcodeproj file into your main project tree in XCode.
  4. Right-click on your project target, and add openssl.xcodeproj under “Direct
    Dependencies” on the General tab.
  5. On the Build tab for your project’s target, find the “Header Search Paths”
    option, and add the path:
    > $(SRCROOT)/Library/openssl/build/openssl.build/openssl/include

    (Assuming you’ve put openssl.xcodeproj at the path Library/openssl — adjust as necessary).

  6. Expand your target’s “Link Binary With Libraries” build stage, and drag
    libcrypto.a from the openssl.xcodeproj group.

Then, you can just import and use as normal (#import, etc).

Download it here

Read More

Avoid the clobber: Nest your network activity indicator updates

On the iPhone, when you are doing anything that uses the network, you’re supposed to let the user know something’s going on, via -[UIApplication setNetworkActivityIndicatorVisible:]. This takes a boolean.

That’s all well and good, but if you have more than one object in your app that may do things with the network simultaneously, you’re going to clobber yourself.

A nice and easy solution: Maintain an activity counter and create a category on UIApplication to maintain it, and show or hide the indicator as appropriate. Then, whenever you start doing something with the network:

[[UIApplication sharedApplication] showNetworkActivityIndicator];

[[UIApplication sharedApplication] showNetworkActivityIndicator];

…And when you’re done:

[[UIApplication sharedApplication] hideNetworkActivityIndicator];

[[UIApplication sharedApplication] hideNetworkActivityIndicator];

Here’s a category that’ll do it:

Read More

Playing audio in time using Remote IO

I got an email today with a question about how to handle playback of audio in time, synchronised with a clock. My ‘musical notepad’ app Loopy does this, and I thought I’d briefly explain how.

Any app that makes use of the Remote IO audio unit framework (which is generally necessary for the kind of responsiveness required in a realtime musical app) provides audio to the hardware via a callback, which is periodically called when the hardware is ready for more.

The trick here is to provide the right chunk of samples in this callback for the current time position.

Loopy achieves this by:

1. Keeping track of where in the timeline we are at the time the callback is called

This is easily accomplished by keeping a record of the time the clock was started, subtracting this from the current time, and possibly performing a modulus with the tempo. For example:

  • (now - startTime) % timePerBar gives the number of time units into the current bar (lets call it timeIntoBar).
  • timeIntoBar / (timePerBar/beatsPerBar) gives the number of beats into the current bar, and
  • timeIntoBar % (timePerBar/beatsPerBar) gives us the time into the current beat.

2. Determining first if we should be playing audio at this time, and if so, which samples should be playing

This involves first converting our time units from step 1 into samples. For instance, you can convert microseconds to samples by dividing your time by 1000000/yourSampleRate. Aside: Of course, you can convert back from samples to time by multiplying instead of dividing.

Next, in the case of Loopy’s metronome, for example, we test for whether samplesIntoBeat < sound.lengthInSamples. If so, that means we should be playing audio. If the sound was a loop, of course, we could be always playing.

The offset into the sound, in samples, is just samplesIntoBeat, in the case of the simple metronome. In the case of a loop, you probably will be more interested in the number of samples into your loop — so instead of determining (now - startTime) % timePerBar, you may be interested in (now - startTime) % timePerLoop.

So, we want to return the requested number of samples starting from this offset into the sample array representing our audio.

3. Returning smooth audio in time

Note that if you just go returning any old set of samples, willy-nilly, you're going to get nasty clicks and pops from discontinuities you get by not matching the start of your next buffer to the last one.

To ensure smoothness, Loopy keeps track of the offset of the last samples we returned, and just return the immediately following bunch of samples — unless we're more than some threshold number of samples out of time, in which case we'll suffer the pop in order to stay synchronised. Actually, you can even generally avoid the pop if you smoothly blend buffers over a short time, removing any discontinuity.

Final words

The example above was a relatively trivial one, for a metronome sound. For longer audio that may span multiple bars, you'll probably want to perform a modulus by the length of your audio clip, possibly quantised to your time signature, and possibly using a per-loop time base, so you can start the loop at any point in your timeline and have it begin from the start. This is something Loopy doesn't currently do — Loopy will keep your loops synchronised so when you start a loop playing, it'll play whatever part corresponds to the current timeline, not from the start of the loop. Maybe it'll be an option in the future?

I wrote a little about the timing of loops in my second article on Loopy's implementation.

Read More

iPhone debugging tip: Breaking on exceptions and reading their content

Just a quick one: This may be obvious to many devs, but it’s worth noting. One common and useful debugging technique is breaking on exceptions, so that you can see exactly where in your app’s flow a breakpoint occurs.

This can be done by adding -[NSException raise] and objc_exception_throw to your breakpoints list.

Once an exception happens, you can then check out the exception itself to see what went wrong. The approach varies between platforms. If you’re in the simulator (or any Mac OS X app running on Intel), the exception will be stored in the $eax register. Take a look by typing:

po $eax

If you’re on the iPhone, it’ll be $r0, so:

po $r0

Read More

iPhone/Mac animation for custom classes: Property animation for more than just CALayer

I recently wrote a custom view — a 3D vintage-looking pull lever — that provided a continuous property to control the state. I wanted to animated this smoothly, a-la CABasicAnimation, but couldn’t find a built-in way to do so.

So, I wrote a class that provides similar functionality to CABasicAnimation, but works on any object. I thought I’d share it.

Features:

  • From/to value settings (currently only supports NSNumber and scalar numeric types, but easily extendable)
  • Duration, delay settings
  • Timing functions: Linear, ease out, ease in, and ease in/ease out
  • Animation chaining (specify another configured animation for chainedAnimation, and it’ll be fired once the first animation completes)
  • Delegate notification of animation completion
  • Uses a single timer for smooth lock-step animation
  • Uses CADisplayLink if available, to update in sync with screen updates

Use it like this:

- (void)startMyAnimation {
  TPPropertyAnimation *animation = [TPPropertyAnimation propertyAnimationWithKeyPath:@"state"];
  animation.toValue = [NSNumber numberWithFloat:1.0]; // fromValue is taken from current value if not specified
  animation.duration = 1.0;
  animation.timing = TPPropertyAnimationTimingEaseIn;
  [animation beginWithTarget:self];
}

- (void)startMyAnimation { TPPropertyAnimation *animation = [TPPropertyAnimation propertyAnimationWithKeyPath:@"state"]; animation.toValue = [NSNumber numberWithFloat:1.0]; // fromValue is taken from current value if not specified animation.duration = 1.0; animation.timing = TPPropertyAnimationTimingEaseIn; [animation beginWithTarget:self]; }

Make sure you also include the QuartzCore framework, used to access CADisplayLink, if it’s available.

It’s BSD-licensed.

Grab it here:
TPPropertyAnimation.zip

Read More

Hi! I'm Michael Tyson, and I run A Tasty Pixel from our home in the hills of Melbourne, Australia. I occasionally write on a variety of technology and software development topics. I've also spent 3.5-years travelling around Europe in a motorhome.

I make Loopy, the live-looper for iOS, Audiobus, the app-to-app audio platform, and Samplebot, a sampler and sequencer app for iOS.

Follow me on Twitter.

Posts pagination

« 1 … 5 6 7 … 10 »
© 2021 A Tasty Pixel.