Home / iPhone Tips and Tutorials

Working with Audio on the iPhone

Cars make noise, and a '59 Cadillac certainly does not disappoint in that respect. So in this section, I show you how to add some sound to the RoadTrip app so that everyone can hear your car coming down the road.

Using two different ways to implement audio available in iOS in this section. One is an instance of the AVAudioPlayer class, called an audio player, which provides playback of audio data from a file or memory. You use this class unless you are playing audio captured from a network stream or require very low I/O latency. This class offers quite a lot of functionality, including playing sounds of any duration, looping sounds, playing multiple sounds simultaneously, and having one sound per audio player with precise synchronization among all the players in use. It also controls relative playback level, stereo positioning, and playback rate for each sound you are playing.

The AVAudioPlayer class lets you play sound in any audio format available in iOS. You implement a delegate to handle interruptions (such as an incoming phone call) and to update the user interface when a sound has finished playing. The delegate methods to use are described in AVAudioPlayerDelegate Protocol Reference.

The second way to play sound is by using System Sound Services, which provides a way to play short sounds and make the device vibrate.

You can use System Sound Services to play short (30 seconds or shorter) sounds. The interface does not provide level, positioning, looping, or timing control and does not support simultaneous playback: You can play only one sound at a time. You can use System Sound Services to provide audible alerts. On some iOS devices, alerts can include vibration.

To add sound to your app, you start by adding the frameworks. Just follow these steps:

  1. In the Project navigator, select the Project icon at the top of the Project Navigator area (RoadTrip) to display the Project editor.
  2. In the TARGETS section, select RoadTrip.
  3. In the Summary tab, scroll down to the Linked Frameworks and Libraries section.
  4. Expand the Linked Frameworks and Libraries section If it is not already expanded by clicking the disclosure triangle.
  5. Click the + (plus sign) button underneath the list of current project frameworks.
    A list of frameworks appears.
  6. Scroll down and select the AVFoundation.framework and AudioToolbox.framework from the list of frameworks.
  7. Click the Add button.
    You see the framework added to the Linked Frameworks and Libraries section.
  8. Close the Linked Frameworks and Libraries section.
  9. In the Project navigator (don't do this from the Linked Frameworks and Libraries section!), drag the AVFoundation.framework and AudioToolbox.framework files to the Frameworks group).

The sound files you need are already in the Resources folder that you added to your project.

Tip:
You can use Audacity, a free, open source software for recording and editing sounds, to create your own sound files. It is available for Mac OS X, Microsoft Windows, GNU/Linux, and other operating systems.

You start by importing the necessary the audio player and system sound services headers, and then you add the instance variables you'll be using. To accomplish all this, add the bolded code in Listing-7 to RTViewController.m.

Listing-7: Updating the RTViewController Implementation

#import "RTViewController.h"
#import <AVFoundation/AVFoundation.h>
#import <AudioToolbox/AudioToolbox.h>

@interface RTViewController () {

  AVAudioPlayer *backgroundAudioPlayer;
  SystemSoundID burnRubberSoundID;

}
@end

@implementation RTViewController

As you can see, having you take advantage of being able to put instance variables in the implementation file to keep them hidden.

Next, you need to set up the audio player and system sound services. Add the bolded code in Listing 10-8 to viewDidLoad in RTViewController.m.

Listing-8: Updating viewDidLoad

- (void)viewDidLoad
{
  [super viewDidLoad];
  self.title = @"Road Trip";

  NSURL* backgroundURL = [NSURL fileURLWithPath:
    [[NSBundle mainBundle]pathForResource:
			@"CarRunning" ofType:@"aif"]];
  backgroundAudioPlayer = [[AVAudioPlayer alloc]
	initWithContentsOfURL:backgroundURL error:nil];
  backgroundAudioPlayer.numberOfLoops = -1;
  [backgroundAudioPlayer prepareToPlay];

  NSURL* burnRubberURL = [NSURL fileURLWithPath:
    [[NSBundle mainBundle] pathForResource:
			@"BurnRubber" ofType:@"aif"]];
  AudioServicesCreateSystemSoundID((__bridge
		CFURLRef)burnRubberURL, &burnRubberSoundID);
}

In Listing-8, the first thing you do is load the sound file from the resources in your bundle:

NSURL* backgroundURL = [NSURL fileURLWithPath:
  [[NSBundle mainBundle]pathForResource:
			@"CarRunning" ofType:@"aif"]];

fileURLWithPath is an NSURL class method that initializes and returns an NSURL object as a file URL with a specified path. The NSURL class includes the utilities necessary for downloading files or other resources from web and FTP servers and from the file system.

The sound file you use is a resource, and pathForResource: is an NSBundle method that creates the path needed by the fileURLWithPath: method to construct the NSURL. Just give pathForResource: the name and the file type, and it returns the path that gets packed in to the NSURL and loaded.

"What bundle?" you say? Well, when you build your iPhone application, Xcode packages it as a bundle - one containing the following:

  • The application's executable code
  • Any resources that the app has to use (for instance, the application icon, other images, and localized content - in this case, the plist, .html files, and .png files)
  • The RoadTrip-Info.plist, also known as the information property list, which defines key values for the application, such as bundle ID, version number, and display name
Tip:
Be sure that you provide the right file type; otherwise, this technique won't work.

Next, create an instance of the audio player:

backgroundAudioPlayer = [[AVAudioPlayer alloc]
  initWithContentsOfURL:backgroundURL error:nil];

and initialize it with the audio file location (NSURL). Ignore any errors. Then set the number of loops to -1 (which will cause the audio file to continue to play until you stop it) and tell the player to get ready to play:

backgroundAudioPlayer.numberOfLoops = -1;
[backgroundAudioPlayer prepareToPlay];

prepareToPlay prepares the audio player for playback by preloading its buffers; it also acquires the audio hardware needed for playback. This preloading minimizes the lag between calling the play method and the start of sound output. Without this preloading, although the player would still play when you send the play message (later) in viewDidLoad, you'll likely notice a lag as it sets up its buffers.

Similarly, you set up the NSURL for the BurnRubber sound:

NSURL* burnRubberURL = [NSURL fileURLWithPath:
  [[NSBundle mainBundle] pathForResource:
			@"BurnRubber" ofType:@"aif"]];

You then call a core foundation method to create a system sound object that you later use to play the sound:

AudioServicesCreateSystemSoundID((__bridge
		CFURLRef)burnRubberURL, &burnRubberSoundID);

CFURLRef is a Core Foundation object, and ARC does not automatically manage the lifetimes of Core Foundation types. And although you can use certain Core Foundation memory management rules and functions, you don't need to do that here. That's because all you are doing is casting an Objective-C to a core foundation type object, and you won't need to use any Core Foundation memory management in your code. You have to let the compiler know about any memory management implications, however, so you need to use the __bridge cast.

In testDrive, you play both of the sounds created so far. To do so, add the bolded code in Listing 10-9 to testDrive in RTViewContoller.m.

Listing-9: Updating testDrive

- (IBAction)testDrive:(id)sender {

  AudioServicesPlaySystemSound(burnRubberSoundID);
  [self performSelector:@selector(playCarSound)
			withObject:self afterDelay:.2];

  CGPoint center = CGPointMake(car.center.x,
			  self.view.frame.origin.y +
		car.frame.size.height/2 );

  void (^animation)() = ^(){

    car.center = center;
  };

  void (^completion)(BOOL) = ^(BOOL finished){
    [self rotate];
  };

  [UIView animateWithDuration:3 animations:animation
		completion:null];
}

You also need to add the code in Listing-10.

Listing 10-10: Adding playCarSound

- (void)playCarSound {

  [backgroundAudioPlayer play];
}

You play the BurnRubber sound first, followed by the CarRunning sound. If you don't wait until the BurnRubber is complete before you play the CarRunning sound, the BurnRubber sound gets drowned out by the CarRunning sound.

To play the BurnRubber sound, you use a function call to system sound services:

AudioServicesPlaySystemSound(burnRubberSoundID);

After this sound is done, you start the CarRunning sound by using a very useful method that will enable you to send the message to start the audio player after a delay. That method is performSelector:withObject: afterDelay:, and it looks like this:

[self performSelector:@selector(playCarSound)
			withObject:self afterDelay:.2];

performSelector:withObject:afterDelay: sends a message that you specify to an object after a delay. The method you want invoked should have no return value, and should have zero or one argument.

In Listing-10, this method meets these rules:

- (void)playCarSound {

  [backgroundAudioPlayer play];
}

@selector(playCarSound) is a compiler directive that returns a selector for a method name. A selector is the name used to select a method to execute for an object; it becomes a unique identifier when the source code is compiled.

Selectors really don't do anything. What makes the selector method name different from a plain string is that the compiler makes sure that selectors are unique. Selectors are useful because at runtime they act like a dynamic function pointer that, for a given name, automatically point to the implementation of a method appropriate for whichever class they're used with.

withObject: is the argument to pass to the method when it is invoked. In this case, you are passing nil because the method does not take an argument.

afterDelay: is the minimum time before which the message is sent. Specifying a delay of 0 does not necessarily cause the selector to be performed immediately. When you send the performSelector:withObject: message, you specify .2 seconds because that is the duration of the BurnRubber sound.

Sometimes you may need to cancel a selector. The method cancelPerformSelectorsWithTarget: cancels all outstanding ordered performs scheduled with a given target.

Several other variations exist on the performSelector:withObject:afterDelay: method. Those variations are part of the NSObject class, which is the root class of most Objective-C class hierarchies. It provides the basic interface to the runtime system and the ability to behave as Objective-C objects.

Finally, to play the sound in the playCarSound method, you send the audio player the play message:

[backgroundAudioPlayer play];

The play message plays a sound asynchronously. If you haven't already sent the prepareToPlay message, play will send that for you as well (although you should expect a lag before the sound is played) Next, you need to stop playing the sound in the continueRotation animation's completion block (or it gets really annoying). To stop playing the sound, add the bolded code in Listing-11 to continueRotation in RTViewContoller.m.

Listing-11: Updating continueRotation to Stop the Sound

- (void)continueRotation {

  CGAffineTransform transform =
	CGAffineTransformMakeRotation(-0);

  void (^animation)() = ^(){
    car.transform = transform;
  };

  void (^completion)(BOOL) = ^(BOOL finished){
    [backgroundAudioPlayer stop];
    [backgroundAudioPlayer prepareToPlay];
  };

  [UIView animateWithDuration:3 animations:animation
				completion:completion];
}

In the code in Listing-11, you also set the audio player up to play again. And there you have it. Run your project and you'll notice some very realistic sound effects when you touch the TestDrive button.

[Previous] [Contents] [Next]