Audio Streaming in watchOS 6

One of the great new watchOS 6 features announced last week at WWDC is the ability to stream audio directly from the Apple Watch.

Playing audio on Apple Watch has evolved significantly, especially in the past two years.

This article shows you how to make use of the new AVPlayer functionality in watchOS 6, based on my learnings from adding this functionality to Outcast.

Background

Prior to watchOS 5, the only way to play background audio on Apple Watch was with WKAudioFilePlayer.

When using this class, your app wouldn't actually run in the backgorund, but it would register the file with the system so it would play accordingly (alongside this was the WKAudioFileQueuePlayer, which allowed you scheduled a playlist of files, rather than just a single file).

In watchOS 5, Apple added a background audio entitlement, as well as making the AVAudioFilePlayer class available. This meant that your audio app would continue to run in the background when playing, and it would receive callbacks when events occurred (such as the user tapping on their AirPods to pause, or using the Now Playing app). It also meant you had finer control over what appeared in the Now Playing app.

When using AVAudioPlayer, you must use an AVAsset to indicate the file to be played. However, on watchOS, AVAsset could only play a local URL (not a remote URL, as you would need to do for streaming).

In watchOS 6, the original WKAudioFilePlayer (and associated WKAudio* classes) are deprecated, but Apple has now made the AVPlayer class available. Additionally, AVURLAsset class is also available, meaning you can now stream audio directly on Apple Watch.

Creating the Player

Since the AVPlayer classes are already available on iOS, you can stream audio in the same way as you would on iOS.

If not, you can create the player as follows:

guard let url = URL(string: "http://example.com/file-to-stream.mp3") else {
    return
}

let asset  = AVURLAsset(url: url, options: nil)
let item   = AVPlayerItem(asset: asset, automaticallyLoadedAssetKeys: nil)
let player = AVPlayer(playerItem: item)

player.play()

At this point, the file will start to buffer and playback will commence once there is enough buffer to play.

Additionally, you can subsequently pause and resume if required:

player.pause() // Pause the playback
player.play() // Resume playback. This may buffer initially even when resuming

Detecting Buffering

Much of the complexity in actually in actually loading the playback buffer is handled by the system, but the user should still know when something (i.e. buffering) is happening.

For example, some visual feedback if the file is buffering (such as an activity spinner) would be quite useful.

In order to detect when the file is playing, buffering or paused, you can check the player.timeControlStatus field:

switch player.timeControlStatus {
case .waitingToPlayAtSpecifiedRate:
    // Buffering and will play as soon as it can
case .playing:
    // The audio is currently playing.
case .paused:
    // The player was paused
}

Since the file may move from buffering to playing at any time, you need to know when this status changes. The timeControlStatus variable is key-value observable, meaning you can be notified of this value as changing as follows:

let observer = player.observe(\.timeControlStatus) { object, change in
    switch object.timeControlStatus {
        case .waitingToPlayAtSpecifiedRate:
            // Buffering began
        case .playing:
            // Playback began
        case .paused:
            // Pause began
    }
}

Note: You'll need to keep a reference to observer so it isn't deallocated.

Seeking to a New Location

If you want to fast forward or rewind, you can call the seek() method on the player. Note that this will likely trigger buffering, meaning the above observer comes into play again.

In Outcast, I use TimeInterval when handling playback positions. Because seek() uses the CMTime class (which is also now available on watchOS), you'll need to convert the TimeInterval to CMTime.

extension CMTime {
    var timeInterval: TimeInterval {
        return TimeInterval(CMTimeGetSeconds(self))
    }
}

extension TimeInterval {
    var cmTime: CMTime {
        return CMTimeMakeWithSeconds(self, preferredTimescale: 1000000)
    }
}

These methods make it super-easy to convert between CMTime and TimeInterval.

If you want to seek forward 30 seconds, you would first retrieve the current playback offset, then add 30 seconds.

let offset = player.currentTime().timeInterval

let newOffset = offset + 30

player.seek(newOffset.cmTime)

Detecting Playback Events

The AVPlayer class will trigger notifications for various events, such as when a file finishes playback.

This allows you manage your playback queue, or display an error if something goes wrong.

let nc = NotificationCenter.default

let endTimeObserver = nc.addObserver(forName: Notification.Name.AVPlayerItemDidPlayToEndTime, object: nil, queue: nil) { notification in
}

let failedObserver = nc.addObserver(forName: Notification.Name.AVPlayerItemFailedToPlayToEndTime, object: nil, queue: nil) { notification in
    let error = notification.userInfo?[AVPlayerItemFailedToPlayToEndTimeErrorKey] as? Error
    
}

let jumpedObserver = nc.addObserver(forName: Notification.Name.AVPlayerItemTimeJumped, object: nil, queue: nil) { notification in
    
}

let stalledObserver = nc.addObserver(forName: Notification.Name.AVPlayerItemPlaybackStalled, object: nil, queue: nil) { notification in
    
}

Stopping Buffering

I couldn't conclusively find a way to end buffering, but some research (a fancy way of saying Google and StackOverflow) indicated that removing the current item from the playter will achieve this. Your mileage may vary.

player.replaceCurrentItem(with: nil)

Summary

When TestFlight is open to external testers for watchOS 6, I plan to open up Outcast so you can try out podcast streaming.

Hopefully this helps if your own implementation! Good luck.

Further reading: