What's new for TV in Android 13

Android 13 adds further customizations to improve the user experience and increase compatibility with TV devices. Some of the highlights for this release include performance and quality improvements and continued advancements to how users interact with Android TV.

See the following sections for more information about what's included.

Performance & quality

Anticipatory audio routes

You can now anticipate device audio attribute support and prepare tracks for the active audio device. You can use getDirectPlaybackSupport() to check whether direct playback is supported on the currently routed audio device for a given format and attributes:

Kotlin

val format = AudioFormat.Builder()
    .setEncoding(AudioFormat.ENCODING_E_AC3)
    .setChannelMask(AudioFormat.CHANNEL_OUT_5POINT1)
    .setSampleRate(48000)
    .build()
val attributes = AudioAttributes.Builder()
    .setUsage(AudioAttributes.USAGE_MEDIA)
    .build()

if (AudioManager.getDirectPlaybackSupport(format, attributes) !=
    AudioManager.DIRECT_PLAYBACK_NOT_SUPPORTED
) {
    // The format and attributes are supported for direct playback
    // on the currently active routed audio path
} else {
    // The format and attributes are NOT supported for direct playback
    // on the currently active routed audio path
}

Java

AudioFormat format = new AudioFormat.Builder()
        .setEncoding(AudioFormat.ENCODING_E_AC3)
        .setChannelMask(AudioFormat.CHANNEL_OUT_5POINT1)
        .setSampleRate(48000)
        .build();
AudioAttributes attributes = new AudioAttributes.Builder()
        .setUsage(AudioAttributes.USAGE_MEDIA)
        .build();

if (AudioManager.getDirectPlaybackSupport(format, attributes) !=
        AudioManager.DIRECT_PLAYBACK_NOT_SUPPORTED) {
    // The format and attributes are supported for direct playback
    // on the currently active routed audio path
} else {
    // The format and attributes are NOT supported for direct playback
    // on the currently active routed audio path
}

Alternatively, you can query which profiles are supported for direct media playback through the currently routed audio device. This excludes any profiles that are unsupported or would be, for instance, transcoded by the Android framework:

Kotlin

private fun findBestAudioFormat(audioAttributes: AudioAttributes): AudioFormat {
    val preferredFormats = listOf(
        AudioFormat.ENCODING_E_AC3,
        AudioFormat.ENCODING_AC3,
        AudioFormat.ENCODING_PCM_16BIT,
        AudioFormat.ENCODING_DEFAULT
    )
    val audioProfiles = audioManager.getDirectProfilesForAttributes(audioAttributes)
    val bestAudioProfile = preferredFormats.firstNotNullOf { format ->
        audioProfiles.firstOrNull { it.format == format }
    }
    val sampleRate = findBestSampleRate(bestAudioProfile)
    val channelMask = findBestChannelMask(bestAudioProfile)
    return AudioFormat.Builder()
        .setEncoding(bestAudioProfile.format)
        .setSampleRate(sampleRate)
        .setChannelMask(channelMask)
        .build()
}

Java

private AudioFormat findBestAudioFormat(AudioAttributes audioAttributes) {
    Stream<Integer> preferredFormats = Stream.<Integer>builder()
            .add(AudioFormat.ENCODING_E_AC3)
            .add(AudioFormat.ENCODING_AC3)
            .add(AudioFormat.ENCODING_PCM_16BIT)
            .add(AudioFormat.ENCODING_DEFAULT)
            .build();
    Stream<AudioProfile> audioProfiles =
            audioManager.getDirectProfilesForAttributes(audioAttributes).stream();
    AudioProfile bestAudioProfile = (AudioProfile) preferredFormats.map(format ->
            audioProfiles.filter(profile -> profile.getFormat() == format)
                    .findFirst()
                    .orElseThrow(NoSuchElementException::new)
    );
    Integer sampleRate = findBestSampleRate(bestAudioProfile);
    Integer channelMask = findBestChannelMask(bestAudioProfile);
    return new AudioFormat.Builder()
            .setEncoding(bestAudioProfile.getFormat())
            .setSampleRate(sampleRate)
            .setChannelMask(channelMask)
            .build();
}

In this example, preferredFormats is a list of AudioFormat instances. It is ordered with the most preferred first in the list, and the least preferred last. getDirectProfilesForAttributes()returns a list of supported AudioProfile objectss for the currently routed audio device with the supplied AudioAttributes. The list of preferred AudioFormat items is iterated through until a matching supported AudioProfile is found. This AudioProfile is stored as bestAudioProfile. Optimum sample rates and channel masks are determined from bestAudioProfile. Finally, an appropriate AudioFormat instance is created.

HDMI state surfaced to MediaSession

HDMI state changes are now surfaced to the MediaSession lifecycle. If you handle these events accurately, then playback is stopped if the HDMI device is powered off.

Input & accessibility

Keyboard layouts API

From Android 13 onwards, you can determine keyboard layouts using getKeyCodeForKeyLocation(). For example, your game supports movement using the WASD keys, but this may not work correctly on an AZERTY keyboard, which has the A and W keys in different locations. You can get the keycodes for the keys you expect at certain positions:

Kotlin

val inputManager: InputManager? = requireActivity().getSystemService()

inputManager?.inputDeviceIds?.map { inputManager.getInputDevice(it) }
    ?.firstOrNull { it.keyboardType == InputDevice.KEYBOARD_TYPE_ALPHABETIC }
    ?.let { inputDevice ->
        keyUp = inputDevice.getKeyCodeForKeyLocation(KeyEvent.KEYCODE_W)
        keyLeft = inputDevice.getKeyCodeForKeyLocation(KeyEvent.KEYCODE_A)
        keyDown = inputDevice.getKeyCodeForKeyLocation(KeyEvent.KEYCODE_S)
        keyRight = inputDevice.getKeyCodeForKeyLocation(KeyEvent.KEYCODE_D)
    }

Java

InputManager inputManager = requireActivity().getSystemService(InputManager.class);
InputDevice inputDevice = Arrays.stream(inputManager.getInputDeviceIds())
        .mapToObj(inputManager::getInputDevice)
        .filter( device -> device.getKeyboardType() == InputDevice.KEYBOARD_TYPE_ALPHABETIC)
        .filter(Objects::nonNull)
        .findFirst()
        .orElse(null);
if (inputDevice != null) {
    keyUp = inputDevice.getKeyCodeForKeyLocation(KeyEvent.KEYCODE_W);
    keyLeft = inputDevice.getKeyCodeForKeyLocation(KeyEvent.KEYCODE_A);
    keyDown = inputDevice.getKeyCodeForKeyLocation(KeyEvent.KEYCODE_S);
    keyRight = inputDevice.getKeyCodeForKeyLocation(KeyEvent.KEYCODE_D);
}

In this example, with an AZERTY keyboard, keyUp is set to KeyEvent.KEYCODE_Z, keyLeft is set to KeyEvent.KEYCODE_Q, while keyDown and keyRight are set to KeyEvent.KEYCODE_S and KeyEvent.KEYCODE_D respectively. You can now create key event handlers for these key codes and implement the expected behavior.

Audio descriptions

Android 13 introduces a new system-wide accessibility preference that allows users to enable audio descriptions across all apps. Android TV apps can follow the user’s preferences by querying it with isAudioDescriptionRequested().

Kotlin


private lateinit var accessibilityManager: AccessibilityManager

// In onCreate():
accessibilityManager = getSystemService(AccessibilityManager::class.java)

// Where your media player is initialized
if (am.isAudioDescriptionRequested) {
    // User has requested to enable audio descriptions
}

Java


private AccessibilityManager accessibilityManager;

// In onCreate():
accessibilityManager = getSystemService(AccessibilityManager.class);

// Where your media player is initialized
if(accessibilityManager.isAudioDescriptionRequested()) {
    // User has requested to enable audio descriptions
}

Android TV apps can monitor user’s preference change by adding a listener to AccessbilityManager.

Kotlin

private val listener =
    AccessibilityManager.AudioDescriptionRequestedChangeListener { enabled ->
        // Preference changed; reflect its state in your media player
    }

override fun onStart() {
    super.onStart()

    accessibilityManager.addAudioDescriptionRequestedChangeListener(mainExecutor, listener)
}

override fun onStop() {
    super.onStop()

    accessibilityManager.removeAudioDescriptionRequestedChangeListener(listener)
}

Java

private AccessibilityManager.AudioDescriptionRequestedChangeListener listener = enabled -> {
    // Preference changed; reflect its state in your media player
};

@Override
protected void onStart() {
    super.onStart();

    accessibilityManager.addAudioDescriptionRequestedChangeListener(getMainExecutor(), listener);
}

@Override
protected void onStop() {
    super.onStop();

    accessibilityManager.removeAudioDescriptionRequestedChangeListener(listener);
}

Multitasking

Picture-in-picture improvements

Android 13 introduces some enhancements to the Picture-in-picture (PiP) APIs to allow for multi-tasking. While PiP support was introduced in Android 8.0 (API level 26), it was not widely supported on Android TV, and not supported at all on Google TV prior to Android 13. Multi-tasking for TV uses picture-in-picture mode to allow two separate apps to coexist on the screen: one taking up most of the screen, with a second running in picture-in-picture mode. There are different requirements for apps running in either of these modes.

The default behavior is that the PiP app overlays the full-screen app. This is much the same as standard Android picture-in-picture behavior. However, for TV, the PiP app can provide an expanded mode.

The best way to work effectively with PiP is to make your activities resizable. While TVs have consistent aspect ratios, when picture-in-picture mode is active, your Activity may have a different aspect ratio. You must design your layouts with this in mind.

Run your app in picture-in-picture mode

Putting your app into picture-in-picture mode is done by calling enterPictureInPictureMode(). However, you should always check whether picture-in-picture mode is supported on the device. If the TV device is running a version of Android earlier than Android 13, picture-in-picture mode is not supported.

Here is an example of how to create a picture-in-picture action to the playback transport controls in a video playback app:

Kotlin

class MyPlaybackTransportControlGlue<T : MediaPlayerAdapter>(context: Context, adapter: T)
    : PlaybackTransportControlGlue<T>(context, adapter) {

    private lateinit var pictureInPictureAction: PlaybackControlsRow.PictureInPictureAction

    override fun onCreatePrimaryActions(primaryActionsAdapter: ArrayObjectAdapter) {
        super.onCreatePrimaryActions(primaryActionsAdapter)
        if (context.packageManager.hasSystemFeature(FEATURE_PICTURE_IN_PICTURE)) {
            pictureInPictureAction = PlaybackControlsRow.PictureInPictureAction(context)
            primaryActionsAdapter.add(pictureInPictureAction)
        }
    }

    override fun onActionClicked(action: Action) {
        when (action) {
            pictureInPictureAction -> {
                val aspectRatio = playerAdapter.mediaPlayer.let { mediaPlayer ->
                    Rational(mediaPlayer.videoWidth, mediaPlayer.videoHeight)
                }
                val params = PictureInPictureParams.Builder()
                    .setAspectRatio(aspectRatio)
                    .build()
                val result = (context as? Activity)?.enterPictureInPictureMode(params) ?: false
                // result is true if picture-in-picture mode was entered successfully
            }
            else -> super.onActionClicked(action)
        }
    }
}

Java

public class MyPlaybackTransportControlGlueJava<T extends MediaPlayerAdapter>
    extends PlaybackTransportControlGlue<T> {

    private PlaybackControlsRow.PictureInPictureAction pictureInPictureAction;

    MyPlaybackTransportControlGlueJava(Activity activity, T adapter) {
        super(activity, adapter);
    }

    @Override
    protected void onCreatePrimaryActions(ArrayObjectAdapter primaryActionsAdapter) {
        super.onCreatePrimaryActions(primaryActionsAdapter);

        Context context = getContext();

        if (context.getPackageManager().hasSystemFeature(FEATURE_PICTURE_IN_PICTURE)) {
            pictureInPictureAction = new PlaybackControlsRow.PictureInPictureAction(context);
            primaryActionsAdapter.add(pictureInPictureAction);
        }
    }

    @Override
    public void onActionClicked(Action action) {
        if (action == pictureInPictureAction) {
            Rational aspectRatio = new Rational(
                    getPlayerAdapter().getMediaPlayer().getVideoWidth(),
                    getPlayerAdapter().getMediaPlayer().getVideoHeight()
            );
            PictureInPictureParams params = new PictureInPictureParams.Builder()
                    .setAspectRatio(aspectRatio)
                    .build();
            Activity activity = (Activity) getContext();
            Boolean result = activity.enterPictureInPictureMode(params);
            // result is true if picture-in-picture mode was entered successfully
        } else {
            super.onActionClicked(action);
        }
    }
}

Points worth noting are that the action is only added if the device has the system feature FEATURE_PICTURE_IN_PICTURE. Also, when the action is triggered, the aspect ratio of picture-in-picture mode is set to match the aspect ratio of the video being played.

Expanded PiP mode

Many of the concepts covered in the main picture-in-picture documentation are applicable to picture-in-picture on TV. By default, the PiP app overlays the full-screen app in a small window. However, there are some concepts that are more applicable to TV. For example, TVs have a larger screen and can provide an expanded PiP mode. Expanded PiP mode is available on Android 13 and higher if the device has the system feature FEATURE_EXPANDED_PICTURE_IN_PICTURE.

You should provide aspect ratios for both normal PiP mode and expanded PiP mode. The user may decide to go in to either one. For example, in a video conferencing app you may show the active speaker the app is in normal picture-in-picture mode, or a participants list when it is in expanded picture-in-picture mode:

Kotlin

val aspectRatio = playerAdapter.mediaPlayer.let { mediaPlayer ->
    Rational(mediaPlayer.videoWidth, mediaPlayer.videoHeight)
}
val params = PictureInPictureParams.Builder()
   .setAspectRatio(aspectRatio)
   .apply {
       if (context.packageManager.hasSystemFeature(FEATURE_EXPANDED_PICTURE_IN_PICTURE)) {
           setExpandedAspectRatio(Rational(1, participants.size))
       }
   }
   .build()

Java

Rational aspectRatio = new Rational(
    getPlayerAdapter().getMediaPlayer().getVideoWidth(),
    getPlayerAdapter().getMediaPlayer().getVideoHeight()
);
PictureInPictureParams.Builder builder = new PictureInPictureParams.Builder()
    .setAspectRatio(aspectRatio);
if (getContext().getPackageManager().hasSystemFeature(FEATURE_EXPANDED_PICTURE_IN_PICTURE)) {
    builder.setExpandedAspectRatio(new Rational(1, participants.size()));
}
PictureInPictureParams params = builder.build();

It may be necessary to know if your Activity switches in and out of picture-in-picture mode. Change your UI in the onPictureInPictureModeChanged() callback. For example, you can hide the transport controls when a video playback Activity enters picture-in-picture mode:

Kotlin

override fun onPictureInPictureModeChanged(
        isInPictureInPictureMode: Boolean,
        newConfig: Configuration
) {
    if (isInPictureInPictureMode) {
        // Hide the full-screen UI (controls, etc.) while in picture-in-picture mode.
    } else {
        // Restore the full-screen UI.
    }
}

Java

@Override
public void onPictureInPictureModeChanged (
        boolean isInPictureInPictureMode,
        Configuration newConfig
) {
    if (isInPictureInPictureMode) {
        // Hide the full-screen UI (controls, etc.) while in picture-in-picture mode.
    } else {
        // Restore the full-screen UI.
        ...
    }
}

The best way to deal with the user switching between normal and expanded PiP modes is to correctly implement responsive layouts then you can offer different layouts for each PiP mode. However, you can determine the size of the window hosting your app using getCurrentWindowMetrics()

Coexist with expanded PiP

Whenever an app is running in expanded PiP mode, it affects the app that is running in full-screen. The full-screen app can chose to implement one of two behaviors for this:

Docked mode
The two apps are docked in separate areas of the screen. The PiP window is docked on one of the screen edges and the fullscreen app is resized to occupy all the space next to it.
Overlay mode
The PiP app overlays a section of the fullscreen app.
Docked mode

The default behavior is docked mode. The exact behavior depends on whether the Activity of the fullscreen app is resizable.

If the Activity is not resizable, then it is be scaled down whilst maintaining the aspect ratio using size compatibility mode. The system may apply borders to your app so that your app has a consistent aspect ratio.

Docked mode non-resizable Activity

If the Activity is resizable, then its aspect ratio changes to make room for the PiP app. Your app has all of the available space without any borders applied. If you have implemented responsive layouts this should work seamlessly.

Docked mode resizable Activity

Overlay mode

Some apps have user experiences that don't work well in docked mode. For example a browsing Activity, such as the homescreen, or content browser, generally display a lot of information and may not scale well with docked mode. Instead, they may be better suited to the PiP window overlaying part of it. Docked mode can be disabled, using setShouldDockBigOverlays().

Kotlin


requireActivity().setShouldDockBigOverlays(false)

Java


requireActivity().setShouldDockBigOverlays(false);

The playback Activity fills the entire screen, with the PiP app overlaying it in normal PiP mode:

Overlay mode

Keep-clear APIs

In some cases, the PiP app may overlay important UI components within the fullscreen app. To mitigate this, there are keep clear APIs that apps can use to identify critical UI components that shouldn't be overlaid. Using these APIs doesn't guarantee that those components are not overlaid, but the system attempts to honor these requests.

Keep Clear

In your XML layout, you can specify that a view shouldn't be overlaid using preferKeepClear:

<TextView
    android:id="@+id/important_text"
    android:layout_width="wrap_content"
    android:layout_height="wrap_content"
    android:preferKeepClear="true"
    android:text="@string/app_name"/>

You can also do this programmatically using setPreferKeepClear():

Kotlin

private lateinit var binding: MyLayoutBinding

override fun onCreate(savedInstanceState: Bundle?) {
    super.onCreate(savedInstanceState)

    binding = MyLayoutBinding.inflate(layoutInflater)
    setContentView(binding.root)
    binding.importantText.isPreferKeepClear = true
}

Java

private MyLayoutBinding binding;

@Override
protected void onCreate(@Nullable Bundle savedInstanceState) {
    super.onCreate(savedInstanceState);

    binding = MyLayoutBinding.inflate(getLayoutInflater());
    setContentView(binding.getRoot());
    binding.importantText.setPreferKeepClear(true);
}

There may be times when you don't need to keep an entire View clear, but only a section of it. The setPreferKeepClearRects() can be used to specify regions of the View that shouldn't be overlaid. UIs that don't use Views natively, such as Flutter, Jetpack Compose, and WebView, may have sub-sections that need keep clear functionality. This API can be used for those cases.

Validation Tools

  • Media Controller Test allows you to test the intricacies of media playback on Android and helps verify your media session implementation.
  • MediaSession Validator provides an easy and automated way to verify your media session integration on Android TV.