Build accessibility services

An accessibility service is an application that provides user interface enhancements to assist users with disabilities, or who may temporarily be unable to fully interact with a device. For example, users who are driving, taking care of a young child or attending a very loud party might need additional or alternative interface feedback.

Android provides standard accessibility services, including TalkBack, and developers can create and distribute their own services. This document explains the basics of building an accessibility service.

Note: Your app should use platform-level accessibility services only for the purpose of helping users with disabilities interact with your app.

The ability for you to build and deploy accessibility services was introduced with Android 1.6 (API Level 4) and received significant improvements with Android 4.0 (API Level 14). The Android Support Library was also updated with the release of Android 4.0 to provide support for these enhanced accessibility features back to Android 1.6. Developers aiming for widely compatible accessibility services are encouraged to use the Support Library and develop for the more advanced accessibility features introduced in Android 4.0.

Manifest declarations and permissions

Applications that provide accessibility services must include specific declarations in their application manifests to be treated as an accessibility service by the Android system. This section explains the required and optional settings for accessibility services.

Accessibility service declaration

In order to be treated as an accessibility service, you must include a service element (rather than the activity element) within the application element in your manifest. In addition, within the service element, you must also include an accessibility service intent filter. For compatiblity with Android 4.1 and higher, the manifest must also protect the service by adding the BIND_ACCESSIBILITY_SERVICE permission to ensure that only the system can bind to it. Here's an example:

  <application>
    <service android:name=".MyAccessibilityService"
        android:permission="android.permission.BIND_ACCESSIBILITY_SERVICE"
        android:label="@string/accessibility_service_label">
      <intent-filter>
        <action android:name="android.accessibilityservice.AccessibilityService" />
      </intent-filter>
    </service>
  </application>

These declarations are required for all accessibility services deployed on Android 1.6 (API Level 4) or higher.

Accessibility service configuration

Accessibility services must also provide a configuration which specifies the types of accessibility events that the service handles and additional information about the service. The configuration of an accessibility service is contained in the AccessibilityServiceInfo class. Your service can build and set a configuration using an instance of this class and setServiceInfo() at runtime. However, not all configuration options are available using this method.

Beginning with Android 4.0, you can include a <meta-data> element in your manifest with a reference to a configuration file, which allows you to set the full range of options for your accessibility service, as shown in the following example:

<service android:name=".MyAccessibilityService">
  ...
  <meta-data
    android:name="android.accessibilityservice"
    android:resource="@xml/accessibility_service_config" />
</service>

This meta-data element refers to an XML file that you create in your application's resource directory (<project_dir>/res/xml/accessibility_service_config.xml). The following code shows example contents for the service configuration file:

<accessibility-service xmlns:android="http://schemas.android.com/apk/res/android"
    android:description="@string/accessibility_service_description"
    android:packageNames="com.example.android.apis"
    android:accessibilityEventTypes="typeAllMask"
    android:accessibilityFlags="flagDefault"
    android:accessibilityFeedbackType="feedbackSpoken"
    android:notificationTimeout="100"
    android:canRetrieveWindowContent="true"
    android:settingsActivity="com.example.android.accessibility.ServiceSettingsActivity"
/>

For more information about the XML attributes which can be used in the accessibility service configuration file, follow these links to the reference documentation:

For more information about which configuration settings can be dynamically set at runtime, see the AccessibilityServiceInfo reference documentation.

Accessibility service methods

An accessibility service must extend the AccessibilityService class and override the following methods from that class. These methods are presented in the order in which they are called by the Android system, from when the service is started (onServiceConnected()), while it is running (onAccessibilityEvent(), onInterrupt()) to when it is shut down (onUnbind()).

  • onServiceConnected() - (optional) This system calls this method when it successfully connects to your accessibility service. Use this method to do any one-time setup steps for your service, including connecting to user feedback system services, such as the audio manager or device vibrator. If you want to set the configuration of your service at runtime or make one-time adjustments, this is a convenient location from which to call setServiceInfo().
  • onAccessibilityEvent() - (required) This method is called back by the system when it detects an AccessibilityEvent that matches the event filtering parameters specified by your accessibility service. For example, when the user clicks a button or focuses on a user interface control in an application for which your accessibility service is providing feedback. When this happens, the system calls this method, passing the associated AccessibilityEvent, which the service can then interpret and use to provide feedback to the user. This method may be called many times over the lifecycle of your service.
  • onInterrupt() - (required) This method is called when the system wants to interrupt the feedback your service is providing, usually in response to a user action such as moving focus to a different control. This method may be called many times over the lifecycle of your service.
  • onUnbind() - (optional) This method is called when the system is about to shutdown the accessibility service. Use this method to do any one-time shutdown procedures, including de-allocating user feedback system services, such as the audio manager or device vibrator.

These callback methods provide the basic structure for your accessibility service. It is up to you to decide on how to process data provided by the Android system in the form of AccessibilityEvent objects and provide feedback to the user. For more information about getting information from an accessibility event, see the Implementing Accessibility training.

Register for accessibility events

One of the most important functions of the accessibility service configuration parameters is to allow you to specify what types of accessibility events your service can handle. Being able to specify this information enables accessibility services to cooperate with each other, and allows you as a developer the flexibility to handle only specific events types from specific applications. The event filtering can include the following criteria:

  • Package Names - Specify the package names of applications whose accessibility events you want your service to handle. If this parameter is omitted, your accessibility service is considered available to service accessibility events for any application. This parameter can be set in the accessibility service configuration files with the android:packageNames attribute as a comma-separated list, or set using the AccessibilityServiceInfo.packageNames member.
  • Event Types - Specify the types of accessibility events you want your service to handle. This parameter can be set in the accessibility service configuration files with the android:accessibilityEventTypes attribute as a list separated by the | character (for example accessibilityEventTypes="typeViewClicked|typeViewFocused"), or set using the AccessibilityServiceInfo.eventTypes member.

When setting up your accessibility service, carefully consider what events your service is able to handle and only register for those events. Since users can activate more than one accessibility services at a time, your service must not consume events that it is not able to handle. Remember that other services may handle those events in order to improve a user's experience.

Note: The Android framework dispatches accessibility events to more than one accessibility service if the services provide different feedback types. However, if two or more services provide the same feedback type, then only the first registered service receives the event.

Accessibility volume

Devices running Android 8.0 (API level 26) and higher include the STREAM_ACCESSIBILITY volume category, which allows you to control the volume of your accessibility service's audio output independently of other sounds on the device.

Accessibility services can use this stream type by setting the FLAG_ENABLE_ACCESSIBILITY_VOLUME option. You can then change the device's accessibility audio volume by calling the adjustStreamVolume() method on the device's instance of AudioManager.

The following code snippet demonstrates how an accessibility service can use the STREAM_ACCESSIBILITY volume category:

Kotlin

import android.media.AudioManager.*

class MyAccessibilityService : AccessibilityService() {

    private val mAudioManager = getSystemService(AUDIO_SERVICE) as AudioManager

    override fun onAccessibilityEvent(accessibilityEvent: AccessibilityEvent) {
        if (accessibilityEvent.source.text == "Increase volume") {
            mAudioManager.adjustStreamVolume(AudioManager.STREAM_ACCESSIBILITY, ADJUST_RAISE, 0)
        }
    }
}

Java

import static android.media.AudioManager.*;

public class MyAccessibilityService extends AccessibilityService {
    private AudioManager mAudioManager =
            (AudioManager) getSystemService(AUDIO_SERVICE);

    @Override
    public void onAccessibilityEvent(AccessibilityEvent accessibilityEvent) {
        AccessibilityNodeInfo interactedNodeInfo =
                accessibilityEvent.getSource();
        if (interactedNodeInfo.getText().equals("Increase volume")) {
            mAudioManager.adjustStreamVolume(AudioManager.STREAM_ACCESSIBILITY,
                ADJUST_RAISE, 0);
        }
    }
}

For more information, see the What's New In Android Accessibility session video from Google I/O 2017, starting at 6:35.

Accessibility shortcut

On devices running Android 8.0 (API level 26) and higher, users can enable and disable their preferred accessibility service from any screen by long-pressing both volume keys at the same time. Although this shortcut enables and disables Talkback by default, a user can configure the button to enable and disable any service, including your own, that's installed on their device.

In order for users to access a particular accessibility service from the accessibility shortcut, the service needs to request the feature at runtime, when it starts.

For more information, see the What's New In Android Accessibility session video from Google I/O 2017, starting at 13:25.

Accessibility button

On devices that use a software-rendered navigation area and are running Android 8.0 (API level 26) or higher, the right-hand side of the navigation bar includes an accessibility button. When users press this button, they can invoke one of several enabled accessibility features and services, depending on the content currently shown on the screen.

To allow users to invoke a given accessibility service using the accessibility button, the service needs to add the FLAG_REQUEST_ACCESSIBILITY_BUTTON flag in an AccessibilityServiceInfo object's android:accessibilityFlags attribute. The service can then register callbacks using registerAccessibilityButtonCallback().

Note: This feature is available only on devices that provide a software-rendered navigation area. Services must always use isAccessibilityButtonAvailable() and respond to changes based on the availability of the accessibility button by implementing onAvailabilityChanged(). That way, users can always access the service's functionality, even if the accessibility button isn't supported or becomes unavailable.

The following code snippet demonstrates how you can configure an accessibility service to respond to the user's pressing the accessibility button:

Kotlin

private var mAccessibilityButtonController: AccessibilityButtonController? = null
private var mAccessibilityButtonCallback:
        AccessibilityButtonController.AccessibilityButtonCallback? = null
private var mIsAccessibilityButtonAvailable: Boolean = false

override fun onServiceConnected() {
    mAccessibilityButtonController = accessibilityButtonController
    mIsAccessibilityButtonAvailable =
            mAccessibilityButtonController?.isAccessibilityButtonAvailable ?: false

    if (!mIsAccessibilityButtonAvailable) return

    serviceInfo = serviceInfo.apply {
        flags = flags or AccessibilityServiceInfo.FLAG_REQUEST_ACCESSIBILITY_BUTTON
    }

    mAccessibilityButtonCallback =
        object : AccessibilityButtonController.AccessibilityButtonCallback() {
            override fun onClicked(controller: AccessibilityButtonController) {
                Log.d("MY_APP_TAG", "Accessibility button pressed!")

                // Add custom logic for a service to react to the
                // accessibility button being pressed.
            }

            override fun onAvailabilityChanged(
                    controller: AccessibilityButtonController,
                    available: Boolean
            ) {
                if (controller == mAccessibilityButtonController) {
                    mIsAccessibilityButtonAvailable = available
                }
            }
    }

    mAccessibilityButtonCallback?.also {
        mAccessibilityButtonController?.registerAccessibilityButtonCallback(it, null)
    }
}

Java

private AccessibilityButtonController mAccessibilityButtonController;
private AccessibilityButtonController
        .AccessibilityButtonCallback mAccessibilityButtonCallback;
private boolean mIsAccessibilityButtonAvailable;

@Override
protected void onServiceConnected() {
    mAccessibilityButtonController = getAccessibilityButtonController();
    mIsAccessibilityButtonAvailable =
            mAccessibilityButtonController.isAccessibilityButtonAvailable();

    if (!mIsAccessibilityButtonAvailable) {
        return;
    }

    AccessibilityServiceInfo serviceInfo = getServiceInfo();
    serviceInfo.flags
            |= AccessibilityServiceInfo.FLAG_REQUEST_ACCESSIBILITY_BUTTON;
    setServiceInfo(serviceInfo);

    mAccessibilityButtonCallback =
        new AccessibilityButtonController.AccessibilityButtonCallback() {
            @Override
            public void onClicked(AccessibilityButtonController controller) {
                Log.d("MY_APP_TAG", "Accessibility button pressed!");

                // Add custom logic for a service to react to the
                // accessibility button being pressed.
            }

            @Override
            public void onAvailabilityChanged(
              AccessibilityButtonController controller, boolean available) {
                if (controller.equals(mAccessibilityButtonController)) {
                    mIsAccessibilityButtonAvailable = available;
                }
            }
        };

    if (mAccessibilityButtonCallback != null) {
        mAccessibilityButtonController.registerAccessibilityButtonCallback(
                mAccessibilityButtonCallback, null);
    }
}

For more information, see the What's New In Android Accessibility session video from Google I/O 2017, starting at 16:28.

Fingerprint gestures

Accessibility services on devices running Android 8.0 (API level 26) or higher can respond to an alternative input mechanism, directional swipes (up, down, left, and right) along a device's fingerprint sensor. To configure a service to receive callbacks about these interactions, complete the following sequence of steps:

  1. Declare the USE_FINGERPRINT permission and the CAPABILITY_CAN_REQUEST_FINGERPRINT_GESTURES capability.
  2. Set the FLAG_REQUEST_FINGERPRINT_GESTURES flag within the android:accessibilityFlags attribute.
  3. Register for callbacks using registerFingerprintGestureCallback().

Note: You should allow users to disable an accessibility service's support for fingerprint gestures. Although multiple accessibility services can listen for fingerprint gestures simultaneously, doing so causes the services to conflict with each other.

Keep in mind that not all devices include fingerprint sensors. To identify whether a device supports the sensor, use the isHardwareDetected() method. Even on a device that includes a fingerprint sensor, your service cannot use the sensor when it's in use for authentication purposes. To identify when the sensor is available, call the isGestureDetectionAvailable() method and implement the onGestureDetectionAvailabilityChanged() callback.

The following code snippet shows an example of using fingerprint gestures to navigate around a virtual game board:

AndroidManifest.xml

<manifest ... >
    <uses-permission android:name="android.permission.USE_FINGERPRINT" />
    ...
    <application>
        <service android:name="com.example.MyFingerprintGestureService" ... >
            <meta-data
                android:name="android.accessibilityservice"
                android:resource="@xml/myfingerprintgestureservice" />
        </service>
    </application>
</manifest>

myfingerprintgestureservice.xml

<accessibility-service xmlns:android="http://schemas.android.com/apk/res/android"
    ...
    android:accessibilityFlags=" ... |flagRequestFingerprintGestures"
    android:canRequestFingerprintGestures="true"
    ... />

MyFingerprintGestureService.java

Kotlin

import android.accessibilityservice.FingerprintGestureController.*

class MyFingerprintGestureService : AccessibilityService() {

    private var mGestureController: FingerprintGestureController? = null
    private var mFingerprintGestureCallback:
            FingerprintGestureController.FingerprintGestureCallback? = null
    private var mIsGestureDetectionAvailable: Boolean = false

    override fun onCreate() {
        mGestureController = fingerprintGestureController
        mIsGestureDetectionAvailable = mGestureController?.isGestureDetectionAvailable ?: false
    }

    override fun onServiceConnected() {
        if (mFingerprintGestureCallback != null || !mIsGestureDetectionAvailable) return

        mFingerprintGestureCallback =
                object : FingerprintGestureController.FingerprintGestureCallback() {
                    override fun onGestureDetected(gesture: Int) {
                        when (gesture) {
                            FINGERPRINT_GESTURE_SWIPE_DOWN -> moveGameCursorDown()
                            FINGERPRINT_GESTURE_SWIPE_LEFT -> moveGameCursorLeft()
                            FINGERPRINT_GESTURE_SWIPE_RIGHT -> moveGameCursorRight()
                            FINGERPRINT_GESTURE_SWIPE_UP -> moveGameCursorUp()
                            else -> Log.e(MY_APP_TAG, "Error: Unknown gesture type detected!")
                        }
                    }

                    override fun onGestureDetectionAvailabilityChanged(available: Boolean) {
                        mIsGestureDetectionAvailable = available
                    }
                }

        mFingerprintGestureCallback?.also {
            mGestureController?.registerFingerprintGestureCallback(it, null)
        }
    }
}

Java

import static android.accessibilityservice.FingerprintGestureController.*;

public class MyFingerprintGestureService extends AccessibilityService {
    private FingerprintGestureController mGestureController;
    private FingerprintGestureController
            .FingerprintGestureCallback mFingerprintGestureCallback;
    private boolean mIsGestureDetectionAvailable;

    @Override
    public void onCreate() {
        mGestureController = getFingerprintGestureController();
        mIsGestureDetectionAvailable =
                mGestureController.isGestureDetectionAvailable();
    }

    @Override
    protected void onServiceConnected() {
        if (mFingerprintGestureCallback != null
                || !mIsGestureDetectionAvailable) {
            return;
        }

        mFingerprintGestureCallback =
               new FingerprintGestureController.FingerprintGestureCallback() {
            @Override
            public void onGestureDetected(int gesture) {
                switch (gesture) {
                    case FINGERPRINT_GESTURE_SWIPE_DOWN:
                        moveGameCursorDown();
                        break;
                    case FINGERPRINT_GESTURE_SWIPE_LEFT:
                        moveGameCursorLeft();
                        break;
                    case FINGERPRINT_GESTURE_SWIPE_RIGHT:
                        moveGameCursorRight();
                        break;
                    case FINGERPRINT_GESTURE_SWIPE_UP:
                        moveGameCursorUp();
                        break;
                    default:
                        Log.e(MY_APP_TAG,
                                  "Error: Unknown gesture type detected!");
                        break;
                }
            }

            @Override
            public void onGestureDetectionAvailabilityChanged(boolean available) {
                mIsGestureDetectionAvailable = available;
            }
        };

        if (mFingerprintGestureCallback != null) {
            mGestureController.registerFingerprintGestureCallback(
                    mFingerprintGestureCallback, null);
        }
    }
}

For more information, see the What's New In Android Accessibility session video from Google I/O 2017, starting at 9:03.

Multilingual text to speech

As of Android 8.0 (API level 26), Android's text-to-speech (TTS) service can identify and speak phrases in multiple languages within a single block of text. To enable this automatic language-switching capability in an accessibility service, wrap all strings in LocaleSpan objects, as shown in the following code snippet:

Kotlin

val localeWrappedTextView = findViewById<TextView>(R.id.my_french_greeting_text).apply {
    text = wrapTextInLocaleSpan("Bonjour!", Locale.FRANCE)
}

private fun wrapTextInLocaleSpan(originalText: CharSequence, loc: Locale): SpannableStringBuilder {
    return SpannableStringBuilder(originalText).apply {
        setSpan(LocaleSpan(loc), 0, originalText.length - 1, 0)
    }
}

Java

TextView localeWrappedTextView = findViewById(R.id.my_french_greeting_text);
localeWrappedTextView.setText(wrapTextInLocaleSpan("Bonjour!", Locale.FRANCE));

private SpannableStringBuilder wrapTextInLocaleSpan(
        CharSequence originalText, Locale loc) {
    SpannableStringBuilder myLocaleBuilder =
            new SpannableStringBuilder(originalText);
    myLocaleBuilder.setSpan(new LocaleSpan(loc), 0,
            originalText.length() - 1, 0);
    return myLocaleBuilder;
}

For more information, see the What's New In Android Accessibility session video from Google I/O 2017, starting at 10:59.

Take action for users

Starting with Android 4.0 (API Level 14), accessibility services can act on behalf of users, including changing the input focus and selecting (activating) user interface elements. In Android 4.1 (API Level 16) the range of actions has been expanded to include scrolling lists and interacting with text fields. Accessibility services can also take global actions, such as navigating to the Home screen, pressing the Back button, opening the notifications screen and recent applications list. Android 4.1 also includes a new type of focus, Accessibilty Focus, which makes all visible elements selectable by an accessibility service.

These new capabilities make it possible for developers of accessibility services to create alternative navigation modes such as gesture navigation, and give users with disabilities improved control of their Android devices.

Listen for gestures

Accessibility services can listen for specific gestures and respond by taking action on behalf of a user. This feature, added in Android 4.1 (API Level 16), and requires that your accessibility service request activation of the Explore by Touch feature. Your service can request this activation by setting the flags member of the service's AccessibilityServiceInfo instance to FLAG_REQUEST_TOUCH_EXPLORATION_MODE, as shown in the following example.

Kotlin

class MyAccessibilityService : AccessibilityService() {

    override fun onCreate() {
        serviceInfo.flags = AccessibilityServiceInfo.FLAG_REQUEST_TOUCH_EXPLORATION_MODE
    }
    ...
}

Java

public class MyAccessibilityService extends AccessibilityService {
    @Override
    public void onCreate() {
        getServiceInfo().flags = AccessibilityServiceInfo.FLAG_REQUEST_TOUCH_EXPLORATION_MODE;
    }
    ...
}

Once your service has requested activation of Explore by Touch, the user must allow the feature to be turned on, if it is not already active. When this feature is active, your service receives notification of accessibility gestures through your service's onGesture() callback method and can respond by taking actions for the user.

Continued gestures

Devices running Android 8.0 (API level 26) include support for continued gestures, or programmatic gestures containing more than one Path object.

When specifying sequences of strokes, you must specify that they belong to the same programmatic gesture by using the final argument, willContinue, in the GestureDescription.StrokeDescription constructor, as shown in the following code snippet:

Kotlin

// Simulates an L-shaped drag path: 200 pixels right, then 200 pixels down.
private fun doRightThenDownDrag() {
    val dragRightPath = Path().apply {
        moveTo(200f, 200f)
        lineTo(400f, 200f)
    }
    val dragRightDuration = 500L // 0.5 second

    // The starting point of the second path must match
    // the ending point of the first path.
    val dragDownPath = Path().apply {
        moveTo(400f, 200f)
        lineTo(400f, 400f)
    }
    val dragDownDuration = 500L
    val rightThenDownDrag = GestureDescription.StrokeDescription(
            dragRightPath,
            0L,
            dragRightDuration,
            true
    ).apply {
        continueStroke(dragDownPath, dragRightDuration, dragDownDuration, false)
    }
}

Java

// Simulates an L-shaped drag path: 200 pixels right, then 200 pixels down.
private void doRightThenDownDrag() {
    Path dragRightPath = new Path();
    dragRightPath.moveTo(200, 200);
    dragRightPath.lineTo(400, 200);
    long dragRightDuration = 500L; // 0.5 second

    // The starting point of the second path must match
    // the ending point of the first path.
    Path dragDownPath = new Path();
    dragDownPath.moveTo(400, 200);
    dragDownPath.lineTo(400, 400);
    long dragDownDuration = 500L;
    GestureDescription.StrokeDescription rightThenDownDrag =
            new GestureDescription.StrokeDescription(dragRightPath, 0L,
            dragRightDuration, true);
    rightThenDownDrag.continueStroke(dragDownPath, dragRightDuration,
            dragDownDuration, false);
}

For more information, see the What's New In Android Accessibility session video from Google I/O 2017, starting at 15:47.

Use accessibility actions

Accessibility services can take action on behalf of users to make interacting with applications simpler and more productive. The ability of accessibility services to perform actions was added in Android 4.0 (API Level 14) and significantly expanded with Android 4.1 (API Level 16).

In order to take actions on behalf of users, your accessibility service must register to receive events from a few or many applications and request permission to view the content of applications by setting the android:canRetrieveWindowContent to true in the service configuration file. When events are received by your service, it can then retrieve the AccessibilityNodeInfo object from the event using getSource(). With the AccessibilityNodeInfo object, your service can then explore the view hierarchy to determine what action to take and then act for the user using performAction().

Kotlin

class MyAccessibilityService : AccessibilityService() {

    override fun onAccessibilityEvent(event: AccessibilityEvent) {
        // get the source node of the event
        event.source?.apply {

            // Use the event and node information to determine
            // what action to take

            // take action on behalf of the user
            performAction(AccessibilityNodeInfo.ACTION_SCROLL_FORWARD)

            // recycle the nodeInfo object
            recycle()
        }
    }
    ...
}

Java

public class MyAccessibilityService extends AccessibilityService {

    @Override
    public void onAccessibilityEvent(AccessibilityEvent event) {
        // get the source node of the event
        AccessibilityNodeInfo nodeInfo = event.getSource();

        // Use the event and node information to determine
        // what action to take

        // take action on behalf of the user
        nodeInfo.performAction(AccessibilityNodeInfo.ACTION_SCROLL_FORWARD);

        // recycle the nodeInfo object
        nodeInfo.recycle();
    }
    ...
}

The performAction() method allows your service to take action within an application. If your service needs to perform a global action such as navigating to the Home screen, pressing the Back button, opening the notifications screen or recent applications list, then use the performGlobalAction() method.

Use focus types

Android 4.1 (API Level 16) introduces a new type of user interface focus called Accessibility Focus. Accessibility services can used this type of focus to select any visible user interface element and act on it. This focus type is different from the more well known Input Focus, which determines what on-screen user interface element receives input when a user types characters, presses Enter on a keyboard or pushes the center button of a D-pad control.

Accessibility Focus is completely separate and independent from Input Focus. In fact, it is possible for one element in a user interface to have Input Focus while another element has Accessibility Focus. The purpose of Accessibility Focus is to provide accessibility services with a method of interacting with any visible element on a screen, regardless of whether or not the element is input-focusable from a system perspective. You can see accessibility focus in action by testing accessibility gestures. For more information about testing this feature, see Testing gesture navigation.

Note: Accessibility services that use Accessibility Focus are responsible for synchronizing the current Input Focus when an element is capable of this type of focus. Services that do not synchronize Input Focus with Accessibility Focus run the risk of causing problems in applications that expect input focus to be in a specific location when certain actions are taken.

An accessibility service can determine what user interface element has Input Focus or Accessibility Focus using the AccessibilityNodeInfo.findFocus() method. You can also search for elements that can be selected with Input Focus using the focusSearch() method. Finally, your accessibility service can set Accessibility Focus using the performAction(AccessibilityNodeInfo.ACTION_SET_ACCESSIBILITY_FOCUS) method.

Gather information

Accessibility services also have standard methods of gathering and representing key units of user-provided information, such as event details, text, and numbers.

Get event details

The Android system provides information to accessibility services about the user interface interaction through AccessibilityEvent objects. Prior to Android 4.0, the information available in an accessibility event, while providing a significant amount of detail about a user interface control selected by the user, offered limited contextual information. In many cases, this missing context information might be critical to understanding the meaning of the selected control.

An example of an interface where context is critical is a calendar or day planner. If the user selects a 4:00 PM time slot in a Monday to Friday day list and the accessibility service announces “4 PM”, but does not announce the weekday name, the day of the month, or the month name, the resulting feedback is confusing. In this case, the context of a user interface control is critical to a user who wants to schedule a meeting.

Android 4.0 significantly extends the amount of information that an accessibility service can obtain about an user interface interaction by composing accessibility events based on the view hierarchy. A view hierarchy is the set of user interface components that contain the component (its parents) and the user interface elements that may be contained by that component (its children). In this way, the Android system can provide much richer detail about accessibility events, allowing accessibility services to provide more useful feedback to users.

An accessibility service gets information about an user interface event through an AccessibilityEvent passed by the system to the service's onAccessibilityEvent() callback method. This object provides details about the event, including the type of object being acted upon, its descriptive text and other details. Starting in Android 4.0 (and supported in previous releases through the AccessibilityEventCompat object in the Support Library), you can obtain additional information about the event using these calls:

  • AccessibilityEvent.getRecordCount() and getRecord(int) - These methods allow you to retrieve the set of AccessibilityRecord objects which contributed to the AccessibilityEvent passed to you by the system. This level of detail provides more context for the event that triggered your accessibility service.
  • AccessibilityEvent.getSource() - This method returns an AccessibilityNodeInfo object. This object allows you to request view layout hierarchy (parents and children) of the component that originated the accessibility event. This feature allows an accessibility service to investigate the full context of an event, including the content and state of any enclosing views or child views.

    Important: The ability to investigate the view hierarchy from an AccessibilityEvent potentially exposes private user information to your accessibility service. For this reason, your service must request this level of access through the accessibility service configuration XML file, by including the canRetrieveWindowContent attribute and setting it to true. If you do not include this setting in your service configuration xml file, calls to getSource() fail.

    Note: In Android 4.1 (API Level 16) and higher, the getSource() method, as well as AccessibilityNodeInfo.getChild() and getParent(), return only view objects that are considered important for accessibility (views that draw content or respond to user actions). If your service requires all views, it can request them by setting the flags member of the service's AccessibilityServiceInfo instance to FLAG_INCLUDE_NOT_IMPORTANT_VIEWS.

Process text

Devices running Android 8.0 (API level 26) and higher include several text-processing features that make it easier for accessibility services to identify and operate on specific units of text that appear on screen.

Hint text

Android 8.0 (API level 26) includes several methods for interacting with a text-based object's hint text:

Locations of on-screen text characters

On devices running Android 8.0 (API level 26) and higher, accessibility services can determine the screen coordinates for each visible character's bounding box within a TextView widget. Services find these coordinates by calling refreshWithExtraData(), passing in EXTRA_DATA_TEXT_CHARACTER_LOCATION_KEY as the first argument and a Bundle object as the second argument. As the method executes, the system populates the Bundle argument with a parcelable array of Rect objects. Each Rect object represents the bounding box of a particular character.

Standardized one-sided range values

Some AccessibilityNodeInfo objects use an instance of AccessibilityNodeInfo.RangeInfo to indicate that a UI element can take on a range of values. When creating a range using RangeInfo.obtain(), or when retrieving the extreme values of the range using getMin() and getMax(), keep in mind that devices running Android 8.0 (API level 26) and higher represent one-sided ranges in a standardized manner:

Sample code

The API Demo project contains two samples which can be used as a starting point for generating accessibility services (<sdk>/samples/<platform>/ApiDemos/src/com/example/android/apis/accessibility):