Metrics are the main type of information extracted from your benchmarks. They
are passed to the
measureRepeated function as a
List, which lets you specify
multiple measured metrics at once. At least one type of metric is required for
the benchmark to run.
The following code snippet captures frame timing and custom trace section metrics.
benchmarkRule.measureRepeated( packageName = TARGET_PACKAGE, metrics = listOf( FrameTimingMetric(), TraceSectionMetric("RV CreateView"), TraceSectionMetric("RV OnBindView"), ), // ... )
benchmarkRule.measureRepeated( /\* packageName \*/ TARGET_PACKAGE, /\* metrics \*/ Arrays.asList( new StartupTimingMetric(), new TraceSectionMetric("RV CreateView"), new TraceSectionMetric("RV OnBindView"), ), /\* iterations \*/ 5, // ... );
Benchmark results are output to Android Studio, as shown in the following image. If multiple metrics are defined, all of them are combined in the output.
captures app startup timing metrics with the following values:
timeToInitialDisplayMs: The amount of time from when the system receives a launch intent to when it renders the first frame of the destination
timeToFullDisplayMs: The amount of time from when the system receives a launch intent to when the application reports fully drawn using the
reportFullyDrawnmethod. The measurement stops at the completion of rendering the first frame after—or containing—the
reportFullyDrawn()call. This measurement might not be available on Android 10 (API level 29) and lower.
For more information about what contributes to application startup time, see App startup time.
Improve startup timing accuracy
The two key metrics for measuring app startup times are time to initial display (TTID) and time to full display (TTFD). TTID is the time it takes to display the first frame of the application UI. TTFD also includes the time to display any content that is loaded asynchronously after the initial frame is displayed.
TTFD is reported once the
method of the
reportFullyDrawn() is never called, the TTID is reported instead.
You might need to delay when
reportFullyDrawn() is called until after the
asynchronous loading is complete. For example, if the UI contains a dynamic list
such as a
lazy list, this might be populated by a
background task that completes after the list is first drawn and, therefore,
after the UI is marked as fully drawn. In such cases, the list population isn't
included in the benchmarking.
To include the list population as part of your benchmark timing, get the
FullyDrawnReporter by using
and add a reporter to it in your app code. You must release the reporter once
the background task finishes populating the list. The
doesn't call the
method until all added reporters are released. By adding a recorder untilx
the background process completes, the timings also include the amount of time it
takes to populate the list in the startup timing data. This doesn't change the
app's behavior for the user, but it lets the timing startup data include the
time it takes to populate the list.
If your app uses Jetpack Compose, you can use the following APIs to indicate fully drawn state:
ReportDrawnIndicates that your composable is immediately ready for interaction.
ReportDrawnWhentakes a predicate, such as
list.count > 0, to indicate when your composable is ready for interaction.
ReportDrawnAfterTakes a suspending method that, when it completes, indicates that your composable is ready for interaction.
captures timing information from frames produced by a benchmark, such as a
scrolling or animation, and outputs the following values:
frameOverrunMs: The amount of time a given frame misses its deadline by. Positive numbers indicate a dropped frame and visible jank or stutter. Negative numbers indicate how much faster a frame is than the deadline. Note: This is available only on Android 12 (API level 31) and higher.
frameDurationCpuMs: The amount of time the frame takes to be produced on the CPU on both the UI thread and the
These measurements are collected in a distribution: 50th, 90th, 95th, and 99th percentile.
For more information about how to identify and improve slow frames, see Slow rendering.
captures the amount of time taken by a trace section matching the provided
sectionName and outputs min, median, and maximum time in
milliseconds. The trace section is defined either by the function call
trace(sectionName) or the code between
Trace.endSection() or their async variants. It always selects the first
instance of a trace section captured during a measurement.
For more information about tracing, see Overview of system tracing and Define custom events.
the change in power or energy over the duration of your test for the provided
power categories. Each
selected category is broken down into its measurable subcomponents, and
unselected categories are added to the "unselected" metric. Metrics measure
system-wide consumption, not the consumption on a per-app basis, and are limited
to Pixel 6 and Pixel 6 Pro devices.
power<category>Uw: The amount of power consumed over the duration of your test in this category.
energy<category>Uws: The amount of energy transferred per unit of time for the duration of your test in this category.
Categories include the following:
With some categories, like
CPU, it might be difficult to separate work done by
other processes from work done by your own app. Minimize the interference
by removing or restricting unnecessary apps and accounts.