InferenceInput.Params.Builder


public static final class InferenceInput.Params.Builder
extends Object

java.lang.Object
   ↳ android.adservices.ondevicepersonalization.InferenceInput.Params.Builder


A builder for Params

Summary

Public constructors

Builder(KeyValueStore keyValueStore, String modelKey)

Creates a new Builder.

Public methods

InferenceInput.Params build()

Builds the instance.

InferenceInput.Params.Builder setDelegateType(int value)

The delegate to run model inference.

InferenceInput.Params.Builder setKeyValueStore(KeyValueStore value)

A KeyValueStore where pre-trained model is stored.

InferenceInput.Params.Builder setModelKey(String value)

The key of the table where the corresponding value stores a pre-trained model.

InferenceInput.Params.Builder setModelType(int value)

The type of the pre-trained model.

InferenceInput.Params.Builder setRecommendedNumThreads(int value)

The number of threads used for intraop parallelism on CPU, must be positive number.

Inherited methods

Public constructors

Builder

public Builder (KeyValueStore keyValueStore, 
                String modelKey)

Creates a new Builder.

Parameters
keyValueStore KeyValueStore: A KeyValueStore where pre-trained model is stored. Only supports TFLite model now. This value cannot be null.

modelKey String: The key of the table where the corresponding value stores a pre-trained model. Only supports TFLite model now. This value cannot be null.

Public methods

build

public InferenceInput.Params build ()

Builds the instance.

Returns
InferenceInput.Params This value cannot be null.

setDelegateType

public InferenceInput.Params.Builder setDelegateType (int value)

The delegate to run model inference. If not set, the default value is InferenceInput.Params.DELEGATE_CPU.

Parameters
value int: Value is InferenceInput.Params.DELEGATE_CPU

Returns
InferenceInput.Params.Builder This value cannot be null.

setKeyValueStore

public InferenceInput.Params.Builder setKeyValueStore (KeyValueStore value)

A KeyValueStore where pre-trained model is stored. Only supports TFLite model now.

Parameters
value KeyValueStore: This value cannot be null.

Returns
InferenceInput.Params.Builder This value cannot be null.

setModelKey

public InferenceInput.Params.Builder setModelKey (String value)

The key of the table where the corresponding value stores a pre-trained model. Only supports TFLite model now.

Parameters
value String: This value cannot be null.

Returns
InferenceInput.Params.Builder This value cannot be null.

setModelType

public InferenceInput.Params.Builder setModelType (int value)

The type of the pre-trained model. If not set, the default value is InferenceInput.Params.MODEL_TYPE_TENSORFLOW_LITE . Only supports InferenceInput.Params.MODEL_TYPE_TENSORFLOW_LITE for now.

Parameters
value int: Value is InferenceInput.Params.MODEL_TYPE_TENSORFLOW_LITE

Returns
InferenceInput.Params.Builder This value cannot be null.

setRecommendedNumThreads

public InferenceInput.Params.Builder setRecommendedNumThreads (int value)

The number of threads used for intraop parallelism on CPU, must be positive number. Adopters can set this field based on model architecture. The actual thread number depends on system resources and other constraints.

Parameters
value int: Value is 1 or greater

Returns
InferenceInput.Params.Builder This value cannot be null.