InferenceInput.Params.Builder


public static final class InferenceInput.Params.Builder
extends Object

java.lang.Object
   ↳ android.adservices.ondevicepersonalization.InferenceInput.Params.Builder


A builder for Params

Summary

Public constructors

Builder(KeyValueStore keyValueStore, String modelKey)

Creates a new Builder.

Public methods

InferenceInput.Params build()

Builds the instance.

InferenceInput.Params.Builder setDelegateType(int value)

The delegate to run model inference.

InferenceInput.Params.Builder setKeyValueStore(KeyValueStore value)

A KeyValueStore where pre-trained model is stored.

InferenceInput.Params.Builder setModelKey(String value)

The key of the table where the corresponding value stores a pre-trained model.

InferenceInput.Params.Builder setModelType(int value)

The type of the pre-trained model.

InferenceInput.Params.Builder setRecommendedNumThreads(int value)

The number of threads used for intraop parallelism on CPU, must be positive number.

Inherited methods

Object clone()

Creates and returns a copy of this object.

boolean equals(Object obj)

Indicates whether some other object is "equal to" this one.

void finalize()

Called by the garbage collector on an object when garbage collection determines that there are no more references to the object.

final Class<?> getClass()

Returns the runtime class of this Object.

int hashCode()

Returns a hash code value for the object.

final void notify()

Wakes up a single thread that is waiting on this object's monitor.

final void notifyAll()

Wakes up all threads that are waiting on this object's monitor.

String toString()

Returns a string representation of the object.

final void wait(long timeoutMillis, int nanos)

Causes the current thread to wait until it is awakened, typically by being notified or interrupted, or until a certain amount of real time has elapsed.

final void wait(long timeoutMillis)

Causes the current thread to wait until it is awakened, typically by being notified or interrupted, or until a certain amount of real time has elapsed.

final void wait()

Causes the current thread to wait until it is awakened, typically by being notified or interrupted.

Public constructors

Builder

Added in API level 35
public Builder (KeyValueStore keyValueStore, 
                String modelKey)

Creates a new Builder.

Parameters
keyValueStore KeyValueStore: a KeyValueStore where pre-trained model is stored. This value cannot be null.

modelKey String: key of the table where the corresponding value stores a pre-trained model. This value cannot be null.

Public methods

build

Added in API level 35
public InferenceInput.Params build ()

Builds the instance. This builder should not be touched after calling this!

Returns
InferenceInput.Params This value cannot be null.

setDelegateType

Added in API level 35
public InferenceInput.Params.Builder setDelegateType (int value)

The delegate to run model inference. If not set, the default value is InferenceInput.Params.DELEGATE_CPU.

Parameters
value int: Value is InferenceInput.Params.DELEGATE_CPU

Returns
InferenceInput.Params.Builder This value cannot be null.

setKeyValueStore

Added in API level 35
public InferenceInput.Params.Builder setKeyValueStore (KeyValueStore value)

A KeyValueStore where pre-trained model is stored.

Parameters
value KeyValueStore: This value cannot be null.

Returns
InferenceInput.Params.Builder This value cannot be null.

setModelKey

Added in API level 35
public InferenceInput.Params.Builder setModelKey (String value)

The key of the table where the corresponding value stores a pre-trained model.

Parameters
value String: This value cannot be null.

Returns
InferenceInput.Params.Builder This value cannot be null.

setModelType

Added in API level 35
public InferenceInput.Params.Builder setModelType (int value)

The type of the pre-trained model. If not set, the default value is InferenceInput.Params.MODEL_TYPE_TENSORFLOW_LITE .

Parameters
value int: Value is InferenceInput.Params.MODEL_TYPE_TENSORFLOW_LITE, or android.adservices.ondevicepersonalization.InferenceInput.Params.MODEL_TYPE_EXECUTORCH

Returns
InferenceInput.Params.Builder This value cannot be null.

setRecommendedNumThreads

Added in API level 35
public InferenceInput.Params.Builder setRecommendedNumThreads (int value)

The number of threads used for intraop parallelism on CPU, must be positive number. Adopters can set this field based on model architecture. The actual thread number depends on system resources and other constraints.

Parameters
value int: Value is 1 or greater

Returns
InferenceInput.Params.Builder This value cannot be null.