InferenceInput.Params.Builder
public
static
final
class
InferenceInput.Params.Builder
extends Object
java.lang.Object | |
↳ | android.adservices.ondevicepersonalization.InferenceInput.Params.Builder |
A builder for Params
Summary
Public constructors | |
---|---|
Builder(KeyValueStore keyValueStore, String modelKey)
Creates a new Builder. |
Public methods | |
---|---|
InferenceInput.Params
|
build()
Builds the instance. |
InferenceInput.Params.Builder
|
setDelegateType(int value)
The delegate to run model inference. |
InferenceInput.Params.Builder
|
setKeyValueStore(KeyValueStore value)
A |
InferenceInput.Params.Builder
|
setModelKey(String value)
The key of the table where the corresponding value stores a pre-trained model. |
InferenceInput.Params.Builder
|
setModelType(int value)
The type of the pre-trained model. |
InferenceInput.Params.Builder
|
setRecommendedNumThreads(int value)
The number of threads used for intraop parallelism on CPU, must be positive number. |
Inherited methods | |
---|---|
Public constructors
Builder
public Builder (KeyValueStore keyValueStore, String modelKey)
Creates a new Builder.
Parameters | |
---|---|
keyValueStore |
KeyValueStore : a KeyValueStore where pre-trained model is stored.
This value cannot be null . |
modelKey |
String : key of the table where the corresponding value stores a pre-trained
model.
This value cannot be null . |
Public methods
build
public InferenceInput.Params build ()
Builds the instance. This builder should not be touched after calling this!
Returns | |
---|---|
InferenceInput.Params |
This value cannot be null . |
setDelegateType
public InferenceInput.Params.Builder setDelegateType (int value)
The delegate to run model inference. If not set, the default value is InferenceInput.Params.DELEGATE_CPU
.
Parameters | |
---|---|
value |
int : Value is InferenceInput.Params.DELEGATE_CPU |
Returns | |
---|---|
InferenceInput.Params.Builder |
This value cannot be null . |
setKeyValueStore
public InferenceInput.Params.Builder setKeyValueStore (KeyValueStore value)
A KeyValueStore
where pre-trained model is stored.
Parameters | |
---|---|
value |
KeyValueStore : This value cannot be null . |
Returns | |
---|---|
InferenceInput.Params.Builder |
This value cannot be null . |
setModelKey
public InferenceInput.Params.Builder setModelKey (String value)
The key of the table where the corresponding value stores a pre-trained model.
Parameters | |
---|---|
value |
String : This value cannot be null . |
Returns | |
---|---|
InferenceInput.Params.Builder |
This value cannot be null . |
setModelType
public InferenceInput.Params.Builder setModelType (int value)
The type of the pre-trained model. If not set, the default value is InferenceInput.Params.MODEL_TYPE_TENSORFLOW_LITE
.
Parameters | |
---|---|
value |
int : Value is InferenceInput.Params.MODEL_TYPE_TENSORFLOW_LITE , or android.adservices.ondevicepersonalization.InferenceInput.Params.MODEL_TYPE_EXECUTORCH |
Returns | |
---|---|
InferenceInput.Params.Builder |
This value cannot be null . |
setRecommendedNumThreads
public InferenceInput.Params.Builder setRecommendedNumThreads (int value)
The number of threads used for intraop parallelism on CPU, must be positive number. Adopters can set this field based on model architecture. The actual thread number depends on system resources and other constraints.
Parameters | |
---|---|
value |
int : Value is 1 or greater |
Returns | |
---|---|
InferenceInput.Params.Builder |
This value cannot be null . |