InferenceInput.Builder


public static final class InferenceInput.Builder
extends Object

java.lang.Object
   ↳ android.adservices.ondevicepersonalization.InferenceInput.Builder


A builder for InferenceInput

Summary

Public methods

InferenceInput build()

Builds the instance.

InferenceInput.Builder setBatchSize(int value)

The number of input examples.

InferenceInput.Builder setExpectedOutputStructure(InferenceOutput value)

The empty InferenceOutput representing the expected output structure.

InferenceInput.Builder setInputData(Object... value)

Note: use ERROR(InferenceInput.Builder.setInputData(byte[])/android.adservices.ondevicepersonalization.InferenceInput.Builder#setInputData(byte[]) InferenceInput.Builder.setInputData(byte[])) instead.

InferenceInput.Builder setParams(InferenceInput.Params value)

The configuration that controls runtime interpreter behavior.

Inherited methods

Object clone()

Creates and returns a copy of this object.

boolean equals(Object obj)

Indicates whether some other object is "equal to" this one.

void finalize()

Called by the garbage collector on an object when garbage collection determines that there are no more references to the object.

final Class<?> getClass()

Returns the runtime class of this Object.

int hashCode()

Returns a hash code value for the object.

final void notify()

Wakes up a single thread that is waiting on this object's monitor.

final void notifyAll()

Wakes up all threads that are waiting on this object's monitor.

String toString()

Returns a string representation of the object.

final void wait(long timeoutMillis, int nanos)

Causes the current thread to wait until it is awakened, typically by being notified or interrupted, or until a certain amount of real time has elapsed.

final void wait(long timeoutMillis)

Causes the current thread to wait until it is awakened, typically by being notified or interrupted, or until a certain amount of real time has elapsed.

final void wait()

Causes the current thread to wait until it is awakened, typically by being notified or interrupted.

Public constructors

Builder

Added in API level 35
public Builder (InferenceInput.Params params, 
                Object[] inputData, 
                InferenceOutput expectedOutputStructure)

Note: use ERROR(InferenceInput.Builder.Builder(Params, byte[])/android.adservices.ondevicepersonalization.InferenceInput.Builder#Builder(android.adservices.ondevicepersonalization.InferenceInput.Params,byte[]) InferenceInput.Builder.Builder(Params, byte[])) instead.

Creates a new Builder for LiteRT model inference input. For LiteRT, inputData field is mapped to inputs of runForMultipleInputsOutputs: https://www.tensorflow.org/lite/api_docs/java/org/tensorflow/lite/InterpreterApi#parameters_9 The inputs should be in the same order as inputs * of the model. *

For example, if a model takes multiple inputs: *

String[] input0 = {"foo", "bar"}; // string tensor shape is [2].
 int[] input1 = new int[]{3, 2, 1}; // int tensor shape is [3].
 Object[] inputData = {input0, input1, ...};
 
For LiteRT, the inference code will verify whether the expected output structure matches model output signature.

If a model produce string tensors:

String[] output = new String[3][2];  // Output tensor shape is [3, 2].
 HashMap<Integer, Object> outputs = new HashMap<>();
 outputs.put(0, output);
 expectedOutputStructure = new InferenceOutput.Builder().setDataOutputs(outputs).build();

 

Parameters
params InferenceInput.Params: configuration that controls runtime interpreter behavior. This value cannot be null.

inputData Object: an array of input data. This value cannot be null.

expectedOutputStructure InferenceOutput: an empty InferenceOutput representing the expected output structure. This value cannot be null.

Public methods

build

Added in API level 35
public InferenceInput build ()

Builds the instance. This builder should not be touched after calling this!

Returns
InferenceInput This value cannot be null.

setBatchSize

Added in API level 35
public InferenceInput.Builder setBatchSize (int value)

The number of input examples. Adopter can set this field to run batching inference. The batch size is 1 by default. The batch size should match the input data size.

Parameters
value int

Returns
InferenceInput.Builder This value cannot be null.

setExpectedOutputStructure

Added in API level 35
public InferenceInput.Builder setExpectedOutputStructure (InferenceOutput value)

The empty InferenceOutput representing the expected output structure. It's only required by LiteRT model. For LiteRT, the inference code will verify whether this expected output structure matches model output signature.

If a model produce string tensors:

String[] output = new String[3][2];  // Output tensor shape is [3, 2].
 HashMap<Integer, Object> outputs = new HashMap<>();
 outputs.put(0, output);
 expectedOutputStructure = new InferenceOutput.Builder().setDataOutputs(outputs).build();
 

Parameters
value InferenceOutput: This value cannot be null.

Returns
InferenceInput.Builder This value cannot be null.

setInputData

Added in API level 35
public InferenceInput.Builder setInputData (Object... value)

Note: use ERROR(InferenceInput.Builder.setInputData(byte[])/android.adservices.ondevicepersonalization.InferenceInput.Builder#setInputData(byte[]) InferenceInput.Builder.setInputData(byte[])) instead.

An array of input data. The inputs should be in the same order as inputs of the model.

For example, if a model takes multiple inputs:

String[] input0 = {"foo", "bar"}; // string tensor shape is [2].
 int[] input1 = new int[]{3, 2, 1}; // int tensor shape is [3].
 Object[] inputData = {input0, input1, ...};
 
For LiteRT, this field is mapped to inputs of runForMultipleInputsOutputs: https://www.tensorflow.org/lite/api_docs/java/org/tensorflow/lite/InterpreterApi#parameters_9

Parameters
value Object: This value cannot be null.

Returns
InferenceInput.Builder This value cannot be null.

setParams

Added in API level 35
public InferenceInput.Builder setParams (InferenceInput.Params value)

The configuration that controls runtime interpreter behavior.

Parameters
value InferenceInput.Params: This value cannot be null.

Returns
InferenceInput.Builder This value cannot be null.