tensorflow/lite/g3doc/api_docs/java/org/tensorflow/lite/support/model/Model.html
public class Model
The wrapper class for a TFLite model and a TFLite interpreter.
Note: A Model can only holds 1 TFLite model at a time, and always holds a TFLite interpreter instance to run it.
| class | Model.Builder | This class is deprecated. Please use Model.createModel(Context, String, Options). |
| enum | Model.Device | The runtime device type used for executing classification. |
| class | Model.Options | Options for running the model. |
| void | close() | | static Model | createModel(Context context, String modelPath, Model.Options options) Loads a model from assets and initialize TFLite interpreter with given options.
| | static Model | createModel(Context context, String modelPath) Loads a model from assets and initialize TFLite interpreter.
| | MappedByteBuffer | getData() Returns the memory-mapped model data.
| | Tensor | getInputTensor(int inputIndex) Gets the Tensor associated with the provided input index.
| | Tensor | getOutputTensor(int outputIndex) Gets the Tensor associated with the provided output index.
| | int[] | getOutputTensorShape(int outputIndex) Returns the output shape.
| | String | getPath() Returns the path of the model file stored in Assets.
| | void | run(Object[] inputs, Map<Integer, Object> outputs) Runs model inference on multiple inputs, and returns multiple outputs.
|
From class java.lang.Object
| boolean | equals(Object arg0) | | final Class<?> | getClass() | | int | hashCode() | | final void | notify() | | final void | notifyAll() | | String | toString() | | final void | wait(long arg0, int arg1) | | final void | wait(long arg0) | | final void | wait() |
Loads a model from assets and initialize TFLite interpreter with given options.
| context | The App Context. | | modelPath | The path of the model file. | | options | The options for running the model. |
| IOException | if any exception occurs when open the model file. |
for details.Loads a model from assets and initialize TFLite interpreter.
The default options are: (1) CPU device; (2) one thread.
| context | The App Context. | | modelPath | The path of the model file. |
| IOException | if any exception occurs when open the model file. |
Returns the memory-mapped model data.
Gets the Tensor associated with the provided input index.
| inputIndex | |
| IllegalStateException | if the interpreter is closed. |
Gets the Tensor associated with the provided output index.
| outputIndex | |
| IllegalStateException | if the interpreter is closed. |
Returns the output shape. Useful if output shape is only determined when graph is created.
| outputIndex | |
| IllegalStateException | if the interpreter is closed. |
Returns the path of the model file stored in Assets.
Runs model inference on multiple inputs, and returns multiple outputs.
| inputs | an array of input data. The inputs should be in the same order as inputs of the model. Each input can be an array or multidimensional array, or a ByteBuffer of primitive types including int, float, long, and byte. ByteBuffer is the preferred way to pass large input data, whereas string types require using the (multi-dimensional) array input path. When ByteBuffer is used, its content should remain unchanged until model inference is done. |
| outputs | a map mapping output indices to multidimensional arrays of output data or ByteBuffers of primitive types including int, float, long, and byte. It only needs to keep entries for the outputs to be used. |