diff --git a/docs/services/ml/_index.md b/docs/services/ml/_index.md index 8dc3bdf771..b811ec9ac0 100644 --- a/docs/services/ml/_index.md +++ b/docs/services/ml/_index.md @@ -147,6 +147,179 @@ In the absence of metadata, your `.tflite_cpu` model must satisfy the following These requirements are satisfied by a few publicly available model architectures including EfficientDet, MobileNet, and SSD MobileNet V1. You can use one of these architectures or build your own. +## API + +The MLModel service supports the following methods: + +{{< readfile "/static/include/services/apis/ml.md" >}} + +{{% alert title="Tip" color="tip" %}} + +The following code examples assume that you have a robot configured with an `MLModel` service, and that you add the required code to connect to your robot and import any required packages at the top of your code file. +Go to your robot's **Code Sample** tab on the [Viam app](https://app.viam.com) for boilerplate code to connect to your robot. + +{{% /alert %}} + +### Infer + +Take an already ordered input tensor as an array, make an inference on the model, and return an output tensor map. + +{{< tabs >}} +{{% tab name="Go" %}} + +**Parameters:** + +- `ctx` [(Context)](https://pkg.go.dev/context): A Context carries a deadline, a cancellation signal, and other values across API boundaries. +- `tensors` [(ml.Tensors)](https://pkg.go.dev/go.viam.com/rdk@v0.11.1/ml#Tensors): The input map of tensors, as specified in the metadata. + +**Returns:** + +- [(ml.Tensors)](https://pkg.go.dev/go.viam.com/rdk@v0.11.1/ml#Tensors): The output map of tensors, as specified in the metadata, after being run through an inference engine. +- [(error)](https://pkg.go.dev/builtin#error): An error, if one occurred. + +For more information, see the [Go SDK Docs](https://pkg.go.dev/go.viam.com/rdk/services/mlmodel#Service). + +```go {class="line-numbers linkable-line-numbers"} +myMLModel, err := mlmodel.FromRobot(robot, "my_mlmodel_service") + +input_tensors := ml.Tensors{"0": tensor.New(tensor.WithShape(1, 2, 3), tensor.WithBacking(6))} + +output_tensors, err := myMLModel.Infer(ctx.Background(), input_tensors) +``` + +{{% /tab %}} +{{% tab name="Python" %}} + +**Parameters:** + +- `input_tensors` [(Dict[str, NDArray])](https://numpy.org/doc/stable/reference/generated/numpy.ndarray.html): A dictionary of input flat tensors, as specified in the metadata. +- `timeout` [(Optional\[float\])](https://docs.python.org/library/typing.html#typing.Optional): An option to set how long to wait (in seconds) before calling a time-out and closing the underlying RPC call. + +**Returns:** + +- [(`Dict[str, NDArray]`)](https://numpy.org/doc/stable/reference/generated/numpy.ndarray.html): A dictionary of output flat tensors as specified in the metadata, after being run through an inference engine. + +For more information, see the [Python SDK Docs](https://python.viam.dev/autoapi/viam/services/mlmodel/client/index.html#viam.services.mlmodel.client.MLModelClient.infer). + +```python {class="line-numbers linkable-line-numbers"} +import numpy as np + +my_mlmodel = MLModelClient.from_robot(robot=robot, name="my_mlmodel_service") + +nd_array = np.array([1, 2, 3], dtype=np.float64) +input_tensors = {"0": nd_array} + +output_tensors = await my_mlmodel.infer(input_tensors) +``` + +{{% /tab %}} +{{< /tabs >}} + +### Metadata + +Get the metadata: name, data type, expected tensor/array shape, inputs, and outputs associated with the ML model. + +{{< tabs >}} +{{% tab name="Go" %}} + +**Parameters:** + +- `ctx` [(Context)](https://pkg.go.dev/context): A Context carries a deadline, a cancellation signal, and other values across API boundaries. + +**Returns:** + +- [(MLMetadata)](https://pkg.go.dev/go.viam.com/rdk@v0.11.1/services/mlmodel#MLMetadata): Struct containing the metadata of the model file, such as the name of the model, what kind of model it is, and the expected tensor/array shape and types of the inputs and outputs of the model. +- [(error)](https://pkg.go.dev/builtin#error): An error, if one occurred. + +For more information, see the [Go SDK Docs](https://pkg.go.dev/go.viam.com/rdk/services/mlmodel#Service). + +```go {class="line-numbers linkable-line-numbers"} +myMLModel, err := mlmodel.FromRobot(robot, "my_mlmodel_service") + +metadata, err := myMLModel.Metadata(ctx.Background()) +``` + +{{% /tab %}} +{{% tab name="Python" %}} + +**Parameters:** + +- `timeout` [(Optional\[float\])](https://docs.python.org/library/typing.html#typing.Optional): An option to set how long to wait (in seconds) before calling a time-out and closing the underlying RPC call. + +**Returns:** + +- [(`Metadata`)](https://python.viam.dev/autoapi/viam/gen/service/mlmodel/v1/mlmodel_pb2/index.html#viam.gen.service.mlmodel.v1.mlmodel_pb2.Metadata): Name, type, expected tensor/array shape, inputs, and outputs associated with the ML model. + +For more information, see the [Python SDK Docs](https://python.viam.dev/autoapi/viam/services/mlmodel/client/index.html#viam.services.mlmodel.client.MLModelClient.metadata). + +```python {class="line-numbers linkable-line-numbers"} +my_mlmodel = MLModelClient.from_robot(robot=robot, name="my_mlmodel_service") + +metadata = await my_mlmodel.metadata() +``` + +{{% /tab %}} +{{< /tabs >}} + +### DoCommand + +Execute model-specific commands that are not otherwise defined by the service API. +For built-in service models, any model-specific commands available are covered with each model's documentation. +If you are implementing your own navigation service and add features that have no built-in API method, you can access them with `DoCommand`. + +{{< tabs >}} +{{% tab name="Go" %}} + +**Parameters:** + +- `ctx` [(Context)](https://pkg.go.dev/context): A Context carries a deadline, a cancellation signal, and other values across API boundaries. +- `cmd` [(map\[string\]interface{})](https://go.dev/blog/maps): The command to execute. + +**Returns:** + +- [(map\[string\]interface{})](https://go.dev/blog/maps): Result of the executed command. +- [(error)](https://pkg.go.dev/builtin#error): An error, if one occurred. + +```go {class="line-numbers linkable-line-numbers"} +myMLModel, err := mlmodel.FromRobot(robot, "my_mlmodel_service") + +resp, err := myMLModel.DoCommand(ctx, map[string]interface{}{"command": "dosomething", "someparameter": 52}) +``` + +For more information, see the [Go SDK Docs](https://pkg.go.dev/go.viam.com/rdk/resource#Resource). + +{{% /tab %}} +{{% tab name="Python" %}} + +**Parameters:** + +- `command` [(Mapping[str, ValueTypes])](https://docs.python.org/3/library/stdtypes.html#typesmapping): The command to execute. +- `timeout` [(Optional\[float\])](https://docs.python.org/library/typing.html#typing.Optional): An option to set how long to wait (in seconds) before calling a time-out and closing the underlying RPC call. + +**Returns:** + +- [(Mapping[str, ValueTypes])](https://docs.python.org/3/library/stdtypes.html#typesmapping): Result of the executed command. + +**Raises:** + +- `NotImplementedError`: Raised if the Resource does not support arbitrary commands. + +```python {class="line-numbers linkable-line-numbers"} +my_mlmodel = MLModelClient.from_robot(robot=robot, name="my_mlmodel_service") + +my_command = { + "command": "dosomething", + "someparameter": 52 +} + +await my_mlmodel.do_command(my_command) +``` + +For more information, see the [Python SDK Docs](https://python.viam.dev/autoapi/viam/services/mlmodel/client/index.html#viam.services.mlmodel.client.MLModelClient.do_command). + +{{% /tab %}} +{{< /tabs >}} + ## Use the ML model service with the Viam Python SDK To use the ML model service from the [Viam Python SDK](https://python.viam.dev/), install the Python SDK using the `mlmodel` extra: diff --git a/static/include/services/apis/ml.md b/static/include/services/apis/ml.md index a52f7324d9..6ea0e95dcd 100644 --- a/static/include/services/apis/ml.md +++ b/static/include/services/apis/ml.md @@ -2,3 +2,4 @@ Method Name | Description ----------- | ----------- `Infer` | Take an already ordered input tensor as an array, make an inference on the model, and return an output tensor map. `Metadata`| Get the metadata (such as name, type, expected tensor/array shape, inputs, and outputs) associated with the ML model. +`DoCommand` | Send arbitrary commands to the resource.