Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

DOCS-1865: Address customer feedback #2587

Merged
merged 11 commits into from
Feb 27, 2024
Merged
Prev Previous commit
Next Next commit
Update _index.md
Co-authored-by: JessamyT <[email protected]>
npentrel and JessamyT authored Feb 27, 2024

Verified

This commit was created on GitHub.com and signed with GitHub’s verified signature.
commit 212c34fc31a0c3516c23babdd28b0e5b86b24814
2 changes: 1 addition & 1 deletion docs/ml/_index.md
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Out-of-diff nits:

  • Consider sentence casing the heading of step 3
  • Consider adding the word "model" to "machine learning (ML) service" in the first sentence (i.e. change it to "machine learning (ML) model service" for consistency with the rest of the doc

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Isn't it already sentence case?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Oh you mean number 2

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

ooh yeah oops

Original file line number Diff line number Diff line change
@@ -49,7 +49,7 @@ Viam natively supports [TensorFlow Lite](https://www.tensorflow.org/lite) ML mod
<tr>
<td>
<b>4. Deploy your ML model</b>
<p>To make use of ML models with your machine, you must first deploy the model using the built-in <a href="/ml/deploy/">ML model service</a>. The ML Model service will run the model and allow the vision service to use it.</p>
<p>To make use of ML models with your machine, you must first deploy the model using the built-in <a href="/ml/deploy/">ML model service</a>. The ML model service will run the model and allow the vision service to use it.</p>
npentrel marked this conversation as resolved.
Show resolved Hide resolved
</td>
</tr>
<tr>