Skip to content

Commit

Permalink
Use semantic linefeeds for doc
Browse files Browse the repository at this point in the history
Co-authored-by: Hagen Wierstorf <[email protected]>
  • Loading branch information
audeerington and hagenw authored Dec 18, 2023
1 parent 3c96ffa commit 158179d
Show file tree
Hide file tree
Showing 3 changed files with 24 additions and 12 deletions.
12 changes: 8 additions & 4 deletions audonnx/core/api.py
Original file line number Diff line number Diff line change
Expand Up @@ -53,14 +53,18 @@ def load(
onnxruntime chooses the number of threads
session_options: :class:`onnxruntime.SessionOptions`
to use for inference.
If ``None`` the default options are used and the number of
threads for running inference on cpu is determined
by ``num_workers``. Otherwise, the provided options are used
If ``None`` the default options are used
and the number of threads
for running inference on cpu
is determined by ``num_workers``.
Otherwise,
the provided options are used
and the ``session_options`` properties
:attr:`~onnxruntime.SessionOptions.inter_op_num_threads`
and :attr:`~onnxruntime.SessionOptions.intra_op_num_threads`
determine the number of threads
for inference on cpu and ``num_workers`` is ignored
for inference on cpu
and ``num_workers`` is ignored
auto_install: install missing packages needed to create the object
.. _`provider(s)`: https://onnxruntime.ai/docs/execution-providers/
Expand Down
12 changes: 8 additions & 4 deletions audonnx/core/model.py
Original file line number Diff line number Diff line change
Expand Up @@ -56,14 +56,18 @@ class Model(audobject.Object):
onnxruntime chooses the number of threads
session_options: :class:`onnxruntime.SessionOptions`
to use for inference.
If ``None`` the default options are used and the number of
threads for running inference on cpu is determined
by ``num_workers``. Otherwise, the provided options are used
If ``None`` the default options are used
and the number of threads
for running inference on cpu
is determined by ``num_workers``.
Otherwise,
the provided options are used
and the ``session_options`` properties
:attr:`~onnxruntime.SessionOptions.inter_op_num_threads`
and :attr:`~onnxruntime.SessionOptions.intra_op_num_threads`
determine the number of threads
for inference on cpu and ``num_workers`` is ignored
for inference on cpu
and ``num_workers`` is ignored
Examples:
>>> import audiofile
Expand Down
12 changes: 8 additions & 4 deletions audonnx/core/testing.py
Original file line number Diff line number Diff line change
Expand Up @@ -45,14 +45,18 @@ def create_model(
onnxruntime chooses the number of threads
session_options: :class:`onnxruntime.SessionOptions`
to use for inference.
If ``None`` the default options are used and the number of
threads for running inference on cpu is determined
by ``num_workers``. Otherwise, the provided options are used
If ``None`` the default options are used
and the number of threads
for running inference on cpu
is determined by ``num_workers``.
Otherwise,
the provided options are used
and the ``session_options`` properties
:attr:`~onnxruntime.SessionOptions.inter_op_num_threads`
and :attr:`~onnxruntime.SessionOptions.intra_op_num_threads`
determine the number of threads
for inference on cpu and ``num_workers`` is ignored
for inference on cpu
and ``num_workers`` is ignored
Returns:
model object
Expand Down

0 comments on commit 158179d

Please sign in to comment.