Skip to content

Commit

Permalink
Example showing how to run TorchServe's backend code only (#2605)
Browse files Browse the repository at this point in the history
* Example showing how to run TorchServe backend code only

* Example showing how to run TorchServe backend code only

* Example showing how to run TorchServe backend code only

* Example showing how to run TorchServe backend code only

* review comments

* include a screenshot of debugger

* Works with pytest too

* Works with pytest too
  • Loading branch information
agunapal committed Sep 21, 2023
1 parent d0ae857 commit ab69b69
Show file tree
Hide file tree
Showing 5 changed files with 138 additions and 2 deletions.
4 changes: 4 additions & 0 deletions docs/getting_started.md
Original file line number Diff line number Diff line change
Expand Up @@ -170,6 +170,10 @@ All the logs you've seen as output to stdout related to model registration, mana

High level performance data like Throughput or Percentile Precision can be generated with [Benchmark](https://github.com/pytorch/serve/tree/master/benchmarks/README.md) and visualized in a report.

## Debugging Handler Code

If you want to debug your handler code, you can run TorchServe with just the backend and hence use any python debugger. You can refer to an example defined [here](../examples/image_classifier/resnet_18/README.md#debug-torchserve-backend)

### Contributing

If you plan to develop with TorchServe and change some source code, follow the [contributing guide](https://github.com/pytorch/serve/blob/master/CONTRIBUTING.md).
64 changes: 62 additions & 2 deletions examples/image_classifier/resnet_18/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -35,8 +35,8 @@ curl http://127.0.0.1:8080/predictions/resnet-18 -T ./examples/image_classifier/
example_input = torch.rand(1, 3, 224, 224)
traced_script_module = torch.jit.trace(model, example_input)
traced_script_module.save("resnet-18.pt")
```
```

* Use following commands to register Resnet18 torchscript model on TorchServe and run image prediction

```bash
Expand All @@ -46,3 +46,63 @@ curl http://127.0.0.1:8080/predictions/resnet-18 -T ./examples/image_classifier/
torchserve --start --model-store model_store --models resnet-18=resnet-18.mar
curl http://127.0.0.1:8080/predictions/resnet-18 -T ./serve/examples/image_classifier/kitten.jpg
```

### Debug TorchServe Backend

If you want to test your handler code, you can use the example in `debugging_backend/test_handler.py`

```
python debugging_backend/test_handler.py --batch_size 2
```
results in
```
Torch TensorRT not enabled
DEBUG:ts.torch_handler.base_handler:Model file /home/ubuntu/serve/examples/image_classifier/resnet_18/resnet-18.pt loaded successfully
INFO:__main__:Result is [{'tabby': 0.4096629023551941, 'tiger_cat': 0.34670525789260864, 'Egyptian_cat': 0.13002872467041016, 'lynx': 0.02391958236694336, 'bucket': 0.011532173492014408}, {'tabby': 0.4096629023551941, 'tiger_cat': 0.34670525789260864, 'Egyptian_cat': 0.13002872467041016, 'lynx': 0.02391958236694336, 'bucket': 0.011532173492014408}]
```
If this doesn't work, you can use a debugger to find the problem in your backend handler code.
Once you are confident this works, you can use your handler to deploy the model using TorchServe
Below is a screenshot of debugger running with this handler
![image info](./debugging_backend/debugger_screenshot.png)
You can also use this with pytest
```
pytest debugging_backend/test_handler.py
```
results in
```
================================================================================== test session starts ===================================================================================
platform linux -- Python 3.8.18, pytest-7.3.1, pluggy-1.0.0
rootdir: /home/ubuntu/serve
plugins: mock-3.10.0, anyio-3.6.1, cov-4.1.0, hypothesis-6.54.3
collected 1 item

debugging_backend/test_handler.py . [100%]

==================================================================================== warnings summary ====================================================================================
../../../../anaconda3/envs/torchserve/lib/python3.8/site-packages/ts/torch_handler/base_handler.py:13
/home/ubuntu/anaconda3/envs/torchserve/lib/python3.8/site-packages/ts/torch_handler/base_handler.py:13: DeprecationWarning: pkg_resources is deprecated as an API. See https://setuptools.pypa.io/en/latest/pkg_resources.html
from pkg_resources import packaging

../../../../anaconda3/envs/torchserve/lib/python3.8/site-packages/pkg_resources/__init__.py:2871
/home/ubuntu/anaconda3/envs/torchserve/lib/python3.8/site-packages/pkg_resources/__init__.py:2871: DeprecationWarning: Deprecated call to `pkg_resources.declare_namespace('mpl_toolkits')`.
Implementing implicit namespace packages (as specified in PEP 420) is preferred to `pkg_resources.declare_namespace`. See https://setuptools.pypa.io/en/latest/references/keywords.html#keyword-namespace-packages
declare_namespace(pkg)

../../../../anaconda3/envs/torchserve/lib/python3.8/site-packages/pkg_resources/__init__.py:2871
../../../../anaconda3/envs/torchserve/lib/python3.8/site-packages/pkg_resources/__init__.py:2871
/home/ubuntu/anaconda3/envs/torchserve/lib/python3.8/site-packages/pkg_resources/__init__.py:2871: DeprecationWarning: Deprecated call to `pkg_resources.declare_namespace('ruamel')`.
Implementing implicit namespace packages (as specified in PEP 420) is preferred to `pkg_resources.declare_namespace`. See https://setuptools.pypa.io/en/latest/references/keywords.html#keyword-namespace-packages
declare_namespace(pkg)

-- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html
============================================================================= 1 passed, 4 warnings in 2.29s ==============================================================================
```
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Original file line number Diff line number Diff line change
@@ -0,0 +1,70 @@
import argparse
import io
import logging
from pathlib import Path

import torch

from ts.torch_handler.image_classifier import ImageClassifier
from ts.torch_handler.unit_tests.test_utils.mock_context import MockContext

logger = logging.getLogger(__name__)
logging.basicConfig(level=logging.DEBUG)

CURR_FILE_PATH = Path(__file__).parent.absolute()
REPO_ROOT_DIR = CURR_FILE_PATH.parents[3]
EXAMPLE_ROOT_DIR = REPO_ROOT_DIR.joinpath("examples", "image_classifier", "resnet_18")
TEST_DATA = REPO_ROOT_DIR.joinpath("examples", "image_classifier", "kitten.jpg")
MODEL_PT_FILE = "resnet-18.pt"


def prepare_data(batch_size):
"""
Function to prepare data based on the desired batch size
"""
f = io.open(TEST_DATA, "rb", buffering=0)
read_data = f.read()
data = []
for i in range(batch_size):
tmp = {}
tmp["data"] = read_data
data.append(tmp)
return data


def test_resnet18(batch_size=1):
# Define your handler
handler = ImageClassifier()

# Context definition
ctx = MockContext(
model_pt_file=MODEL_PT_FILE,
model_dir=EXAMPLE_ROOT_DIR.as_posix(),
model_file=None,
)

torch.manual_seed(42 * 42)
handler.initialize(ctx)
handler.context = ctx

data = prepare_data(batch_size)

# Here we are using the BaseHandler's handle method. You can define your own
result = handler.handle(data, ctx)
logger.info(f"Result is {result}")

# Can be used with pytest
value = max(result[0], key=result[0].get)
assert value == "tabby"


if __name__ == "__main__":
parser = argparse.ArgumentParser()
parser.add_argument(
"--batch_size",
default=1,
type=int,
help="Batch size for testing inference",
)
args = parser.parse_args()
test_resnet18(args.batch_size)
2 changes: 2 additions & 0 deletions ts/torch_handler/unit_tests/test_utils/mock_context.py
Original file line number Diff line number Diff line change
Expand Up @@ -2,11 +2,13 @@
Mocks for adding model context without loading all of Torchserve
"""

import os
import uuid

import torch

from ts.metrics.metrics_store import MetricsStore
from ts.utils.util import get_yaml_config


class MockContext:
Expand Down

0 comments on commit ab69b69

Please sign in to comment.