Skip to content

Commit

Permalink
Update examples (#42)
Browse files Browse the repository at this point in the history
* Update batch inference example

* remove truefoundry from core requirements

* Minor improvements to docker deploy example

* Add deployment-requirements.txt

* Improve deploy-ml-model example

* Fix readme
  • Loading branch information
chiragjn authored Sep 9, 2024
1 parent 6c122a6 commit 44b42ea
Show file tree
Hide file tree
Showing 17 changed files with 192 additions and 96 deletions.
3 changes: 3 additions & 0 deletions batch-inference/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -54,6 +54,9 @@ env={

4. Deploy!

> Please refer to following docs
> - [Getting workspace FQN](https://docs.truefoundry.com/docs/key-concepts#getting-workspace-fqn)
```shell
python deploy.py --workspace_fqn <Workspace FQN>
```
Expand Down
1 change: 1 addition & 0 deletions batch-inference/deployment-requirements.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
truefoundry[ml]>=0.2.0,<1.0.0
1 change: 0 additions & 1 deletion batch-inference/requirements.txt
Original file line number Diff line number Diff line change
Expand Up @@ -5,4 +5,3 @@ pandas==2.2.2
torch==2.2.1; sys_platform != 'linux'
torch==2.2.1+cu121; sys_platform == 'linux'
transformers==4.40.2
truefoundry[ml]>=0.2.0,<1.0.0
18 changes: 6 additions & 12 deletions batch-inference/truefoundry.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -10,10 +10,8 @@ image:
type: tfy-python-buildpack
python_version: '3.11'
requirements_path: requirements.txt
command: python batch_infer.py --input_bucket_name {{input_bucket_name}} --input_path
{{input_path}} --output_bucket_name {{output_bucket_name}} --output_path {{output_path}}
--batch_size {{batch_size}}
docker_registry: null
command: >-
python batch_infer.py --input_bucket_name {{input_bucket_name}} --input_path {{input_path}} --output_bucket_name {{output_bucket_name}} --output_path {{output_path}} --batch_size {{batch_size}}
env:
AWS_ACCESS_KEY_ID: tfy-secret://your-secret-group-name/AWS_ACCESS_KEY_ID
AWS_SECRET_ACCESS_KEY: tfy-secret://your-secret-group-name/AWS_SECRET_ACCESS_KEY
Expand All @@ -36,13 +34,9 @@ params:
param_type: string
default: '4'
resources:
cpu_limit: 0.5
cpu_request: 0.5
devices: null
ephemeral_storage_limit: 50000
ephemeral_storage_request: 10000
gpu_count: 0
memory_limit: 4000
cpu_limit: 2
memory_request: 1000
node: null
shared_memory_size: null
memory_limit: 4000
ephemeral_storage_request: 10000
ephemeral_storage_limit: 50000
125 changes: 71 additions & 54 deletions deploy-ml-model/Deploy_your_first_service.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -37,7 +37,7 @@
"source": [
"## Project structure\n",
"\n",
"To complete this guide, you are going to create the following **files**:\n",
"To complete this guide, you are going to review the following **files**:\n",
"\n",
"- `app.py` : contains our inference and FastAPI code\n",
"- `iris_classifier.joblib` : the model file\n",
Expand All @@ -54,25 +54,7 @@
"└── requirements.txt\n",
"```\n",
"\n",
"As you can see, all the following files are created in the same folder/directory\n",
"\n",
"**Let's create the directory which will contain all this files:-**\n"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"colab": {
"base_uri": "https://localhost:8080/"
},
"id": "tNRHZkr90mLf",
"outputId": "8ada3cf3-38c4-4d51-8e2b-d2aebd3e0233"
},
"outputs": [],
"source": [
"%mkdir develop\n",
"%cd develop"
"As you can see, all the following files are created in the same folder/directory"
]
},
{
Expand Down Expand Up @@ -146,7 +128,9 @@
"\n",
"Once you run the cell below you will get a prompt to enter your Workspace FQN. Follow the docs to\n",
"\n",
"**Create a Workspace**: https://docs.truefoundry.com/docs/key-concepts#creating-a-workspace "
"**Create a Workspace**: https://docs.truefoundry.com/docs/key-concepts#creating-a-workspace\n",
"\n",
"**Get Existing Workspace FQN**: https://docs.truefoundry.com/docs/key-concepts#getting-workspace-fqn"
]
},
{
Expand All @@ -157,7 +141,8 @@
},
"outputs": [],
"source": [
"WORKSPACE_FQN = input(\"Enter your Workspace FQN: \")"
"WORKSPACE_FQN = input(\"Enter your Workspace FQN: \")\n",
"print(f\"Workspace FQN set to {WORKSPACE_FQN!r}\")"
]
},
{
Expand All @@ -171,10 +156,9 @@
"\n",
"## Model details\n",
"\n",
"For this guide, we have already trained a model. \n",
"The given model has been trained on _iris dataset_. Then it is stored as a joblib file in [google drive](https://drive.google.com/file/d/1-9nwjs6F7cp_AhAlBAWZHMXG8yb2q_LR/view).\n",
"For this guide, we have already trained a model. The given model has been trained on _iris dataset_ and uploaded to [google drive](https://drive.google.com/file/d/1-9nwjs6F7cp_AhAlBAWZHMXG8yb2q_LR/view).\n",
"\n",
"> **Attributes** : \n",
"> **Attributes** :\n",
"> sepal length in cm, sepal width in cm, petal length in cm, petal width in cm\n",
">\n",
"> **Predicted Attribute** : \n",
Expand Down Expand Up @@ -243,8 +227,7 @@
"from fastapi import FastAPI\n",
"\n",
"model = joblib.load(\"iris_classifier.joblib\")\n",
"\n",
"app = FastAPI()\n",
"app = FastAPI(docs_url=\"/\", root_path=os.getenv(\"TFY_SERVICE_ROOT_PATH\"))\n",
"\n",
"@app.post(\"/predict\")\n",
"def predict(\n",
Expand Down Expand Up @@ -276,11 +259,12 @@
"outputs": [],
"source": [
"%%writefile requirements.txt\n",
"fastapi==0.81.0\n",
"uvicorn==0.18.3\n",
"fastapi==0.114.0\n",
"uvicorn==0.30.6\n",
"scikit-learn==1.5.0\n",
"joblib==1.3.2\n",
"pandas==2.1.0\n"
"pandas==2.2.2\n",
"numpy==1.26.4"
]
},
{
Expand Down Expand Up @@ -329,9 +313,16 @@
"logging.basicConfig(level=logging.INFO)\n",
"\n",
"parser = argparse.ArgumentParser()\n",
"parser.add_argument(\"--name\", required=True, type=str, help=\"Name of the application.\")\n",
"parser.add_argument(\n",
" \"--name\", \n",
" required=False, \n",
" default=\"iris-classifier-svc\",\n",
" type=str, \n",
" help=\"Name of the application.\"\n",
")\n",
"parser.add_argument(\n",
" \"--workspace_fqn\",\n",
" \"--workspace-fqn\",\n",
" required=True,\n",
" type=str,\n",
" help=\"FQN of the workspace where application will be deployed.\",\n",
Expand All @@ -342,39 +333,67 @@
" type=str,\n",
" help=\"Host where the application will be available for access. Ex:- my-app.my-org.com\",\n",
")\n",
"args = parser.parse_args()\n",
"\n",
"image = Build(\n",
" build_source=LocalSource(local_build=False),\n",
" build_spec=PythonBuild(\n",
" python_version=\"3.11\",\n",
" command=\"uvicorn app:app --port 8000 --host 0.0.0.0\",\n",
" requirements_path=\"requirements.txt\",\n",
" )\n",
"parser.add_argument(\n",
" \"--path\",\n",
" required=False,\n",
" default=None,\n",
" type=str,\n",
" help=\"Path in addition to the host where the application will be available for access. Eg: my-org.com/my-path\",\n",
")\n",
"args = parser.parse_args()\n",
"\n",
"service = Service(\n",
" name=args.name,\n",
" image=image,\n",
" ports=[Port(port=8000, host=args.host)],\n",
" # Define how to build your code into a Docker image\n",
" image=Build(\n",
" # `LocalSource` helps specify the details of your local source code.\n",
" build_source=LocalSource(local_build=False),\n",
" # `PythonBuild` helps specify the details of your Python Code.\n",
" \t# These details will be used to templatize a DockerFile to build your Docker Image\n",
" build_spec=PythonBuild(\n",
" python_version=\"3.11\",\n",
" command=\"uvicorn app:app --port 8000 --host 0.0.0.0\",\n",
" requirements_path=\"requirements.txt\",\n",
" )\n",
" ),\n",
" # Set the ports your server will listen on\n",
" ports=[\n",
" # Providing a host and path value depends on the base domain urls configured in the cluster settings.\n",
" # You can learn how to find the base domain urls available to you https://docs.truefoundry.com/docs/define-ports-and-domains#identifying-available-domains\n",
" Port(port=8000, host=args.host, path=args.path)\n",
" ],\n",
" # Define the resource constraints.\n",
" #\n",
" # Requests are the minimum amount of resources that a container needs to run.\n",
" # Limits are the maximum amount of resources that a container can use.\n",
" resources=Resources(\n",
" cpu_request=0.1,\n",
" cpu_limit=0.1,\n",
" memory_request=500,\n",
" memory_limit=500,\n",
" ),\n",
" # Define environment variables that your Service will have access to\n",
" env={\"UVICORN_WEB_CONCURRENCY\": \"1\", \"ENVIRONMENT\": \"dev\"},\n",
")\n",
"service.deploy(workspace_fqn=args.workspace_fqn, wait=False)"
]
},
{
"cell_type": "markdown",
"metadata": {
"id": "I61_mM7Y447C"
},
"metadata": {},
"source": [
"Click on this [link](https://docs.truefoundry.com/recipes/deploy-fastapi-service-via-python) to understand the **`deploy.py`**:\n"
"Please refer to this [link](https://docs.truefoundry.com/docs/deploy-service-using-python-sdk) to understand more about **`deploy.py`**"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"We will need a endpoint to access the deployed service. This host should follow the base domain url configured in the cluster.\n",
"\n",
"Please refer to following docs to get the base domain url to make your host:\n",
"\n",
"https://docs.truefoundry.com/docs/define-ports-and-domains#identifying-available-domains"
]
},
{
Expand All @@ -383,10 +402,8 @@
"metadata": {},
"outputs": [],
"source": [
"SERVICE_NAME = input(\"Enter the Service name\")\n",
"SERVICE_HOST = input(\n",
" \"Enter the Service Host (Can be found from cluster details in TrueFoundry UI)\"\n",
")"
"SERVICE_HOST = input(\"Enter the Service Host: \")\n",
"print(f\"Service Host set to {SERVICE_HOST!r}\")"
]
},
{
Expand All @@ -410,7 +427,7 @@
},
"outputs": [],
"source": [
"!python deploy.py --name $SERVICE_NAME --workspace_fqn $WORKSPACE_FQN --host $SERVICE_HOST"
"!python deploy.py --workspace_fqn $WORKSPACE_FQN --host $SERVICE_HOST"
]
},
{
Expand All @@ -429,9 +446,9 @@
"provenance": []
},
"kernelspec": {
"display_name": "Python 3 (ipykernel)",
"display_name": "jupyter-base",
"language": "python",
"name": "python3"
"name": "jupyter-base"
},
"language_info": {
"codemirror_mode": {
Expand All @@ -443,7 +460,7 @@
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.11.2"
"version": "3.11.9"
}
},
"nbformat": 4,
Expand Down
3 changes: 1 addition & 2 deletions deploy-ml-model/app.py
Original file line number Diff line number Diff line change
Expand Up @@ -5,8 +5,7 @@
from fastapi import FastAPI

model = joblib.load("iris_classifier.joblib")

app = FastAPI(docs_url="/", root_path=os.getenv("TFY_SERVICE_ROOT_PATH", "/"))
app = FastAPI(docs_url="/", root_path=os.getenv("TFY_SERVICE_ROOT_PATH"))


@app.post("/predict")
Expand Down
44 changes: 32 additions & 12 deletions deploy-ml-model/deploy.py
Original file line number Diff line number Diff line change
Expand Up @@ -6,9 +6,10 @@
logging.basicConfig(level=logging.INFO)

parser = argparse.ArgumentParser()
parser.add_argument("--name", required=True, type=str, help="Name of the application.")
parser.add_argument("--name", required=False, default="iris-classifier-svc", type=str, help="Name of the application.")
parser.add_argument(
"--workspace_fqn",
"--workspace-fqn",
required=True,
type=str,
help="FQN of the workspace where application will be deployed.",
Expand All @@ -19,27 +20,46 @@
type=str,
help="Host where the application will be available for access. Ex:- my-app.my-org.com",
)
args = parser.parse_args()

image = Build(
build_source=LocalSource(local_build=False),
build_spec=PythonBuild(
python_version="3.11",
command="uvicorn app:app --port 8000 --host 0.0.0.0",
requirements_path="requirements.txt",
),
parser.add_argument(
"--path",
required=False,
default=None,
type=str,
help="Path in addition to the host where the application will be available for access. Eg: my-org.com/my-path",
)
args = parser.parse_args()

service = Service(
name=args.name,
image=image,
ports=[Port(port=8000, host=args.host)],
# Define how to build your code into a Docker image
image=Build(
# `LocalSource` helps specify the details of your local source code.
build_source=LocalSource(local_build=False),
# `PythonBuild` helps specify the details of your Python Code.
# These details will be used to templatize a DockerFile to build your Docker Image
build_spec=PythonBuild(
python_version="3.11",
command="uvicorn app:app --port 8000 --host 0.0.0.0",
requirements_path="requirements.txt",
),
),
# Set the ports your server will listen on
ports=[
# Providing a host and path value depends on the base domain urls configured in the cluster settings.
# You can learn how to find the base domain urls available to you https://docs.truefoundry.com/docs/define-ports-and-domains#identifying-available-domains
Port(port=8000, host=args.host, path=args.path)
],
# Define the resource constraints.
#
# Requests are the minimum amount of resources that a container needs to run.
# Limits are the maximum amount of resources that a container can use.
resources=Resources(
cpu_request=0.1,
cpu_limit=0.1,
memory_request=500,
memory_limit=500,
),
# Define environment variables that your Service will have access to
env={"UVICORN_WEB_CONCURRENCY": "1", "ENVIRONMENT": "dev"},
)
service.deploy(workspace_fqn=args.workspace_fqn, wait=False)
8 changes: 4 additions & 4 deletions deploy-ml-model/requirements.txt
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
fastapi==0.81.0
fastapi==0.114.0
uvicorn==0.30.6
scikit-learn==1.5.0
joblib==1.3.2
pandas==2.2.2
numpy==1.26.4
pandas==2.1.0
scikit-learn==1.5.0
uvicorn==0.18.3
8 changes: 4 additions & 4 deletions docker-deploy/Dockerfile
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
FROM python:3.9
FROM python:3.11-slim
WORKDIR /app
COPY ./requirements.txt /tmp/
RUN pip install -U pip && pip install -r /tmp/requirements.txt
COPY . ./app
WORKDIR app
RUN pip install -U pip setuptools wheel && pip install -r /tmp/requirements.txt
COPY . /app
ENTRYPOINT python app.py
Loading

0 comments on commit 44b42ea

Please sign in to comment.