Skip to content

Commit

Permalink
Update CONTRIBUTING.md to use "KServe" not "kserve"
Browse files Browse the repository at this point in the history
  • Loading branch information
ca-scribner authored Mar 4, 2024
1 parent 4b7504d commit e85a747
Showing 1 changed file with 6 additions and 6 deletions.
12 changes: 6 additions & 6 deletions CONTRIBUTING.md
Original file line number Diff line number Diff line change
@@ -1,18 +1,18 @@
# kserve's server rocks
# KServe's server rocks

## Summary of upstream's dockerfiles

The kserve server images are a collection of different inference server runtimes, such as sklearn or paddle. This repo includes rocks for the upstream server images located [here](https://github.com/kserve/kserve/tree/master/python). These server images all have the following common traits:
The KServe server images are a collection of different inference server runtimes, such as sklearn or paddle. This repo includes rocks for the upstream server images located [here](https://github.com/kserve/kserve/tree/master/python). These server images all have the following common traits:

* they are implemented as python packages and use [poetry](https://python-poetry.org/) to manage their dependencies
* each server installs its own server-specific package (ex: [sklearn](https://github.com/kserve/kserve/tree/master/python/sklearnserver))
* they all install a common [kserve](https://github.com/kserve/kserve/tree/master/python/kserve) package
* they all install a common [KServe](https://github.com/kserve/kserve/tree/master/python/kserve) package

The Dockerfile for each of these images takes advantage of how each server is defined as a poetry package, using `poetry install` in the Dockerfile directly.

## Implementation details of the ROCKs in this repo

The ROCKs for the kserve servers require some atypical workarounds, mostly due to the upstream project using poetry to install its dependencies. These are documented here in detail, and briefly noted in the rockcraft.yaml files in this repository.
The ROCKs for the KServe servers require some atypical workarounds, mostly due to the upstream project using poetry to install its dependencies. These are documented here in detail, and briefly noted in the rockcraft.yaml files in this repository.

### Installing Python/pip via overlay-packages

Expand All @@ -25,7 +25,7 @@ As a workaround, we use python/pip from the `overlay-packages`, which somehow ma
By listing `python3.10` and `python3-pip` in `overlay-packages`, rockcraft will promote python/pip to the final rock but it **does not automatically migrate any python packages we have installed**. As a workaround, we copy the installed packages manually by copying the contents of `/usr/local/lib/python3.10/dist-packages` to `$CRAFT_PART_INSTALL/usr/local/lib/python3.10/dist-packages` (which will be rendered to `/usr/local/lib/python3.10/dist-packages` in the final rock.


### Installing kserve/server-specific package via a dummy poetry package
### Installing `kserve`/server-specific package via a dummy poetry package

When you install a local package using `poetry install`, poetry installs the root package (eg: the package you have code for locally) as editable (equivalent to doing `pip -e /my/local/package`), while the package's dependencies are installed as non-editable (default `pip` behaviour). Packages installed normally have their code put into `/usr/local/lib/python3.10/dist-packages`, but editable packages are not copied to this directory and instead just point to your local folder where you installed them from. Because we are in the rock's build environment when we do `poetry install`, this means the package is installed pointing to its location in the build environment (eg: `/root/parts/mypart/build/mycode`) and not the final rock environment. The result of this is the package is not actually included in the final rock.

Expand Down Expand Up @@ -79,7 +79,7 @@ The upstream install procedure results in `python` being executable, but our roc

For every inference server provided, upstream maintains an example usage in their [Model Serving Runtimes docs](https://kserve.github.io/website/master/modelserving/v1beta1/serving_runtime/). Each example includes a model for the given server, for example the [`Scikit-learn`](https://kserve.github.io/website/master/modelserving/v1beta1/sklearn/v2/#deploy-the-model-with-rest-endpoint-through-inferenceservice/) runtime has a provided model at `gs://kfserving-examples/models/sklearn/1.0/model`.

While the upstream examples show how to use these models in kserve itself, we can use the same models to test the inference server rocks directly. For example, we can do:
While the upstream examples show how to use these models in KServe itself, we can use the same models to test the inference server rocks directly. For example, we can do:

Launch the server with:
```
Expand Down

0 comments on commit e85a747

Please sign in to comment.