Skip to content

Commit

Permalink
fix kserve storage optional package (#2537)
Browse files Browse the repository at this point in the history
* fix kserve storage optional package

Signed-off-by: jagadeesh <[email protected]>

* upgrade kserve at Docker dev

Signed-off-by: jagadeesh <[email protected]>

---------

Signed-off-by: jagadeesh <[email protected]>
Co-authored-by: Ankith Gunapal <[email protected]>
  • Loading branch information
Jagadeesh J and agunapal authored Aug 24, 2023
1 parent b37296d commit cd7c47e
Show file tree
Hide file tree
Showing 3 changed files with 5 additions and 5 deletions.
5 changes: 2 additions & 3 deletions kubernetes/kserve/Dockerfile.dev
Original file line number Diff line number Diff line change
Expand Up @@ -17,7 +17,6 @@ FROM ${BASE_IMAGE} AS compile-image
ARG BASE_IMAGE
ARG BRANCH_NAME=master
ARG MACHINE_TYPE=cpu
ARG BRANCH_NAME_KF=master
ENV PYTHONUNBUFFERED TRUE

RUN --mount=type=cache,id=apt-dev,target=/var/cache/apt \
Expand Down Expand Up @@ -46,7 +45,7 @@ RUN --mount=type=cache,id=apt-dev,target=/var/cache/apt \
&& rm -rf /var/lib/apt/lists/* \
&& cd /tmp \

RUN update-alternatives --remove python /usr/bin/python \
RUN update-alternatives --remove python /usr/bin/python \
&& update-alternatives --install /usr/bin/python python /usr/bin/python3.8 1

#ADD "https://www.random.org/cgi-bin/randbyte?nbytes=10&format=h" skipcache
Expand All @@ -62,7 +61,7 @@ RUN if [ "$MACHINE_TYPE" = "gpu" ]; then export USE_CUDA=1; fi \
&& git checkout ${BRANCH_NAME} \
&& if [ -z "$CUDA_VERSION" ]; then python ts_scripts/install_dependencies.py --environment=dev; else python ts_scripts/install_dependencies.py --environment=dev --cuda $CUDA_VERSION; fi \
&& python ts_scripts/install_from_src.py \
&& python -m pip install captum transformers kserve \
&& python -m pip install captum transformers kserve[storage]>=0.11.0 \
&& python -m pip install . \
&& useradd -m model-server \
&& mkdir -p /home/model-server/tmp \
Expand Down
3 changes: 2 additions & 1 deletion kubernetes/kserve/kserve_wrapper/TorchserveModel.py
Original file line number Diff line number Diff line change
Expand Up @@ -6,6 +6,7 @@
import kserve
from kserve.errors import ModelMissingError
from kserve.model import Model as Model
from kserve.storage import Storage

logging.basicConfig(level=kserve.constants.KSERVE_LOGLEVEL)

Expand Down Expand Up @@ -53,7 +54,7 @@ def load(self) -> bool:
"""This method validates model availabilty in the model directory
and sets ready flag to true.
"""
model_path = pathlib.Path(kserve.Storage.download(self.model_dir))
model_path = pathlib.Path(Storage.download(self.model_dir))
paths = list(pathlib.Path(model_path).glob("*.mar"))
existing_paths = [path for path in paths if path.exists()]
if len(existing_paths) == 0:
Expand Down
2 changes: 1 addition & 1 deletion kubernetes/kserve/requirements.txt
Original file line number Diff line number Diff line change
@@ -1,3 +1,3 @@
kserve>=0.9.0
kserve[storage]>=0.11.0
transformers
captum

0 comments on commit cd7c47e

Please sign in to comment.