Replies: 1 comment 4 replies
-
You can try the following workflow openllm build <model_name> Then it should show you how you can containerize the LLM. I understand the notion of using alpine, but for the use case of LLM, it doesn't make sense because you will end up having to rebuild a lot of kernel from source, which is time consuming and not usually recommended. you can use |
Beta Was this translation helpful? Give feedback.
4 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hello! First time using OpenLLM. I am attempting to create a Docker image with OpenLLM in my local dev environment. I receive the following error message when attempting to install OpenLLM:
#0 92.76 ERROR: Cannot install transformers[accelerate,onnx,onnxruntime,tokenizers,torch]==4.29.0, transformers[accelerate,onnx,onnxruntime,tokenizers,torch]==4.29.1, transformers[accelerate,onnx,onnxruntime,tokenizers,torch]==4.29.2, transformers[accelerate,onnx,onnxruntime,tokenizers,torch]==4.30.0, transformers[accelerate,onnx,onnxruntime,tokenizers,torch]==4.30.1 and transformers[accelerate,onnx,onnxruntime,tokenizers,torch]==4.30.2 because these package versions have conflicting dependencies. #0 92.76 #0 92.76 The conflict is caused by: #0 92.76 transformers[accelerate,onnx,onnxruntime,tokenizers,torch] 4.30.2 depends on onnxruntime>=1.4.0; extra == "onnx" #0 92.76 transformers[accelerate,onnx,onnxruntime,tokenizers,torch] 4.30.1 depends on onnxruntime>=1.4.0; extra == "onnx" #0 92.76 transformers[accelerate,onnx,onnxruntime,tokenizers,torch] 4.30.0 depends on onnxruntime>=1.4.0; extra == "onnx" #0 92.76 transformers[accelerate,onnx,onnxruntime,tokenizers,torch] 4.29.2 depends on onnxruntime>=1.4.0; extra == "onnx" #0 92.76 transformers[accelerate,onnx,onnxruntime,tokenizers,torch] 4.29.1 depends on onnxruntime>=1.4.0; extra == "onnx" #0 92.76 transformers[accelerate,onnx,onnxruntime,tokenizers,torch] 4.29.0 depends on onnxruntime>=1.4.0; extra == "onnx" #0 92.76 #0 92.76 To fix this you could try to: #0 92.76 1. loosen the range of package versions you've specified #0 92.76 2. remove package versions to allow pip attempt to solve the dependency conflict #0 92.76 #0 92.76 ERROR: ResolutionImpossible: for help visit https://pip.pypa.io/en/latest/topics/dependency-resolution/#dealing-with-dependency-conflicts
My Dockerfile is:
FROM python:3.11.3-alpine3.18
LABEL maintainer="[email protected]"
RUN apk add --update --no-cache bash bash-completion
libffi-dev tzdata git postgresql-client &&
apk add --update --no-cache --virtual .tmp-build-deps
build-base postgresql-dev
ENV PYTHONDONTWRITEBYTECODE=1
ENV PYTHONUNBUFFERED=1
WORKDIR /app
RUN echo "alias ll='ls -alF'" >> $HOME/.bashrc
echo "alias la='ls -A'" >> $HOME/.bashrc
echo "alias l='ls -CF'" >> $HOME/.bashrc
echo "alias q='exit'" >> $HOME/.bashrc
echo "alias c='clear'" >> $HOME/.bashrc
COPY ./requirements.txt /tmp/requirements.txt
COPY ./requirements.dev.txt /tmp/requirements.dev.txt
COPY ./app /app/
ARG DEV=false
ENV VIRTUAL_ENV=/py
ENV PATH="${VIRTUAL_ENV}/bin:$PATH"
RUN python -m venv ${VIRTUAL_ENV} &&
${VIRTUAL_ENV}/bin/pip install --upgrade pip &&
${VIRTUAL_ENV}/bin/pip install -r /tmp/requirements.txt &&
if [ $DEV = "true" ];
then ${VIRTUAL_ENV}/bin/pip install -r /tmp/requirements.dev.txt ;
fi &&
rm -rf /tmp &&
apk del .tmp-build-deps &&
adduser
-D
-H
flask-user
USER flask-user
CMD [ "/bin/bash" ]
Any ideas will be appreciated. Thanks.
Mathew
Beta Was this translation helpful? Give feedback.
All reactions