We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
So I had some limited access to some Tesla A10s - they are of course sm_86 and not compatible...
But if you want to make them work just add compatibly and build your own docker image locally.
Dockerfile
FROM devforth/gpt-j-6b-gpu RUN pip3 install torch torchvision torchaudio --index-url https://download.pytorch.org/whl/cu118 -U CMD uvicorn web:app --port 8080 --host 0.0.0.0
Then run docker build -t CustomImageName . and run docker run -p8080:8080 --gpus all --rm -it CustomImageName
docker build -t CustomImageName .
docker run -p8080:8080 --gpus all --rm -it CustomImageName
Credit to this project and https://pytorch.org/get-started/locally/
The text was updated successfully, but these errors were encountered:
No branches or pull requests
So I had some limited access to some Tesla A10s - they are of course sm_86 and not compatible...
But if you want to make them work just add compatibly and build your own docker image locally.
Dockerfile
Then run
docker build -t CustomImageName .
and run
docker run -p8080:8080 --gpus all --rm -it CustomImageName
Credit to this project and https://pytorch.org/get-started/locally/
The text was updated successfully, but these errors were encountered: