We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
What did you find confusing? Please describe. This is Dockerfile link: https://github.com/aws/sagemaker-pytorch-inference-toolkit/blob/master/docker/1.5.0/py3/Dockerfile.cpu This link https://docs.aws.amazon.com/sagemaker/latest/dg/ei-endpoints.html#ei-endpoints-pytorch states: You can download the Elastic Inference enabled binary for PyTorch from the public Amazon S3 bucket at console.aws.amazon.com/s3/buckets/amazonei-pytorch. For information about building a container that uses the Elastic Inference enabled version of PyTorch, see Building your image. I am confused. If I use the Dockerfile above, do I still need to download and install https://console.aws.amazon.com/s3/buckets/amazonei-pytorch to build docker container image? If I want to use customer docker image for Sagemaker elastic inference, do I need to convert pytorch code into torchscript? This part is not covered. Can I use it for Python version >=3.7 and PyTorch version >=1.12?
Describe how documentation can be improved A clear and concise description of where documentation was lacking and how it can be improved.
Additional context Add any other context or screenshots about the documentation request here.
The text was updated successfully, but these errors were encountered:
No branches or pull requests
What did you find confusing? Please describe.
This is Dockerfile link:
https://github.com/aws/sagemaker-pytorch-inference-toolkit/blob/master/docker/1.5.0/py3/Dockerfile.cpu
This link https://docs.aws.amazon.com/sagemaker/latest/dg/ei-endpoints.html#ei-endpoints-pytorch
states:
You can download the Elastic Inference enabled binary for PyTorch from the public Amazon S3 bucket at console.aws.amazon.com/s3/buckets/amazonei-pytorch. For information about building a container that uses the Elastic Inference enabled version of PyTorch, see Building your image.
I am confused. If I use the Dockerfile above, do I still need to download and install https://console.aws.amazon.com/s3/buckets/amazonei-pytorch to build docker container image?
If I want to use customer docker image for Sagemaker elastic inference, do I need to convert pytorch code into torchscript?
This part is not covered.
Can I use it for Python version >=3.7 and PyTorch version >=1.12?
Describe how documentation can be improved
A clear and concise description of where documentation was lacking and how it can be improved.
Additional context
Add any other context or screenshots about the documentation request here.
The text was updated successfully, but these errors were encountered: