diff --git a/CONTRIBUTING.md b/CONTRIBUTING.md index fbfbdf2dbe3d..79ef3cbd4080 100644 --- a/CONTRIBUTING.md +++ b/CONTRIBUTING.md @@ -85,6 +85,10 @@ using either VS Code or a local container: pip install torch_xla[tpu] \ -f https://storage.googleapis.com/libtpu-wheels/index.html \ -f https://storage.googleapis.com/libtpu-releases/index.html + # Optional: if you're using custom kernels, install pallas dependencies + pip install torch_xla[pallas] \ + -f https://storage.googleapis.com/jax-releases/jax_nightly_releases.html \ + -f https://storage.googleapis.com/jax-releases/jaxlib_nightly_releases.html ``` * If you are running on a TPU VM, ensure `torch` and `torch_xla` were built and diff --git a/README.md b/README.md index 279b3152b804..08710641f492 100644 --- a/README.md +++ b/README.md @@ -25,15 +25,29 @@ started: To install PyTorch/XLA stable build in a new TPU VM: -``` -pip install torch~=2.5.0 torch_xla[tpu]~=2.5.0 -f https://storage.googleapis.com/libtpu-releases/index.html -f https://storage.googleapis.com/libtpu-wheels/index.html +```sh +pip install torch~=2.5.0 'torch_xla[tpu]~=2.5.0' \ + -f https://storage.googleapis.com/libtpu-releases/index.html \ + -f https://storage.googleapis.com/libtpu-wheels/index.html + +# Optional: if you're using custom kernels, install pallas dependencies +pip install 'torch_xla[pallas]' \ + -f https://storage.googleapis.com/jax-releases/jax_nightly_releases.html \ + -f https://storage.googleapis.com/jax-releases/jaxlib_nightly_releases.html ``` To install PyTorch/XLA nightly build in a new TPU VM: -``` -pip3 install --pre torch torchvision --index-url https://download.pytorch.org/whl/nightly/cpu -pip install 'torch_xla[tpu] @ https://storage.googleapis.com/pytorch-xla-releases/wheels/tpuvm/torch_xla-2.7.0.dev-cp310-cp310-linux_x86_64.whl' -f https://storage.googleapis.com/libtpu-releases/index.html -f https://storage.googleapis.com/libtpu-wheels/index.html +```sh +pip install --pre torch torchvision --index-url https://download.pytorch.org/whl/nightly/cpu +pip install 'torch_xla[tpu] @ https://storage.googleapis.com/pytorch-xla-releases/wheels/tpuvm/torch_xla-2.7.0.dev-cp310-cp310-linux_x86_64.whl' \ + -f https://storage.googleapis.com/libtpu-releases/index.html \ + -f https://storage.googleapis.com/libtpu-wheels/index.html + +# Optional: if you're using custom kernels, install pallas dependencies +pip install 'torch_xla[pallas]' \ + -f https://storage.googleapis.com/jax-releases/jax_nightly_releases.html \ + -f https://storage.googleapis.com/jax-releases/jaxlib_nightly_releases.html ``` ### GPU Plugin diff --git a/scripts/build_developer.sh b/scripts/build_developer.sh index 680c4a3e8f74..c7d946cd51bc 100755 --- a/scripts/build_developer.sh +++ b/scripts/build_developer.sh @@ -7,6 +7,12 @@ set -x # Display commands being run. cd "$(dirname "$(readlink -f "$0")")" cd ../../ +# First remove any left over old wheels +# and old installation +pip uninstall torch -y +python3 setup.py clean + +# Install pytorch python3 setup.py bdist_wheel python3 setup.py install cd .. @@ -17,8 +23,10 @@ if [ -d "vision" ]; then python3 setup.py develop fi +# Install torch_xla cd .. cd pytorch/xla +pip uninstall torch_xla -y python3 setup.py develop # libtpu is needed to talk to the TPUs. If TPUs are not present, @@ -27,6 +35,10 @@ pip install torch_xla[tpu] \ -f https://storage.googleapis.com/libtpu-wheels/index.html \ -f https://storage.googleapis.com/libtpu-releases/index.html +# Install Pallas dependencies +pip install torch_xla[pallas] \ + -f https://storage.googleapis.com/jax-releases/jax_nightly_releases.html \ + -f https://storage.googleapis.com/jax-releases/jaxlib_nightly_releases.html + # Test that the library is installed correctly. python3 -c 'import torch_xla as xla; print(xla.device())' -