Skip to content

Commit

Permalink
Improve build_developer.sh (#8579)
Browse files Browse the repository at this point in the history
  • Loading branch information
tengyifei authored Jan 23, 2025
1 parent 557d9f3 commit d382fee
Show file tree
Hide file tree
Showing 3 changed files with 36 additions and 6 deletions.
4 changes: 4 additions & 0 deletions CONTRIBUTING.md
Original file line number Diff line number Diff line change
Expand Up @@ -85,6 +85,10 @@ using either VS Code or a local container:
pip install torch_xla[tpu] \
-f https://storage.googleapis.com/libtpu-wheels/index.html \
-f https://storage.googleapis.com/libtpu-releases/index.html
# Optional: if you're using custom kernels, install pallas dependencies
pip install torch_xla[pallas] \
-f https://storage.googleapis.com/jax-releases/jax_nightly_releases.html \
-f https://storage.googleapis.com/jax-releases/jaxlib_nightly_releases.html
```

* If you are running on a TPU VM, ensure `torch` and `torch_xla` were built and
Expand Down
24 changes: 19 additions & 5 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -25,15 +25,29 @@ started:

To install PyTorch/XLA stable build in a new TPU VM:

```
pip install torch~=2.5.0 torch_xla[tpu]~=2.5.0 -f https://storage.googleapis.com/libtpu-releases/index.html -f https://storage.googleapis.com/libtpu-wheels/index.html
```sh
pip install torch~=2.5.0 'torch_xla[tpu]~=2.5.0' \
-f https://storage.googleapis.com/libtpu-releases/index.html \
-f https://storage.googleapis.com/libtpu-wheels/index.html

# Optional: if you're using custom kernels, install pallas dependencies
pip install 'torch_xla[pallas]' \
-f https://storage.googleapis.com/jax-releases/jax_nightly_releases.html \
-f https://storage.googleapis.com/jax-releases/jaxlib_nightly_releases.html
```

To install PyTorch/XLA nightly build in a new TPU VM:

```
pip3 install --pre torch torchvision --index-url https://download.pytorch.org/whl/nightly/cpu
pip install 'torch_xla[tpu] @ https://storage.googleapis.com/pytorch-xla-releases/wheels/tpuvm/torch_xla-2.7.0.dev-cp310-cp310-linux_x86_64.whl' -f https://storage.googleapis.com/libtpu-releases/index.html -f https://storage.googleapis.com/libtpu-wheels/index.html
```sh
pip install --pre torch torchvision --index-url https://download.pytorch.org/whl/nightly/cpu
pip install 'torch_xla[tpu] @ https://storage.googleapis.com/pytorch-xla-releases/wheels/tpuvm/torch_xla-2.7.0.dev-cp310-cp310-linux_x86_64.whl' \
-f https://storage.googleapis.com/libtpu-releases/index.html \
-f https://storage.googleapis.com/libtpu-wheels/index.html

# Optional: if you're using custom kernels, install pallas dependencies
pip install 'torch_xla[pallas]' \
-f https://storage.googleapis.com/jax-releases/jax_nightly_releases.html \
-f https://storage.googleapis.com/jax-releases/jaxlib_nightly_releases.html
```

### GPU Plugin
Expand Down
14 changes: 13 additions & 1 deletion scripts/build_developer.sh
Original file line number Diff line number Diff line change
Expand Up @@ -7,6 +7,12 @@ set -x # Display commands being run.
cd "$(dirname "$(readlink -f "$0")")"
cd ../../

# First remove any left over old wheels
# and old installation
pip uninstall torch -y
python3 setup.py clean

# Install pytorch
python3 setup.py bdist_wheel
python3 setup.py install
cd ..
Expand All @@ -17,8 +23,10 @@ if [ -d "vision" ]; then
python3 setup.py develop
fi

# Install torch_xla
cd ..
cd pytorch/xla
pip uninstall torch_xla -y
python3 setup.py develop

# libtpu is needed to talk to the TPUs. If TPUs are not present,
Expand All @@ -27,6 +35,10 @@ pip install torch_xla[tpu] \
-f https://storage.googleapis.com/libtpu-wheels/index.html \
-f https://storage.googleapis.com/libtpu-releases/index.html

# Install Pallas dependencies
pip install torch_xla[pallas] \
-f https://storage.googleapis.com/jax-releases/jax_nightly_releases.html \
-f https://storage.googleapis.com/jax-releases/jaxlib_nightly_releases.html

# Test that the library is installed correctly.
python3 -c 'import torch_xla as xla; print(xla.device())'

0 comments on commit d382fee

Please sign in to comment.