Skip to content

Commit

Permalink
Dependencies: Add pytorch-triton-rocm
Browse files Browse the repository at this point in the history
Required for AMD installs.

Signed-off-by: kingbri <[email protected]>
  • Loading branch information
bdashore3 committed Mar 28, 2024
1 parent 271f5ba commit d4280e1
Showing 1 changed file with 4 additions and 0 deletions.
4 changes: 4 additions & 0 deletions pyproject.toml
Original file line number Diff line number Diff line change
Expand Up @@ -78,6 +78,10 @@ cu118 = [
"flash_attn @ https://github.com/Dao-AILab/flash-attention/releases/download/v2.5.2/flash_attn-2.5.2+cu118torch2.2cxx11abiFALSE-cp310-cp310-linux_x86_64.whl ; platform_system == 'Linux' and platform_machine == 'x86_64' and python_version == '3.10'",
]
amd = [
# Torch triton for ROCm
"pytorch_triton_rocm @ https://download.pytorch.org/whl/pytorch_triton_rocm-2.2.0-cp311-cp311-linux_x86_64.whl ; python_version == '3.11'",
"pytorch_triton_rocm @ https://download.pytorch.org/whl/pytorch_triton_rocm-2.2.0-cp310-cp310-linux_x86_64.whl ; python_version == '3.10'",

# Torch
"torch @ https://download.pytorch.org/whl/rocm5.6/torch-2.2.1%2Brocm5.6-cp311-cp311-linux_x86_64.whl ; python_version == '3.11'",
"torch @ https://download.pytorch.org/whl/rocm5.6/torch-2.2.1%2Brocm5.6-cp310-cp310-linux_x86_64.whl ; python_version == '3.10'",
Expand Down

0 comments on commit d4280e1

Please sign in to comment.