-
Notifications
You must be signed in to change notification settings - Fork 636
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
update to rocm 6.3 wheels #1192
base: main
Are you sure you want to change the base?
Conversation
@tenpercent I think you have much more context on a PR like this than me. If you say you want it merged, I can merge it! But I can't really review it. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
win build is failing and torch version should be same between building and uploading flows
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
looks good, let's wait for CI and merge
@tenpercent @johnnynunez Not sure if you see the errors on the CI? Like this on 6.1 and 6.2
|
@johnnynunez also it looks like rocm6.3 build hasn't actually built the flash attention extension |
@johnnynunez I think the root cause is there is no torch wheel uploaded for rocm6.3 on download.pytorch.org. It'll appear there in some time. I think, for a meantime, we can build at least for rocm6.2 |
I ask to pytorch team, and I can confirm that: rocm6.3 is coming with pytorch 2.7 |
It's coming with pytorch 2.7
I think right now we need the torch wheel built for rocm6.3 as a dependency to build the xformers wheel for rocm6.3. Even though the cpp extension for ROCm doesn't use any API specific to rocm6.3 Lines 73 to 75 in 536363e
|
Maybe maintain this philosophy. Why did I do this? In my case, rocm is better and better in each version supporting new features and fewer errors builds. For example, I went from 364 compilation errors with rocm6.1 to 20 with rocm6.2 to 7 with rocm6.3 in flash attention for my w7900 dual slot PY_ROCM = [(PY_VERSIONS[-1], ROCM_VERSIONS[-2])] # Always last version is upcoming version |
Better performance rocm6.3 https://community.amd.com/t5/ai/unlocking-new-horizons-in-ai-and-hpc-with-the-release-of-amd/ba-p/726434