You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
So most likely v2.4.2 is going to be installed. However, this version seems to have some issues when imported, claiming some missing symbols. I uninstalled it and manually installed the latest v2.5.5, and the import succeeded.
The question is: why do we set this upper bound for flash-attn?
The text was updated successfully, but these errors were encountered:
Flash Attention is being rapidly developed and its API is somewhat unstable. We've found it safer to only bump the version constraint after validating that Flash Attention works as expected. We are open to improving our workflows though.
Is it possible to look into updating max to flash-attn version v2.6.3? The compile time of this version is much faster for me than the current max v2.5.8.
Edit: Never mind, realized I was looking at stable branch and main is updated to 2.6.3
Version: latest stable
Currently, the version constraint for
flash-attn
is:TransformerEngine/setup.py
Line 269 in b8eea8a
So most likely
v2.4.2
is going to be installed. However, this version seems to have some issues when imported, claiming some missing symbols. I uninstalled it and manually installed the latestv2.5.5
, and the import succeeded.The question is: why do we set this upper bound for
flash-attn
?The text was updated successfully, but these errors were encountered: