You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
State space models could help with very long sequences found in some scientific datasets.
Recently Nvidia has implemented Mamba2 in Megatron-LM. Could we use that?
Hacking in naive Mamba2 example was fairly easy and has been ran on Sunspot but it was not memory or compute efficient as the efficient implementation is dependant on causal conv1d implemented in Cuda.
The text was updated successfully, but these errors were encountered:
State space models could help with very long sequences found in some scientific datasets.
Recently Nvidia has implemented Mamba2 in Megatron-LM. Could we use that?
Hacking in naive Mamba2 example was fairly easy and has been ran on Sunspot but it was not memory or compute efficient as the efficient implementation is dependant on causal conv1d implemented in Cuda.
The text was updated successfully, but these errors were encountered: