Skip to content

🐛 [Bug] Softmax in nn.MultiheadAttention layer not fused with torch.compile backend #2127

🐛 [Bug] Softmax in nn.MultiheadAttention layer not fused with torch.compile backend

🐛 [Bug] Softmax in nn.MultiheadAttention layer not fused with torch.compile backend #2127