-
Notifications
You must be signed in to change notification settings - Fork 486
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Adds ASTLayer
support for BetterTransformer
#548
base: main
Are you sure you want to change the base?
Conversation
The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. |
Hi @younesbelkada !
Please let me know what you think about these changes and let me what else I need to do on this topic. Thank you very much! |
Hi @ravenouse ! |
Hi @younesbelkada ! Happy New Year! I have just rebased this branch with the current optimum main branch, and I re-run the test, Please let me know what else I can do! Thank you so much! |
I did rebase some time ago. Could you spare some time to review this? |
What does this PR do?
Adds
ASTLayer
support forBetterTransformer
Fixes Community contribution - BetterTransformer integration for more models! #20372
Questions:
"MIT/ast-finetuned-audioset-10-10-0.4593
as the test model to runpytest
but some tests failed. I think one test model, like"hf-internal-testing/tiny-random-MBartModel"
, is needed to test theASTLayerBetterTransformer
.BetterTransformerBaseLayer
class. I notice that we setself. use_gelu
as false for the default setting but lots of the supported transformer models actually use the gelu activation function, likebert
. Could you provide more information about it?Thank you so much for your effort!!
Please let me know what else I need to do!