Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add New Optimizer #32225

Open
SoheilFM opened this issue Jul 25, 2024 · 6 comments
Open

Add New Optimizer #32225

SoheilFM opened this issue Jul 25, 2024 · 6 comments
Labels
Feature request Request for a new feature

Comments

@SoheilFM
Copy link

SoheilFM commented Jul 25, 2024

Feature request

I want to add a new optimizer that I worked on it
Is there any doc or anything how to add this new optimizer to all files of transformers library?

Motivation

This optimizer would be great to implement on transformers and reduce time to train a new model

Your contribution

I can make a clone of this repo and then changes file and submit PR. But I can't find a document how to implement my algorithm to transformers Files because I found so much classes and there are a lots of file that need changes.

@SoheilFM SoheilFM added the Feature request Request for a new feature label Jul 25, 2024
@deven367
Copy link

Hi @SoheilFM, you can follow the link here → https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md

@SoheilFM
Copy link
Author

Hi @SoheilFM, you can follow the link here → https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md

I read that, but I couldn't find any information about how optimizers are implemented.

@qubvel
Copy link
Member

qubvel commented Jul 26, 2024

Hi @SoheilFM, thanks for your initiative to add a new optimizer!

It would be cool to have some descriptions, benchmarks, and links to your optimizer, to understand better the motivation regarding adding it. Is there any paper published with benchmarks? Is there any working implementation with pip package?

Later on, if the feature is approved, you could probably take a look at recent PRs regarding adding optimizers (in case your optimizer has a pip package already):

@SoheilFM
Copy link
Author

SoheilFM commented Jul 26, 2024

Hi @SoheilFM, thanks for your initiative to add a new optimizer!

It would be cool to have some descriptions, benchmarks, and links to your optimizer, to understand better the motivation regarding adding it. Is there any paper published with benchmarks? Is there any working implementation with pip package?

Later on, if the feature is approved, you could probably take a look at recent PRs regarding adding optimizers (in case your optimizer has a pip package already):

Hello Dear @qubvel,

Thank you for your reply.

We are currently working on the optimizer mentioned in our research paper, but unfortunately, I cannot share it publicly at this time. The general concept is similar to the optimizer they added to transformers, which they named LION.

This algorithm is written in pure Python and has not been implemented in pip or any package management system. It has been tested on unimodal benchmarks such as Sphere, Rastrigin, Rosenbrock, and Ackley, showing promising results.

Our next step is to implement this algorithm in the transformers library to train some models and evaluate benchmarks to determine its effectiveness as an optimizer.

I have read the two PRs you mentioned, but they require numerous file changes and include functions and parameters that I do not fully understand. Different developers have different approaches, possibly based on their use cases.

Could you please let me know if there are any specific issues or documentation that would help us implement a new optimizer in the transformers library, or should we rely on our knowledge and navigate through the various modules and code?

Best regards,

@SoheilFM
Copy link
Author

SoheilFM commented Aug 5, 2024

Any News?

@qubvel
Copy link
Member

qubvel commented Aug 5, 2024

Hi @SoheilFM,

If you just need to conduct experiments with your custom optimizer, you don't need to integrate it into transofrmers library, Trainer has option to provide an external optimizer and lr scheduler for training, see here:

optimizer = MyOptimizer(model.parameters(), ...)

trainer = Trainer(
   ...
   optimizers=(optimizer, scheduler),
   ...
)
trainer.train()
...

Let me know if that's something you are looking for.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Feature request Request for a new feature
Projects
None yet
Development

No branches or pull requests

3 participants