-
Notifications
You must be signed in to change notification settings - Fork 392
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Fixes #275 #1025
Fixes #275 #1025
Changes from all commits
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
Original file line number | Diff line number | Diff line change | ||||||||||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
@@ -0,0 +1,67 @@ | ||||||||||||||||||||||
======= | ||||||||||||||||||||||
Support For Learning Rate Schedulers in Skorch | ||||||||||||||||||||||
======= | ||||||||||||||||||||||
Skorch, a powerful library for training PyTorch models, offers seamless integration with learning rate schedulers. Learning rate schedulers allow you to adapt the learning rate during training, leading to faster convergence and improved model performance. In this section, we'll explore how to use learning rate schedulers with Skorch to fine-tune your neural network training process. | ||||||||||||||||||||||
|
||||||||||||||||||||||
What is a Learning Rate Scheduler? | ||||||||||||||||||||||
---------- | ||||||||||||||||||||||
|
||||||||||||||||||||||
A learning rate scheduler dynamically adjusts the learning rate during training. It can be a crucial component of your training pipeline, enabling you to control the step size for updating the model's weights as the training progresses. | ||||||||||||||||||||||
|
||||||||||||||||||||||
Using Learning Rate Schedulers in Skorch | ||||||||||||||||||||||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Should this be a section header? |
||||||||||||||||||||||
Skorch allows you to integrate PyTorch learning rate schedulers seamlessly into your training process. Here's a step-by-step guide on how to use them: | ||||||||||||||||||||||
|
||||||||||||||||||||||
1. Create Your Neural Network Model | ||||||||||||||||||||||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more.
Suggested change
|
||||||||||||||||||||||
Before you can use a learning rate scheduler, you need to define your neural network model using PyTorch. For example: | ||||||||||||||||||||||
|
||||||||||||||||||||||
.. code:: python | ||||||||||||||||||||||
|
||||||||||||||||||||||
import torch | ||||||||||||||||||||||
import torch.nn as nn | ||||||||||||||||||||||
|
||||||||||||||||||||||
class YourModel(nn.Module): | ||||||||||||||||||||||
def __init__(self): | ||||||||||||||||||||||
super(YourModel, self).__init__() | ||||||||||||||||||||||
# Define your layers here | ||||||||||||||||||||||
Comment on lines
+22
to
+25
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more.
Suggested change
Let's use a working example, it's not much longer. Also, the naming is more consistent with the other skorch examples. |
||||||||||||||||||||||
|
||||||||||||||||||||||
|
||||||||||||||||||||||
2. Create Your Skorch NeuralNet | ||||||||||||||||||||||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more.
Suggested change
|
||||||||||||||||||||||
Now, create a Skorch NeuralNet that wraps your PyTorch model. Make sure to specify the optimizer and learning rate scheduler in the NeuralNet constructor. Below is an example using the StepLR learning rate scheduler: | ||||||||||||||||||||||
|
||||||||||||||||||||||
.. code:: python | ||||||||||||||||||||||
|
||||||||||||||||||||||
from skorch import NeuralNet | ||||||||||||||||||||||
from skorch.callbacks import LRScheduler | ||||||||||||||||||||||
|
||||||||||||||||||||||
from torch.optim import SGD | ||||||||||||||||||||||
from torch.optim.lr_scheduler import StepLR | ||||||||||||||||||||||
|
||||||||||||||||||||||
net = NeuralNet( | ||||||||||||||||||||||
YourModel, | ||||||||||||||||||||||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more.
Suggested change
Also, how about setting |
||||||||||||||||||||||
criterion=nn.CrossEntropyLoss, | ||||||||||||||||||||||
optimizer=SGD, | ||||||||||||||||||||||
optimizer__lr=0.01, | ||||||||||||||||||||||
optimizer__momentum=0.9, | ||||||||||||||||||||||
iterator_train__shuffle=True, | ||||||||||||||||||||||
callbacks=[ | ||||||||||||||||||||||
('scheduler', LRScheduler(StepLR, step_size=10, gamma=0.5)), | ||||||||||||||||||||||
], | ||||||||||||||||||||||
) | ||||||||||||||||||||||
|
||||||||||||||||||||||
|
||||||||||||||||||||||
In the example above, we set the optimizer to Stochastic Gradient Descent (SGD) and attach a StepLR learning rate scheduler with a step size of 10 and a decay factor of 0.5. You can customize the scheduler parameters to suit your needs. | ||||||||||||||||||||||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Let's add that as, with any skorch callback, the parameters of the LRScheduler can be optimized via hyper-parameter search, e.g. to find the best |
||||||||||||||||||||||
|
||||||||||||||||||||||
3. Train Your Model | ||||||||||||||||||||||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more.
Suggested change
|
||||||||||||||||||||||
With your Skorch NeuralNet defined and the learning rate scheduler attached, you can start training your model as you normally would with scikit-learn: | ||||||||||||||||||||||
|
||||||||||||||||||||||
.. code:: python | ||||||||||||||||||||||
|
||||||||||||||||||||||
net.fit(X_train, y_train) | ||||||||||||||||||||||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. If we add a tiny amount of code, the example you show can actually be run: from sklearn.datasets import make_classification
X_train, y_train = make_classification(1000, 20, n_informative=10, random_state=0)
X_train = X_train.astype(np.float32)
y_train = y_train.astype(np.int64) |
||||||||||||||||||||||
|
||||||||||||||||||||||
The learning rate scheduler will automatically adjust the learning rate during training based on the specified schedule. | ||||||||||||||||||||||
|
||||||||||||||||||||||
4. Monitor Training Progress | ||||||||||||||||||||||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more.
Suggested change
|
||||||||||||||||||||||
During training, Skorch will automatically keep you informed about the learning rate changes, allowing you to monitor the effect of the learning rate scheduler on your model's performance. | ||||||||||||||||||||||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. I think this needs to be extended a bit, as it doesn't explain how that works. |
||||||||||||||||||||||
|
||||||||||||||||||||||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more.
Suggested change
|
||||||||||||||||||||||
|
||||||||||||||||||||||
Learning rate schedulers are a valuable tool for fine-tuning neural network training, and Skorch simplifies their integration into your training pipeline. Experiment with different schedulers and monitor your model's progress to find the best strategy for your specific task. With Skorch, you have the flexibility to choose the scheduler that suits your needs, and you can easily adjust its parameters for optimal results. | ||||||||||||||||||||||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. I think this paragraph can be safely removed. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is very flattering but not necessary ;)