-
Notifications
You must be signed in to change notification settings - Fork 390
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Fixes #275 #1025
base: master
Are you sure you want to change the base?
Fixes #275 #1025
Conversation
Thanks for working on this long standing issue. I don't have time this week, will give a review next week. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thank you very much for working on this long standing issue. Overall, this already looks quite good. I have suggested some small changes here and there, please take a look.
Apart from those, I have some more general points:
- There is no entry in
docs/index.rst
, which means that users cannot find the page. Could you please add one? - We generally don't capitalize "skorch".
- The
LRScheduler
class in skorch also has a handy method calledsimulate
which allows to simulate the learning rate changes. It would be nice to document that too, but it's not a must have.
Btw. it is not difficult to build the documentation locally and check it yourself (install the doc requirements and follow the instructions in the docs folder).
from torch.optim.lr_scheduler import StepLR | ||
|
||
net = NeuralNet( | ||
YourModel, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
YourModel, | |
MyModule, |
Also, how about setting max_epochs=20
. This way, the effect of the LR scheduler can be more clearly seen (since it steps at epoch 10).
|
||
A learning rate scheduler dynamically adjusts the learning rate during training. It can be a crucial component of your training pipeline, enabling you to control the step size for updating the model's weights as the training progresses. | ||
|
||
Using Learning Rate Schedulers in Skorch |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Should this be a section header?
======= | ||
Support For Learning Rate Schedulers in Skorch | ||
======= | ||
Skorch, a powerful library for training PyTorch models, offers seamless integration with learning rate schedulers. Learning rate schedulers allow you to adapt the learning rate during training, leading to faster convergence and improved model performance. In this section, we'll explore how to use learning rate schedulers with Skorch to fine-tune your neural network training process. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Skorch, a powerful library for training PyTorch models, offers seamless integration with learning rate schedulers. Learning rate schedulers allow you to adapt the learning rate during training, leading to faster convergence and improved model performance. In this section, we'll explore how to use learning rate schedulers with Skorch to fine-tune your neural network training process. | |
Skorch offers seamless integration with learning rate schedulers. Learning rate schedulers allow you to adapt the learning rate during training, leading to faster convergence and improved model performance. In this section, we'll explore how to use learning rate schedulers with Skorch to fine-tune your neural network training process. |
This is very flattering but not necessary ;)
|
||
.. code:: python | ||
|
||
net.fit(X_train, y_train) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
If we add a tiny amount of code, the example you show can actually be run:
from sklearn.datasets import make_classification
X_train, y_train = make_classification(1000, 20, n_informative=10, random_state=0)
X_train = X_train.astype(np.float32)
y_train = y_train.astype(np.int64)
class YourModel(nn.Module): | ||
def __init__(self): | ||
super(YourModel, self).__init__() | ||
# Define your layers here |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
class YourModel(nn.Module): | |
def __init__(self): | |
super(YourModel, self).__init__() | |
# Define your layers here | |
class MyModule(nn.Module): | |
def __init__(self): | |
super().__init__() | |
self.lin = torch.nn.Linear(20, 2) | |
def forward(self, x): | |
return self.lin(x) |
Let's use a working example, it's not much longer. Also, the naming is more consistent with the other skorch examples.
The learning rate scheduler will automatically adjust the learning rate during training based on the specified schedule. | ||
|
||
4. Monitor Training Progress | ||
During training, Skorch will automatically keep you informed about the learning rate changes, allowing you to monitor the effect of the learning rate scheduler on your model's performance. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think this needs to be extended a bit, as it doesn't explain how that works.
# Define your layers here | ||
|
||
|
||
2. Create Your Skorch NeuralNet |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
2. Create Your Skorch NeuralNet | |
2. Create Your Skorch NeuralNet | |
Using Learning Rate Schedulers in Skorch | ||
Skorch allows you to integrate PyTorch learning rate schedulers seamlessly into your training process. Here's a step-by-step guide on how to use them: | ||
|
||
1. Create Your Neural Network Model |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
1. Create Your Neural Network Model | |
1. Create Your Neural Network Model | |
|
||
In the example above, we set the optimizer to Stochastic Gradient Descent (SGD) and attach a StepLR learning rate scheduler with a step size of 10 and a decay factor of 0.5. You can customize the scheduler parameters to suit your needs. | ||
|
||
3. Train Your Model |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
3. Train Your Model | |
3. Train Your Model | |
|
||
The learning rate scheduler will automatically adjust the learning rate during training based on the specified schedule. | ||
|
||
4. Monitor Training Progress |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
4. Monitor Training Progress | |
4. Monitor Training Progress | |
@devyanic11 Do you still plan on working on this PR? |
No description provided.