Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fixes #275 #1025

Open
wants to merge 2 commits into
base: master
Choose a base branch
from
Open

Fixes #275 #1025

wants to merge 2 commits into from

Conversation

devyanic11
Copy link

No description provided.

@BenjaminBossan
Copy link
Collaborator

Thanks for working on this long standing issue. I don't have time this week, will give a review next week.

Copy link
Collaborator

@BenjaminBossan BenjaminBossan left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thank you very much for working on this long standing issue. Overall, this already looks quite good. I have suggested some small changes here and there, please take a look.

Apart from those, I have some more general points:

  1. There is no entry in docs/index.rst, which means that users cannot find the page. Could you please add one?
  2. We generally don't capitalize "skorch".
  3. The LRScheduler class in skorch also has a handy method called simulate which allows to simulate the learning rate changes. It would be nice to document that too, but it's not a must have.

Btw. it is not difficult to build the documentation locally and check it yourself (install the doc requirements and follow the instructions in the docs folder).

from torch.optim.lr_scheduler import StepLR

net = NeuralNet(
YourModel,
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
YourModel,
MyModule,

Also, how about setting max_epochs=20. This way, the effect of the LR scheduler can be more clearly seen (since it steps at epoch 10).


A learning rate scheduler dynamically adjusts the learning rate during training. It can be a crucial component of your training pipeline, enabling you to control the step size for updating the model's weights as the training progresses.

Using Learning Rate Schedulers in Skorch
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Should this be a section header?

=======
Support For Learning Rate Schedulers in Skorch
=======
Skorch, a powerful library for training PyTorch models, offers seamless integration with learning rate schedulers. Learning rate schedulers allow you to adapt the learning rate during training, leading to faster convergence and improved model performance. In this section, we'll explore how to use learning rate schedulers with Skorch to fine-tune your neural network training process.
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
Skorch, a powerful library for training PyTorch models, offers seamless integration with learning rate schedulers. Learning rate schedulers allow you to adapt the learning rate during training, leading to faster convergence and improved model performance. In this section, we'll explore how to use learning rate schedulers with Skorch to fine-tune your neural network training process.
Skorch offers seamless integration with learning rate schedulers. Learning rate schedulers allow you to adapt the learning rate during training, leading to faster convergence and improved model performance. In this section, we'll explore how to use learning rate schedulers with Skorch to fine-tune your neural network training process.

This is very flattering but not necessary ;)


.. code:: python

net.fit(X_train, y_train)
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If we add a tiny amount of code, the example you show can actually be run:

from sklearn.datasets import make_classification
X_train, y_train = make_classification(1000, 20, n_informative=10, random_state=0)
X_train = X_train.astype(np.float32)
y_train = y_train.astype(np.int64)

Comment on lines +22 to +25
class YourModel(nn.Module):
def __init__(self):
super(YourModel, self).__init__()
# Define your layers here
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
class YourModel(nn.Module):
def __init__(self):
super(YourModel, self).__init__()
# Define your layers here
class MyModule(nn.Module):
def __init__(self):
super().__init__()
self.lin = torch.nn.Linear(20, 2)
def forward(self, x):
return self.lin(x)

Let's use a working example, it's not much longer. Also, the naming is more consistent with the other skorch examples.

The learning rate scheduler will automatically adjust the learning rate during training based on the specified schedule.

4. Monitor Training Progress
During training, Skorch will automatically keep you informed about the learning rate changes, allowing you to monitor the effect of the learning rate scheduler on your model's performance.
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think this needs to be extended a bit, as it doesn't explain how that works.

# Define your layers here


2. Create Your Skorch NeuralNet
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
2. Create Your Skorch NeuralNet
2. Create Your Skorch NeuralNet

Using Learning Rate Schedulers in Skorch
Skorch allows you to integrate PyTorch learning rate schedulers seamlessly into your training process. Here's a step-by-step guide on how to use them:

1. Create Your Neural Network Model
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
1. Create Your Neural Network Model
1. Create Your Neural Network Model


In the example above, we set the optimizer to Stochastic Gradient Descent (SGD) and attach a StepLR learning rate scheduler with a step size of 10 and a decay factor of 0.5. You can customize the scheduler parameters to suit your needs.

3. Train Your Model
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
3. Train Your Model
3. Train Your Model


The learning rate scheduler will automatically adjust the learning rate during training based on the specified schedule.

4. Monitor Training Progress
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
4. Monitor Training Progress
4. Monitor Training Progress

@BenjaminBossan
Copy link
Collaborator

@devyanic11 Do you still plan on working on this PR?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants