Skip to content
This repository has been archived by the owner on Feb 1, 2024. It is now read-only.

build(deps): bump pytorch-lightning from 2.0.9 to 2.1.0 #51

Draft
wants to merge 2 commits into
base: main
Choose a base branch
from

Conversation

dependabot[bot]
Copy link
Contributor

@dependabot dependabot bot commented on behalf of github Nov 1, 2023

Bumps pytorch-lightning from 2.0.9 to 2.1.0.

Release notes

Sourced from pytorch-lightning's releases.

Lightning 2.1: Train Bigger, Better, Faster

Lightning AI is excited to announce the release of Lightning 2.1 ⚡ It's the culmination of work from 79 contributors who have worked on features, bug-fixes, and documentation for a total of over 750+ commits since v2.0.

The theme of 2.1 is "bigger, better, faster": Bigger because training large multi-billion parameter models has gotten even more efficient thanks to FSDP, efficient initialization and sharded checkpointing improvements, better because it's easier than ever to scale models without making substantial code changes or installing third-party packages and faster because it leverages the latest hardware features to speed up training in low-bit precision thanks to new precision plugins like bitsandbytes and transformer engine. And of course, as the name implies, this release fully leverages the latest features in PyTorch 2.1 🎉

Highlights

Improvements To Large-Scale Training With FSDP

The FSDP strategy for training large billion-parameter models gets substantial improvements and new features in Lightning 2.1, both in Trainer and Fabric (in case you didn't know, Fabric is the latest addition to the Lightning family of tools to scale models without the boilerplate code). FSDP is now more user-friendly to configure, has memory management and speed improvements, and we have a brand new end-to-end user guide with best practices (Trainer, Fabric).

Efficient Saving and Loading of Large Checkpoints

When training large billion-parameter models with FSDP, saving and resuming training, or even just loading model parameters for finetuning can be challenging, as users are are often plagued by out-of-memory errors and speed bottlenecks.

In 2.1, we made several improvements. Starting with saving checkpoints, we added support for distributed/sharded checkpoints, enabled through the setting state_dict_type in the strategy (#18364, #18358):

Trainer:

import lightning as L
from lightning.pytorch.strategies import FSDPStrategy
Default used by the strategy
strategy = FSDPStrategy(state_dict_type="full")
Enable saving distributed checkpoints
</tr></table>

... (truncated)

Commits

Dependabot compatibility score

You can trigger a rebase of this PR by commenting @dependabot rebase.


Dependabot commands and options

You can trigger Dependabot actions by commenting on this PR:

  • @dependabot rebase will rebase this PR
  • @dependabot recreate will recreate this PR, overwriting any edits that have been made to it
  • @dependabot merge will merge this PR after your CI passes on it
  • @dependabot squash and merge will squash and merge this PR after your CI passes on it
  • @dependabot cancel merge will cancel a previously requested merge and block automerging
  • @dependabot reopen will reopen this PR if it is closed
  • @dependabot close will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
  • @dependabot show <dependency name> ignore conditions will show all of the ignore conditions of the specified dependency
  • @dependabot ignore this major version will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
  • @dependabot ignore this minor version will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
  • @dependabot ignore this dependency will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)

Note
Automatic rebases have been disabled on this pull request as it has been open for over 30 days.

@dependabot dependabot bot requested a review from Borda as a code owner November 1, 2023 08:39
Copy link
Contributor Author

dependabot bot commented on behalf of github Nov 1, 2023

Dependabot tried to add @Lightning-AI/core-lightning as a reviewer to this PR, but received the following error from GitHub:

POST https://api.github.com/repos/Lightning-AI/lightning-Graphcore/pulls/51/requested_reviewers: 422 - Reviews may only be requested from collaborators. One or more of the teams you specified is not a collaborator of the Lightning-AI/lightning-Graphcore repository. // See: https://docs.github.com/rest/pulls/review-requests#request-reviewers-for-a-pull-request

@dependabot dependabot bot added the ci / tests label Nov 1, 2023
Bumps [pytorch-lightning](https://github.com/Lightning-AI/lightning) from 2.0.9 to 2.1.0.
- [Release notes](https://github.com/Lightning-AI/lightning/releases)
- [Commits](Lightning-AI/pytorch-lightning@2.0.9...2.1.0)

---
updated-dependencies:
- dependency-name: pytorch-lightning
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <[email protected]>
@dependabot dependabot bot force-pushed the dependabot-pip-pytorch-lightning-2.1.0 branch from 250fdda to 593dbf7 Compare November 1, 2023 12:46
@Borda Borda enabled auto-merge (squash) November 1, 2023 12:47
@Borda Borda disabled auto-merge November 1, 2023 12:48
@Borda Borda enabled auto-merge (squash) November 1, 2023 12:48
@Borda Borda mentioned this pull request Nov 8, 2023
Copy link
Contributor Author

dependabot bot commented on behalf of github Dec 1, 2023

A newer version of pytorch-lightning exists, but since this PR has been edited by someone other than Dependabot I haven't updated it. You'll get a PR for the updated version as normal once this PR is merged.

@Borda
Copy link
Member

Borda commented Dec 1, 2023

@dependabot rebase

Copy link
Contributor Author

dependabot bot commented on behalf of github Dec 1, 2023

Looks like this PR has been edited by someone other than Dependabot. That means Dependabot can't rebase it - sorry!

If you're happy for Dependabot to recreate it from scratch, overwriting any edits, you can request @dependabot recreate.

@Borda Borda marked this pull request as draft December 1, 2023 10:33
auto-merge was automatically disabled December 1, 2023 10:33

Pull request was converted to draft

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant