Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Multi-dimensional learning rate (sigma0) for AdamW #462

Merged
merged 8 commits into from
Sep 10, 2024

Conversation

BradyPlanden
Copy link
Member

Description

Adds multi-dimensional sigma0 values to AdamW, updates integration tests for multi-dimensional sigma0, and updates the default value in GaussianLogLikehood sigma0

Issue reference

Fixes #461

Review

Before you mark your PR as ready for review, please ensure that you've considered the following:

  • Updated the CHANGELOG.md in reverse chronological order (newest at the top) with a concise description of the changes, including the PR number.
  • Noted any breaking changes, including details on how it might impact existing functionality.

Type of change

  • New Feature: A non-breaking change that adds new functionality.
  • Optimization: A code change that improves performance.
  • Examples: A change to existing or additional examples.
  • Bug Fix: A non-breaking change that addresses an issue.
  • Documentation: Updates to documentation or new documentation for new features.
  • Refactoring: Non-functional changes that improve the codebase.
  • Style: Non-functional changes related to code style (formatting, naming, etc).
  • Testing: Additional tests to improve coverage or confirm functionality.
  • Other: (Insert description of change)

Key checklist:

  • No style issues: $ pre-commit run (or $ nox -s pre-commit) (see CONTRIBUTING.md for how to set this up to run automatically when committing locally, in just two lines of code)
  • All unit tests pass: $ nox -s tests
  • The documentation builds: $ nox -s doctest

You can run integration tests, unit tests, and doctests together at once, using $ nox -s quick.

Further checks:

  • Code is well-commented, especially in complex or unclear areas.
  • Added tests that prove my fix is effective or that my feature works.
  • Checked that coverage remains or improves, and added tests if necessary to maintain or increase coverage.

Thank you for contributing to our project! Your efforts help us to deliver great software.

Copy link

codecov bot commented Aug 19, 2024

Codecov Report

All modified and coverable lines are covered by tests ✅

Project coverage is 99.03%. Comparing base (76de36a) to head (20b62ed).
Report is 28 commits behind head on develop.

Additional details and impacted files
@@             Coverage Diff             @@
##           develop     #462      +/-   ##
===========================================
- Coverage    99.04%   99.03%   -0.01%     
===========================================
  Files           52       52              
  Lines         3545     3527      -18     
===========================================
- Hits          3511     3493      -18     
  Misses          34       34              

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

@BradyPlanden BradyPlanden marked this pull request as ready for review September 9, 2024 12:56
Copy link
Contributor

@martinjrobins martinjrobins left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

looks good to me. I see that you fixed a few of the cost funtions gradient calculations as well

# Conflicts:
#	pybop/costs/_likelihoods.py
#	tests/integration/test_eis_parameterisation.py
#	tests/integration/test_spm_parameterisations.py
@BradyPlanden BradyPlanden merged commit e0b8d29 into develop Sep 10, 2024
30 of 31 checks passed
@BradyPlanden BradyPlanden deleted the update-adamW-multi-dimensional-lr branch September 10, 2024 09:30
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Update AdamW for multi-dimensional sigma0
2 participants