Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[BugFix] Allow expanding TensorDictPrimer transforms shape with parent batch size #2521

Merged
merged 6 commits into from
Nov 5, 2024

Conversation

albertbou92
Copy link
Contributor

@albertbou92 albertbou92 commented Oct 28, 2024

Description

Often, when defining TensorDictPrimer transforms during environment creation, the batch size is unknown. For example when we create the transform with the torchrl.modules.utils.get_primers_from_module method.

In TensorDictPrimer, we assume that if the parent environment is batch-locked, the specs will already have the appropriate leading shape. However, this can lead to issues if the batch size is unknown at instantiation.

Maybe I am missing something here, but this PR makes sure the primer matches the parent batch size during reset, expanding it if necessary.

Motivation and Context

Why is this change required? What problem does it solve?
If it fixes an open issue, please link to the issue here.
You can use the syntax close #15213 if this solves the issue #15213

  • I have raised an issue to propose this change (required for new features and bug fixes)

Types of changes

What types of changes does your code introduce? Remove all that do not apply:

  • Bug fix (non-breaking change which fixes an issue)
  • New feature (non-breaking change which adds core functionality)
  • Breaking change (fix or feature that would cause existing functionality to change)
  • Documentation (update in the documentation)
  • Example (update in the folder of examples)

Checklist

Go over all the following points, and put an x in all the boxes that apply.
If you are unsure about any of these, don't hesitate to ask. We are here to help!

  • I have read the CONTRIBUTION guide (required)
  • My change requires a change to the documentation.
  • I have updated the tests accordingly (required for a bug fix or a new feature).
  • I have updated the documentation accordingly.

Copy link

pytorch-bot bot commented Oct 28, 2024

🔗 Helpful Links

🧪 See artifacts and rendered test results at hud.pytorch.org/pr/pytorch/rl/2521

Note: Links to docs will display an error until the docs builds have been completed.

❌ 16 New Failures, 5 Unrelated Failures

As of commit 82c12e4 with merge base a70b258 (image):

NEW FAILURES - The following jobs have failed:

BROKEN TRUNK - The following jobs failed but were present on the merge base:

👉 Rebase onto the `viable/strict` branch to avoid these failures

This comment was automatically generated by Dr. CI and updates every 15 minutes.

@facebook-github-bot facebook-github-bot added the CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed. label Oct 28, 2024
@vmoens vmoens added the bug Something isn't working label Oct 29, 2024
Copy link
Contributor

@vmoens vmoens left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM, just a couple of things:
Since this will affect the transform in-place I think it should be documented in the docstrings of the class. Also, we should quickly test that the behaviour is as such, to make sure we don't modify it inadvertently in the future.

@albertbou92
Copy link
Contributor Author

I have remove the line where I first tried to set the primer batch_size to the parent batch_size. I could see edge cases where the batch_size could be set but still the specs needed to be expanded with and extra dimension. For example in the test I added.

@albertbou92 albertbou92 requested a review from vmoens November 1, 2024 16:55
@vmoens vmoens merged commit 98b45a6 into pytorch:main Nov 5, 2024
59 of 80 checks passed
@vmoens vmoens deleted the primer_batch branch November 5, 2024 17:42
vmoens pushed a commit that referenced this pull request Nov 14, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants