-
Notifications
You must be signed in to change notification settings - Fork 325
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[BugFix] Allow expanding TensorDictPrimer transforms shape with parent batch size #2521
Conversation
🔗 Helpful Links🧪 See artifacts and rendered test results at hud.pytorch.org/pr/pytorch/rl/2521
Note: Links to docs will display an error until the docs builds have been completed. ❌ 16 New Failures, 5 Unrelated FailuresAs of commit 82c12e4 with merge base a70b258 (): NEW FAILURES - The following jobs have failed:
BROKEN TRUNK - The following jobs failed but were present on the merge base:👉 Rebase onto the `viable/strict` branch to avoid these failures
This comment was automatically generated by Dr. CI and updates every 15 minutes. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM, just a couple of things:
Since this will affect the transform in-place I think it should be documented in the docstrings of the class. Also, we should quickly test that the behaviour is as such, to make sure we don't modify it inadvertently in the future.
I have remove the line where I first tried to set the primer batch_size to the parent batch_size. I could see edge cases where the batch_size could be set but still the specs needed to be expanded with and extra dimension. For example in the test I added. |
Description
Often, when defining
TensorDictPrimer
transforms during environment creation, the batch size is unknown. For example when we create the transform with thetorchrl.modules.utils.get_primers_from_module
method.In
TensorDictPrimer
, we assume that if the parent environment is batch-locked, the specs will already have the appropriate leading shape. However, this can lead to issues if the batch size is unknown at instantiation.Maybe I am missing something here, but this PR makes sure the primer matches the parent batch size during reset, expanding it if necessary.
Motivation and Context
Why is this change required? What problem does it solve?
If it fixes an open issue, please link to the issue here.
You can use the syntax
close #15213
if this solves the issue #15213Types of changes
What types of changes does your code introduce? Remove all that do not apply:
Checklist
Go over all the following points, and put an
x
in all the boxes that apply.If you are unsure about any of these, don't hesitate to ask. We are here to help!