Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[BugFix] Allow expanding TensorDictPrimer transforms shape with parent batch size #2521

Merged
merged 6 commits into from
Nov 5, 2024
Merged
Changes from 3 commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
15 changes: 12 additions & 3 deletions torchrl/envs/transforms/transforms.py
Original file line number Diff line number Diff line change
Expand Up @@ -4760,7 +4760,7 @@ def transform_observation_spec(self, observation_spec: Composite) -> Composite:
# We try to set the primer shape to the observation spec shape
self.primers.shape = observation_spec.shape
except ValueError:
# If we fail, we expnad them to that shape
# If we fail, we expand them to that shape
self.primers = self._expand_shape(self.primers)
device = observation_spec.device
observation_spec.update(self.primers.clone().to(device))
Expand Down Expand Up @@ -4827,12 +4827,21 @@ def _reset(
) -> TensorDictBase:
"""Sets the default values in the input tensordict.

If the parent is batch-locked, we assume that the specs have the appropriate leading
If the parent is batch-locked, we make sure the specs have the appropriate leading
shape. We allow for execution when the parent is missing, in which case the
spec shape is assumed to match the tensordict's.

"""
_reset = _get_reset(self.reset_key, tensordict)
if (
self.parent
and self.primers.shape[: len(tensordict.shape)] != self.parent.batch_size
):
try:
# We try to set the primer shape to the parent shape
self.primers.shape = self.parent.batch_size
except ValueError:
# If we fail, we expand them to parent batch size
self.primers = self._expand_shape(self.primers)
if _reset.any():
for key, spec in self.primers.items(True, True):
if self.random:
Expand Down
Loading