Shortcuts

BatchSubSampler

class torchrl.trainers.BatchSubSampler(batch_size: int, sub_traj_len: int = 0, min_sub_traj_len: int = 0)[source]

Data subsampler for online RL sota-implementations.

This class subsamples a part of a whole batch of data just collected from the environment.

Parameters:
  • batch_size (int) – sub-batch size to collect. The provided batch size must be equal to the total number of items in the output tensordict, which will have size [batch_size // sub_traj_len, sub_traj_len].

  • sub_traj_len (int, optional) – length of the trajectories that sub-samples must have in online settings. Default is -1 (i.e. takes the full length of the trajectory)

  • min_sub_traj_len (int, optional) – minimum value of sub_traj_len, in case some elements of the batch contain few steps. Default is -1 (i.e. no minimum value)

Examples

>>> td = TensorDict(
...     {
...         key1: torch.stack([torch.arange(0, 10), torch.arange(10, 20)], 0),
...         key2: torch.stack([torch.arange(0, 10), torch.arange(10, 20)], 0),
...     },
...     [2, 10],
... )
>>> trainer.register_op(
...     "process_optim_batch",
...     BatchSubSampler(batch_size=batch_size, sub_traj_len=sub_traj_len),
... )
>>> td_out = trainer._process_optim_batch_hook(td)
>>> assert td_out.shape == torch.Size([batch_size // sub_traj_len, sub_traj_len])
register(trainer: Trainer, name: str = 'batch_subsampler')[source]

Registers the hook in the trainer at a default location.

Parameters:
  • trainer (Trainer) – the trainer where the hook must be registered.

  • name (str) – the name of the hook.

Note

To register the hook at another location than the default, use register_op().

Docs

Access comprehensive developer documentation for PyTorch

View Docs

Tutorials

Get in-depth tutorials for beginners and advanced developers

View Tutorials

Resources

Find development resources and get your questions answered

View Resources