Shortcuts

OptimizerHook

class torchrl.trainers.OptimizerHook(optimizer: Optimizer, loss_components: Optional[Sequence[str]] = None)[source]

Add an optimizer for one or more loss components.

Parameters:
  • optimizer (optim.Optimizer) – An optimizer to apply to the loss_components.

  • loss_components (Sequence[str], optional) – The keys in the loss TensorDict for which the optimizer should be appled to the respective values. If omitted, the optimizer is applied to all components with the names starting with loss_.

Examples

>>> optimizer_hook = OptimizerHook(optimizer, ["loss_actor"])
>>> trainer.register_op("optimizer", optimizer_hook)
register(trainer, name='optimizer') None[source]

Registers the hook in the trainer at a default location.

Parameters:
  • trainer (Trainer) – the trainer where the hook must be registered.

  • name (str) – the name of the hook.

Note

To register the hook at another location than the default, use register_op().

Docs

Access comprehensive developer documentation for PyTorch

View Docs

Tutorials

Get in-depth tutorials for beginners and advanced developers

View Tutorials

Resources

Find development resources and get your questions answered

View Resources