Shortcuts

LogScalar

class torchrl.trainers.LogScalar(logname='r_training', log_pbar: bool = False, reward_key: Optional[Union[str, tuple]] = None)[source]

Reward logger hook.

Parameters:
  • logname (str, optional) – name of the rewards to be logged. Default is "r_training".

  • log_pbar (bool, optional) – if True, the reward value will be logged on the progression bar. Default is False.

  • reward_key (str or tuple, optional) – the key where to find the reward in the input batch. Defaults to ("next", "reward")

Examples

>>> log_reward = LogScalar(("next", "reward"))
>>> trainer.register_op("pre_steps_log", log_reward)
register(trainer: Trainer, name: str = 'log_reward')[source]

Registers the hook in the trainer at a default location.

Parameters:
  • trainer (Trainer) – the trainer where the hook must be registered.

  • name (str) – the name of the hook.

Note

To register the hook at another location than the default, use register_op().

Docs

Access comprehensive developer documentation for PyTorch

View Docs

Tutorials

Get in-depth tutorials for beginners and advanced developers

View Tutorials

Resources

Find development resources and get your questions answered

View Resources