Shortcuts

ClearCudaCache

class torchrl.trainers.ClearCudaCache(interval: int)[source]

Clears cuda cache at a given interval.

Examples

>>> clear_cuda = ClearCudaCache(100)
>>> trainer.register_op("pre_optim_steps", clear_cuda)
abstract register(trainer: Trainer, name: str)

Registers the hook in the trainer at a default location.

Parameters:
  • trainer (Trainer) – the trainer where the hook must be registered.

  • name (str) – the name of the hook.

Note

To register the hook at another location than the default, use register_op().

Docs

Access comprehensive developer documentation for PyTorch

View Docs

Tutorials

Get in-depth tutorials for beginners and advanced developers

View Tutorials

Resources

Find development resources and get your questions answered

View Resources