Shortcuts

torch.optim.Optimizer.register_state_dict_pre_hook

Optimizer.register_state_dict_pre_hook(hook, prepend=False)[source][source]

Register a state dict pre-hook which will be called before state_dict() is called.

It should have the following signature:

hook(optimizer) -> None

The optimizer argument is the optimizer instance being used. The hook will be called with argument self before calling state_dict on self. The registered hook can be used to perform pre-processing before the state_dict call is made.

Parameters
  • hook (Callable) – The user defined hook to be registered.

  • prepend (bool) – If True, the provided pre hook will be fired before all the already registered pre-hooks on state_dict. Otherwise, the provided hook will be fired after all the already registered pre-hooks. (default: False)

Returns

a handle that can be used to remove the added hook by calling handle.remove()

Return type

torch.utils.hooks.RemoveableHandle

Docs

Access comprehensive developer documentation for PyTorch

View Docs

Tutorials

Get in-depth tutorials for beginners and advanced developers

View Tutorials

Resources

Find development resources and get your questions answered

View Resources