Shortcuts

torch.optim.Optimizer.step

Optimizer.step(closure: None = None) None[source]
Optimizer.step(closure: Callable[[], float]) float

Perform a single optimization step to update parameter.

Parameters

closure (Callable) – A closure that reevaluates the model and returns the loss. Optional for most optimizers.

Note

Unless otherwise specified, this function should not modify the .grad field of the parameters.

Docs

Access comprehensive developer documentation for PyTorch

View Docs

Tutorials

Get in-depth tutorials for beginners and advanced developers

View Tutorials

Resources

Find development resources and get your questions answered

View Resources