Shortcuts

torch.nn.utils.clip_grad_norm

torch.nn.utils.clip_grad_norm(parameters, max_norm, norm_type=2.0, error_if_nonfinite=False, foreach=None)[source][source]

Clip the gradient norm of an iterable of parameters.

Warning

This method is now deprecated in favor of torch.nn.utils.clip_grad_norm_().

Return type

Tensor

Docs

Access comprehensive developer documentation for PyTorch

View Docs

Tutorials

Get in-depth tutorials for beginners and advanced developers

View Tutorials

Resources

Find development resources and get your questions answered

View Resources