Shortcuts

torch.nn.utils.clip_grads_with_norm_

torch.nn.utils.clip_grads_with_norm_(parameters, max_norm, total_norm, foreach=None)[source]

Scale the gradients of an iterable of parameters given a pre-calculated total norm and desired max norm.

The gradients will be scaled by the following calculation

grad=gradmax_normtotal_norm+1e6grad = grad * \frac{max\_norm}{total\_norm + 1e-6}

Gradients are modified in-place.

This function is equivalent to torch.nn.utils.clip_grad_norm_() with a pre-calculated total norm.

Parameters
  • parameters (Iterable[Tensor] or Tensor) – an iterable of Tensors or a single Tensor that will have gradients normalized

  • max_norm (float) – max norm of the gradients

  • total_norm (Tensor) – total norm of the gradients to use for clipping

  • foreach (bool) – use the faster foreach-based implementation. If None, use the foreach implementation for CUDA and CPU native tensors and silently fall back to the slow implementation for other device types. Default: None

Returns

None

Return type

None

Docs

Access comprehensive developer documentation for PyTorch

View Docs

Tutorials

Get in-depth tutorials for beginners and advanced developers

View Tutorials

Resources

Find development resources and get your questions answered

View Resources