Shortcuts

Typedef torch::AutoGradMode

Typedef Documentation

using torch::AutoGradMode = at::AutoGradMode

A RAII, thread-local guard that sets gradient calculation to on or off.

AutoGradMode will enable or disable grads based on its argument enabled.

This context manager is thread-local; it will not affect computation in other threads.

Example:

auto x = torch::tensor({1.}, torch::requires_grad());
{
  torch::AutoGradMode enable_grad(true);
  auto y = x * 2;
  std::cout << y.requires_grad() << std::endl; // prints `true`
}
{
  torch::AutoGradMode enable_grad(false);
  auto y = x * 2;
  std::cout << y.requires_grad() << std::endl; // prints `false`
}

Param enabled

Flag whether to enable grad (true), or disable (false). This can be used to conditionally enable gradients.

Docs

Access comprehensive developer documentation for PyTorch

View Docs

Tutorials

Get in-depth tutorials for beginners and advanced developers

View Tutorials

Resources

Find development resources and get your questions answered

View Resources