Shortcuts

Function torch::autograd::create_gradient_edge

Function Documentation

inline void torch::autograd::create_gradient_edge(Variable &variable, std::shared_ptr<Node> function)

Create an Edge between the given variable and the function, which is assumed to be the gradient function of this variable (i.e.

the function through which this variable is backpropagated during the backward pass). This sets the grad_fn property of the variable. This function assumes that the Variable is a new input to the gradient function and its input_nr thus equal to function->num_inputs(). Additionally, it increments the Node’s number of inputs by one. Approximately equivalent to variable.set_gradient_edge(function, function->add_input_metadata(variable.dispatch_type(), variable.sizes())). If you don’t want the Node’s num_inputs to be incremented, use set_gradient_edge directly.

Docs

Access comprehensive developer documentation for PyTorch

View Docs

Tutorials

Get in-depth tutorials for beginners and advanced developers

View Tutorials

Resources

Find development resources and get your questions answered

View Resources