jacobian(func, inputs, create_graph=False, strict=False, vectorize=False)¶
Function that computes the Jacobian of a given function.
func (function) – a Python function that takes Tensor inputs and returns a tuple of Tensors or a Tensor.
inputs (tuple of Tensors or Tensor) – inputs to the function
create_graph (bool, optional) – If
True, the Jacobian will be computed in a differentiable manner. Note that when
False, the result can not require gradients or be disconnected from the inputs. Defaults to
strict (bool, optional) – If
True, an error will be raised when we detect that there exists an input such that all the outputs are independent of it. If
False, we return a Tensor of zeros as the jacobian for said inputs, which is the expected mathematical value. Defaults to
vectorize (bool, optional) – This feature is experimental, please use at your own risk. When computing the jacobian, usually we invoke
autograd.gradonce per row of the jacobian. If this flag is
True, we use the vmap prototype feature as the backend to vectorize calls to
autograd.gradso we only invoke it once instead of once per row. This should lead to performance improvements in many use cases, however, due to this feature being incomplete, there may be performance cliffs. Please use torch._C._debug_only_display_vmap_fallback_warnings(True) to show any performance warnings and file us issues if warnings exist for your use case. Defaults to
if there is a single input and output, this will be a single Tensor containing the Jacobian for the linearized inputs and output. If one of the two is a tuple, then the Jacobian will be a tuple of Tensors. If both of them are tuples, then the Jacobian will be a tuple of tuple of Tensors where
Jacobian[i][j]will contain the Jacobian of the
ith output and
jth input and will have as size the concatenation of the sizes of the corresponding output and the corresponding input and will have same dtype and device as the corresponding input.
- Return type
Jacobian (Tensor or nested tuple of Tensors)
>>> def exp_reducer(x): ... return x.exp().sum(dim=1) >>> inputs = torch.rand(2, 2) >>> jacobian(exp_reducer, inputs) tensor([[[1.4917, 2.4352], [0.0000, 0.0000]], [[0.0000, 0.0000], [2.4369, 2.3799]]])
>>> jacobian(exp_reducer, inputs, create_graph=True) tensor([[[1.4917, 2.4352], [0.0000, 0.0000]], [[0.0000, 0.0000], [2.4369, 2.3799]]], grad_fn=<ViewBackward>)
>>> def exp_adder(x, y): ... return 2 * x.exp() + 3 * y >>> inputs = (torch.rand(2), torch.rand(2)) >>> jacobian(exp_adder, inputs) (tensor([[2.8052, 0.0000], [0.0000, 3.3963]]), tensor([[3., 0.], [0., 3.]]))