GaussianNLLLoss(*, full=False, eps=1e-06, reduction='mean')¶
Gaussian negative log likelihood loss.
The targets are treated as samples from Gaussian distributions with expectations and variances predicted by the neural network. For a D-dimensional
targettensor modelled as having heteroscedastic Gaussian distributions with a D-dimensional tensor of expectations
inputand a D-dimensional tensor of positive variances
varthe loss is:
epsis used for stability. By default, the constant term of the loss function is omitted unless
varis a scalar (implying
targettensor has homoscedastic Gaussian distributions) it is broadcasted to be the same size as the input.
full (bool, optional) – include the constant term in the loss calculation. Default:
eps (float, optional) – value used to clamp
var(see note below), for stability. Default: 1e-6.
reduction (string, optional) – specifies the reduction to apply to the output:
'none': no reduction will be applied,
'mean': the output is the average of all batch member losses,
'sum': the output is the sum of all batch member losses. Default:
Input: where means any number of additional dimensions
Target: , same shape as the input
Var: or , same shape as the input
Output: scalar if
>>> loss = nn.GaussianNLLLoss() >>> input = torch.randn(5, 2, requires_grad=True) >>> target = torch.randn(5, 2) >>> var = torch.ones(5, 2, requires_grad=True) #heteroscedastic >>> output = loss(input, target, var) >>> output.backward() >>> loss = nn.GaussianNLLLoss() >>> input = torch.randn(5, 2, requires_grad=True) >>> target = torch.randn(5, 2) >>> var = torch.ones(5, 1, requires_grad=True) #homoscedastic >>> output = loss(input, target, var) >>> output.backward()
The clamping of
varis ignored with respect to autograd, and so the gradients are unaffected by it.
Nix, D. A. and Weigend, A. S., “Estimating the mean and variance of the target probability distribution”, Proceedings of 1994 IEEE International Conference on Neural Networks (ICNN’94), Orlando, FL, USA, 1994, pp. 55-60 vol.1, doi: 10.1109/ICNN.1994.374138.