KLDivLoss(size_average=None, reduce=None, reduction: str = 'mean', log_target: bool = False)¶
The Kullback-Leibler divergence loss measure
Kullback-Leibler divergence is a useful distance measure for continuous distributions and is often useful when performing direct regression over the space of (discretely sampled) continuous output distributions.
NLLLoss, the input given is expected to contain log-probabilities and is not restricted to a 2D Tensor. The targets are interpreted as probabilities by default, but could be considered as log-probabilities with
This criterion expects a target Tensor of the same size as the input Tensor.
The unreduced (i.e. with
'none') loss can be described as:
where the index spans all dimensions of
inputand has the same shape as
'mean', the losses are averaged for each minibatch over observations as well as over dimensions.
'batchmean'mode gives the correct KL divergence where losses are averaged over batch dimension only.
'mean'mode’s behavior will be changed to the same as
'batchmean'in the next major release.
size_average (bool, optional) – Deprecated (see
reduction). By default, the losses are averaged over each loss element in the batch. Note that for some losses, there are multiple elements per sample. If the field
size_averageis set to
False, the losses are instead summed for each minibatch. Ignored when reduce is
reduce (bool, optional) – Deprecated (see
reduction). By default, the losses are averaged or summed over observations for each minibatch depending on
False, returns a loss per batch element instead and ignores
reduction (string, optional) – Specifies the reduction to apply to the output:
'none': no reduction will be applied.
'batchmean': the sum of the output will be divided by batchsize.
'sum': the output will be summed.
'mean': the output will be divided by the number of elements in the output. Default:
log_target (bool, optional) – Specifies whether target is passed in the log space. Default:
reduceare in the process of being deprecated, and in the meantime, specifying either of those two args will override
'mean'doesn’t return the true kl divergence value, please use
'batchmean'which aligns with KL math definition. In the next major release,
'mean'will be changed to be the same as
Input: where means, any number of additional dimensions
Target: , same shape as the input
Output: scalar by default. If :attr:
'none', then , the same shape as the input