- class torch.nn.Softmax(dim=None)¶
Applies the Softmax function to an n-dimensional input Tensor rescaling them so that the elements of the n-dimensional output Tensor lie in the range [0,1] and sum to 1.
Softmax is defined as:
When the input Tensor is a sparse tensor then the unspecifed values are treated as
Input: where * means, any number of additional dimensions
Output: , same shape as the input
a Tensor of the same dimension and shape as the input with values in the range [0, 1]
dim (int) – A dimension along which Softmax will be computed (so every slice along dim will sum to 1).
- Return type:
This module doesn’t work directly with NLLLoss, which expects the Log to be computed between the Softmax and itself. Use LogSoftmax instead (it’s faster and has better numerical properties).
>>> m = nn.Softmax(dim=1) >>> input = torch.randn(2, 3) >>> output = m(input)