# Softplus¶

class torch.nn.Softplus(beta=1, threshold=20)[source]

Applies the Softplus function $\text{Softplus}(x) = \frac{1}{\beta} * \log(1 + \exp(\beta * x))$ element-wise.

SoftPlus is a smooth approximation to the ReLU function and can be used to constrain the output of a machine to always be positive.

For numerical stability the implementation reverts to the linear function when $input \times \beta > threshold$.

Parameters
• beta – the $\beta$ value for the Softplus formulation. Default: 1

• threshold – values above this revert to a linear function. Default: 20

Shape:
• Input: $(*)$, where $*$ means any number of dimensions.

• Output: $(*)$, same shape as the input.

Examples:

>>> m = nn.Softplus()
>>> input = torch.randn(2)
>>> output = m(input)