Shortcuts

torch.nn.functional.softplus

torch.nn.functional.softplus(input, beta=1, threshold=20) Tensor

Applies element-wise, the function Softplus(x)=1βlog(1+exp(βx))\text{Softplus}(x) = \frac{1}{\beta} * \log(1 + \exp(\beta * x)).

For numerical stability the implementation reverts to the linear function when input×β>thresholdinput \times \beta > threshold.

See Softplus for more details.

Docs

Access comprehensive developer documentation for PyTorch

View Docs

Tutorials

Get in-depth tutorials for beginners and advanced developers

View Tutorials

Resources

Find development resources and get your questions answered

View Resources