# GELU¶

class torch.nn.GELU[source]

Applies the Gaussian Error Linear Units function:

$\text{GELU}(x) = x * \Phi(x)$

where $\Phi(x)$ is the Cumulative Distribution Function for Gaussian Distribution.

Shape:
• Input: $(*)$, where $*$ means any number of dimensions.

• Output: $(*)$, same shape as the input.

Examples:

>>> m = nn.GELU()
>>> input = torch.randn(2)
>>> output = m(input)