# GELU¶

class torch.nn.GELU[source]

Applies the Gaussian Error Linear Units function:

$\text{GELU}(x) = x * \Phi(x)$

where $\Phi(x)$ is the Cumulative Distribution Function for Gaussian Distribution.

Shape:
• Input: $(N, *)$ where * means, any number of additional dimensions

• Output: $(N, *)$ , same shape as the input

Examples:

>>> m = nn.GELU()
>>> input = torch.randn(2)
>>> output = m(input)