Shortcuts

GELU

class torch.nn.GELU[source]

Applies the Gaussian Error Linear Units function:

GELU(x)=xΦ(x)\text{GELU}(x) = x * \Phi(x)

where Φ(x)\Phi(x) is the Cumulative Distribution Function for Gaussian Distribution.

Shape:
  • Input: ()(*), where * means any number of dimensions.

  • Output: ()(*), same shape as the input.

../_images/GELU.png

Examples:

>>> m = nn.GELU()
>>> input = torch.randn(2)
>>> output = m(input)

Docs

Access comprehensive developer documentation for PyTorch

View Docs

Tutorials

Get in-depth tutorials for beginners and advanced developers

View Tutorials

Resources

Find development resources and get your questions answered

View Resources