Shortcuts

torch.nn.functional.glu

torch.nn.functional.glu(input, dim=-1) Tensor[source]

The gated linear unit. Computes:

GLU(a,b)=aσ(b)\text{GLU}(a, b) = a \otimes \sigma(b)

where input is split in half along dim to form a and b, σ\sigma is the sigmoid function and \otimes is the element-wise product between matrices.

See Language Modeling with Gated Convolutional Networks.

Parameters
  • input (Tensor) – input tensor

  • dim (int) – dimension on which to split the input. Default: -1

Return type

Tensor

Docs

Access comprehensive developer documentation for PyTorch

View Docs

Tutorials

Get in-depth tutorials for beginners and advanced developers

View Tutorials

Resources

Find development resources and get your questions answered

View Resources