torch.nn.functional.silu¶
- torch.nn.functional.silu(input, inplace=False)[source][source]¶
Apply the Sigmoid Linear Unit (SiLU) function, element-wise.
The SiLU function is also known as the swish function.
Note
See Gaussian Error Linear Units (GELUs) where the SiLU (Sigmoid Linear Unit) was originally coined, and see Sigmoid-Weighted Linear Units for Neural Network Function Approximation in Reinforcement Learning and Swish: a Self-Gated Activation Function where the SiLU was experimented with later.
See
SiLU
for more details.- Return type