torch.nn.functional.feature_alpha_dropout¶
- torch.nn.functional.feature_alpha_dropout(input, p=0.5, training=False, inplace=False)[source][source]¶
Randomly masks out entire channels (a channel is a feature map).
For example, the -th channel of the -th sample in the batch input is a tensor of the input tensor. Instead of setting activations to zero, as in regular Dropout, the activations are set to the negative saturation value of the SELU activation function.
Each element will be masked independently on every forward call with probability
p
using samples from a Bernoulli distribution. The elements to be masked are randomized on every forward call, and scaled and shifted to maintain zero mean and unit variance.See
FeatureAlphaDropout
for details.