# torch.nn.functional.feature_alpha_dropout¶

torch.nn.functional.feature_alpha_dropout(input, p=0.5, training=False, inplace=False)[source]

Randomly masks out entire channels (a channel is a feature map, e.g. the $j$-th channel of the $i$-th sample in the batch input is a tensor $\text{input}[i, j]$) of the input tensor). Instead of setting activations to zero, as in regular Dropout, the activations are set to the negative saturation value of the SELU activation function.

Each element will be masked independently on every forward call with probability p using samples from a Bernoulli distribution. The elements to be masked are randomized on every forward call, and scaled and shifted to maintain zero mean and unit variance.

See FeatureAlphaDropout for details.

Parameters
• p – dropout probability of a channel to be zeroed. Default: 0.5

• training – apply dropout if is True. Default: True

• inplace – If set to True, will do this operation in-place. Default: False