feature_alpha_dropout(input, p=0.5, training=False, inplace=False)[source]¶
Randomly masks out entire channels (a channel is a feature map, e.g. the -th channel of the -th sample in the batch input is a tensor ) of the input tensor). Instead of setting activations to zero, as in regular Dropout, the activations are set to the negative saturation value of the SELU activation function.
Each element will be masked independently on every forward call with probability
pusing samples from a Bernoulli distribution. The elements to be masked are randomized on every forward call, and scaled and shifted to maintain zero mean and unit variance.
p – dropout probability of a channel to be zeroed. Default: 0.5
training – apply dropout if is
inplace – If set to
True, will do this operation in-place. Default: