Shortcuts

torch.nn.functional.feature_alpha_dropout

torch.nn.functional.feature_alpha_dropout(input, p=0.5, training=False, inplace=False)[source][source]

Randomly masks out entire channels (a channel is a feature map).

For example, the jj-th channel of the ii-th sample in the batch input is a tensor input[i,j]\text{input}[i, j] of the input tensor. Instead of setting activations to zero, as in regular Dropout, the activations are set to the negative saturation value of the SELU activation function.

Each element will be masked independently on every forward call with probability p using samples from a Bernoulli distribution. The elements to be masked are randomized on every forward call, and scaled and shifted to maintain zero mean and unit variance.

See FeatureAlphaDropout for details.

Parameters
  • p (float) – dropout probability of a channel to be zeroed. Default: 0.5

  • training (bool) – apply dropout if is True. Default: True

  • inplace (bool) – If set to True, will do this operation in-place. Default: False

Return type

Tensor

Docs

Access comprehensive developer documentation for PyTorch

View Docs

Tutorials

Get in-depth tutorials for beginners and advanced developers

View Tutorials

Resources

Find development resources and get your questions answered

View Resources