Shortcuts

LazyBatchNorm1d

class torch.nn.LazyBatchNorm1d(eps=1e-05, momentum=0.1, affine=True, track_running_stats=True, device=None, dtype=None)[source][source]

A torch.nn.BatchNorm1d module with lazy initialization.

Lazy initialization based on the num_features argument of the BatchNorm1d that is inferred from the input.size(1). The attributes that will be lazily initialized are weight, bias, running_mean and running_var.

Check the torch.nn.modules.lazy.LazyModuleMixin for further documentation on lazy modules and their limitations.

Parameters
  • eps (float) – a value added to the denominator for numerical stability. Default: 1e-5

  • momentum (Optional[float]) – the value used for the running_mean and running_var computation. Can be set to None for cumulative moving average (i.e. simple average). Default: 0.1

  • affine (bool) – a boolean value that when set to True, this module has learnable affine parameters. Default: True

  • track_running_stats (bool) – a boolean value that when set to True, this module tracks the running mean and variance, and when set to False, this module does not track such statistics, and initializes statistics buffers running_mean and running_var as None. When these buffers are None, this module always uses batch statistics. in both training and eval modes. Default: True

cls_to_become[source]

alias of BatchNorm1d

Docs

Access comprehensive developer documentation for PyTorch

View Docs

Tutorials

Get in-depth tutorials for beginners and advanced developers

View Tutorials

Resources

Find development resources and get your questions answered

View Resources